Tagged: LSST-Large Synoptic Survey Telescope Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:17 pm on May 14, 2018 Permalink | Reply
    Tags: , , , , LSST-Large Synoptic Survey Telescope, , , The next big discovery in astronomy? Scientists probably found it years ago – but they don’t know it yet   

    From The Conversation: “The next big discovery in astronomy? Scientists probably found it years ago – but they don’t know it yet” 

    Conversation
    From The Conversation

    May 14, 2018
    Eileen Meyer

    1
    An artist’s illustration of a black hole “eating” a star. NASA/JPL-Caltech

    Earlier this year, astronomers stumbled upon a fascinating finding: Thousands of black holes likely exist near the center of our galaxy.

    1
    Hundreds — Perhaps Thousands — of Black Holes Occupy the Center of the Milky Way

    The X-ray images that enabled this discovery weren’t from some state-of-the-art new telescope. Nor were they even recently taken – some of the data was collected nearly 20 years ago.

    No, the researchers discovered the black holes by digging through old, long-archived data.

    Discoveries like this will only become more common, as the era of “big data” changes how science is done. Astronomers are gathering an exponentially greater amount of data every day – so much that it will take years to uncover all the hidden signals buried in the archives.

    The evolution of astronomy

    Sixty years ago, the typical astronomer worked largely alone or in a small team. They likely had access to a respectably large ground-based optical telescope at their home institution.

    Their observations were largely confined to optical wavelengths – more or less what the eye can see. That meant they missed signals from a host of astrophysical sources, which can emit non-visible radiation from very low-frequency radio all the way up to high-energy gamma rays. For the most part, if you wanted to do astronomy, you had to be an academic or eccentric rich person with access to a good telescope.

    Old data was stored in the form of photographic plates or published catalogs. But accessing archives from other observatories could be difficult – and it was virtually impossible for amateur astronomers.

    Today, there are observatories that cover the entire electromagnetic spectrum. No longer operated by single institutions, these state-of-the-art observatories are usually launched by space agencies and are often joint efforts involving many countries.

    With the coming of the digital age, almost all data are publicly available shortly after they are obtained. This makes astronomy very democratic – anyone who wants to can reanalyze almost any data set that makes the news. (You too can look at the Chandra data that led to the discovery of thousands of black holes!)

    These observatories generate a staggering amount of data. For example, the Hubble Space Telescope, operating since 1990, has made over 1.3 million observations and transmits around 20 GB of raw data every week, which is impressive for a telescope first designed in the 1970s.

    NASA/ESA Hubble Telescope

    The Atacama Large Millimeter Array in Chile now anticipates adding 2 TB of data to its archives every day.

    ESO/NRAO/NAOJ ALMA Array in Chile in the Atacama at Chajnantor plateau, at 5,000 metres

    Data firehose

    The archives of astronomical data are already impressively large. But things are about to explode.

    Each generation of observatories are usually at least 10 times more sensitive than the previous, either because of improved technology or because the mission is simply larger. Depending on how long a new mission runs, it can detect hundreds of times more astronomical sources than previous missions at that wavelength.

    For example, compare the early EGRET gamma ray observatory, which flew in the 1990s, to NASA’s flagship mission Fermi, which turns 10 this year. EGRET detected only about 190 gamma ray sources in the sky. Fermi has seen over 5,000.

    NASA/Fermi LAT


    NASA/Fermi Gamma Ray Space Telescope

    The Large Synoptic Survey Telescope, an optical telescope currently under construction in Chile, will image the entire sky every few nights. It will be so sensitive that it will generate 10 million alerts per night on new or transient sources, leading to a catalog of over 15 petabytes after 10 years.

    LSST

    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The Square Kilometre Array , when completed in 2020, will be the most sensitive telescope in the world, capable of detecting airport radar stations of alien civilizations up to 50 light-years away. In just one year of activity, it will generate more data than the entire internet.


    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia


    SKA Murchison Widefield Array, Boolardy station in outback Western Australia, at the Murchison Radio-astronomy Observatory (MRO)


    SKA Meerkat telescope, 90 km outside the small Northern Cape town of Carnarvon, SA


    SKA LOFAR core (“superterp”) near Exloo, Netherlands


    These ambitious projects will test scientists’ ability to handle data. Images will need to be automatically processed – meaning that the data will need to be reduced down to a manageable size or transformed into a finished product. The new observatories are pushing the envelope of computational power, requiring facilities capable of processing hundreds of terabytes per day.

    The resulting archives – all publicly searchable – will contain 1 million times more information that what can be stored on your typical 1 TB backup disk.

    Unlocking new science

    The data deluge will make astronomy become a more collaborative and open science than ever before. Thanks to internet archives, robust learning communities and new outreach initiatives, citizens can now participate in science. For example, with the computer program Einstein@Home, anyone can use their computer’s idle time to help search for gravitational waves from colliding black holes.

    It’s an exciting time for scientists, too. Astronomers like myself often study physical phenomena on timescales so wildly beyond the typical human lifetime that watching them in real-time just isn’t going to happen. Events like a typical galaxy merger – which is exactly what it sounds like – can take hundreds of millions of years. All we can capture is a snapshot, like a single still frame from a video of a car accident.

    However, there are some phenomena that occur on shorter timescales, taking just a few decades, years or even seconds. That’s how scientists discovered those thousands of black holes in the new study. It’s also how they recently realized that the X-ray emission from the center of a nearby dwarf galaxy has been fading since first detected in the 1990s. These new discoveries suggest that more will be found in archival data spanning decades.

    In my own work, I use Hubble archives to make movies of “jets,” high-speed plasma ejected in beams from black holes. I used over 400 raw images spanning 13 years to make a movie of the jet in nearby galaxy M87. That movie showed, for the first time, the twisting motions of the plasma, suggesting that the jet has a helical structure.

    This kind of work was only possible because other observers, for other purposes, just happened to capture images of the source I was interested in, back when I was in kindergarten. As astronomical images become larger, higher resolution and ever more sensitive, this kind of research will become the norm.

    See the full article here .

    Please help promote STEM in your local schools.

    stem

    Stem Education Coalition

    The Conversation US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

    Advertisements
     
  • richardmitnick 12:53 pm on April 17, 2018 Permalink | Reply
    Tags: , , , LSST-Large Synoptic Survey Telescope,   

    From Symmetry: “The world’s largest astronomical movie” 

    Symmetry Mag
    Symmetry

    04/17/18
    Manuel Gnida

    1
    Artwork by Sandbox Studio, Chicago with Ana Kova

    When the Large Synoptic Survey Telescope begins to survey the night sky in the early 2020s, it’ll collect a treasure trove of data.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The information will benefit a wide range of groundbreaking astronomical and astrophysical research, addressing topics such as dark matter, dark energy, the formation of galaxies and detailed studies of objects in our very own cosmic neighborhood, the Milky Way.

    LSST’s centerpiece will be its 3.2-gigapixel camera, which is being assembled at the US Department of Energy’s SLAC National Accelerator Laboratory. Every few days, the largest digital camera ever built for astronomy will compile a complete image of the Southern sky. Moreover, it’ll do so over and over again for a period of 10 years. It’ll track the motions and changes of tens of billions of stars, galaxies and other objects in what will be the world’s largest stop-motion movie of the universe.

    Fulfilling this extraordinary task requires extraordinary technology. The camera will be the size of a small SUV, weigh in at a whopping 3 tons, and use state-of-the-art optics, imaging technology and data management tools. But how exactly will it work?

    2
    Artwork by Sandbox Studio, Chicago with Ana Kova

    Collecting ancient light

    It all starts with choosing the right location for the telescope. Astronomers want the sharpest images of the dimmest objects for their analyses, and they also want to maximize their observation time. They need the nights to be dark and the air to be dry and stable.

    It turns out that the Atacama Desert, a plateau in the foothills of the Andes Mountains, scores very high for these criteria. That’s where LSST will be located—at nearly 8700 feet altitude on the Cerro Pachón ridge in Chile, 60 miles from the coastal town of La Serena.

    The next challenge is that most objects LSST researchers want to study are so far away that their light has been traveling through space for millions to billions of years. It arrives on Earth merely as a faint glow, and astronomers need to collect as much of that glow as possible. For this purpose, LSST will have a large primary mirror with a diameter close to 28 feet.

    The mirror will be part of a sophisticated three-mirror system that will reflect and focus the cosmic light into the camera.

    The unique optical design is crucial for the telescope’s extraordinary field of view—a measure of the area of sky captured with every snapshot. At 9.6 square degrees, corresponding to 40 times the area of the full moon, the large field of view will allow astronomers to put together a complete map of the Southern night sky every few days.

    After bouncing off the mirrors, the ancient cosmic light will enter the camera through a set of three large lenses. The largest one will have a diameter of more than 5 feet.

    Together with the mirrors, the lenses’ job is to focus the light as sharply as possible onto the focal plane—a grid of light-sensitive sensors at the back of the camera where the light from the sky will be detected.

    A filter changer will insert filters in front of the third lens, allowing astronomers to take images with different kinds of cosmic light that range from the ultraviolet to the near-infrared. This flexibility enhances the range of possible observations with LSST. For example, with an infrared filter researchers can look right through dust and get a better view of objects obscured by it. By comparing how bright an object is when seen through different filters, astronomers also learn how its emitted light varies with the wavelength, which reveals details about how the light is produced.

    4
    Artwork by Sandbox Studio, Chicago with Ana Kova

    An Extraordinary Imaging Device

    The heart of LSST’s camera is its 25-inch-wide focal plane. That’s where the light of stars and galaxies will be turned into electrical signals, which will then be used to reconstruct images of the sky. The focal plane will hold 189 imaging sensors, called charge-coupled devices, that perform this transformation.

    Each CCD is 4096 pixels wide and long, and together they’ll add up to the camera’s 3.2 gigapixels. A “good” star will be the size of only a handful of pixels, whereas distant galaxies might appear as somewhat larger fuzzballs.

    The focal plane will consist of 21 smaller square arrays, called rafts, with nine CCDs each. This modular structure will make it easier and less costly to replace imaging sensors if needed in the future.

    To the delight of astronomers interested in extremely dim objects, the camera will have a large aperture (f/1.2, for the photographers among us), meaning that it’ll let a lot of light onto the imaging sensors. However, the large aperture will also make the depth of field very shallow, which means that objects will become blurry very quickly if they are not precisely projected onto the focal plane. That’s why the focal plane will need to be extremely flat, demanding that individual CCDs don’t stick out or recess by more than 0.0004 inches.

    To eliminate unwanted background signals, known as dark currents, the sensors will also need to be cooled to minus 150 degrees Fahrenheit. The temperature will need to be kept stable to half a degree. Because water vapor inside the camera housing would form ice on the sensors at this chilly temperature, the focal plane must also be kept in a vacuum.

    In addition to the 189 “science” sensors that will capture images of the sky, the focal plane will also have three specialty sensors in each of the four corners of the focal plane. Two so-called guiders will frequently monitor the position of a reference star and help LSST stay in sync with the Earth’s rotation. The third sensor, called a wavefront sensor, will be split into two halves that will be positioned six-hundredths of an inch above and below the focal plane. It’ll see objects as blurry “donuts” and provide information that will be used to adjust the telescope’s focus.

    Cinematography of astronomical dimension

    Once the camera has taken enough data from a patch in the sky, about every 36 seconds, the telescope will be repositioned to look at the next spot. A computer algorithm will determine the patches in the sky that will be surveyed by LSST on any given night.

    While the telescope is moving, a shutter between the filter and the third lens camera will close to prevent more light from falling onto the imaging sensors. At the same time, the CCDs will be read out and their information digitized.

    The data will be sent into the processing and analysis pipeline that will handle LSST’s enormous flood of information (about 20 terabytes of data every single night). There, it will be turned into useable images. The system will also flag potential interesting events and send out alerts to astronomers within a minute.

    This way—patch by patch—a complete image of the entire Southern sky will be stitched together every few days. Then the imaging process will start over and repeat for the 10-year duration of the survey, ultimately creating the largest time-lapse movie of the universe ever made and providing researchers with unprecedented research opportunities.

    For more information on LSST, visit LSST’s website or SLAC’s LSST camera website.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 1:27 pm on February 9, 2018 Permalink | Reply
    Tags: , , , , LSST-Large Synoptic Survey Telescope   

    From LSST: “LSST’s Auxiliary Telescope” 

    LSST

    Large Synoptic Survey Telescope

    February 6, 2018

    1

    In tandem with LSST’s construction on Cerro Pachón, a smaller telescope will soon be assembled on nearby calibration hill, a short distance away from the main LSST Facility. LSST’s 1.2-meter Auxiliary Telescope will measure atmospheric transmission, which refers to how directly light is transmitting through the Earth’s atmosphere in a given spot, as opposed to being absorbed or scattered. Because the presence of certain molecules and particles in the atmosphere will change the color of light detected by the LSST telescope, data collected by the Auxiliary Telescope, as it mirrors the nightly movements of LSST, will inform the catalog corrections that need to be made to LSST data in order to render it more accurate.

    Elements in the atmosphere that affect how light is detected by a ground based telescope like LSST include water, oxygen, and ozone, as well as aerosols like sea salt, dust from volcanoes, and smoke from forest fires. The presence and quantity of these elements varies from night to night, so the Auxiliary Telescope will provide this important complementary data for LSST throughout survey operations. According to Calibration Hardware Scientist Patrick Ingraham, “Having a dedicated auxiliary telescope supporting the main telescope is somewhat unique, and it will increase the quality of data produced by LSST.”

    The Auxiliary Telescope itself wasn’t built from scratch; it’s an existing telescope that has been repurposed for its role in the LSST survey. Since being moved from its original location on nearby Kitt Peak in May, 2014, it’s been housed in the workshop at LSST’s Project Office in Tucson, AZ. Refurbishment work has included replacement of all the telescope’s electrical parts including the motors and the position encoders, which record the exact position of the telescope at any given time. Mechanically speaking, the telescope is largely unchanged. Its mirrors, which were removed while work was done, will be recoated and replaced once the telescope arrives on Cerro Pachón; they are currently in separate protective crates that will protect them during shipping.

    Currently, the subcontractor working on the refurbishment project is almost finished with the wiring of the telescope’s electrical components. Once that’s complete, the telescope will undergo functional testing of its mechanical and electrical systems. Individual tasks that make up this testing include driving the telescope toward its upper and lower limits and ensuring the system will shut off before those limits are reached (preventing damage to the telescope), testing for excessive vibration, and testing the speed at which the telescope slews, or moves from one spot to the next. Extensive functional testing is critical now, because once the telescope is on Cerro Pachón there won’t be sufficient facilities to easily make repairs. Optical testing of the telescope will occur after the telescope is installed in its facility on the summit and re-integrated with its mirrors.

    Once the telescope is officially ready to be shipped from Tucson to Chile, the individual telescope assemblies will be packed in custom crates, and these crates will be loaded into a shipping container. It will take about 2 months for the shipping container to get from Tucson to Cerro Pachón. Once there, the telescope will be installed in a few pieces, with a crane, through the dome of its facility on calibration hill. Photos of the Auxiliary Telescope in the workshop , as well as the facility on Cerro Pachón (link is external), can be viewed and downloaded from the LSST Gallery.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    LSST telescope, currently under construction at Cerro Pachón Chile
    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    LSST Interior
    LSST telescope, currently under construction at Cerro Pachón Chile.

    LSST/Camera, built at SLAC
    LSST/Camera, built at SLAC

    The LSST is a new kind of telescope. Currently under construction in Chile, it is being built to rapidly survey the night-time sky. Compact and nimble, the LSST will move quickly between images, yet its large mirror and large field of view—almost 10 square degrees of sky, or 40 times the size of the full moon—work together to deliver more light from faint astronomical objects than any optical telescope in the world.

    From its mountaintop site in the foothills of the Andes, the LSST will take more than 800 panoramic images each night with its 3.2 billion-pixel camera, recording the entire visible sky twice each week. Each patch of sky it images will be visited 1000 times during the survey. With a light-gathering power equal to a 6.7-m diameter primary mirror, each of its 30-second observations will be able to detect objects 10 million times fainter than visible with the human eye. A powerful data system will compare new with previous images to detect changes in brightness and position of objects as big as far-distant galaxy clusters and as small as near-by asteroids.

    The LSST’s combination of telescope, mirror, camera, data processing, and survey will capture changes in billions of faint objects and the data it provides will be used to create an animated, three-dimensional cosmic map with unprecedented depth and detail , giving us an entirely new way to look at the Universe. This map will serve a myriad of purposes, from locating that mysterious substance called dark matter and characterizing the properties of the even more mysterious dark energy, to tracking transient objects, to studying our own Milky Way Galaxy in depth. It will even be used to detect and track potentially hazardous asteroids—asteroids that might impact the Earth and cause significant damage.

    As with past technological advances that opened new windows of discovery, such a powerful system for exploring the faint and transient Universe will undoubtedly serve up surprises.

    Plans for sharing the data from LSST with the public are as ambitious as the telescope itself. Anyone with a computer will be able to view the moving map of the Universe created by the LSST, including objects a hundred million times fainter than can be observed with the unaided eye. The LSST project will provide analysis tools to enable both students and the public to participate in the process of scientific discovery. We invite you to learn more about LSST science.

    The LSST will be unique: no existing telescope or proposed camera could be retrofitted or re-designed to cover ten square degrees of sky with a collecting area of forty square meters. Named the highest priority for ground-based astronomy in the 2010 Decadal Survey, the LSST project formally began construction in July 2014.

     
  • richardmitnick 12:47 pm on January 23, 2018 Permalink | Reply
    Tags: , , , , LSST-Large Synoptic Survey Telescope,   

    From Texas A&M: “A&M professors help develop new telescope” 

    Texas A&M logo

    Texas A&M

    Dec 4, 2017
    Elaine Soliman

    The new telescope will help change how astronomers study space.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The LSST Facility will change how astronomers study the sky by providing a new method of examination.

    The ability to finally be able to analyze and learn more about dark matter and dark energy is just around the corner thanks the innovative Large Synoptic Survey Telescope, or LSST. It is being developed by an international team made of over thousands of people, including three professors at Texas A&M.

    The LSST is a groundbreaking telescope which will develop a digital picture of the entire sky continuously over a three-night period. The project is funded by the National Science Foundation and the Department of Energy, according to Lucas Macri, institutional board representative of the LSST. The LSST is aimed to become operational as soon as 2022. This project wasn’t feasible fifteen years ago, but the LSST will bring in all sorts of new data about the universe around us, according to Macri.

    “Imagine if our only knowledge of biology was one picture of a cell that you took once,” Macri said. “The nice thing about a microscope … is you can actually see a cell … that dynamic and that temporal coverage, of in this case, a cell, gives you a lot of information and it is the same thing with the sky. We’ve been able to study small patches of the sky repeatedly. Many pictures of them see things that change, discover new stars, exploding stars, asteroid, whatever. But we have never been able to do an unbiased complete survey of the sky.”

    The LSST is able to do this using a charged couple device that is sensitive to light. It also requires a large mirror to be cast, and a lot glass melted into the right shape. The LSST has two mirrors in one shape which collect in a flight with enough quality so that eventually one can make these pictures of the sky, according to Macri.

    “The telescope was able to be designed to look at a large part of the sky,” Nicholas Suntzeff, university distinguished professor and head of the TAMU Astronomy Group said. “The digital detector for the LSST is the size of [a] table. Imagine covering [a] table with silicon chips and cramming them all together. And so every image you take with this telescope is the size of [a] table and you’re taking images every twenty seconds all night long. So this is an unbelievable size of an image of a focal plane. And compare that with the camera that’s being built for the LSST and so that’s what [a] table is, it’s the size of the image. As a person whose built instruments that just blows my mind that we’re able to do something like that.”

    This flood of new information about the entire universe can be utilized to further understand dark matter and dark energy. With the development of the LSST, astronomers can learn more about dark matter and energy than was ever possible before. They will also be able to better understand transient objects, according to Suntzeff.

    “This telescope will be the first big telescope to devote itself to searching for what we call in astronomy the transient sky,” Suntzeff said. “Stars that vary get brighter and fainter. Stars that explode. Galaxies that get brighter and fainter. Black holes that rip apart stars. Gamma Ray explosions at the edge of the universe. And we’ll discover things that we can’t even imagine right now. That’s one of the beauties of astronomy.

    Every time a telescope is built, that opens up a new way of looking at the universe, Suntzeff said.

    “We anticipate cool things to discover that ultimately what was really exciting was to discover things that we had no idea existed,” Suntzeff said. “So, in this case we’re opening up the transient sky and we will find things beyond our imaginations.”

    The LSST will also be able to help predict if an asteroid is projected to hit the earth, according to Macri. Macri said if an asteroid the size of Kyle Field hit the earth, the impact wouldn’t be the problem, but the amount of dust would eventually black out the whole earth.

    The LSST is currently being developed as a worldwide project. The LSST headquarters are in Tuscon, Arizona. Astronomers at Stanford University are developing the camera, which will be the largest digital camera ever assembled. The telescope itself is being built in Chile.

    Suntzeff, who picked the mountain in Chile on which to build the telescope, was actually one of the first people involved with the project approximately twenty years ago. According to Suntzeff, the LSST has brought together the astronomy and statistics departments.

    “It’s unbelievable how much data is going to come from this telescope,” Suntzeff said. “And in order to sift through the data we can’t just be normal astronomers. We have to use advanced mathematical and statistical techniques. So we’ve begun a program in collaboration with the statistics department in studying something that’s called astrostatistics. And astrostatistics will allow us to have tools to allow us to search very large databases for objects of interest.”

    Currently, these TAMU professors are preparing their graduate students for what is to come in the next few years with the completion of the LSST.

    “Well I am preparing for some software I was thinking about getting students to work LSST related problems in particular to identify objects that may be interesting to us,” said Lifan Wang, professor in physics and astronomy at TAMU and member of the LSST dark energy science collaboration.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition
    Located in College Station, Texas, about 90 miles northwest of Houston and within a two to three-hour drive from Austin and Dallas.
    Home to more than 50,000 students, ranking as the sixth-largest university in the country, with more than 370,000 former students worldwide.
    Holds membership in the prestigious Association of American Universities, one of only 62 institutions with this distinction.
    More than $820 million in research expenditures generated by faculty-researchers
    Has an endowment valued at more than $5 billion, which ranks fourth among U.S. public universities and 10th overall.

     
  • richardmitnick 12:20 pm on December 22, 2017 Permalink | Reply
    Tags: , , , , LSST-Large Synoptic Survey Telescope   

    From Astronomy: “The LSST and big data science” 

    Astronomy magazine

    Astronomy Magazine

    December 15, 2017
    Steve Murray

    A new kind of telescope will need a new kind of astronomer.

    Construction of the Large Synoptic Survey Telescope (LSST) in Chile is about halfway between first brick and first light. Its 3-ton camera, built with National Science Foundation support, will be the largest digital instrument ever built for ground-based astronomy and will take pictures fast enough to capture the entire southern sky every three nights.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    According to a TED talk by Andy Connolly, Professor of Astronomy at the University of Washington and Team Lead for LSST Simulations, the Hubble Space Telescope would need 120 years to image an equivalent area of sky.

    Imaging at this rate will generate about 15 terabytes (15 trillion bytes) of raw data per night and 30 petabytes over its 10-year survey life. (A petabyte is approximately the amount of data in 200,000 movie-length DVDs.) Even after processing, that’s still a 15 PB (15,000 TB) store.

    Such huge datasets will give astronomers a ten-year time-lapse “movie” of the southern sky, yielding new subject matter for time-domain studies and a deeper understanding of the dynamic behavior of the universe. It will also change the way science is done – astronomer-and-telescope is giving way to astronomer-and-data as an engine of new knowledge.

    Preparing the information

    The LSST’s biggest strength may be its ability to capture transients – rare or changing events usually missed in narrow-field searches and static images. The good news is that software will alert astronomers almost immediately when a transient is detected to enable fast follow-up observations by other instruments. The not-so-good news is that up to 10 million such events are possible each night. With detection rates like these, good data handling is essential.

    2
    An innovative method developed by the LSST Data Management team will allow the storage of large volumes of data for rapid access. LSST Project/NSF/AURA.

    The LSST Data Management Team is designing user tools that can operate on a variety of computing systems without the need for large downloads, all based on open-source software. Their system includes two basic types of products: those produced for nightly observing and those produced for annual science releases.

    Nightly processing will subtract two exposures of the each image field to quickly highlight changes. The data stream from the camera will be pipeline processed and continuously updated in real time, with a transient alert triggered within 60 seconds of completing an image readout.

    Data complied into scheduled science releases will get considerable reprocessing to ensure that all contents are consistent, that false detections are filtered and that faint signal sources are confirmed. Reprocessing will also classify objects using both standard categories (position, movement, brightness, etc.) and dimensions derived mathematically from the data themselves. Products will be reprocessed at time intervals from nightly to annually, which means that their quality will improve as additional observations are accumulated.

    Preparing the science

    The LSST program includes Science Collaborations, teams of scientists and technical experts that work to grow the observatory’s science agendas. There are currently eight collaborations in such areas as galaxies, dark energy and active galactic nuclei. One of the most unique, however, is the Informatics and Statistics Science Collaboration (ISSC) which, unlike other teams, doesn’t focus on a specific astronomy topic but cuts across them all. New methods will be needed to handle heavy computational loads, to optimize data representations, and to guide astronomers through the discovery process. The ISSC focus is on such new approaches to ensure that astronomers realize the best return from the anticipated flood of new data.

    “Data analysis is changing because of the volume of data we’re facing,” says Kirk Borne, an astrophysicist and data scientist with Booz Allen Hamilton, and a core member of the ISSC. “Traditional data analysis is more about fitting a physical model to observed data. When I was growing up, we didn’t have sample sizes like this. We were trying to understand a particular phenomenon with our small sample sets. Now, it’s more unsupervised. Instead of asking ‘tell me about my model,’ you ask ‘tell me what you know.’ Data become the model, which means that more is different.”

    LSST data will almost certainly expand the chances for surprise. “When we start adding different measurement domains like gravitational wave physics and neutrino astrophysics for exploration,” adds Borne, “we start seeing these interesting new associations. Ultraluminous infrared galaxies are connected with colliding starbursting galaxies, for example, but it was a discovery made by combining optical radiation with infrared. Quasars were discovered when people compared bright radio observations of galaxies with optical images of galaxies.”

    4
    A depiction of the observatory interior. LSST Project/NSF/AURA.

    Preparing the people

    The LSST Data Management Team is starting to orient the astronomy community to what’s coming with a series of conferences and workshops. “We try to cover as many meetings as we can, giving talks and hosting hack sessions,” says William O’Mullane, the team Project Manager.

    Science notebooks, which allow users to collaborate, analyze data and publish their results online, will be an integral tool for LSST research communities and one that’s being introduced early. “We rolled out Jupyterlab [an upgraded type of science notebook] at a recent workshop,” he adds, “which is a much faster way to get people working with the stack [the image manipulation code set].”

    The next generation of big data astronomers is also being groomed through graduate curricula and a special fellowship program. “Getting students involved early is a very good thing, both for the field and for them,” says Mario Juric, Associate Professor of Astronomy at the University of Washington, and the LSST Data Management System Science Team Coordinator. “Students need to understand early on what it’s like to do large-scale experiments, to design equipment and software, and to collaborate with very large teams. Astronomy today is entering the age of big data just like particle physics did 20 or 30 years ago.

    “We also have a Data Science Fellowship Program,” adds Juric, “a cooperative effort a few of us initiated in 2015 to educate the next generation of astronomer data scientists through a two-year series of workshops.” The program is funded by the LSST Corporation, a non-profit organization dedicated to enabling science with the telescope, and student interest has been intense. Only about a dozen people were admitted from among 200 applicants in a recent selection cycle.

    Telescope data are being packaged for a wide audience, too. The LSST Education and Public Outreach (EPO) program is working to involve classrooms, citizen scientists and the general public as deeply in big data astronomy as they want (or dare) to go. Primary EPO goals are to help educators integrate real LSST data into classrooms and introductory astronomy courses, and to help non-specialists access LSST data in ways similar to those of professional astronomers. Working through platforms like Zooniverse, almost anyone will be able to conduct serious research projects. “Citizen volunteers should be thought of as members of the science collaboration,” says Amanda Bauer, Head of LSST EPO.

    The future IS the data

    The LSST will cement an age where software is as critical to astronomy as the telescope. “When I was in graduate school,” says Juric, “I worked on the Sloan Digital Sky Survey (SDSS) and I didn’t touch a telescope; I did all my research out of a database.

    SDSS Telescope at Apache Point Observatory, NM, USA, Altitude 2,788 meters (9,147 ft)

    I know many students who have done the same. So we’re already seeing that kind of migration.”

    O’Mullane would agree. “Large surveys like SDSS, Gaia and now LSST provide enough data for a different approach,” he says. “Astronomers are not always reaching for a telescope. In fact, missions like LSST basically only offer you the archive; you can’t even request the observatory to make a specific observation.”

    ESA/GAIA satellite

    5
    Observatory construction on the El Peñon summit, Chile as of November 2017. LSST Project/NSF/AURA.

    Given the enormous information streams that LSST will deliver, it soon won’t be possible for scientists to directly look at even a representative fraction of available data. Instead, they’ll increasingly rely on skillful manipulation of algorithms to examine relationships within the entirety of a dataset. The best insights will be obtained by those who ask the best questions of all those numbers.

    And, because more people will have ready access to those data, the biggest discoveries may come not only from the professionals, but from dedicated amateurs working at home on their laptops.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 8:04 am on November 27, 2017 Permalink | Reply
    Tags: , , , , , , LSST-Large Synoptic Survey Telescope, , Simulating the universe using Einstein’s theory of gravity may solve cosmic puzzles   

    From ScienceNews: “Simulating the universe using Einstein’s theory of gravity may solve cosmic puzzles” 

    ScienceNews bloc

    ScienceNews

    November 25, 2017
    Emily Conover

    Until recently, simulations of the universe haven’t given its lumps their due.

    1
    UNEVEN TERRAIN Universe simulations that consider general relativity (one shown) may shift knowledge of the cosmos. James Mertens

    If the universe were a soup, it would be more of a chunky minestrone than a silky-smooth tomato bisque.

    Sprinkled with matter that clumps together due to the insatiable pull of gravity, the universe is a network of dense galaxy clusters and filaments — the hearty beans and vegetables of the cosmic stew. Meanwhile, relatively desolate pockets of the cosmos, known as voids, make up a thin, watery broth in between.

    Until recently, simulations of the cosmos’s history haven’t given the lumps their due. The physics of those lumps is described by general relativity, Albert Einstein’s theory of gravity. But that theory’s equations are devilishly complicated to solve. To simulate how the universe’s clumps grow and change, scientists have fallen back on approximations, such as the simpler but less accurate theory of gravity devised by Isaac Newton.

    Relying on such approximations, some physicists suggest, could be mucking with measurements, resulting in a not-quite-right inventory of the cosmos’s contents. A rogue band of physicists suggests that a proper accounting of the universe’s clumps could explain one of the deepest mysteries in physics: Why is the universe expanding at an increasingly rapid rate?

    The accepted explanation for that accelerating expansion is an invisible pressure called dark energy. In the standard theory of the universe, dark energy makes up about 70 percent of the universe’s “stuff” — its matter and energy. Yet scientists still aren’t sure what dark energy is, and finding its source is one of the most vexing problems of cosmology.

    Perhaps, the dark energy doubters suggest, the speeding up of the expansion has nothing to do with dark energy. Instead, the universe’s clumpiness may be mimicking the presence of such an ethereal phenomenon.

    Most physicists, however, feel that proper accounting for the clumps won’t have such a drastic impact. Robert Wald of the University of Chicago, an expert in general relativity, says that lumpiness is “never going to contribute anything that looks like dark energy.” So far, observations of the universe have been remarkably consistent with predictions based on simulations that rely on approximations.

    _____________________________________________________________________________

    Growing a lumpy universe

    The universe has gradually grown lumpier throughout its history. During inflation, rapid expansion magnified tiny quantum fluctuations into minute density variations. Over time, additional matter glommed on to dense spots due to the stronger gravitational pull from the extra mass. After 380,000 years, those blips were imprinted as hot and cold spots in the cosmic microwave background, the oldest light in the universe. Lumps continued growing for billions of years, forming stars, planets, galaxies and galaxy clusters.

    1

    _____________________________________________________________________________

    As observations become more detailed, though, even slight inaccuracies in simulations could become troublesome. Already, astronomers are charting wide swaths of the sky in great detail, and planning more extensive surveys. To translate telescope images of starry skies into estimates of properties such as the amount of matter in the universe, scientists need accurate simulations of the cosmos’s history. If the detailed physics of clumps is important, then simulations could go slightly astray, sending estimates off-kilter. Some scientists already suggest that the lumpiness is behind a puzzling mismatch of two estimates of how fast the universe is expanding.

    Researchers are attempting to clear up the debate by conquering the complexities of general relativity and simulating the cosmos in its full, lumpy glory. “That is really the new frontier,” says cosmologist Sabino Matarrese of the University of Padua in Italy, “something that until a few years ago was considered to be science fiction.” In the past, he says, scientists didn’t have the tools to complete such simulations. Now researchers are sorting out the implications of the first published results of the new simulations. So far, dark energy hasn’t been explained away, but some simulations suggest that certain especially sensitive measurements of how light is bent by matter in the universe might be off by as much as 10 percent.

    Soon, simulations may finally answer the question: How much do lumps matter? The idea that cosmologists might have been missing a simple answer to a central problem of cosmology incessantly nags some skeptics. For them, results of the improved simulations can’t come soon enough. “It haunts me. I can’t let it go,” says cosmologist Rocky Kolb of the University of Chicago.

    Smooth universe

    By observing light from different eras in the history of the cosmos, cosmologists can compute the properties of the universe, such as its age and expansion rate. But to do this, researchers need a model, or framework, that describes the universe’s contents and how those ingredients evolve over time. Using this framework, cosmologists can perform computer simulations of the universe to make predictions that can be compared with actual observations.

    2
    COSMIC WEB Clumps and filaments of matter thread through a simulated universe 2 billion light years across. This simulation incorporates some aspects of Einstein’s theory of general relativity, allowing for detailed results while avoiding the difficulties of the full-fledged theory.

    After Einstein introduced his theory in 1915, physicists set about figuring out how to use it to explain the universe. It wasn’t easy, thanks to general relativity’s unwieldy, difficult-to-solve suite of equations. Meanwhile, observations made in the 1920s indicated that the universe wasn’t static as previously expected; it was expanding. Eventually, researchers converged on a solution to Einstein’s equations known as the Friedmann-Lemaître-Robertson-Walker metric. Named after its discoverers, the FLRW metric describes a simplified universe that is homogeneous and isotropic, meaning that it appears identical at every point in the universe and in every direction. In this idealized cosmos, matter would be evenly distributed, no clumps. Such a smooth universe would expand or contract over time.

    A smooth-universe approximation is sensible, because when we look at the big picture, averaging over the structures of galaxy clusters and voids, the universe is remarkably uniform. It’s similar to the way that a single spoonful of minestrone soup might be mostly broth or mostly beans, but from bowl to bowl, the overall bean-to-broth ratios match.

    In 1998, cosmologists revealed that not only was the universe expanding, but its expansion was also accelerating (SN: 2/2/08, p. 74). Observations of distant exploding stars, or supernovas, indicated that the space between us and them was expanding at an increasing clip. But gravity should slow the expansion of a universe evenly filled with matter. To account for the observed acceleration, scientists needed another ingredient, one that would speed up the expansion. So they added dark energy to their smooth-universe framework.

    Now, many cosmologists follow a basic recipe to simulate the universe — treating the cosmos as if it has been run through an imaginary blender to smooth out its lumps, adding dark energy and calculating the expansion via general relativity. On top of the expanding slurry, scientists add clumps and track their growth using approximations, such as Newtonian gravity, which simplifies the calculations.

    In most situations, Newtonian gravity and general relativity are near-twins. Throw a ball while standing on the surface of the Earth, and it doesn’t matter whether you use general relativity or Newtonian mechanics to calculate where the ball will land — you’ll get the same answer. But there are subtle differences. In Newtonian gravity, matter directly attracts other matter. In general relativity, gravity is the result of matter and energy warping spacetime, creating curves that alter the motion of objects (SN: 10/17/15, p. 16). The two theories diverge in extreme gravitational environments. In general relativity, for example, hulking black holes produce inescapable pits that reel in light and matter (SN: 5/31/14, p. 16). The question, then, is whether the difference between the two theories has any impact in lumpy-universe simulations.

    Most cosmologists are comfortable with the status quo simulations because observations of the heavens seem to fit neatly together like interlocking jigsaw puzzle pieces. Predictions based on the standard framework agree remarkably well with observations of the cosmic microwave background — ancient light released when the universe was just 380,000 years old (SN: 3/21/15, p. 7). And measurements of cosmological parameters — the fraction of dark energy and matter, for example — are generally consistent, whether they are made using the light from galaxies or the cosmic microwave background [CMB].

    CMB per ESA/Planck


    ESA/Planck

    3
    An image from the Two-Micron All Sky Survey of 1.6 million galaxies in infrared light reveals how matter clumps into galaxy clusters and filaments. Future large-scale surveys may require improved simulations that use general relativity to track the evolution of lumps over time. T.H. Jarrett, J. Carpenter & R. Hurt, obtained as part of 2MASS, a joint project of Univ. of Massachusetts and the Infrared Processing and Analysis Center/Caltech, funded by NASA and NSF.


    Caltech 2MASS Telescopes, a joint project of the University of Massachusetts and the Infrared Processing and Analysis Center (IPAC) at Caltech, at the Whipple Observatory on Mt. Hopkins south of Tucson, AZ, and at the Cerro Tololo Inter-American Observatory near La Serena, Chile.

    Dethroning dark energy

    Some cosmologists hope to explain the universe’s accelerating expansion by fully accounting for the universe’s lumpiness, with no need for the mysterious dark energy.

    These researchers argue that clumps of matter can alter how the universe expands, when the clumps’ influence is tallied up over wide swaths of the cosmos. That’s because, in general relativity, the expansion of each local region of space depends on how much matter is within. Voids expand faster than average; dense regions expand more slowly. Because the universe is mostly made up of voids, this effect could produce an overall expansion and potentially an acceleration. Known as backreaction, this idea has lingered in obscure corners of physics departments for decades, despite many claims that backreaction’s effect is small or nonexistent.

    Backreaction continues to appeal to some researchers because they don’t have to invent new laws of physics to explain the acceleration of the universe. “If there is an alternative which is based only upon traditional physics, why throw that away completely?” Matarrese asks.

    Most cosmologists, however, think explaining away dark energy just based on the universe’s lumps is unlikely. Previous calculations have indicated any effect would be too small to account for dark energy, and would produce an acceleration that changes in time in a way that disagrees with observations.

    “My personal view is that it’s a much smaller effect,” says astrophysicist Hayley Macpherson of Monash University in Melbourne, Australia. “That’s just basically a gut feeling.” Theories that include dark energy explain the universe extremely well, she points out. How could that be if the whole approach is flawed?

    New simulations by Macpherson and others that model how lumps evolve in general relativity may be able to gauge the importance of backreaction once and for all. “Up until now, it’s just been too hard,” says cosmologist Tom Giblin of Kenyon College in Gambier, Ohio.

    To perform the simulations, researchers needed to get their hands on supercomputers capable of grinding through the equations of general relativity as the simulated universe evolves over time. Because general relativity is so complex, such simulations are much more challenging than those that use approximations, such as Newtonian gravity. But, a seemingly distinct topic helped lay some of the groundwork: gravitational waves, or ripples in the fabric of spacetime.

    4
    SPECKLED SPACETIME A lumpy universe, recently simulated using general relativity, shows clumps of matter (pink and yellow) that beget stars and galaxies. H. Macpherson, Paul Lasky, Daniel Price.

    The Advanced Laser Interferometer Gravitational-Wave Observatory, LIGO, searches for the tremors of cosmic dustups such as colliding black holes (SN: 10/28/17, p. 8).


    VIRGO Gravitational Wave interferometer, near Pisa, Italy

    Caltech/MIT Advanced aLigo Hanford, WA, USA installation


    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project

    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger-Zib

    ESA/eLISA the future of gravitational wave research

    1
    Skymap showing how adding Virgo to LIGO helps in reducing the size of the source-likely region in the sky. (Credit: Giuseppe Greco (Virgo Urbino group)

    In preparation for this search, physicists honed their general relativity skills on simulations of the spacetime storm kicked up by black holes, predicting what LIGO might see and building up the computational machinery to solve the equations of general relativity. Now, cosmologists have adapted those techniques and unleashed them on entire, lumpy universes.

    The first lumpy universe simulations to use full general relativity were unveiled in the June 2016 Physical Review Letters. Giblin and colleagues reported their results simultaneously with Eloisa Bentivegna of the University of Catania in Italy and Marco Bruni of the University of Portsmouth in England.

    So far, the simulations have not been able to account for the universe’s acceleration. “Nearly everybody is convinced [the effect] is too small to explain away the need for dark energy,” says cosmologist Martin Kunz of the University of Geneva. Kunz and colleagues reached the same conclusion in their lumpy-universe simulations, which have one foot in general relativity and one in Newtonian gravity. They reported their first results in Nature Physics in March 2016.

    Backreaction aficionados still aren’t dissuaded. “Before saying the effect is too small to be relevant, I would, frankly, wait a little bit more,” Matarrese says. And the new simulations have potential caveats. For example, some simulated universes behave like an old arcade game — if you walk to one edge of the universe, you cross back over to the other side, like Pac-Man exiting the right side of the screen and reappearing on the left. That geometry would suppress the effects of backreaction in the simulation, says Thomas Buchert of the University of Lyon in France. “This is a good beginning,” he says, but there is more work to do on the simulations. “We are in infancy.”

    Different assumptions in a simulation can lead to disparate results, Bentivegna says. As a result, she doesn’t think that her lumpy, general-relativistic simulations have fully closed the door on efforts to dethrone dark energy. For example, tricks of light might be making it seem like the universe’s expansion is accelerating, when in fact it isn’t.

    When astronomers observe far-away sources like supernovas, the light has to travel past all of the lumps of matter between the source and Earth. That journey could make it look like there’s an acceleration when none exists. “It’s an optical illusion,” Bentivegna says. She and colleagues see such an effect in a simulation reported in March in the Journal of Cosmology and Astroparticle Physics. But, she notes, this work simulated an unusual universe, in which matter sits on a grid — not a particularly realistic scenario.

    For most other simulations, the effect of optical illusions remains small. That leaves many cosmologists, including Giblin, even more skeptical of the possibility of explaining away dark energy: “I feel a little like a downer,” he admits.

    6
    Lumps (gray) within this simulated universe change the path light takes (yellow lines), potentially affecting observations. Matter bends space, slightly altering the light’s trajectory from that in a smooth universe. James Mertens.

    Surveying the skies

    Subtle effects of lumps could still be important. In Hans Christian Andersen’s The Princess and the Pea, the princess felt a tiny pea beneath an impossibly tall stack of mattresses. Likewise, cosmologists’ surveys are now so sensitive that even if the universe’s lumps have a small impact, estimates could be thrown out of whack.

    The Dark Energy Survey, for example, has charted 26 million galaxies using the Victor M. Blanco Telescope in Chile, measuring how the light from those galaxies is distorted by the intervening matter on the journey to Earth.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    In a set of papers posted online August 4 at arXiv.org, scientists with the Dark Energy Survey reported new measurements of the universe’s properties, including the amount of matter (both dark and normal) and how clumpy that matter is (SN: 9/2/17, p. 32). The results are consistent with those from the cosmic microwave background [CMB] — light emitted billions of years earlier.

    To make the comparison, cosmologists took the measurements from the cosmic microwave background, early in the universe, and used simulations to extrapolate to what galaxies should look like later in the universe’s history. It’s like taking a baby’s photograph, precisely computing the number and size of wrinkles that should emerge as the child ages and finding that your picture agrees with a snapshot taken decades later. The matching results so far confirm cosmologists’ standard picture of the universe — dark energy and all.

    “So far, it has not yet been important for the measurements that we’ve made to actually include general relativity in those simulations,” says Risa Wechsler, a cosmologist at Stanford University and a founding member of the Dark Energy Survey. But, she says, for future measurements, “these effects could become more important.” Cosmologists are edging closer to Princess and the Pea territory.

    Those future surveys include the Dark Energy Spectroscopic Instrument, DESI, set to kick off in 2019 at Kitt Peak National Observatory near Tucson; the European Space Agency’s Euclid satellite, launching in 2021; and the Large Synoptic Survey Telescope in Chile, which is set to begin collecting data in 2023.

    LBNL/DESI spectroscopic instrument on the Mayall 4-meter telescope at Kitt Peak National Observatory, Altitude 2,120 m (6,960 ft)

    LBNL/DESI spectroscopic instrument on the Mayall 4-meter telescope at Kitt Peak National Observatory starting in 2018

    NOAO/Mayall 4 m telescope at Kitt Peak, Arizona, USA, Altitude 2,120 m (6,960 ft)

    ESA/Euclid spacecraft

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    If cosmologists keep relying on simulations that don’t use general relativity to account for lumps, certain kinds of measurements of weak lensing — the bending of light due to matter acting like a lens — could be off by up to 10 percent, Giblin and colleagues reported at arXiv.org in July. “There is something that we’ve been ignoring by making approximations,” he says.

    That 10 percent could screw up all kinds of estimates, from how dark energy changes over the universe’s history to how fast the universe is currently expanding, to the calculations of the masses of ethereal particles known as neutrinos. “You have to be extremely certain that you don’t get some subtle effect that gets you the wrong answers,” Geneva’s Kunz says, “otherwise the particle physicists are going to be very angry with the cosmologists.”

    Some estimates may already be showing problem signs, such as the conflicting estimates of the cosmic expansion rate (SN: 8/6/16, p. 10). Using the cosmic microwave background, cosmologists find a slower expansion rate than they do from measurements of supernovas. If this discrepancy is real, it could indicate that dark energy changes over time. But before jumping to that conclusion, there are other possible causes to rule out, including the universe’s lumps.

    Until the issue of lumps is smoothed out, scientists won’t know how much lumpiness matters to the cosmos at large. “I think it’s rather likely that it will turn out to be an important effect,” Kolb says. Whether it explains away dark energy is less certain. “I want to know the answer so I can get on with my life.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 9:25 am on November 22, 2017 Permalink | Reply
    Tags: , , , , LSST-Large Synoptic Survey Telescope, Preparing to Light Up the LSST Network   

    From LSST: “Preparing to Light Up the LSST Network” 

    LSST

    Large Synoptic Survey Telescope

    November 16, 2017
    No writer credit found

    November 12, 2017 – LSST’s fiber-optic network, which will provide the necessary 100Gbps connectivity to move data from the summit of Cerro Pachón to all LSST operational sites and to multiple data centers, came one milestone closer to activation last week; the AURA LSST Dense Wavelength Division Multiplexing (DWDM) Network Equipment that LSST will use initially was installed in several key locations. DWDM equipment sends pulses of light down the fiber to transmit data, therefore a DWDM box is needed at each end of a fiber network in order for the network to be operational. In this installation project, the Summit-Base Network DWDM equipment was set up in the La Serena computer room and in the communications hut on the summit of Cerro Pachón. The Santiago portion of the Base-Archive Network was also addressed, with DWDM hardware installed in La Serena as well as at the National University Network (REUNA) facility in Santiago. The DWDM hardware in Santiago will be connected to AmLight DWDM equipment which will transfer the data to Florida. There, it will be picked up by Florida LambdaRail (FLR), ESnet, and internet2 for its journey to NSCA via Chicago.

    The primary South to North network traffic will be the transfer of raw image data from Cerro Pachón to the National Center for Supercomputing Applications (NCSA), where the data will be processed into scientific data products, including transient alerts, calibrated images, and catalogs. From there, a backup of the raw data will be made over the international network to IN2P3 in Lyon, France. IN2P3 will also perform half of the annual catalog processing. The network will also transfer data from North to South, returning the processed scientific data products to the Chilean Data Access Center (DAC), where they will be made available to the Chilean scientific community.

    The LSST Summit-Base and Base-Archive networks are on new fibers all the way to Santiago; there is also an existing fiber that provides a backup path from La Serena to Santiago. From Santiago to Florida, the data will travel on a new submarine fiber cable, with a backup on existing fiber cables. LSST currently shares the AURA fiber-optic network (connecting La Serena and the Summit) with the Gemini and CTIO telescopes, but will have its own dedicated DWDM equipment in 2018. Additional information on LSST data flow during LSST Operations is available here.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    LSST telescope, currently under construction at Cerro Pachón Chile
    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.
    LSST Interior

    LSST/Camera, built at SLAC
    LSST/Camera, built at SLAC

    The LSST is a new kind of telescope. Currently under construction in Chile, it is being built to rapidly survey the night-time sky. Compact and nimble, the LSST will move quickly between images, yet its large mirror and large field of view—almost 10 square degrees of sky, or 40 times the size of the full moon—work together to deliver more light from faint astronomical objects than any optical telescope in the world.

    From its mountaintop site in the foothills of the Andes, the LSST will take more than 800 panoramic images each night with its 3.2 billion-pixel camera, recording the entire visible sky twice each week. Each patch of sky it images will be visited 1000 times during the survey. With a light-gathering power equal to a 6.7-m diameter primary mirror, each of its 30-second observations will be able to detect objects 10 million times fainter than visible with the human eye. A powerful data system will compare new with previous images to detect changes in brightness and position of objects as big as far-distant galaxy clusters and as small as near-by asteroids.

    The LSST’s combination of telescope, mirror, camera, data processing, and survey will capture changes in billions of faint objects and the data it provides will be used to create an animated, three-dimensional cosmic map with unprecedented depth and detail , giving us an entirely new way to look at the Universe. This map will serve a myriad of purposes, from locating that mysterious substance called dark matter and characterizing the properties of the even more mysterious dark energy, to tracking transient objects, to studying our own Milky Way Galaxy in depth. It will even be used to detect and track potentially hazardous asteroids—asteroids that might impact the Earth and cause significant damage.

    As with past technological advances that opened new windows of discovery, such a powerful system for exploring the faint and transient Universe will undoubtedly serve up surprises.

    Plans for sharing the data from LSST with the public are as ambitious as the telescope itself. Anyone with a computer will be able to view the moving map of the Universe created by the LSST, including objects a hundred million times fainter than can be observed with the unaided eye. The LSST project will provide analysis tools to enable both students and the public to participate in the process of scientific discovery. We invite you to learn more about LSST science.

    The LSST will be unique: no existing telescope or proposed camera could be retrofitted or re-designed to cover ten square degrees of sky with a collecting area of forty square meters. Named the highest priority for ground-based astronomy in the 2010 Decadal Survey, the LSST project formally began construction in July 2014.

     
  • richardmitnick 7:07 am on October 21, 2017 Permalink | Reply
    Tags: ADASS, , , , , , , LSST-Large Synoptic Survey Telescope, ,   

    From ALMA: “ALMA Organizes International Astroinformatics Conference in Chile” 

    ESO/NRAO/NAOJ ALMA Array in Chile in the Atacama at Chajnantor plateau, at 5,000 metres

    ALMA

    20 October, 2017

    Nicolás Lira
    Education and Public Outreach Coordinator
    Joint ALMA Observatory, Santiago – Chile
    Phone: +56 2 2467 6519
    Cell phone: +56 9 9445 7726
    nicolas.lira@alma.cl

    Andrea Riquelme P.
    Journalist
    ADASS – Chile
    Cell phone: +56 9 93 96 96 38
    acriquelme@gmail.com

    Related Posts
    Launch of ChiVO, the first Chilean Virtual Observatory

    1
    Experts from 33 countries will attend the global Astronomical Data Analysis Software & Systems (ADASS) conference, which brings together astronomy and computer science. Organized by the Atacama Large Millimeter/submillimeter Array (ALMA), the European Southern Observatory (ESO) and the Universidad Técnica Federico Santa María (UTFSM), from October 22 to 26 for the first time in Chile, ADASS will seek to develop astronomy and other industries, providing an opportunity to promote local talent to the rest of the world.

    Chile is a privileged setting for astronomic observation and data collection, generating an enormous amount of public data. The ALMA observatory alone generates a terabyte of data per day; the LSST will reach 30 terabytes per night by 2022 and the SKA 360 terabytes per hour by 2030.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    This evolution implies a never seen before data storage and analysis challenge, and Chile is in a position to lead this progress with the support of data, communication and technology platforms and expert human capital with the support of this potent cloud computing era. Herein lies the importance of Chile’s debut as Latin American headquarters for the International Astronomical Data Analysis Software & Systems-ADASS Conference, which after 27 years in practice, has chosen the country as its meeting location.
    Invited speakers. Credit: ADASS 2017 website (www.adass.cl)

    2
    ADASS Invited speakers. Credit: ADASS 2017 website (www.adass.cl)

    “A modern observatory today is a true data factory, and the creation of systems and infrastructure capable of storing this data and analyzing and sharing it will contribute to the democratization of access to current, critical and unique information, necessary for the hundreds of groups of researchers of the Universe around the world,” says Jorge Ibsen, Head of the ALMA Computing Department and Co-Chair of ADASS.

    The Chilean Virtual Observatory (ChiVO) and The International Virtual Observatory Alliance (IVOA), have worked together for years to define standards for sharing data between observatories around the world and to create public access protocols. Mauricio Solar, Director of ChiVO and Co-Chair of the ADASS conference, assures that Chile can contribute to astronomy, not just through astronomers, but also through the development of applications in astroinformatics that, for example, can help find evidence of extraterrestrial life.

    3
    Local Organizing Committee. Credit: ADASS 2017 website (http://www.adass.cl)

    Astroinformatics combines advanced computing, statistics applied to mass complex data, and astronomy. Topics to be addressed at ADASS include: high-performance computing (HPC) for astronomical data, human-computer interaction and interfaces for large data collections, challenges in the operation of large-scale highly complex instrumentation, network infrastructure and data centers in the era of mass data transfer, machine learning applied to astronomical data, and software for the operation of Earth and space observatories, diversity and inclusion, and citizen education and science, among other subjects.

    The ADASS Conference will bring together 350 experts from 33 countries at the Sheraton Hotel in Santiago, and will be followed by an Interoperability Meeting of the International Virtual Observatories Alliance (IVOA), organized by ChiVO, from October 27 to 29. More information at http://www.adass.cl.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

    The Atacama Large Millimeter/submillimeter Array (ALMA), an international astronomy facility, is a partnership of Europe, North America and East Asia in cooperation with the Republic of Chile. ALMA is funded in Europe by the European Organization for Astronomical Research in the Southern Hemisphere (ESO), in North America by the U.S. National Science Foundation (NSF) in cooperation with the National Research Council of Canada (NRC) and the National Science Council of Taiwan (NSC) and in East Asia by the National Institutes of Natural Sciences (NINS) of Japan in cooperation with the Academia Sinica (AS) in Taiwan.

    ALMA construction and operations are led on behalf of Europe by ESO, on behalf of North America by the National Radio Astronomy Observatory (NRAO), which is managed by Associated Universities, Inc. (AUI) and on behalf of East Asia by the National Astronomical Observatory of Japan (NAOJ). The Joint ALMA Observatory (JAO) provides the unified leadership and management of the construction, commissioning and operation of ALMA.

    NRAO Small
    ESO 50 Large
    NAOJ

     
  • richardmitnick 1:24 pm on September 28, 2017 Permalink | Reply
    Tags: , “ExaSky” - “Computing the Sky at Extreme Scales” project or, Cartography of the cosmos, , , , LSST-Large Synoptic Survey Telescope, Salman Habib, , The computer can generate many universes with different parameters, There are hundreds of billions of stars in our own Milky Way galaxy   

    From ALCF: “Cartography of the cosmos” 

    Argonne Lab
    News from Argonne National Laboratory

    ALCF

    September 27, 2017
    John Spizzirri

    2
    Argonne’s Salman Habib leads the ExaSky project, which takes on the biggest questions, mysteries, and challenges currently confounding cosmologists.

    1
    No image caption or credit

    There are hundreds of billions of stars in our own Milky Way galaxy.

    Milky Way NASA/JPL-Caltech /ESO R. Hurt

    Estimates indicate a similar number of galaxies in the observable universe, each with its own large assemblage of stars, many with their own planetary systems. Beyond and between these stars and galaxies are all manner of matter in various phases, such as gas and dust. Another form of matter, dark matter, exists in a very different and mysterious form, announcing its presence indirectly only through its gravitational effects.

    This is the universe Salman Habib is trying to reconstruct, structure by structure, using precise observations from telescope surveys combined with next-generation data analysis and simulation techniques currently being primed for exascale computing.

    “We’re simulating all the processes in the structure and formation of the universe. It’s like solving a very large physics puzzle,” said Habib, a senior physicist and computational scientist with the High Energy Physics and Mathematics and Computer Science divisions of the U.S. Department of Energy’s (DOE) Argonne National Laboratory.

    Habib leads the “Computing the Sky at Extreme Scales” project or “ExaSky,” one of the first projects funded by the recently established Exascale Computing Project (ECP), a collaborative effort between DOE’s Office of Science and its National Nuclear Security Administration.

    From determining the initial cause of primordial fluctuations to measuring the sum of all neutrino masses, this project’s science objectives represent a laundry list of the biggest questions, mysteries, and challenges currently confounding cosmologists.

    There is the question of dark energy, the potential cause of the accelerated expansion of the universe, while yet another is the nature and distribution of dark matter in the universe.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Dark Matter Research

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Scientists studying the cosmic microwave background hope to learn about more than just how the universe grew—it could also offer insight into dark matter, dark energy and the mass of the neutrino.

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    Dark Matter Particle Explorer China

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB deep in Sudbury’s Creighton Mine

    LUX Dark matter Experiment at SURF, Lead, SD, USA

    ADMX Axion Dark Matter Experiment, U Uashington

    These are immense questions that demand equally expansive computational power to answer. The ECP is readying science codes for exascale systems, the new workhorses of computational and big data science.

    Initiated to drive the development of an “exascale ecosystem” of cutting-edge, high-performance architectures, codes and frameworks, the ECP will allow researchers to tackle data and computationally intensive challenges such as the ExaSky simulations of the known universe.

    In addition to the magnitude of their computational demands, ECP projects are selected based on whether they meet specific strategic areas, ranging from energy and economic security to scientific discovery and healthcare.

    “Salman’s research certainly looks at important and fundamental scientific questions, but it has societal benefits, too,” said Paul Messina, Argonne Distinguished Fellow. “Human beings tend to wonder where they came from, and that curiosity is very deep.”

    HACC’ing the night sky

    For Habib, the ECP presents a two-fold challenge — how do you conduct cutting-edge science on cutting-edge machines?

    The cross-divisional Argonne team has been working on the science through a multi-year effort at the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science User Facility. The team is running cosmological simulations for large-scale sky surveys on the facility’s 10-petaflop high-performance computer, Mira. The simulations are designed to work with observational data collected from specialized survey telescopes, like the forthcoming Dark Energy Spectroscopic Instrument (DESI) and the Large Synoptic Survey Telescope (LSST).

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    Survey telescopes look at much larger areas of the sky — up to half the sky, at any point — than does the Hubble Space Telescope, for instance, which focuses more on individual objects.

    NASA/ESA Hubble Telescope

    One night concentrating on one patch, the next night another, survey instruments systematically examine the sky to develop a cartographic record of the cosmos, as Habib describes it.

    Working in partnership with Los Alamos and Lawrence Berkeley National Laboratories, the Argonne team is readying itself to chart the rest of the course.

    Their primary code, which Habib helped develop, is already among the fastest science production codes in use. Called HACC (Hardware/Hybrid Accelerated Cosmology Code), this particle-based cosmology framework supports a variety of programming models and algorithms.

    Unique among codes used in other exascale computing projects, it can run on all current and prototype architectures, from the basic X86 chip used in most home PCs, to graphics processing units, to the newest Knights Landing chip found in Theta, the ALCF’s latest supercomputing system.

    As robust as the code is already, the HACC team continues to develop it further, adding significant new capabilities, such as hydrodynamics and associated subgrid models.

    “When you run very large simulations of the universe, you can’t possibly do everything, because it’s just too detailed,” Habib explained. “For example, if we’re running a simulation where we literally have tens to hundreds of billions of galaxies, we cannot follow each galaxy in full detail. So we come up with approximate approaches, referred to as subgrid models.”

    Even with these improvements and its successes, the HACC code still will need to increase its performance and memory to be able to work in an exascale framework. In addition to HACC, the ExaSky project employs the adaptive mesh refinement code Nyx, developed at Lawrence Berkeley. HACC and Nyx complement each other with different areas of specialization. The synergy between the two is an important element of the ExaSky team’s approach.

    A cosmological simulation approach that melds multiple approaches allows the verification of difficult-to-resolve cosmological processes involving gravitational evolution, gas dynamics and astrophysical effects at very high dynamic ranges. New computational methods like machine learning will help scientists to quickly and systematically recognize features in both the observational and simulation data that represent unique events.

    A trillion particles of light

    The work produced under the ECP will serve several purposes, benefitting both the future of cosmological modeling and the development of successful exascale platforms.

    On the modeling end, the computer can generate many universes with different parameters, allowing researchers to compare their models with observations to determine which models fit the data most accurately. Alternatively, the models can make predictions for observations yet to be made.

    Models also can produce extremely realistic pictures of the sky, which is essential when planning large observational campaigns, such as those by DESI and LSST.

    “Before you spend the money to build a telescope, it’s important to also produce extremely good simulated data so that people can optimize observational campaigns to meet their data challenges,” said Habib.

    But the cost of realism is expensive. Simulations can range in the trillion-particle realm and produce several petabytes — quadrillions of bytes — of data in a single run. As exascale becomes prevalent, these simulations will produce 10 to 100 times as much data.

    The work that the ExaSky team is doing, along with that of the other ECP research teams, will help address these challenges and those faced by computer manufacturers and software developers as they create coherent, functional exascale platforms to meet the needs of large-scale science. By working with their own codes on pre-exascale machines, the ECP research team can help guide vendors in chip design, I/O bandwidth and memory requirements and other features.

    “All of these things can help the ECP community optimize their systems,” noted Habib. “That’s the fundamental reason why the ECP science teams were chosen. We will take the lessons we learn in dealing with this architecture back to the rest of the science community and say, ‘We have found a solution.’”

    The Exascale Computing Project is a collaborative effort of two DOE organizations — the Office of Science and the National Nuclear Security Administration. As part of President Obama’s National Strategic Computing initiative, ECP was established to develop a capable exascale ecosystem, encompassing applications, system software, hardware technologies and architectures and workforce development to meet the scientific and national security mission needs of DOE in the mid-2020s timeframe.

    ANL ALCF Cetus IBM supercomputer

    ANL ALCF Theta Cray supercomputer

    ANL ALCF Cray Aurora supercomputer

    ANL ALCF MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About ALCF

    The Argonne Leadership Computing Facility’s (ALCF) mission is to accelerate major scientific discoveries and engineering breakthroughs for humanity by designing and providing world-leading computing facilities in partnership with the computational science community.

    We help researchers solve some of the world’s largest and most complex problems with our unique combination of supercomputing resources and expertise.

    ALCF projects cover many scientific disciplines, ranging from chemistry and biology to physics and materials science. Examples include modeling and simulation efforts to:

    Discover new materials for batteries
    Predict the impacts of global climate change
    Unravel the origins of the universe
    Develop renewable energy technologies

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 4:21 pm on August 4, 2017 Permalink | Reply
    Tags: , , , , , , , LSST-Large Synoptic Survey Telescope,   

    From Quanta: “Scientists Unveil a New Inventory of the Universe’s Dark Contents” 

    Quanta Magazine
    Quanta Magazine

    August 3, 2017
    Natalie Wolchover

    In a much-anticipated analysis of its first year of data, the Dark Energy Survey (DES) telescope experiment has gauged the amount of dark energy and dark matter in the universe by measuring the clumpiness of galaxies — a rich and, so far, barely tapped source of information that many see as the future of cosmology.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    The analysis, posted on DES’s website today and based on observations of 26 million galaxies in a large swath of the southern sky, tweaks estimates only a little. It draws the pie chart of the universe as 74 percent dark energy and 21 percent dark matter, with galaxies and all other visible matter — everything currently known to physicists — filling the remaining 5 percent sliver.

    The results are based on data from the telescope’s first observing season, which began in August 2013 and lasted six months. Since then, three more rounds of data collection have passed; the experiment begins its fifth and final planned observing season this month. As the 400-person team analyzes more of this data in the coming years, they’ll begin to test theories about the nature of the two invisible substances that dominate the cosmos — particularly dark energy, “which is what we’re ultimately going after,” said Joshua Frieman, co-founder and director of DES and an astrophysicist at Fermi National Accelerator Laboratory (Fermilab) and the University of Chicago. Already, with their first-year data, the experimenters have incrementally improved the measurement of a key quantity that will reveal what dark energy is.

    Both terms — dark energy and dark matter — are mental place holders for unknown physics. “Dark energy” refers to whatever is causing the expansion of the universe to accelerate, as astronomers first discovered it to be doing in 1998. And great clouds of missing “dark matter” have been inferred from 80 years of observations of their apparent gravitational effect on visible matter (though whether dark matter consists of actual particles or something else, nobody knows).

    The balance of the two unknown substances sculpts the distribution of galaxies. “As the universe evolves, the gravity of dark matter is making it more clumpy, but dark energy makes it less clumpy because it’s pushing galaxies away from each other,” Frieman said. “So the present clumpiness of the universe is telling us about that cosmic tug-of-war between dark matter and dark energy.”

    2
    The Dark Energy Survey uses a 570-megapixel camera mounted on the Victor M. Blanco Telescope in Chile (left). The camera is made out of 74 individual light-gathering wafers.

    A Dark Map

    Until now, the best way to inventory the cosmos has been to look at the Cosmic Microwave Background [CMB]: pristine light from the infant universe that has long served as a wellspring of information for cosmologists, but which — after the Planck space telescope mapped it in breathtakingly high resolution in 2013 — has less and less to offer.

    CMB per ESA/Planck

    ESA/Planck

    Cosmic microwaves come from the farthest point that can be seen in every direction, providing a 2-D snapshot of the universe at a single moment in time, 380,000 years after the Big Bang (the cosmos was dark before that). Planck’s map of this light shows an extremely homogeneous young universe, with subtle density variations that grew into the galaxies and voids that fill the universe today.

    Galaxies, after undergoing billions of years of evolution, are more complex and harder to glean information from than the cosmic microwave background, but according to experts, they will ultimately offer a richer picture of the universe’s governing laws since they span the full three-dimensional volume of space. “There’s just a lot more information in a 3-D volume than on a 2-D surface,” said Scott Dodelson, co-chair of the DES science committee and an astrophysicist at Fermilab and the University of Chicago.

    To obtain that information, the DES team scrutinized a section of the universe spanning an area 1,300 square degrees wide in the sky — the total area of 6,500 full moons — and stretching back 8 billion years (the data were collected by the half-billion-pixel Dark Energy Camera mounted on the Victor M. Blanco Telescope in Chile). They statistically analyzed the separations between galaxies in this cosmic volume. They also examined the distortion in the galaxies’ apparent shapes — an effect known as “weak gravitational lensing” that indicates how much space-warping dark matter lies between the galaxies and Earth. These two probes — galaxy clustering and weak lensing — are two of the four approaches that DES will eventually use to inventory the cosmos. Already, the survey’s measurements are more precise than those of any previous galaxy survey, and for the first time, they rival Planck’s.

    4

    “This is entering a new era of cosmology from galaxy surveys,” Frieman said. With DES’s first-year data, “galaxy surveys have now caught up to the cosmic microwave background in terms of probing cosmology. That’s really exciting because we’ve got four more years where we’re going to go deeper and cover a larger area of the sky, so we know our error bars are going to shrink.”

    For cosmologists, the key question was whether DES’s new cosmic pie chart based on galaxy surveys would differ from estimates of dark energy and dark matter inferred from Planck’s map of the cosmic microwave background. Comparing the two would reveal whether cosmologists correctly understand how the universe evolved from its early state to its present one. “Planck measures how much dark energy there should be” at present by extrapolating from its state at 380,000 years old, Dodelson said. “We measure how much there is.”

    The DES scientists spent six months processing their data without looking at the results along the way — a safeguard against bias — then “unblinded” the results during a July 7 video conference. After team leaders went through a final checklist, a member of the team ran a computer script to generate the long-awaited plot: DES’s measurement of the fraction of the universe that’s matter (dark and visible combined), displayed together with the older estimate from Planck. “We were all watching his computer screen at the same time; we all saw the answer at the same time. That’s about as dramatic as it gets,” said Gary Bernstein, an astrophysicist at the University of Pennsylvania and co-chair of the DES science committee.

    Planck pegged matter at 33 percent of the cosmos today, plus or minus two or three percentage points. When DES’s plots appeared, applause broke out as the bull’s-eye of the new matter measurement centered on 26 percent, with error bars that were similar to, but barely overlapped with, Planck’s range.

    “We saw they didn’t quite overlap,” Bernstein said. “But everybody was just excited to see that we got an answer, first, that wasn’t insane, and which was an accurate answer compared to before.”

    Statistically speaking, there’s only a slight tension between the two results: Considering their uncertainties, the 26 and 33 percent appraisals are between 1 and 1.5 standard deviations or “sigma” apart, whereas in modern physics you need a five-sigma discrepancy to claim a discovery. The mismatch stands out to the eye, but for now, Frieman and his team consider their galaxy results to be consistent with expectations based on the cosmic microwave background. Whether the hint of a discrepancy strengthens or vanishes as more data accumulate will be worth watching as the DES team embarks on its next analysis, expected to cover its first three years of data.

    If the possible discrepancy between the cosmic-microwave and galaxy measurements turns out to be real, it could create enough of a tension to lead to the downfall of the “Lambda-CDM model” of cosmology, the standard theory of the universe’s evolution. Lambda-CDM is in many ways a simple model that starts with Albert Einstein’s general theory of relativity, then bolts on dark energy and dark matter. A replacement for Lambda-CDM might help researchers uncover the quantum theory of gravity that presumably underlies everything else.

    What Is Dark Energy?

    According to Lambda-CDM, dark energy is the “cosmological constant,” represented by the Greek symbol lambda Λ in Einstein’s theory; it’s the energy that infuses space itself, when you get rid of everything else. This energy has negative pressure, which pushes space away and causes it to expand. New dark energy arises in the newly formed spatial fabric, so that the density of dark energy always remains constant, even as the total amount of it relative to dark matter increases over time, causing the expansion of the universe to speed up.

    The universe’s expansion is indeed accelerating, as two teams of astronomers discovered in 1998 by observing light from distant supernovas. The discovery, which earned the leaders of the two teams the 2011 Nobel Prize in physics, suggested that the cosmological constant has a positive but “mystifyingly tiny” value, Bernstein said. “There’s no good theory that explains why it would be so tiny.” (This is the “cosmological constant problem” that has inspired anthropic reasoning and the dreaded multiverse hypothesis.)

    On the other hand, dark energy could be something else entirely. Frieman, whom colleagues jokingly refer to as a “fallen theorist,” studied alternative models of dark energy before co-founding DES in 2003 in hopes of testing his and other researchers’ ideas. The leading alternative theory envisions dark energy as a field that pervades space, similar to the “inflaton field” that most cosmologists think drove the explosive inflation of the universe during the Big Bang. The slowly diluting energy of the inflaton field would have exerted a negative pressure that expanded space, and Frieman and others have argued that dark energy might be a similar field that is dynamically evolving today.

    DES’s new analysis incrementally improves the measurement of a parameter that distinguishes between these two theories — the cosmological constant on the one hand, and a slowly changing energy field on the other. If dark energy is the cosmological constant, then the ratio of its negative pressure and density has to be fixed at −1. Cosmologists call this ratio w. If dark energy is an evolving field, then its density would change over time relative to its pressure, and w would be different from −1.

    Remarkably, DES’s first-year data, when combined with previous measurements, pegs w’s value at −1, plus or minus roughly 0.04. However, the present level of accuracy still isn’t enough to tell if we’re dealing with a cosmological constant rather than a dynamic field, which could have w within a hair of −1. “That means we need to keep going,” Frieman said.

    The DES scientists will tighten the error bars around w in their next analysis, slated for release next year; they’ll also measure the change in w over time, by probing its value at different cosmic distances. (Light takes time to reach us, so distant galaxies reveal the universe’s past). If dark energy is the cosmological constant, the change in w will be zero. A nonzero measurement would suggest otherwise.

    Larger galaxy surveys might be needed to definitively measure w and the other cosmological parameters. In the early 2020s, the ambitious Large Synoptic Survey Telescope (LSST) will start collecting light from 20 billion galaxies and other cosmological objects, creating a high-resolution map of the universe’s clumpiness that will yield a big jump in accuracy.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The data might confirm that we occupy a Lambda-CDM universe, infused with an inexplicably tiny cosmological constant and full of dark matter whose nature remains elusive. But Frieman doesn’t discount the possibility of discovering that dark energy is an evolving quantum field, which would invite a deeper understanding by going beyond Einstein’s theory and tying cosmology to quantum physics.

    “With these surveys — DES and LSST that comes after it — the prospects are quite bright,” Dodelson said. “It is more complicated to analyze these things because the cosmic microwave background is simpler, and that is good for young people in the field because there’s a lot of work to do.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: