Tagged: LSST-Large Synoptic Survey Telescope Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:58 pm on February 19, 2019 Permalink | Reply
    Tags: A simplified version of that interface will make some of that data accessible to the public, , , , , Every 40 seconds LSST’s camera will snap a new image of the sky, Hundreds of computer cores at NCSA will be dedicated to this task, International data highways, LSST Data Journey, LSST-Large Synoptic Survey Telescope, National Center for Supercomputing Applications at the University of Illinois Urbana-Champaign, NCSA will be the central node of LSST’s data network, , , The two data centers NCSA and IN2P3 will provide petascale computing power corresponding to several million billion computing operations per second, They are also developing machine learning algorithms to help classify the different objects LSST finds in the sky   

    From Symmetry: “An astronomical data challenge” 

    Symmetry Mag
    From Symmetry

    1
    Illustration by Sandbox Studio, Chicago with Ana Kova

    02/19/19
    Manuel Gnida

    The Large Synoptic Survey Telescope will manage unprecedented volumes of data produced each night.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The Large Synoptic Survey Telescope—scheduled to come online in the early 2020s—will use a 3.2-gigapixel camera to photograph a giant swath of the heavens. It’ll keep it up for 10 years, every night with a clear sky, creating the world’s largest astronomical stop-motion movie.

    The results will give scientists both an unprecedented big-picture look at the motions of billions of celestial objects over time, and an ongoing stream of millions of real-time updates each night about changes in the sky.

    3
    Illustration by Sandbox Studio, Chicago with Ana Kova

    Accomplishing both of these tasks will require dealing with a lot of data, more than 20 terabytes each day for a decade. Collecting and storing the enormous volume of raw data, turning it into processed data that scientists can use, distributing it among institutions all over the globe, and doing all of this reliably and fast requires elaborate data management and technology.

    International data highways

    This type of data stream can be handled only with high-performance computing, the kind available at the National Center for Supercomputing Applications at the University of Illinois, Urbana-Champaign.

    NCSA U Illinois Urbana-Champaign Blue Waters Cray Linux XE/XK hybrid machine supercomputer

    Unfortunately, the U of I is a long way from Cerro Pachón, the remote Chilean mountaintop where the telescope will actually sit.

    But a network of dedicated data highways will make it feel like the two are right next door.

    LSST Data Journey,Illustration by Sandbox Studio, Chicago with Ana Kova

    Every 40 seconds, LSST’s camera will snap a new image of the sky. The camera’s data acquisition system will read out the data, and, after some initial corrections, send them hurtling down the mountain through newly installed high-speed optical fibers. These fibers have a bandwidth of up to 400 gigabits per second, thousands of times larger than the bandwidth of your typical home internet.

    Within a second, the data will arrive at the LSST base site in La Serena, Chile, which will store a copy before sending them to Chile’s capital, Santiago.

    From there, the data will take one of two routes across the ocean.

    The main route will lead them to São Paolo, Brazil, then fire them through cables across the ocean floor to Florida, which will pass them to Chicago, where they will finally be rerouted to the NCSA facility at the University of Illinois.

    If the primary path is interrupted, the data will take an alternative route through the Republic of Panama instead of Brazil. Either way, the entire trip—covering a distance of about 5000 miles—will take no more than 5 seconds.

    Curating LSST data for the world

    NCSA will be the central node of LSST’s data network. It will archive a second copy of the raw data and maintain key connections to two US-based facilities, the LSST headquarters in Tucson, which will manage science operations, and SLAC National Accelerator Laboratory in Menlo Park, California, which will provide support for the camera. But NCSA will also serve as the main data processing center, getting raw data ready for astrophysics research.

    NCSA will prepare the data at two speeds: quickly, for use in nightly alerts about changes to the sky, and at a more leisurely pace, for release as part of the annual catalogs of LSST data.

    6
    Illustration by Sandbox Studio, Chicago with Ana Kova

    Alert production has to be quick, to give scientists at LSST and other instruments time to respond to transient events, such as a sudden flare from an active galaxy or dying star, or the discovery of a new asteroid streaking across the firmament. LSST will send out about 10 million of these alerts per night, each within a minute after the event.

    Hundreds of computer cores at NCSA will be dedicated to this task. With the help of event brokers—software that facilitates the interaction with the alert stream—everyone in the world will be able to subscribe to all or a subset of these alerts.

    NCSA will share the task of processing data for the annual data releases with IN2P3, the French National Institution of Nuclear and Particle Physics, which will also archive a copy of the raw data.

    3

    The two data centers will provide petascale computing power, corresponding to several million billion computing operations per second.

    7
    Illustration by Sandbox Studio, Chicago with Ana Kova

    The releases will be curated catalogs of billions of objects containing calibrated images and measurements of object properties, such as positions, shapes and the power of their light emissions. To pull these details from the data, LSST’s data experts are creating advanced software for image processing and analysis. They are also developing machine learning algorithms to help classify the different objects LSST finds in the sky.

    Annual data releases will be made available to scientists in the US and Chile and institutions supporting LSST operations.

    Last but not least, LSST’s data management team is working on an interface that will make it easy for scientists to use the data LSST collects. What’s even better: A simplified version of that interface will make some of that data accessible to the public.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 9:38 am on December 3, 2018 Permalink | Reply
    Tags: , , , , , , LSST-Large Synoptic Survey Telescope, NASA NEOCam   

    From Science Alert: “Astronaut Warns This Neglected NASA Telescope Is Our Best Chance to Avoid Death by Asteroid” 

    ScienceAlert

    From Science Alert

    3 DEC 2018
    DAVE MOSHER

    A former NASA astronaut says the agency he used to work for has a duty to protect civilians from killer asteroids, but that it isn’t meeting that obligation.

    The threat of asteroid strikes might seem as abstract as outer space itself. But the risk, while infrequent, is real – and potentially more deadly than the threat posed by some of the most powerful nuclear weapons ever detonated.

    Risk of death from above

    In 1908, a space rock estimated to be several hundred feet in diameter screamed into Earth’s atmosphere at many thousands of miles per hour, causing the foreign body to explode over the remote Tunguska region of Russia with the force of a thermonuclear weapon.

    The resulting blast flattened trees over an area nearly twice the size of New York City.

    More recently, in 2013, a roughly 70-foot-wide meteorite shot over Chelyabinsk, Russia.

    The concussive fireball smashed windows for miles around and sent more than 1,000 people in multiple cities to hospitals, several dozen of them with serious injuries.

    We know they’re out there

    NASA is poignantly aware of such risks – and so are lawmakers.

    In 2005 Congress made one of the agency’s seven core goals to track down 90 percent of asteroids 460 feet (140 meters) and larger, which could lead to a worse-than-Tunguska-level event. The deadline for this legally mandated goal is 2020.

    So far, however, telescopes on Earth and in space have found less than one third of these near-Earth objects (NEOs) and NASA will almost certainly fail to hit its deadline.

    Practically, this means tens of thousands of NEOs big enough to wipe out a city have yet to be found, according to a June 2018 report published by the White House.

    The same report concludes that even with current and planned capabilities, less than half of such space rocks will be located by 2033.

    We have the technology to confront the problem

    Russell “Rusty” Schweickart, an aerospace engineer retired astronaut who flew on the Apollo 9 mission, says there is a solution in waiting for this problem: NASA can launch the Near-Earth Object Camera (NEOCam), which is a small infrared observatory, into space.

    NASA NEOCAM

    “It’s a critical discovery telescope to protect life on Earth, and it’s ready to go,” Schweickart told Business Insider at The Economist Space Summit on November 1.

    NEOCam’s designers have pitched the mission to NASA multiple times. The mission has received several million dollars here and there to continue its development in response to those proposals, but the agency has denied full funding in every instance on account of it not being the best purely science-focused mission.

    “For God’s sake, fund it as a mainline program. Don’t put it in yet another competition with science,” Schweickart said. “This is a public safety program.”

    How NEOCam would hunt for ‘city killer’ asteroids

    Telescopes that are looking in the right place at the right time can detect a dot of that light sneaking across the blackness of space. This allows scientists to calculate an NEO’s mass, speed, orbit, and the odds that it will eventually smack into Earth.

    Small NEOs, though, aren’t very bright. This means a telescope has to be big, see a lot of the sky, and use very advanced hardware to pick them up. These monstrous telescopes take a very long time to build and calibrate and are budget-crushingly expensive.

    Take the Large Synoptic Survey Telescope (LSST), for example, which is one of Earth’s best current hopes of finding killer asteroids.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The project broke ground in 2015 and is expected to cost about half a billion dollars to build.

    Based on its current construction schedule, it won’t be fully operational until late 2021, at the soonest, or able to fulfill the 90 percent detection goal set by Congress until the mid-2030s.

    LSST, like all ground-based observatories, also comes with two major limitations.

    The first: “You can’t see asteroids near the Sun. You’re blinded by the sky,” Mark Sykes, director of the Planetary Science Institute and a scientist on the NEOCam team, previously told Business Insider.

    “Right now we have to wait until those pop out in front of us.”

    Sykes said the second snag is that ground-based telescopes mainly rely on visible light for detection. “If [an asteroid] has a dark surface, it’s going to be very hard to see,” he said.

    NEOCam addresses these two problems by being in space, where Sykes says “you’re not blinded by the sky.”

    The telescope would also use an advanced, high-resolution infrared camera. Infrared is a longer wavelength of light that’s invisible to our eyes, but if a source is strong enough – say, a roaring fire – we can feel invisible light as warmth on our skin.

    Asteroids warmed by the Sun, radioactive elements, or both will emit infrared light, even when they’re too small or dark for ground-based telescopes to see. Which means NEOCam could spot them merely by their heat signatures.

    This approach is already proven to work.

    The prime example is NASA’s eight-year-old Wide-field Infrared Survey Explorer (WISE) telescope, which has found roughly 275 NEOs, including 50 potentially hazardous asteroids, or PHOs (so named because they come within 4.6 million miles of Earth at some point in their orbits).

    NASA Wise Telescope

    1
    (NASA/JPL-Caltech)

    However, it’s a less powerful telescope, has a smaller field of view, an older camera that requires cryogenic cooling that eventually runs out (NEOCam’s doesn’t need it), and wasn’t designed just to hunt asteroids.

    The telescope, now called NEOWISE, may end operations in December 2018.

    NEOCam is Earth’s best immediate hope for quick detection of asteroids

    According to a recent study in The Astronomical Journal, neither NEOCam nor LSST alone would ever achieve Congress’ 90 percent detection mandate – only by working together, the research found, could the observatories achieve that goal over a decade.

    But NEOCam offers significant upgrades to the situation under LSST.

    In its latest pitch to NASA, the NEOCam team proposed to launch in 2021 and find two-thirds of missing objects in the larger-than-460-feet (140 meters) category within four years, or about a decade ahead of LSST’s schedule.

    Less than 70 percent of all NEOs that are 460 feet (140 meters) or larger have not been found, according to a report published by the White House’s National Science and Technology Council (NSTC) in December 2016.

    This amounts to about 25,000 nearby asteroids and roughly 2,300 potentially hazardous ones.

    The NTSC report suggests that an orbiting telescope like NEOCam could also help root out asteroids that would strike with a force somewhere between a Tunguska-type event (occurring about once every 100-200 years) and a Chelyabinsk-type event (occurring about once every 10 years), of which less than 1 percent have been located.

    So if launching a more-capable replacement for NEOWISE is a top priority, why might NASA not fully fund NEOCam for a 2024 launch?

    ‘NASA has a responsibility to do it’

    The team behind NEOCam has pitched the mission to NASA three times – in 2006, 2010, and 2015 – and three times NASA has punted on fully funding the telescope.

    The last instance it was denied, sources told Business Insider the proposal had no major technical weaknesses. Instead, it was a case of trying to jam a square peg into a round bureaucratic hole.

    The NASA competition it was a part of, called Discovery, values scientific firsts – not ensuring humanity’s safety – and thus did not grant NEOCam nearly US$450 million to develop its spacecraft and a rocket with which to launch it.

    NASA instead picked two new space missions to explore the Solar System: Lucy, a probe that will visit swarms of ancient asteroids lurking near Jupiter, and Psyche, which will orbit the all-metal core of a dead planet.

    For Schweickart’s part, he doesn’t care about the distinction.

    “NASA has a responsibility to do it, and it’s not happening,” he said. “It needs to be put into the NASA budget both by NASA and by the Congress.”

    NEOCam did get US$35 million in the 2018 government funding bill to keep itself going, but proponents say this is not enough to get the telescope to a launch pad.

    “In the meantime, NEOCam is in a zombie state and all the while Earth waits inevitably in the crosshairs,” Richard Binzel, a planetary scientist and expert on the hazards posed by asteroids at Massachusetts Institute of Technology, told Business Insider in an email.

    Binzel is one of three scientists who wrote a recent op-ed in Space News in support of fully funding the project, even though they’re not on the project’s team.

    Binzel and others argue NEOCam could get launched by raising the House of Representatives’ proposed budget for NASA planetary defence by another US$40 million (up from a US$160 million to US$200 million) and by sharing a rocket ride with a spacecraft called IMAP, which the agency plans to launch in 2024.

    By working in coordination with ground-based telescopes, NEOCam could achieve nearly 70 percent detection in four years, and the agency’s target of 90 percent detection in less than 10 years.

    Finding such money is not easy though. Binzel said the infrequency of asteroid strikes makes it politically uncostly to instead fund other initiatives year after year.

    “But the consequences of being wrong are irresponsible, especially when the capability to gain the necessary knowledge is easily within our grasp,” he said.

    “We should simply act like responsible adults and ‘just do it.’ What are we waiting for?”

    It’s now up to President Trump and Congress

    Schweickart acknowledged that NASA’s budgeting and culture has, for decades, been focused on pushing top-tier scientific exploration and that deviating from this norm – Congressional mandate or not – isn’t easy.

    “You’re going upstream. You’re fighting a pretty strong headwind within NASA,” he said, adding that pulling money from science budgets to fund anything is extremely unpopular. “But government agencies are not at liberty to ask for increases in their budget.”

    Schweickart and fellow retired astronaut Ed Lu tried years ago to end-run around the problem by co-founding the B612 Foundation, which is a nonprofit dedicated to developing NEO-detecting capabilities.

    But the group tabled its longest-running (and most expensive) idea, the Sentinel space telescope, in part to improve NEOCam’s chances of getting funded. On Oct. 29, the organisation even publicized its strong support for lawmakers fully funding its rival.

    The public also appears to be on-board with NASA making asteroid detection projects like NEOCam happen.

    In a June poll by Pew Research Center, nearly two-thirds of 2,500 American adults surveyed said that asteroid monitoring should be a top priority for NASA. (Only monitoring climate change was higher.)

    It remains to be seen what the Trump administration will decide to do with NEOCam in the next NASA budget, and if Congress authorizes that funding.

    “That’s a February discussion,” Stephen Jurczyk, NASA’s associate administrator, told Business Insider at the Economist Space Summit.

    “All of that’s all embargoed until the president releases his budget to Congress.”

    Jurczyk acknowledged the tension between NASA’s duty to locate dangerous asteroids along with internal changes required to make that work happen.

    “It is to some extent a cultural issue, where we kind of have this mentality of pure science and pure competition,” he said.

    “I think we’re starting to evolve to a more diverse and more balanced approach between pure science and other things that we need to do.”

    The question is whether those changes will happen before the next Tunguska-type asteroid arrives at Earth. Given enough warning, we might fly out to such a space rock and prevent a calamity or, if there isn’t enough time for that, try to move people out of harm’s way.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 2:49 pm on October 16, 2018 Permalink | Reply
    Tags: , , , , , Deep Skies Lab, Galaxy Zoo-Citizen Science, Gravitational lenses, , LSST-Large Synoptic Survey Telescope, ,   

    From Symmetry: “Studying the stars with machine learning” 

    Symmetry Mag
    From Symmetry

    10/16/18
    Evelyn Lamb

    1
    Illustration by Sandbox Studio, Chicago with Corinne Mucha

    To keep up with an impending astronomical increase in data about our universe, astrophysicists turn to machine learning.

    Kevin Schawinski had a problem.

    In 2007 he was an astrophysicist at Oxford University and hard at work reviewing seven years’ worth of photographs from the Sloan Digital Sky Survey—images of more than 900,000 galaxies. He spent his days looking at image after image, noting whether a galaxy looked spiral or elliptical, or logging which way it seemed to be spinning.

    Technological advancements had sped up scientists’ ability to collect information, but scientists were still processing information at the same rate. After working on the task full time and barely making a dent, Schawinski and colleague Chris Lintott decided there had to be a better way to do this.

    There was: a citizen science project called Galaxy Zoo. Schawinski and Lintott recruited volunteers from the public to help out by classifying images online. Showing the same images to multiple volunteers allowed them to check one another’s work. More than 100,000 people chipped in and condensed a task that would have taken years into just under six months.

    Citizen scientists continue to contribute to image-classification tasks. But technology also continues to advance.

    The Dark Energy Spectroscopic Instrument, scheduled to begin in 2019, will measure the velocities of about 30 million galaxies and quasars over five years.

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA

    The Large Synoptic Survey Telescope, scheduled to begin in the early 2020s, will collect more than 30 terabytes of data each night—for a decade.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    “The volume of datasets [from those surveys] will be at least an order of magnitude larger,” says Camille Avestruz, a postdoctoral researcher at the University of Chicago.

    To keep up, astrophysicists like Schawinski and Avestruz have recruited a new class of non-scientist scientists: machines.

    Researchers are using artificial intelligence to help with a variety of tasks in astronomy and cosmology, from image analysis to telescope scheduling.

    Superhuman scheduling, computerized calibration

    Artificial intelligence is an umbrella term for ways in which computers can seem to reason, make decisions, learn, and perform other tasks that we associate with human intelligence. Machine learning is a subfield of artificial intelligence that uses statistical techniques and pattern recognition to train computers to make decisions, rather than programming more direct algorithms.

    In 2017, a research group from Stanford University used machine learning to study images of strong gravitational lensing, a phenomenon in which an accumulation of matter in space is dense enough that it bends light waves as they travel around it.

    Gravitational Lensing NASA/ESA

    Because many gravitational lenses can’t be accounted for by luminous matter alone, a better understanding of gravitational lenses can help astronomers gain insight into dark matter.

    In the past, scientists have conducted this research by comparing actual images of gravitational lenses with large numbers of computer simulations of mathematical lensing models, a process that can take weeks or even months for a single image. The Stanford team showed that machine learning algorithms can speed up this process by a factor of millions.

    3
    Greg Stewart, SLAC National Accelerator Laboratory

    Schawinski, who is now an astrophysicist at ETH Zürich, uses machine learning in his current work. His group has used tools called generative adversarial networks, or GAN, to recover clean versions of images that have been degraded by random noise. They recently published a paper [Astronomy and Astrophysics]about using AI to generate and test new hypotheses in astrophysics and other areas of research.

    Another application of machine learning in astrophysics involves solving logistical challenges such as scheduling. There are only so many hours in a night that a given high-powered telescope can be used, and it can only point in one direction at a time. “It costs millions of dollars to use a telescope for on the order of weeks,” says Brian Nord, a physicist at the University of Chicago and part of Fermilab’s Machine Intelligence Group, which is tasked with helping researchers in all areas of high-energy physics deploy AI in their work.

    Machine learning can help observatories schedule telescopes so they can collect data as efficiently as possible. Both Schawinski’s lab and Fermilab are using a technique called reinforcement learning to train algorithms to solve problems like this one. In reinforcement learning, an algorithm isn’t trained on “right” and “wrong” answers but through differing rewards that depend on its outputs. The algorithms must strike a balance between the safe, predictable payoffs of understood options and the potential for a big win with an unexpected solution.

    4
    Illustration by Sandbox Studio, Chicago with Corinne Mucha

    A growing field

    When computer science graduate student Shubhendu Trivedi of the Toyota Technological Institute at University of Chicago started teaching a graduate course on deep learning with one of his mentors, Risi Kondor, he was pleased with how many researchers from the physical sciences signed up for it. They didn’t know much about how to use AI in their research, and Trivedi realized there was an unmet need for machine learning experts to help scientists in different fields find ways of exploiting these new techniques.

    The conversations he had with researchers in his class evolved into collaborations, including participation in the Deep Skies Lab, an astronomy and artificial intelligence research group co-founded by Avestruz, Nord and astronomer Joshua Peek of the Space Telescope Science Institute. Earlier this month, they submitted their first peer-reviewed paper demonstrating the efficiency of an AI-based method to measure gravitational lensing in the Cosmic Microwave Background [CMB].

    Similar groups are popping up across the world, from Schawinski’s group in Switzerland to the Centre for Astrophysics and Supercomputing in Australia. And adoption of machine learning techniques in astronomy is increasing rapidly. In an arXiv search of astronomy papers, the terms “deep learning” and “machine learning” appear more in the titles of papers from the first seven months of 2018 than from all of 2017, which in turn had more than 2016.

    “Five years ago, [machine learning algorithms in astronomy] were esoteric tools that performed worse than humans in most circumstances,” Nord says. Today, more and more algorithms are consistently outperforming humans. “You’d be surprised at how much low-hanging fruit there is.”

    But there are obstacles to introducing machine learning into astrophysics research. One of the biggest is the fact that machine learning is a black box. “We don’t have a fundamental theory of how neural networks work and make sense of things,” Schawinski says. Scientists are understandably nervous about using tools without fully understanding how they work.

    Another related stumbling block is uncertainty. Machine learning often depends on inputs that all have some amount of noise or error, and the models themselves make assumptions that introduce uncertainty. Researchers using machine learning techniques in their work need to understand these uncertainties and communicate those accurately to each other and the broader public.

    The state of the art in machine learning is changing so rapidly that researchers are reluctant to make predictions about what will be coming even in the next five years. “I would be really excited if as soon as data comes off the telescopes, a machine could look at it and find unexpected patterns,” Nord says.

    No matter exactly the form future advances take, the data keeps coming faster and faster, and researchers are increasingly convinced that artificial intelligence is going to be necessary to help them keep up.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 3:17 pm on May 14, 2018 Permalink | Reply
    Tags: , , , , LSST-Large Synoptic Survey Telescope, , , The next big discovery in astronomy? Scientists probably found it years ago – but they don’t know it yet   

    From The Conversation: “The next big discovery in astronomy? Scientists probably found it years ago – but they don’t know it yet” 

    Conversation
    From The Conversation

    May 14, 2018
    Eileen Meyer

    1
    An artist’s illustration of a black hole “eating” a star. NASA/JPL-Caltech

    Earlier this year, astronomers stumbled upon a fascinating finding: Thousands of black holes likely exist near the center of our galaxy.

    1
    Hundreds — Perhaps Thousands — of Black Holes Occupy the Center of the Milky Way

    The X-ray images that enabled this discovery weren’t from some state-of-the-art new telescope. Nor were they even recently taken – some of the data was collected nearly 20 years ago.

    No, the researchers discovered the black holes by digging through old, long-archived data.

    Discoveries like this will only become more common, as the era of “big data” changes how science is done. Astronomers are gathering an exponentially greater amount of data every day – so much that it will take years to uncover all the hidden signals buried in the archives.

    The evolution of astronomy

    Sixty years ago, the typical astronomer worked largely alone or in a small team. They likely had access to a respectably large ground-based optical telescope at their home institution.

    Their observations were largely confined to optical wavelengths – more or less what the eye can see. That meant they missed signals from a host of astrophysical sources, which can emit non-visible radiation from very low-frequency radio all the way up to high-energy gamma rays. For the most part, if you wanted to do astronomy, you had to be an academic or eccentric rich person with access to a good telescope.

    Old data was stored in the form of photographic plates or published catalogs. But accessing archives from other observatories could be difficult – and it was virtually impossible for amateur astronomers.

    Today, there are observatories that cover the entire electromagnetic spectrum. No longer operated by single institutions, these state-of-the-art observatories are usually launched by space agencies and are often joint efforts involving many countries.

    With the coming of the digital age, almost all data are publicly available shortly after they are obtained. This makes astronomy very democratic – anyone who wants to can reanalyze almost any data set that makes the news. (You too can look at the Chandra data that led to the discovery of thousands of black holes!)

    These observatories generate a staggering amount of data. For example, the Hubble Space Telescope, operating since 1990, has made over 1.3 million observations and transmits around 20 GB of raw data every week, which is impressive for a telescope first designed in the 1970s.

    NASA/ESA Hubble Telescope

    The Atacama Large Millimeter Array in Chile now anticipates adding 2 TB of data to its archives every day.

    ESO/NRAO/NAOJ ALMA Array in Chile in the Atacama at Chajnantor plateau, at 5,000 metres

    Data firehose

    The archives of astronomical data are already impressively large. But things are about to explode.

    Each generation of observatories are usually at least 10 times more sensitive than the previous, either because of improved technology or because the mission is simply larger. Depending on how long a new mission runs, it can detect hundreds of times more astronomical sources than previous missions at that wavelength.

    For example, compare the early EGRET gamma ray observatory, which flew in the 1990s, to NASA’s flagship mission Fermi, which turns 10 this year. EGRET detected only about 190 gamma ray sources in the sky. Fermi has seen over 5,000.

    NASA/Fermi LAT


    NASA/Fermi Gamma Ray Space Telescope

    The Large Synoptic Survey Telescope, an optical telescope currently under construction in Chile, will image the entire sky every few nights. It will be so sensitive that it will generate 10 million alerts per night on new or transient sources, leading to a catalog of over 15 petabytes after 10 years.

    LSST

    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The Square Kilometre Array , when completed in 2020, will be the most sensitive telescope in the world, capable of detecting airport radar stations of alien civilizations up to 50 light-years away. In just one year of activity, it will generate more data than the entire internet.


    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia


    SKA Murchison Widefield Array, Boolardy station in outback Western Australia, at the Murchison Radio-astronomy Observatory (MRO)


    SKA Meerkat telescope, 90 km outside the small Northern Cape town of Carnarvon, SA


    SKA LOFAR core (“superterp”) near Exloo, Netherlands


    These ambitious projects will test scientists’ ability to handle data. Images will need to be automatically processed – meaning that the data will need to be reduced down to a manageable size or transformed into a finished product. The new observatories are pushing the envelope of computational power, requiring facilities capable of processing hundreds of terabytes per day.

    The resulting archives – all publicly searchable – will contain 1 million times more information that what can be stored on your typical 1 TB backup disk.

    Unlocking new science

    The data deluge will make astronomy become a more collaborative and open science than ever before. Thanks to internet archives, robust learning communities and new outreach initiatives, citizens can now participate in science. For example, with the computer program Einstein@Home, anyone can use their computer’s idle time to help search for gravitational waves from colliding black holes.

    It’s an exciting time for scientists, too. Astronomers like myself often study physical phenomena on timescales so wildly beyond the typical human lifetime that watching them in real-time just isn’t going to happen. Events like a typical galaxy merger – which is exactly what it sounds like – can take hundreds of millions of years. All we can capture is a snapshot, like a single still frame from a video of a car accident.

    However, there are some phenomena that occur on shorter timescales, taking just a few decades, years or even seconds. That’s how scientists discovered those thousands of black holes in the new study. It’s also how they recently realized that the X-ray emission from the center of a nearby dwarf galaxy has been fading since first detected in the 1990s. These new discoveries suggest that more will be found in archival data spanning decades.

    In my own work, I use Hubble archives to make movies of “jets,” high-speed plasma ejected in beams from black holes. I used over 400 raw images spanning 13 years to make a movie of the jet in nearby galaxy M87. That movie showed, for the first time, the twisting motions of the plasma, suggesting that the jet has a helical structure.

    This kind of work was only possible because other observers, for other purposes, just happened to capture images of the source I was interested in, back when I was in kindergarten. As astronomical images become larger, higher resolution and ever more sensitive, this kind of research will become the norm.

    See the full article here .

    Please help promote STEM in your local schools.

    stem

    Stem Education Coalition

    The Conversation US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

     
  • richardmitnick 12:53 pm on April 17, 2018 Permalink | Reply
    Tags: , , , LSST-Large Synoptic Survey Telescope,   

    From Symmetry: “The world’s largest astronomical movie” 

    Symmetry Mag
    Symmetry

    04/17/18
    Manuel Gnida

    1
    Artwork by Sandbox Studio, Chicago with Ana Kova

    When the Large Synoptic Survey Telescope begins to survey the night sky in the early 2020s, it’ll collect a treasure trove of data.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The information will benefit a wide range of groundbreaking astronomical and astrophysical research, addressing topics such as dark matter, dark energy, the formation of galaxies and detailed studies of objects in our very own cosmic neighborhood, the Milky Way.

    LSST’s centerpiece will be its 3.2-gigapixel camera, which is being assembled at the US Department of Energy’s SLAC National Accelerator Laboratory. Every few days, the largest digital camera ever built for astronomy will compile a complete image of the Southern sky. Moreover, it’ll do so over and over again for a period of 10 years. It’ll track the motions and changes of tens of billions of stars, galaxies and other objects in what will be the world’s largest stop-motion movie of the universe.

    Fulfilling this extraordinary task requires extraordinary technology. The camera will be the size of a small SUV, weigh in at a whopping 3 tons, and use state-of-the-art optics, imaging technology and data management tools. But how exactly will it work?

    2
    Artwork by Sandbox Studio, Chicago with Ana Kova

    Collecting ancient light

    It all starts with choosing the right location for the telescope. Astronomers want the sharpest images of the dimmest objects for their analyses, and they also want to maximize their observation time. They need the nights to be dark and the air to be dry and stable.

    It turns out that the Atacama Desert, a plateau in the foothills of the Andes Mountains, scores very high for these criteria. That’s where LSST will be located—at nearly 8700 feet altitude on the Cerro Pachón ridge in Chile, 60 miles from the coastal town of La Serena.

    The next challenge is that most objects LSST researchers want to study are so far away that their light has been traveling through space for millions to billions of years. It arrives on Earth merely as a faint glow, and astronomers need to collect as much of that glow as possible. For this purpose, LSST will have a large primary mirror with a diameter close to 28 feet.

    The mirror will be part of a sophisticated three-mirror system that will reflect and focus the cosmic light into the camera.

    The unique optical design is crucial for the telescope’s extraordinary field of view—a measure of the area of sky captured with every snapshot. At 9.6 square degrees, corresponding to 40 times the area of the full moon, the large field of view will allow astronomers to put together a complete map of the Southern night sky every few days.

    After bouncing off the mirrors, the ancient cosmic light will enter the camera through a set of three large lenses. The largest one will have a diameter of more than 5 feet.

    Together with the mirrors, the lenses’ job is to focus the light as sharply as possible onto the focal plane—a grid of light-sensitive sensors at the back of the camera where the light from the sky will be detected.

    A filter changer will insert filters in front of the third lens, allowing astronomers to take images with different kinds of cosmic light that range from the ultraviolet to the near-infrared. This flexibility enhances the range of possible observations with LSST. For example, with an infrared filter researchers can look right through dust and get a better view of objects obscured by it. By comparing how bright an object is when seen through different filters, astronomers also learn how its emitted light varies with the wavelength, which reveals details about how the light is produced.

    4
    Artwork by Sandbox Studio, Chicago with Ana Kova

    An Extraordinary Imaging Device

    The heart of LSST’s camera is its 25-inch-wide focal plane. That’s where the light of stars and galaxies will be turned into electrical signals, which will then be used to reconstruct images of the sky. The focal plane will hold 189 imaging sensors, called charge-coupled devices, that perform this transformation.

    Each CCD is 4096 pixels wide and long, and together they’ll add up to the camera’s 3.2 gigapixels. A “good” star will be the size of only a handful of pixels, whereas distant galaxies might appear as somewhat larger fuzzballs.

    The focal plane will consist of 21 smaller square arrays, called rafts, with nine CCDs each. This modular structure will make it easier and less costly to replace imaging sensors if needed in the future.

    To the delight of astronomers interested in extremely dim objects, the camera will have a large aperture (f/1.2, for the photographers among us), meaning that it’ll let a lot of light onto the imaging sensors. However, the large aperture will also make the depth of field very shallow, which means that objects will become blurry very quickly if they are not precisely projected onto the focal plane. That’s why the focal plane will need to be extremely flat, demanding that individual CCDs don’t stick out or recess by more than 0.0004 inches.

    To eliminate unwanted background signals, known as dark currents, the sensors will also need to be cooled to minus 150 degrees Fahrenheit. The temperature will need to be kept stable to half a degree. Because water vapor inside the camera housing would form ice on the sensors at this chilly temperature, the focal plane must also be kept in a vacuum.

    In addition to the 189 “science” sensors that will capture images of the sky, the focal plane will also have three specialty sensors in each of the four corners of the focal plane. Two so-called guiders will frequently monitor the position of a reference star and help LSST stay in sync with the Earth’s rotation. The third sensor, called a wavefront sensor, will be split into two halves that will be positioned six-hundredths of an inch above and below the focal plane. It’ll see objects as blurry “donuts” and provide information that will be used to adjust the telescope’s focus.

    Cinematography of astronomical dimension

    Once the camera has taken enough data from a patch in the sky, about every 36 seconds, the telescope will be repositioned to look at the next spot. A computer algorithm will determine the patches in the sky that will be surveyed by LSST on any given night.

    While the telescope is moving, a shutter between the filter and the third lens camera will close to prevent more light from falling onto the imaging sensors. At the same time, the CCDs will be read out and their information digitized.

    The data will be sent into the processing and analysis pipeline that will handle LSST’s enormous flood of information (about 20 terabytes of data every single night). There, it will be turned into useable images. The system will also flag potential interesting events and send out alerts to astronomers within a minute.

    This way—patch by patch—a complete image of the entire Southern sky will be stitched together every few days. Then the imaging process will start over and repeat for the 10-year duration of the survey, ultimately creating the largest time-lapse movie of the universe ever made and providing researchers with unprecedented research opportunities.

    For more information on LSST, visit LSST’s website or SLAC’s LSST camera website.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 1:27 pm on February 9, 2018 Permalink | Reply
    Tags: , , , , LSST-Large Synoptic Survey Telescope   

    From LSST: “LSST’s Auxiliary Telescope” 

    LSST

    Large Synoptic Survey Telescope

    February 6, 2018

    1

    In tandem with LSST’s construction on Cerro Pachón, a smaller telescope will soon be assembled on nearby calibration hill, a short distance away from the main LSST Facility. LSST’s 1.2-meter Auxiliary Telescope will measure atmospheric transmission, which refers to how directly light is transmitting through the Earth’s atmosphere in a given spot, as opposed to being absorbed or scattered. Because the presence of certain molecules and particles in the atmosphere will change the color of light detected by the LSST telescope, data collected by the Auxiliary Telescope, as it mirrors the nightly movements of LSST, will inform the catalog corrections that need to be made to LSST data in order to render it more accurate.

    Elements in the atmosphere that affect how light is detected by a ground based telescope like LSST include water, oxygen, and ozone, as well as aerosols like sea salt, dust from volcanoes, and smoke from forest fires. The presence and quantity of these elements varies from night to night, so the Auxiliary Telescope will provide this important complementary data for LSST throughout survey operations. According to Calibration Hardware Scientist Patrick Ingraham, “Having a dedicated auxiliary telescope supporting the main telescope is somewhat unique, and it will increase the quality of data produced by LSST.”

    The Auxiliary Telescope itself wasn’t built from scratch; it’s an existing telescope that has been repurposed for its role in the LSST survey. Since being moved from its original location on nearby Kitt Peak in May, 2014, it’s been housed in the workshop at LSST’s Project Office in Tucson, AZ. Refurbishment work has included replacement of all the telescope’s electrical parts including the motors and the position encoders, which record the exact position of the telescope at any given time. Mechanically speaking, the telescope is largely unchanged. Its mirrors, which were removed while work was done, will be recoated and replaced once the telescope arrives on Cerro Pachón; they are currently in separate protective crates that will protect them during shipping.

    Currently, the subcontractor working on the refurbishment project is almost finished with the wiring of the telescope’s electrical components. Once that’s complete, the telescope will undergo functional testing of its mechanical and electrical systems. Individual tasks that make up this testing include driving the telescope toward its upper and lower limits and ensuring the system will shut off before those limits are reached (preventing damage to the telescope), testing for excessive vibration, and testing the speed at which the telescope slews, or moves from one spot to the next. Extensive functional testing is critical now, because once the telescope is on Cerro Pachón there won’t be sufficient facilities to easily make repairs. Optical testing of the telescope will occur after the telescope is installed in its facility on the summit and re-integrated with its mirrors.

    Once the telescope is officially ready to be shipped from Tucson to Chile, the individual telescope assemblies will be packed in custom crates, and these crates will be loaded into a shipping container. It will take about 2 months for the shipping container to get from Tucson to Cerro Pachón. Once there, the telescope will be installed in a few pieces, with a crane, through the dome of its facility on calibration hill. Photos of the Auxiliary Telescope in the workshop , as well as the facility on Cerro Pachón (link is external), can be viewed and downloaded from the LSST Gallery.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    LSST telescope, currently under construction at Cerro Pachón Chile
    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    LSST Interior
    LSST telescope, currently under construction at Cerro Pachón Chile.

    LSST/Camera, built at SLAC
    LSST/Camera, built at SLAC

    The LSST is a new kind of telescope. Currently under construction in Chile, it is being built to rapidly survey the night-time sky. Compact and nimble, the LSST will move quickly between images, yet its large mirror and large field of view—almost 10 square degrees of sky, or 40 times the size of the full moon—work together to deliver more light from faint astronomical objects than any optical telescope in the world.

    From its mountaintop site in the foothills of the Andes, the LSST will take more than 800 panoramic images each night with its 3.2 billion-pixel camera, recording the entire visible sky twice each week. Each patch of sky it images will be visited 1000 times during the survey. With a light-gathering power equal to a 6.7-m diameter primary mirror, each of its 30-second observations will be able to detect objects 10 million times fainter than visible with the human eye. A powerful data system will compare new with previous images to detect changes in brightness and position of objects as big as far-distant galaxy clusters and as small as near-by asteroids.

    The LSST’s combination of telescope, mirror, camera, data processing, and survey will capture changes in billions of faint objects and the data it provides will be used to create an animated, three-dimensional cosmic map with unprecedented depth and detail , giving us an entirely new way to look at the Universe. This map will serve a myriad of purposes, from locating that mysterious substance called dark matter and characterizing the properties of the even more mysterious dark energy, to tracking transient objects, to studying our own Milky Way Galaxy in depth. It will even be used to detect and track potentially hazardous asteroids—asteroids that might impact the Earth and cause significant damage.

    As with past technological advances that opened new windows of discovery, such a powerful system for exploring the faint and transient Universe will undoubtedly serve up surprises.

    Plans for sharing the data from LSST with the public are as ambitious as the telescope itself. Anyone with a computer will be able to view the moving map of the Universe created by the LSST, including objects a hundred million times fainter than can be observed with the unaided eye. The LSST project will provide analysis tools to enable both students and the public to participate in the process of scientific discovery. We invite you to learn more about LSST science.

    The LSST will be unique: no existing telescope or proposed camera could be retrofitted or re-designed to cover ten square degrees of sky with a collecting area of forty square meters. Named the highest priority for ground-based astronomy in the 2010 Decadal Survey, the LSST project formally began construction in July 2014.

     
  • richardmitnick 12:47 pm on January 23, 2018 Permalink | Reply
    Tags: , , , , LSST-Large Synoptic Survey Telescope,   

    From Texas A&M: “A&M professors help develop new telescope” 

    Texas A&M logo

    Texas A&M

    Dec 4, 2017
    Elaine Soliman

    The new telescope will help change how astronomers study space.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The LSST Facility will change how astronomers study the sky by providing a new method of examination.

    The ability to finally be able to analyze and learn more about dark matter and dark energy is just around the corner thanks the innovative Large Synoptic Survey Telescope, or LSST. It is being developed by an international team made of over thousands of people, including three professors at Texas A&M.

    The LSST is a groundbreaking telescope which will develop a digital picture of the entire sky continuously over a three-night period. The project is funded by the National Science Foundation and the Department of Energy, according to Lucas Macri, institutional board representative of the LSST. The LSST is aimed to become operational as soon as 2022. This project wasn’t feasible fifteen years ago, but the LSST will bring in all sorts of new data about the universe around us, according to Macri.

    “Imagine if our only knowledge of biology was one picture of a cell that you took once,” Macri said. “The nice thing about a microscope … is you can actually see a cell … that dynamic and that temporal coverage, of in this case, a cell, gives you a lot of information and it is the same thing with the sky. We’ve been able to study small patches of the sky repeatedly. Many pictures of them see things that change, discover new stars, exploding stars, asteroid, whatever. But we have never been able to do an unbiased complete survey of the sky.”

    The LSST is able to do this using a charged couple device that is sensitive to light. It also requires a large mirror to be cast, and a lot glass melted into the right shape. The LSST has two mirrors in one shape which collect in a flight with enough quality so that eventually one can make these pictures of the sky, according to Macri.

    “The telescope was able to be designed to look at a large part of the sky,” Nicholas Suntzeff, university distinguished professor and head of the TAMU Astronomy Group said. “The digital detector for the LSST is the size of [a] table. Imagine covering [a] table with silicon chips and cramming them all together. And so every image you take with this telescope is the size of [a] table and you’re taking images every twenty seconds all night long. So this is an unbelievable size of an image of a focal plane. And compare that with the camera that’s being built for the LSST and so that’s what [a] table is, it’s the size of the image. As a person whose built instruments that just blows my mind that we’re able to do something like that.”

    This flood of new information about the entire universe can be utilized to further understand dark matter and dark energy. With the development of the LSST, astronomers can learn more about dark matter and energy than was ever possible before. They will also be able to better understand transient objects, according to Suntzeff.

    “This telescope will be the first big telescope to devote itself to searching for what we call in astronomy the transient sky,” Suntzeff said. “Stars that vary get brighter and fainter. Stars that explode. Galaxies that get brighter and fainter. Black holes that rip apart stars. Gamma Ray explosions at the edge of the universe. And we’ll discover things that we can’t even imagine right now. That’s one of the beauties of astronomy.

    Every time a telescope is built, that opens up a new way of looking at the universe, Suntzeff said.

    “We anticipate cool things to discover that ultimately what was really exciting was to discover things that we had no idea existed,” Suntzeff said. “So, in this case we’re opening up the transient sky and we will find things beyond our imaginations.”

    The LSST will also be able to help predict if an asteroid is projected to hit the earth, according to Macri. Macri said if an asteroid the size of Kyle Field hit the earth, the impact wouldn’t be the problem, but the amount of dust would eventually black out the whole earth.

    The LSST is currently being developed as a worldwide project. The LSST headquarters are in Tuscon, Arizona. Astronomers at Stanford University are developing the camera, which will be the largest digital camera ever assembled. The telescope itself is being built in Chile.

    Suntzeff, who picked the mountain in Chile on which to build the telescope, was actually one of the first people involved with the project approximately twenty years ago. According to Suntzeff, the LSST has brought together the astronomy and statistics departments.

    “It’s unbelievable how much data is going to come from this telescope,” Suntzeff said. “And in order to sift through the data we can’t just be normal astronomers. We have to use advanced mathematical and statistical techniques. So we’ve begun a program in collaboration with the statistics department in studying something that’s called astrostatistics. And astrostatistics will allow us to have tools to allow us to search very large databases for objects of interest.”

    Currently, these TAMU professors are preparing their graduate students for what is to come in the next few years with the completion of the LSST.

    “Well I am preparing for some software I was thinking about getting students to work LSST related problems in particular to identify objects that may be interesting to us,” said Lifan Wang, professor in physics and astronomy at TAMU and member of the LSST dark energy science collaboration.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition
    Located in College Station, Texas, about 90 miles northwest of Houston and within a two to three-hour drive from Austin and Dallas.
    Home to more than 50,000 students, ranking as the sixth-largest university in the country, with more than 370,000 former students worldwide.
    Holds membership in the prestigious Association of American Universities, one of only 62 institutions with this distinction.
    More than $820 million in research expenditures generated by faculty-researchers
    Has an endowment valued at more than $5 billion, which ranks fourth among U.S. public universities and 10th overall.

     
  • richardmitnick 12:20 pm on December 22, 2017 Permalink | Reply
    Tags: , , , , LSST-Large Synoptic Survey Telescope   

    From Astronomy: “The LSST and big data science” 

    Astronomy magazine

    Astronomy Magazine

    December 15, 2017
    Steve Murray

    A new kind of telescope will need a new kind of astronomer.

    Construction of the Large Synoptic Survey Telescope (LSST) in Chile is about halfway between first brick and first light. Its 3-ton camera, built with National Science Foundation support, will be the largest digital instrument ever built for ground-based astronomy and will take pictures fast enough to capture the entire southern sky every three nights.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    According to a TED talk by Andy Connolly, Professor of Astronomy at the University of Washington and Team Lead for LSST Simulations, the Hubble Space Telescope would need 120 years to image an equivalent area of sky.

    Imaging at this rate will generate about 15 terabytes (15 trillion bytes) of raw data per night and 30 petabytes over its 10-year survey life. (A petabyte is approximately the amount of data in 200,000 movie-length DVDs.) Even after processing, that’s still a 15 PB (15,000 TB) store.

    Such huge datasets will give astronomers a ten-year time-lapse “movie” of the southern sky, yielding new subject matter for time-domain studies and a deeper understanding of the dynamic behavior of the universe. It will also change the way science is done – astronomer-and-telescope is giving way to astronomer-and-data as an engine of new knowledge.

    Preparing the information

    The LSST’s biggest strength may be its ability to capture transients – rare or changing events usually missed in narrow-field searches and static images. The good news is that software will alert astronomers almost immediately when a transient is detected to enable fast follow-up observations by other instruments. The not-so-good news is that up to 10 million such events are possible each night. With detection rates like these, good data handling is essential.

    2
    An innovative method developed by the LSST Data Management team will allow the storage of large volumes of data for rapid access. LSST Project/NSF/AURA.

    The LSST Data Management Team is designing user tools that can operate on a variety of computing systems without the need for large downloads, all based on open-source software. Their system includes two basic types of products: those produced for nightly observing and those produced for annual science releases.

    Nightly processing will subtract two exposures of the each image field to quickly highlight changes. The data stream from the camera will be pipeline processed and continuously updated in real time, with a transient alert triggered within 60 seconds of completing an image readout.

    Data complied into scheduled science releases will get considerable reprocessing to ensure that all contents are consistent, that false detections are filtered and that faint signal sources are confirmed. Reprocessing will also classify objects using both standard categories (position, movement, brightness, etc.) and dimensions derived mathematically from the data themselves. Products will be reprocessed at time intervals from nightly to annually, which means that their quality will improve as additional observations are accumulated.

    Preparing the science

    The LSST program includes Science Collaborations, teams of scientists and technical experts that work to grow the observatory’s science agendas. There are currently eight collaborations in such areas as galaxies, dark energy and active galactic nuclei. One of the most unique, however, is the Informatics and Statistics Science Collaboration (ISSC) which, unlike other teams, doesn’t focus on a specific astronomy topic but cuts across them all. New methods will be needed to handle heavy computational loads, to optimize data representations, and to guide astronomers through the discovery process. The ISSC focus is on such new approaches to ensure that astronomers realize the best return from the anticipated flood of new data.

    “Data analysis is changing because of the volume of data we’re facing,” says Kirk Borne, an astrophysicist and data scientist with Booz Allen Hamilton, and a core member of the ISSC. “Traditional data analysis is more about fitting a physical model to observed data. When I was growing up, we didn’t have sample sizes like this. We were trying to understand a particular phenomenon with our small sample sets. Now, it’s more unsupervised. Instead of asking ‘tell me about my model,’ you ask ‘tell me what you know.’ Data become the model, which means that more is different.”

    LSST data will almost certainly expand the chances for surprise. “When we start adding different measurement domains like gravitational wave physics and neutrino astrophysics for exploration,” adds Borne, “we start seeing these interesting new associations. Ultraluminous infrared galaxies are connected with colliding starbursting galaxies, for example, but it was a discovery made by combining optical radiation with infrared. Quasars were discovered when people compared bright radio observations of galaxies with optical images of galaxies.”

    4
    A depiction of the observatory interior. LSST Project/NSF/AURA.

    Preparing the people

    The LSST Data Management Team is starting to orient the astronomy community to what’s coming with a series of conferences and workshops. “We try to cover as many meetings as we can, giving talks and hosting hack sessions,” says William O’Mullane, the team Project Manager.

    Science notebooks, which allow users to collaborate, analyze data and publish their results online, will be an integral tool for LSST research communities and one that’s being introduced early. “We rolled out Jupyterlab [an upgraded type of science notebook] at a recent workshop,” he adds, “which is a much faster way to get people working with the stack [the image manipulation code set].”

    The next generation of big data astronomers is also being groomed through graduate curricula and a special fellowship program. “Getting students involved early is a very good thing, both for the field and for them,” says Mario Juric, Associate Professor of Astronomy at the University of Washington, and the LSST Data Management System Science Team Coordinator. “Students need to understand early on what it’s like to do large-scale experiments, to design equipment and software, and to collaborate with very large teams. Astronomy today is entering the age of big data just like particle physics did 20 or 30 years ago.

    “We also have a Data Science Fellowship Program,” adds Juric, “a cooperative effort a few of us initiated in 2015 to educate the next generation of astronomer data scientists through a two-year series of workshops.” The program is funded by the LSST Corporation, a non-profit organization dedicated to enabling science with the telescope, and student interest has been intense. Only about a dozen people were admitted from among 200 applicants in a recent selection cycle.

    Telescope data are being packaged for a wide audience, too. The LSST Education and Public Outreach (EPO) program is working to involve classrooms, citizen scientists and the general public as deeply in big data astronomy as they want (or dare) to go. Primary EPO goals are to help educators integrate real LSST data into classrooms and introductory astronomy courses, and to help non-specialists access LSST data in ways similar to those of professional astronomers. Working through platforms like Zooniverse, almost anyone will be able to conduct serious research projects. “Citizen volunteers should be thought of as members of the science collaboration,” says Amanda Bauer, Head of LSST EPO.

    The future IS the data

    The LSST will cement an age where software is as critical to astronomy as the telescope. “When I was in graduate school,” says Juric, “I worked on the Sloan Digital Sky Survey (SDSS) and I didn’t touch a telescope; I did all my research out of a database.

    SDSS Telescope at Apache Point Observatory, NM, USA, Altitude 2,788 meters (9,147 ft)

    I know many students who have done the same. So we’re already seeing that kind of migration.”

    O’Mullane would agree. “Large surveys like SDSS, Gaia and now LSST provide enough data for a different approach,” he says. “Astronomers are not always reaching for a telescope. In fact, missions like LSST basically only offer you the archive; you can’t even request the observatory to make a specific observation.”

    ESA/GAIA satellite

    5
    Observatory construction on the El Peñon summit, Chile as of November 2017. LSST Project/NSF/AURA.

    Given the enormous information streams that LSST will deliver, it soon won’t be possible for scientists to directly look at even a representative fraction of available data. Instead, they’ll increasingly rely on skillful manipulation of algorithms to examine relationships within the entirety of a dataset. The best insights will be obtained by those who ask the best questions of all those numbers.

    And, because more people will have ready access to those data, the biggest discoveries may come not only from the professionals, but from dedicated amateurs working at home on their laptops.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 8:04 am on November 27, 2017 Permalink | Reply
    Tags: , , , , , , LSST-Large Synoptic Survey Telescope, , Simulating the universe using Einstein’s theory of gravity may solve cosmic puzzles   

    From ScienceNews: “Simulating the universe using Einstein’s theory of gravity may solve cosmic puzzles” 

    ScienceNews bloc

    ScienceNews

    November 25, 2017
    Emily Conover

    Until recently, simulations of the universe haven’t given its lumps their due.

    1
    UNEVEN TERRAIN Universe simulations that consider general relativity (one shown) may shift knowledge of the cosmos. James Mertens

    If the universe were a soup, it would be more of a chunky minestrone than a silky-smooth tomato bisque.

    Sprinkled with matter that clumps together due to the insatiable pull of gravity, the universe is a network of dense galaxy clusters and filaments — the hearty beans and vegetables of the cosmic stew. Meanwhile, relatively desolate pockets of the cosmos, known as voids, make up a thin, watery broth in between.

    Until recently, simulations of the cosmos’s history haven’t given the lumps their due. The physics of those lumps is described by general relativity, Albert Einstein’s theory of gravity. But that theory’s equations are devilishly complicated to solve. To simulate how the universe’s clumps grow and change, scientists have fallen back on approximations, such as the simpler but less accurate theory of gravity devised by Isaac Newton.

    Relying on such approximations, some physicists suggest, could be mucking with measurements, resulting in a not-quite-right inventory of the cosmos’s contents. A rogue band of physicists suggests that a proper accounting of the universe’s clumps could explain one of the deepest mysteries in physics: Why is the universe expanding at an increasingly rapid rate?

    The accepted explanation for that accelerating expansion is an invisible pressure called dark energy. In the standard theory of the universe, dark energy makes up about 70 percent of the universe’s “stuff” — its matter and energy. Yet scientists still aren’t sure what dark energy is, and finding its source is one of the most vexing problems of cosmology.

    Perhaps, the dark energy doubters suggest, the speeding up of the expansion has nothing to do with dark energy. Instead, the universe’s clumpiness may be mimicking the presence of such an ethereal phenomenon.

    Most physicists, however, feel that proper accounting for the clumps won’t have such a drastic impact. Robert Wald of the University of Chicago, an expert in general relativity, says that lumpiness is “never going to contribute anything that looks like dark energy.” So far, observations of the universe have been remarkably consistent with predictions based on simulations that rely on approximations.

    _____________________________________________________________________________

    Growing a lumpy universe

    The universe has gradually grown lumpier throughout its history. During inflation, rapid expansion magnified tiny quantum fluctuations into minute density variations. Over time, additional matter glommed on to dense spots due to the stronger gravitational pull from the extra mass. After 380,000 years, those blips were imprinted as hot and cold spots in the cosmic microwave background, the oldest light in the universe. Lumps continued growing for billions of years, forming stars, planets, galaxies and galaxy clusters.

    1

    _____________________________________________________________________________

    As observations become more detailed, though, even slight inaccuracies in simulations could become troublesome. Already, astronomers are charting wide swaths of the sky in great detail, and planning more extensive surveys. To translate telescope images of starry skies into estimates of properties such as the amount of matter in the universe, scientists need accurate simulations of the cosmos’s history. If the detailed physics of clumps is important, then simulations could go slightly astray, sending estimates off-kilter. Some scientists already suggest that the lumpiness is behind a puzzling mismatch of two estimates of how fast the universe is expanding.

    Researchers are attempting to clear up the debate by conquering the complexities of general relativity and simulating the cosmos in its full, lumpy glory. “That is really the new frontier,” says cosmologist Sabino Matarrese of the University of Padua in Italy, “something that until a few years ago was considered to be science fiction.” In the past, he says, scientists didn’t have the tools to complete such simulations. Now researchers are sorting out the implications of the first published results of the new simulations. So far, dark energy hasn’t been explained away, but some simulations suggest that certain especially sensitive measurements of how light is bent by matter in the universe might be off by as much as 10 percent.

    Soon, simulations may finally answer the question: How much do lumps matter? The idea that cosmologists might have been missing a simple answer to a central problem of cosmology incessantly nags some skeptics. For them, results of the improved simulations can’t come soon enough. “It haunts me. I can’t let it go,” says cosmologist Rocky Kolb of the University of Chicago.

    Smooth universe

    By observing light from different eras in the history of the cosmos, cosmologists can compute the properties of the universe, such as its age and expansion rate. But to do this, researchers need a model, or framework, that describes the universe’s contents and how those ingredients evolve over time. Using this framework, cosmologists can perform computer simulations of the universe to make predictions that can be compared with actual observations.

    2
    COSMIC WEB Clumps and filaments of matter thread through a simulated universe 2 billion light years across. This simulation incorporates some aspects of Einstein’s theory of general relativity, allowing for detailed results while avoiding the difficulties of the full-fledged theory.

    After Einstein introduced his theory in 1915, physicists set about figuring out how to use it to explain the universe. It wasn’t easy, thanks to general relativity’s unwieldy, difficult-to-solve suite of equations. Meanwhile, observations made in the 1920s indicated that the universe wasn’t static as previously expected; it was expanding. Eventually, researchers converged on a solution to Einstein’s equations known as the Friedmann-Lemaître-Robertson-Walker metric. Named after its discoverers, the FLRW metric describes a simplified universe that is homogeneous and isotropic, meaning that it appears identical at every point in the universe and in every direction. In this idealized cosmos, matter would be evenly distributed, no clumps. Such a smooth universe would expand or contract over time.

    A smooth-universe approximation is sensible, because when we look at the big picture, averaging over the structures of galaxy clusters and voids, the universe is remarkably uniform. It’s similar to the way that a single spoonful of minestrone soup might be mostly broth or mostly beans, but from bowl to bowl, the overall bean-to-broth ratios match.

    In 1998, cosmologists revealed that not only was the universe expanding, but its expansion was also accelerating (SN: 2/2/08, p. 74). Observations of distant exploding stars, or supernovas, indicated that the space between us and them was expanding at an increasing clip. But gravity should slow the expansion of a universe evenly filled with matter. To account for the observed acceleration, scientists needed another ingredient, one that would speed up the expansion. So they added dark energy to their smooth-universe framework.

    Now, many cosmologists follow a basic recipe to simulate the universe — treating the cosmos as if it has been run through an imaginary blender to smooth out its lumps, adding dark energy and calculating the expansion via general relativity. On top of the expanding slurry, scientists add clumps and track their growth using approximations, such as Newtonian gravity, which simplifies the calculations.

    In most situations, Newtonian gravity and general relativity are near-twins. Throw a ball while standing on the surface of the Earth, and it doesn’t matter whether you use general relativity or Newtonian mechanics to calculate where the ball will land — you’ll get the same answer. But there are subtle differences. In Newtonian gravity, matter directly attracts other matter. In general relativity, gravity is the result of matter and energy warping spacetime, creating curves that alter the motion of objects (SN: 10/17/15, p. 16). The two theories diverge in extreme gravitational environments. In general relativity, for example, hulking black holes produce inescapable pits that reel in light and matter (SN: 5/31/14, p. 16). The question, then, is whether the difference between the two theories has any impact in lumpy-universe simulations.

    Most cosmologists are comfortable with the status quo simulations because observations of the heavens seem to fit neatly together like interlocking jigsaw puzzle pieces. Predictions based on the standard framework agree remarkably well with observations of the cosmic microwave background — ancient light released when the universe was just 380,000 years old (SN: 3/21/15, p. 7). And measurements of cosmological parameters — the fraction of dark energy and matter, for example — are generally consistent, whether they are made using the light from galaxies or the cosmic microwave background [CMB].

    CMB per ESA/Planck


    ESA/Planck

    3
    An image from the Two-Micron All Sky Survey of 1.6 million galaxies in infrared light reveals how matter clumps into galaxy clusters and filaments. Future large-scale surveys may require improved simulations that use general relativity to track the evolution of lumps over time. T.H. Jarrett, J. Carpenter & R. Hurt, obtained as part of 2MASS, a joint project of Univ. of Massachusetts and the Infrared Processing and Analysis Center/Caltech, funded by NASA and NSF.


    Caltech 2MASS Telescopes, a joint project of the University of Massachusetts and the Infrared Processing and Analysis Center (IPAC) at Caltech, at the Whipple Observatory on Mt. Hopkins south of Tucson, AZ, and at the Cerro Tololo Inter-American Observatory near La Serena, Chile.

    Dethroning dark energy

    Some cosmologists hope to explain the universe’s accelerating expansion by fully accounting for the universe’s lumpiness, with no need for the mysterious dark energy.

    These researchers argue that clumps of matter can alter how the universe expands, when the clumps’ influence is tallied up over wide swaths of the cosmos. That’s because, in general relativity, the expansion of each local region of space depends on how much matter is within. Voids expand faster than average; dense regions expand more slowly. Because the universe is mostly made up of voids, this effect could produce an overall expansion and potentially an acceleration. Known as backreaction, this idea has lingered in obscure corners of physics departments for decades, despite many claims that backreaction’s effect is small or nonexistent.

    Backreaction continues to appeal to some researchers because they don’t have to invent new laws of physics to explain the acceleration of the universe. “If there is an alternative which is based only upon traditional physics, why throw that away completely?” Matarrese asks.

    Most cosmologists, however, think explaining away dark energy just based on the universe’s lumps is unlikely. Previous calculations have indicated any effect would be too small to account for dark energy, and would produce an acceleration that changes in time in a way that disagrees with observations.

    “My personal view is that it’s a much smaller effect,” says astrophysicist Hayley Macpherson of Monash University in Melbourne, Australia. “That’s just basically a gut feeling.” Theories that include dark energy explain the universe extremely well, she points out. How could that be if the whole approach is flawed?

    New simulations by Macpherson and others that model how lumps evolve in general relativity may be able to gauge the importance of backreaction once and for all. “Up until now, it’s just been too hard,” says cosmologist Tom Giblin of Kenyon College in Gambier, Ohio.

    To perform the simulations, researchers needed to get their hands on supercomputers capable of grinding through the equations of general relativity as the simulated universe evolves over time. Because general relativity is so complex, such simulations are much more challenging than those that use approximations, such as Newtonian gravity. But, a seemingly distinct topic helped lay some of the groundwork: gravitational waves, or ripples in the fabric of spacetime.

    4
    SPECKLED SPACETIME A lumpy universe, recently simulated using general relativity, shows clumps of matter (pink and yellow) that beget stars and galaxies. H. Macpherson, Paul Lasky, Daniel Price.

    The Advanced Laser Interferometer Gravitational-Wave Observatory, LIGO, searches for the tremors of cosmic dustups such as colliding black holes (SN: 10/28/17, p. 8).


    VIRGO Gravitational Wave interferometer, near Pisa, Italy

    Caltech/MIT Advanced aLigo Hanford, WA, USA installation


    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project

    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger-Zib

    ESA/eLISA the future of gravitational wave research

    1
    Skymap showing how adding Virgo to LIGO helps in reducing the size of the source-likely region in the sky. (Credit: Giuseppe Greco (Virgo Urbino group)

    In preparation for this search, physicists honed their general relativity skills on simulations of the spacetime storm kicked up by black holes, predicting what LIGO might see and building up the computational machinery to solve the equations of general relativity. Now, cosmologists have adapted those techniques and unleashed them on entire, lumpy universes.

    The first lumpy universe simulations to use full general relativity were unveiled in the June 2016 Physical Review Letters. Giblin and colleagues reported their results simultaneously with Eloisa Bentivegna of the University of Catania in Italy and Marco Bruni of the University of Portsmouth in England.

    So far, the simulations have not been able to account for the universe’s acceleration. “Nearly everybody is convinced [the effect] is too small to explain away the need for dark energy,” says cosmologist Martin Kunz of the University of Geneva. Kunz and colleagues reached the same conclusion in their lumpy-universe simulations, which have one foot in general relativity and one in Newtonian gravity. They reported their first results in Nature Physics in March 2016.

    Backreaction aficionados still aren’t dissuaded. “Before saying the effect is too small to be relevant, I would, frankly, wait a little bit more,” Matarrese says. And the new simulations have potential caveats. For example, some simulated universes behave like an old arcade game — if you walk to one edge of the universe, you cross back over to the other side, like Pac-Man exiting the right side of the screen and reappearing on the left. That geometry would suppress the effects of backreaction in the simulation, says Thomas Buchert of the University of Lyon in France. “This is a good beginning,” he says, but there is more work to do on the simulations. “We are in infancy.”

    Different assumptions in a simulation can lead to disparate results, Bentivegna says. As a result, she doesn’t think that her lumpy, general-relativistic simulations have fully closed the door on efforts to dethrone dark energy. For example, tricks of light might be making it seem like the universe’s expansion is accelerating, when in fact it isn’t.

    When astronomers observe far-away sources like supernovas, the light has to travel past all of the lumps of matter between the source and Earth. That journey could make it look like there’s an acceleration when none exists. “It’s an optical illusion,” Bentivegna says. She and colleagues see such an effect in a simulation reported in March in the Journal of Cosmology and Astroparticle Physics. But, she notes, this work simulated an unusual universe, in which matter sits on a grid — not a particularly realistic scenario.

    For most other simulations, the effect of optical illusions remains small. That leaves many cosmologists, including Giblin, even more skeptical of the possibility of explaining away dark energy: “I feel a little like a downer,” he admits.

    6
    Lumps (gray) within this simulated universe change the path light takes (yellow lines), potentially affecting observations. Matter bends space, slightly altering the light’s trajectory from that in a smooth universe. James Mertens.

    Surveying the skies

    Subtle effects of lumps could still be important. In Hans Christian Andersen’s The Princess and the Pea, the princess felt a tiny pea beneath an impossibly tall stack of mattresses. Likewise, cosmologists’ surveys are now so sensitive that even if the universe’s lumps have a small impact, estimates could be thrown out of whack.

    The Dark Energy Survey, for example, has charted 26 million galaxies using the Victor M. Blanco Telescope in Chile, measuring how the light from those galaxies is distorted by the intervening matter on the journey to Earth.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    In a set of papers posted online August 4 at arXiv.org, scientists with the Dark Energy Survey reported new measurements of the universe’s properties, including the amount of matter (both dark and normal) and how clumpy that matter is (SN: 9/2/17, p. 32). The results are consistent with those from the cosmic microwave background [CMB] — light emitted billions of years earlier.

    To make the comparison, cosmologists took the measurements from the cosmic microwave background, early in the universe, and used simulations to extrapolate to what galaxies should look like later in the universe’s history. It’s like taking a baby’s photograph, precisely computing the number and size of wrinkles that should emerge as the child ages and finding that your picture agrees with a snapshot taken decades later. The matching results so far confirm cosmologists’ standard picture of the universe — dark energy and all.

    “So far, it has not yet been important for the measurements that we’ve made to actually include general relativity in those simulations,” says Risa Wechsler, a cosmologist at Stanford University and a founding member of the Dark Energy Survey. But, she says, for future measurements, “these effects could become more important.” Cosmologists are edging closer to Princess and the Pea territory.

    Those future surveys include the Dark Energy Spectroscopic Instrument, DESI, set to kick off in 2019 at Kitt Peak National Observatory near Tucson; the European Space Agency’s Euclid satellite, launching in 2021; and the Large Synoptic Survey Telescope in Chile, which is set to begin collecting data in 2023.

    LBNL/DESI spectroscopic instrument on the Mayall 4-meter telescope at Kitt Peak National Observatory, Altitude 2,120 m (6,960 ft)

    LBNL/DESI spectroscopic instrument on the Mayall 4-meter telescope at Kitt Peak National Observatory starting in 2018

    NOAO/Mayall 4 m telescope at Kitt Peak, Arizona, USA, Altitude 2,120 m (6,960 ft)

    ESA/Euclid spacecraft

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    If cosmologists keep relying on simulations that don’t use general relativity to account for lumps, certain kinds of measurements of weak lensing — the bending of light due to matter acting like a lens — could be off by up to 10 percent, Giblin and colleagues reported at arXiv.org in July. “There is something that we’ve been ignoring by making approximations,” he says.

    That 10 percent could screw up all kinds of estimates, from how dark energy changes over the universe’s history to how fast the universe is currently expanding, to the calculations of the masses of ethereal particles known as neutrinos. “You have to be extremely certain that you don’t get some subtle effect that gets you the wrong answers,” Geneva’s Kunz says, “otherwise the particle physicists are going to be very angry with the cosmologists.”

    Some estimates may already be showing problem signs, such as the conflicting estimates of the cosmic expansion rate (SN: 8/6/16, p. 10). Using the cosmic microwave background, cosmologists find a slower expansion rate than they do from measurements of supernovas. If this discrepancy is real, it could indicate that dark energy changes over time. But before jumping to that conclusion, there are other possible causes to rule out, including the universe’s lumps.

    Until the issue of lumps is smoothed out, scientists won’t know how much lumpiness matters to the cosmos at large. “I think it’s rather likely that it will turn out to be an important effect,” Kolb says. Whether it explains away dark energy is less certain. “I want to know the answer so I can get on with my life.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 9:25 am on November 22, 2017 Permalink | Reply
    Tags: , , , , LSST-Large Synoptic Survey Telescope, Preparing to Light Up the LSST Network   

    From LSST: “Preparing to Light Up the LSST Network” 

    LSST

    Large Synoptic Survey Telescope

    November 16, 2017
    No writer credit found

    November 12, 2017 – LSST’s fiber-optic network, which will provide the necessary 100Gbps connectivity to move data from the summit of Cerro Pachón to all LSST operational sites and to multiple data centers, came one milestone closer to activation last week; the AURA LSST Dense Wavelength Division Multiplexing (DWDM) Network Equipment that LSST will use initially was installed in several key locations. DWDM equipment sends pulses of light down the fiber to transmit data, therefore a DWDM box is needed at each end of a fiber network in order for the network to be operational. In this installation project, the Summit-Base Network DWDM equipment was set up in the La Serena computer room and in the communications hut on the summit of Cerro Pachón. The Santiago portion of the Base-Archive Network was also addressed, with DWDM hardware installed in La Serena as well as at the National University Network (REUNA) facility in Santiago. The DWDM hardware in Santiago will be connected to AmLight DWDM equipment which will transfer the data to Florida. There, it will be picked up by Florida LambdaRail (FLR), ESnet, and internet2 for its journey to NSCA via Chicago.

    The primary South to North network traffic will be the transfer of raw image data from Cerro Pachón to the National Center for Supercomputing Applications (NCSA), where the data will be processed into scientific data products, including transient alerts, calibrated images, and catalogs. From there, a backup of the raw data will be made over the international network to IN2P3 in Lyon, France. IN2P3 will also perform half of the annual catalog processing. The network will also transfer data from North to South, returning the processed scientific data products to the Chilean Data Access Center (DAC), where they will be made available to the Chilean scientific community.

    The LSST Summit-Base and Base-Archive networks are on new fibers all the way to Santiago; there is also an existing fiber that provides a backup path from La Serena to Santiago. From Santiago to Florida, the data will travel on a new submarine fiber cable, with a backup on existing fiber cables. LSST currently shares the AURA fiber-optic network (connecting La Serena and the Summit) with the Gemini and CTIO telescopes, but will have its own dedicated DWDM equipment in 2018. Additional information on LSST data flow during LSST Operations is available here.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    LSST telescope, currently under construction at Cerro Pachón Chile
    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.
    LSST Interior

    LSST/Camera, built at SLAC
    LSST/Camera, built at SLAC

    The LSST is a new kind of telescope. Currently under construction in Chile, it is being built to rapidly survey the night-time sky. Compact and nimble, the LSST will move quickly between images, yet its large mirror and large field of view—almost 10 square degrees of sky, or 40 times the size of the full moon—work together to deliver more light from faint astronomical objects than any optical telescope in the world.

    From its mountaintop site in the foothills of the Andes, the LSST will take more than 800 panoramic images each night with its 3.2 billion-pixel camera, recording the entire visible sky twice each week. Each patch of sky it images will be visited 1000 times during the survey. With a light-gathering power equal to a 6.7-m diameter primary mirror, each of its 30-second observations will be able to detect objects 10 million times fainter than visible with the human eye. A powerful data system will compare new with previous images to detect changes in brightness and position of objects as big as far-distant galaxy clusters and as small as near-by asteroids.

    The LSST’s combination of telescope, mirror, camera, data processing, and survey will capture changes in billions of faint objects and the data it provides will be used to create an animated, three-dimensional cosmic map with unprecedented depth and detail , giving us an entirely new way to look at the Universe. This map will serve a myriad of purposes, from locating that mysterious substance called dark matter and characterizing the properties of the even more mysterious dark energy, to tracking transient objects, to studying our own Milky Way Galaxy in depth. It will even be used to detect and track potentially hazardous asteroids—asteroids that might impact the Earth and cause significant damage.

    As with past technological advances that opened new windows of discovery, such a powerful system for exploring the faint and transient Universe will undoubtedly serve up surprises.

    Plans for sharing the data from LSST with the public are as ambitious as the telescope itself. Anyone with a computer will be able to view the moving map of the Universe created by the LSST, including objects a hundred million times fainter than can be observed with the unaided eye. The LSST project will provide analysis tools to enable both students and the public to participate in the process of scientific discovery. We invite you to learn more about LSST science.

    The LSST will be unique: no existing telescope or proposed camera could be retrofitted or re-designed to cover ten square degrees of sky with a collecting area of forty square meters. Named the highest priority for ground-based astronomy in the 2010 Decadal Survey, the LSST project formally began construction in July 2014.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: