Tagged: Large Synoptic Survey Telescope (LSST) Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:20 pm on September 5, 2014 Permalink | Reply
    Tags: , , , , Large Synoptic Survey Telescope (LSST),   

    From Quanta via FNAL: “A Digital Copy of the Universe, Encrypted” 2013 

    Quanta Magazine
    Quanta Magazine

    October 2, 2013
    Natalie Wolchover

    Even as he installed the landmark camera that would capture the first convincing evidence of dark energy in the 1990s, Tony Tyson, an experimental cosmologist now at the University of California, Davis, knew it could be better. The camera’s power lay in its ability to collect more data than any other. But digital image sensors and computer processors were progressing so rapidly that the amount of data they could collect and store would soon be limited only by the size of the telescopes delivering light to them, and those were growing too. Confident that engineering trends would hold, Tyson envisioned a telescope project on a truly grand scale, one that could survey hundreds of attributes of billions of cosmological objects as they changed over time.

    It would record, Tyson said, “a digital, color movie of the universe.”

    Tyson’s vision has come to life as the Large Synoptic Survey Telescope (LSST) project, a joint endeavor of more than 40 research institutions and national laboratories that has been ranked by the National Academy of Sciences as its top priority for the next ground-based astronomical facility. Set on a Chilean mountaintop, and slated for completion by the early 2020s, the 8.4-meter LSST will be equipped with a 3.2-billion-pixel digital camera that will scan 20 billion cosmological objects 800 times apiece over the course of a decade. That will generate well over 100 petabytes of data that anyone in the United States or Chile will be able to peruse at will. Displaying just one of the LSST’s full-sky images would require 1,500 high-definition TV screens.

    LSST

    LSST Exterior
    LSST Camera
    int
    LSST Exterior, Camera, Interior

    The LSST epitomizes the new era of big data in physics and astronomy. Less than 20 years ago, Tyson’s cutting-edge digital camera filled 5 gigabytes of disk space per night with revelatory information about the cosmos. When the LSST begins its work, it will collect that amount every few seconds — literally more data than scientists know what to do with.

    tt
    Tony Tyson, an experimental cosmologist at the University of California, Davis, with a small test camera for the Large Synoptic Survey Telescope project, which he is helping to launch.
    Peter DaSilva for Quanta Magazine

    “The data volumes we [will get] out of LSST are so large that the limitation on our ability to do science isn’t the ability to collect the data, it’s the ability to understand the systematic uncertainties in the data,” said Andrew Connolly, an astronomer at the University of Washington.

    Typical of today’s costly scientific endeavors, hundreds of scientists from different fields are involved in designing and developing the LSST, with Tyson as chief scientist. “It’s sort of like a federation,” said Kirk Borne, an astrophysicist and data scientist at George Mason University. The group is comprised of nearly 700 astronomers, cosmologists, physicists, engineers and data scientists.

    Much of the scientists’ time and about one-half of the $1 billion cost of the project are being spent on developing software rather than hardware, reflecting the exponential growth of data since the astronomy projects of the 1990s. For the telescope to be useful, the scientists must answer a single question. As Borne put it: “How do you turn petabytes of data into scientific knowledge?”

    Physics has been grappling with huge databases longer than any other field of science because of its reliance on high-energy machines and enormous telescopes to probe beyond the known laws of nature. This has given researchers a steady succession of models upon which to structure and organize each next big project, in addition to providing a starter kit of computational tools that must be modified for use with ever larger and more complex data sets.

    Even backed by this tradition, the LSST tests the limits of scientists’ data-handling abilities. It will be capable of tracking the effects of dark energy, which is thought to make up a whopping 68 percent of the total contents of the universe, and mapping the distribution of dark matter, an invisible substance that accounts for an additional 27 percent. And the telescope will cast such a wide and deep net that scientists say it is bound to snag unforeseen objects and phenomena too. But many of the tools for disentangling them from the rest of the data don’t yet exist.

    New Dimensions

    Particle physics is the elder statesman of big data science. For decades, high-energy http://en.wikipedia.org/wiki/Particle_accelerator
    have been bashing particles together millions of times per second in hopes of generating exotic, never-before-seen particles. These facilities, such as the Large Hadron Collider (LHC) at CERN laboratory in Switzerland, generate so much data that only a tiny fraction (deemed interesting by an automatic selection process) can be kept. A network of hundreds of thousands of computers spread across 36 countries called the Worldwide LHC Computing Grid stores and processes the 25 petabytes of LHC data that were archived in a year’s worth of collisions. The work of thousands of physicists went into finding the bump in that data that last summer was deemed representative of a new subatomic particle, the Higgs boson.

    CERN, the organization that operates the LHC, is sharing its wisdom by working with other research organizations “so they can benefit from the knowledge and experience that has been gathered in data acquisition, processing and storage,” said Bob Jones, head of CERN openlab, which develops new IT technologies and techniques for the LHC. Scientists at the European Space Agency, the European Molecular Biology Laboratory, other physics facilities and even collaborations in the social sciences and humanities have taken cues from the LHC on data handling, Jones said.

    When the LHC turns back on in 2014 or 2015 after an upgrade, higher energies will mean more interesting collisions, and the amount of data collected will grow by a significant factor. But even though the LHC will continue to possess the biggest data set in physics, its data is much simpler than those obtained from astronomical surveys such as the Sloan Digital Sky Survey and Dark Energy Survey and — to an even greater extent — those that will be obtained from future sky surveys such as the Square Kilometer Array, a radio telescope project set to begin construction in 2016, and the LSST.

    Sloan Digital Sky Survey Telescope
    SSDS Telescope

    DECam
    DECam

    SKA Square Kilometer Array

    “The LHC generates a lot more data right at the beginning, but they’re only looking for certain events in that data and there’s no correlation between events in that data,” said Jeff Kantor, the LSST data management project manager. “Over time, they still build up large sets, but each one can be individually analyzed.”

    In combining repeat exposures of the same cosmological objects and logging hundreds rather than a handful of attributes of each one, the LSST will have a whole new set of problems to solve. “It’s the complexity of the LSST data that’s a challenge,” Tyson said. “You’re swimming around in this 500-dimensional space.”

    From color to shape, roughly 500 attributes will be recorded for every one of the 20 billion objects surveyed, and each attribute is treated as a separate dimension in the database. Merely cataloguing these attributes consistently from one exposure of a patch of the sky to the next poses a huge challenge. “In one exposure, the scene might be clear enough that you could resolve two different galaxies in the same spot, but in another one, they might be blurred together,” Kantor said. “You have to figure out if it’s one galaxy or two or N.”

    Beyond N-Squared

    To tease scientific discoveries out of the vast trove of data gathered by the LSST and other sky surveys, scientists will need to pinpoint unexpected relationships between attributes, which is extremely difficult in 500 dimensions. Finding correlations is easy with a two-dimensional data set: If two attributes are correlated, then there will be a one-dimensional curve connecting the data points on a two-dimensional plot of one attribute versus the other. But additional attributes plotted as extra dimensions obscure such curves. “Finding the unexpected in a higher-dimensional space is impossible using the human brain,” Tyson said. “We have to design future computers that can in some sense think for themselves.”

    Algorithms exist for “reducing the dimensionality” of data, or finding surfaces on which the data points lie (like that 1-D curve in the 2-D plot), in order to find correlated dimensions and eliminate “nuisance” ones. For example, an algorithm might identify a 3-D surface of data points coursing through a database, indicating that three attributes, such as the type, size and rotation speed of galaxies, are related. But when swamped with petabytes of data, the algorithms take practically forever to run.

    Identifying correlated dimensions is exponentially more difficult than looking for a needle in a haystack. “That’s a linear problem,” said Alex Szalay, a professor of astronomy and computer science at Johns Hopkins University. “You search through the haystack and whatever looks like a needle you throw in one bucket and you throw everything else away.” When you don’t know what correlations you’re looking for, however, you must compare each of the N pieces of hay with every other piece, which takes N-squared operations.

    Adding to the challenge is the fact that the amount of data is doubling every year. “Imagine we are working with an algorithm that if my data doubles, I have to do four times as much computing and then the following year, I have to do 16 times as much computing,” Szalay said. “But by next year, my computers will only be twice as fast, and in two years from today, my computers will only be four times as fast, so I’m falling farther and farther behind in my ability to do this.”

    A huge amount of research has gone into developing scalable algorithms, with techniques such as compressed sensing, topological analysis and the maximal information coefficient emerging as especially promising tools of big data science. But more work remains to be done before astronomers, cosmologists and physicists will be ready to fully exploit the multi-petabyte digital movie of the universe that premiers next decade. Progress is hampered by the fact that researchers in the physical sciences get scant academic credit for developing algorithms — a problem that the community widely recognizes but has yet to solve.

    “It’s always been the case that the people who build the instrumentation don’t get as much credit as the people who use the instruments to do the cutting-edge science,” Connolly said. “Ten years ago, it was people who built physical instruments — the cameras that observe the sky — and today, it’s the people who build the computational instruments who don’t get enough credit. There has to be a career path for someone who wants to work on the software — because they can go get jobs at Google. So if we lose these people, it’s the science that loses.”

    Coffee and Kudos

    In December 2010, in an effort to encourage the development of better algorithms, an international group of astronomers issued a challenge to computer geeks everywhere: What is the best way to measure gravitational lensing, or the distorting effect that dark matter has on the light from distant galaxies? David Kirkby read about the GREAT10 (GRavitational lEnsing Accuracy Testing 2010) Challenge on Wired.com and decided to give it a go.

    dc
    David Kirkby, a physicist at the University of California, Irvine, holds an observing plate designed to capture data for a specific circular patch of the sky. Peter DaSilva for Quanta Magazine

    Kirkby, a physicist at the University of California, Irvine, and his graduate student won the contest using a modified version of a neural network algorithm that he had previously developed for the BABAR experiment, a large physics collaboration investigating the asymmetry of matter and antimatter. The victory earned Kirkby a co-author credit on the recent paper detailing the contest, easing his switch from the field of particle physics to astrophysics. Also, with the prize money, “we bought a top of the line espresso machine for the lab,” he said.

    GREAT10 was one of a growing number of “data challenges” designed to find solutions to specific problems faced in creating and analyzing large physics and astronomy databases, such as the best way to reconstruct the shapes of two galaxies that are aligned relative to Earth and so appear blended together.

    “One group produces a set of data — it could be blended galaxies — and then anybody can go out and try and estimate the shape of the galaxies using their best algorithm,” explained Connolly, who is involved in generating simulations of future LSST images that are used to test the performance of algorithms. “It’s quite a lot of kudos to the person who comes out on top.”

    Many of the data challenges, including the GREAT series, focus on teasing out the effects of dark matter. When light from a distant galaxy travels to Earth, it is bent, or “lensed,” by the gravity of the dark matter it passes through. “It’s a bit like looking at wallpaper through a bathroom window with a rough surface,” Kirkby said. “You determine what the wallpaper would look like if you were looking at it directly, and you use that information to figure out what the shape of the glass is.”

    Each new data challenge in a series includes an extra complication — additional distortions caused by atmospheric turbulence or a faulty amplifier in one of the detectors, for example — moving the goal posts of the challenge closer and closer to reality.

    Data challenges are “a great way of crowd-sourcing problems in data science, but I think it would be good if software development was just recognized as part of your productivity as an academic,” Kirkby said. “At career reviews, you measure people based on their scientific contributions even though software packages could have a much broader impact.”

    The culture is slowly changing, the scientists said, as the ability to analyze data becomes an ever-tightening bottleneck in research. “In the past, it was usually some post-doc or grad student poring over data who would find something interesting or something that doesn’t seem to work and stumble across some new effect,” Tyson said. “But increasingly, the amount of data is so large that you have to have machines with algorithms to do this.”

    Dark Side of the Universe

    Assuming that physicists can solve the computing problems they face with the LSST, the results could be transformative. There are many reasons to want a 100-petabyte digital copy of the universe. For one, it would help map the expansion of space and time caused by the still-mysterious dark energy, discovered with the help of the LSST’s predecessor, the Big Throughput Camera, which Tyson and a collaborator built in 1996.

    When that camera, which could cover a patch of the sky the size of a full moon in a single exposure, was installed on the Blanco Telescope in Chile, astrophysicists immediately discovered dozens of exploding stars called Type IA supernovae strewn across the sky that revealed that most stuff in the universe is unknown. Light from nearby supernovae appeared to have stretched more than it should have during its journey through the expanding cosmos compared with light from faraway ones. This suggested that the expansion of the universe had recently sped up, driven by dark energy.

    CTIO Victor M Blanco 4m Telescope
    CTIO Victor M Blanco 4m Telescope interior
    CTIO Victor M Blanco 4m Telescope

    With the LSST, scientists hope to precisely track the accelerating expansion of the universe and thus to better define the nature of dark energy. They aim to do this by mapping a sort of cosmic yardstick called baryon acoustic oscillations. The yardstick was created from sound waves that rippled through the universe when it was young and hot and became imprinted in the distribution of galaxies as it cooled and expanded. The oscillations indicate the size of space at every distance away from Earth — and thus at any point back in time.

    Baryon acoustic oscillations are so enormous that a truly vast astronomical survey is needed to make them a convenient measuring tool. By cataloguing billions of galaxies, the LSST promises to measure the size of these resonances more accurately than any other existing or planned astronomical survey. “The idea is that with the LSST, we will have onion shells of galaxies at different distances and we can look for this pattern and trace the size of the resonant patterns as a function of time,” Szalay said. “This will be beautiful.”

    But, Szalay added, “it will be a nontrivial task to actually milk the information out of the data.”

    See the full article here.

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 10:23 pm on August 4, 2014 Permalink | Reply
    Tags: , , AURA, , , Large Synoptic Survey Telescope (LSST)   

    From AURA: “AURA Awarded Support by the National Science Foundation To Begin Constructing LSST” 

    AURA Icon
    Association of Universities for Research in Astronomy

    August 4, 2014

    The National Science Foundation (NSF) agreed on Friday to support the Association of Universities for Research in Astronomy (AURA) to manage the construction of the Large Synoptic Survey Telescope (LSST).

    LSST Telescope
    LSST

    This marks the official federal start of the LSST project, the top-ranked major ground-based facility recommended by the National Research Council’s Astronomy and Astrophysics decadal survey committee in its 2010 report, New Worlds, New Horizons. It is being carried out as an NSF and Department of Energy (DOE) partnership, with NSF responsible for the telescope and site, education & outreach, and the data management system, and DOE providing the camera and related instrumentation. Both agencies expect to support post-construction operation of the observatory.

    The NSF construction budget for LSST is not to exceed $473M. The DOE Camera fabrication budget will be baselined later this year, but is estimated to be $165M. Operations costs will be around $40M per year for the ten-year survey. With the approved start occurring now, LSST will see first light in 2019 and begin full science operations in 2022. Today’s action culminates over ten years of developing, planning and reviewing of the LSST concept.

    LSST Project Manager, Victor Krabbendam, was delighted to receive the welcome news from NSF: “This agreement is a tribute to the hard work of an exceptional team of highly skilled individuals, many of whom have dedicated more than a decade to bringing LSST to this point. After a rigorous design and development phase, the project team is ready to get down and dirty and actually build this amazing facility.”

    LSST Director, Steven Kahn of Stanford University, commented on the unique contributions LSST will make to astronomy and fundamental physics: “The broad range of science enabled by the LSST survey will change our understanding of the dynamic Universe on timescales ranging from its earliest moments after the Big Bang to the motions of asteroids in the solar system today. The open nature of our data products means that the public will have the opportunity to share in this exciting adventure along with the scientific community. The most exciting discoveries will probably be those we haven’t yet even envisioned!”

    William Smith, the President AURA, expressed his enthusiasm for AURA’s role in the Project: “AURA is proud to provide management for the construction of LSST, an activity clearly aligned with our mission to promote excellence in astronomical research by providing access to state-of-the-art facilities. Joining the Space Telescope Science Institute, the National Solar Observatory, the National Optical Astronomy Observatory, and the Gemini Telescope as AURA Centers, LSST is a new paradigm in ground-based astronomy that will revolutionize both our cosmic knowledge and the open and collaborative methods of acquiring that knowledge.”

    By digitally imaging the sky for a decade, the LSST will produce a petabyte-scale database enabling new paradigms of knowledge discovery for transformative STEM education. LSST will address the most pressing questions in astronomy and physics, which are driving advances in big data science and computing. LSST is not “just another telescope” but a truly unique discovery engine.

    The early development of LSST was supported by the LSST Corporation (LSSTC), a non-profit consortium of universities and other research institutions. Fabrication of the major mirror components is already underway, thanks to private funding received from the Charles and Lisa Simonyi Foundation for Arts and Sciences, Bill Gates, and other individuals. Receipt of federal construction funds allows major contracts to move forward, including those to build the telescope mount assembly, the figuring of the secondary mirror, the summit facility construction, the focal plane sensors, and the camera lenses.

    LSST’s construction funding will be provided through NSF’s Major Research Equipment and Facilities (MREFC) account. LSST passed its NSF Final Design Review in December of 2013; the National Science Board gave the NSF conditional approval to move the project to construction status in May of 2014. On the DOE side, LSST received Critical Decision-1 approval (CD-1) in 2011 and also just received CD-3a approval, which allows the project to move forward with long-lead procurements. The CD-2 review will take place the first week in November, with approval expected shortly afterward, formally fixing the baseline budget for completion of the camera project. The Particle Physics Project Prioritization Panel (P5), an advisory subpanel of the High Energy Physics Advisory Panel (HEPAP), recommended last month that DOE move forward with LSST under all budget scenarios, even the most pessimistic.

    The Association of Universities for Research in Astronomy (AURA) is a consortium of 39 US institutions and 6 international affiliates that operates world-class astronomical observatories. AURA’s role is to establish, nurture, and promote public observatories and facilities that advance innovative astronomical research. In addition, AURA is deeply committed to public and educational outreach, and to diversity throughout the astronomical and scientific workforce. AURA carries out its role through its astronomical facilities: http://www.aura-astronomy.org

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 11:29 am on July 24, 2014 Permalink | Reply
    Tags: , , , , , , Large Synoptic Survey Telescope (LSST)   

    From Brookhaven Lab: “Instrumentation Division Nears Production Phase for LSST Camera Sensors” 

    Brookhaven Lab

    July 21, 2014
    Rebecca Harrington

    Precision assembly is required to capture the clearest and most extensive picture of the cosmos

    A single sensor for the world’s largest digital camera detected light making its way through wind, air turbulence, and Earth’s atmosphere, successfully converting the light into a glimpse of the galactic wonders that this delicate instrument will eventually capture as it scans the night sky. When installed in the camera of the Large Synoptic Survey Telescope (LSST), these sensors will convert light captured from distant galaxies into digital information that will provide unprecedented insight into our understanding of the universe.

    two
    Design Engineer Justine Haupt (left) and Postdoctoral Research Associate Dajun Huang (right) prepare a test chamber that scientists in the Instrumentation Division are are using to evaluate the digital sensors they are designing for the Large Synoptic Survey Telescope, which is scheduled to see “first light” in 2020, and start surveying in 2022.

    LSST Telescope
    LSST

    But the sensor wasn’t on the telescope yet; it was in a clean room at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory. And the “atmosphere” was being projected from a custom piece of glass made to replicate what the sensor will actually see once it is part of the camera inside the LSST, which every three days will survey the entire night sky visible from its location atop a mountain in Chile. The meticulous laboratory test at Brookhaven was one of many that scientists in the Lab’s Instrumentation Division are conducting on the 201 sensors they are designing for the digital “film” of the telescope’s camera.

    Scheduled to see “first light” in 2020, and start surveying in 2022, the LSST will ultimately survey 20 billion galaxies and 17 billion stars in a 10-year period. . In working on sensors for the camera, Brookhaven is partnering with dozens of public and private organizations, including universities, national laboratories, and Google, Inc., to make the LSST a reality. The project is jointly sponsored by the National Science Foundation (NSF) and DOE’s Office of Science. NSF leads the overall LSST effort, while DOE is responsible for providing the camera, with the DOE-supported effort led by the SLAC National Accelerator Laboratory.

    I think it will be an important chapter in the history of physics.
    Paul O’Connor, Brookhaven Senior Scientist leading the LSST camera team at Brookhaven

    The data gathered from those distant galaxies will offer scientists insight into the seemingly unreal: the dark matter and dark energy that in fact comprise more than 95 percent of our universe (the planets, stars, and other visible matter making up a mere 5 percent). Dark energy, the mysterious force that is accelerating the universe’s expansion, only manifests itself by its effects on large-scale cosmic structures. Dark matter, invisible on its own, can be measured by observing how light bends around it. Understanding these strange concepts and their role in cosmic acceleration are among the “science drivers” recently identified by a panel reviewing priorities in particle physics, which recommended that DOE’s work on the LSST camera go forward no matter what funding scenario the field may face.

    “This question of dark energy and dark matter is so compelling,” said Senior Scientist Paul O’Connor, who’s leading the LSST camera team at Brookhaven. “There’s incontrovertible evidence that these are the major constituents of the universe; they don’t fit into the rest of physics.”

    LSST’s incredible precision and sensitivity will give scientists access to both.

    To unlock the mysteries of dark energy, LSST needs to be able to measure redshift, a phenomenon observed when the wavelengths of light emitted by galaxies receding at the distant edges of space appear to stretch out, or shift to the red end of the light spectrum. Most galaxies to be detected by LSST are faint and far away, at the limits of current sensor technology for measuring redshifts. So O’Connor said his team needed to design the LSST camera sensors with a much thicker layer of silicon and entirely new electronics.

    “Making a contribution on the experimental end exploring these phenomena is quite satisfying,” O’Connor said. “I think it will be an important chapter in the history of physics.”

    But the LSST won’t just be for scientists. The general public will be able to access its images through planned projects such as adopting a patch of sky to monitor and track changes, and interacting with a time-lapse movie shown in science centers depicting a decade of observation. The telescope’s imaging powers will also join the host of other instruments used to detect exploding supernovae, and asteroids that could hit our planet, giving scientists more warning before they come close to Earth.
    Building the World’s Largest Digital Camera

    tower
    A design of a single raft tower housing the charge-coupled devices (CCDs) — sensors that convert light captured by the telescope into an electrical charge representing a specific detail that a computer can turn into a digital picture. The full camera will have 21 raft towers.

    The LSST sensors that the Brookhaven scientists are designing, building, and testing are known as charge-coupled devices (CCDs). Each pixel on a CCD converts light captured by the telescope into an electrical charge representing a specific detail that a computer can turn into a digital picture. LSST’s CCDs will capture deep space in unprecedented detail with 3.2-gigapixel sensors — that’s nearly 200 times larger than a high-end consumer camera.

    Each CCD operates individually, but they will all work together to render a complete image. Nine CCDs sit in a “raft,” or support structure, with their electronics packed underneath. The modularity that the LSST gains because of these rafts will allow for the incredibly quick sky surveys — reading 3 billion pixels in 2 seconds. It will also enable easier telescope maintenance since scientists can fix a single CCD instead of fixing the whole system, which will come in handy when the rafts are housed in a vacuum chamber kept at -100 degrees Celsius inside the telescope.

    The modularity of the rafts will also be a benefit during the installation and testing of the telescope base. Typically, when scientists build a telescope, they use a placeholder camera to test whether the mount and optics are working properly. Later, they install the full camera and sensors, after those instruments have undergone their own functional tests. But O’Connor said the LSST team will be able to use a single raft for initial testing on the mountain, allowing the scientists to measure the success of these components on the telescope itself.

    “We’re now finding some of the instrument effects emerging as we put the CCDs together at the laboratory phase, so we can prepare the type of software we need now,” O’Connor said. “But the sky tells you things you can’t easily measure in the lab.”

    To capture the clearest and most extensive picture of the cosmos, the CCDs must lie perfectly flat and have no more than a 250 micron (millionth of a meter) space between them. This requires painstaking assembly at Brookhaven, but at some point the sensors have to get to California to join the other parts, and then to Chile for operation. Mechanical engineers at Brookhaven are designing a stabilized shipping container to transport the sensitive CCDs across the country and continents.

    By the end of 2014, O’Connor said, his team hopes to have the first fully functional raft completed and tested. After that, he said, it will take four years to build and test the rest of the CCD rafts, which is on track to meet the “first light” deadline.

    “We have a well-defined job now. We can do our part while the other teams building the rest of the LSST do theirs,” O’Connor said. “This is a big project. This is the way science is going to solve big problems.”

    For more information, go to http://www.lsst.org.

    DOE’s Office of High Energy Physics funds the LSST camera development.

    See the full article here.

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 1:38 pm on June 27, 2014 Permalink | Reply
    Tags: , , , , Large Synoptic Survey Telescope (LSST),   

    From Symmetry: “Getting the jump on big data for LSST” 

    Symmetry

    June 27, 2014
    Lori Ann White

    Efforts are already underway to ensure that the data the Large Synoptic Survey Telescope collects will be ready to be mined for scientific gold.

    LSST Telescope
    LSST

    On the first night the Large Synoptic Survey Telescope points its 8.4-meter mirror toward the exquisitely dark skies over Chile—probably in the year 2022—its 3.2-billion-pixel camera will record 30 trillion bytes of data about the contents of the universe. On the second night, LSST will do it again. It will do it again and again, collecting unprecedented amounts of data every clear night for 10 years.

    By the end of its proposed mission, the LSST camera, designed and constructed at SLAC National Accelerator Laboratory, will have captured a full picture of the southern sky hundreds of times over.

    Scientists around the world will search trillions of bytes of LSST data for information about the universe on all scales. They will look for asteroids in the Earth’s backyard; map the Milky Way Galaxy; and study dark energy, the name given to whatever is causing the acceleration of the expansion of the entire universe.

    Cosmic Background Radiation Planck
    CMB Planck

    But getting those terabytes of raw data from the camera processed, polished and to the researchers’ computers in a usable form will be no small task. Cutting-edge computer applications will need to hold the data and mine it for scientific discoveries. These processing and database applications must work together flawlessly.

    Jacek Becla, technology officer for scientific databases at SLAC, leads the group at SLAC constructing the LSST database. Their design recently passed a “stress test” intended to determine whether the software could put more resources to effective use as more was asked of it.

    “We have a very solid prototype,” Becla says. “I’m actually quite confident we’ll be ready for LSST. We just have to stay focused.”

    The LSST processing software, which is being developed by a collaboration led by the Associated Universities for Research in Astronomy, has also proven itself through an ongoing series of “data challenges.” In these challenges, the software is used to analyze data from previous astronomical studies, including nine years of data from the Sloan Digital Sky Survey and a total of 450 nights of data collected over five years by the Legacy Survey at the Canada-France-Hawaii Telescope. The results of the challenges are compared with results from the original surveys, which can highlight bugs and verify that the software does what it’s been written to do.

    Canada-France-Hawaii Telescope
    Canada-France-Hawaii Telescope

    “These challenges have been very successful,” says LSST Director Steven Kahn. “They’ve already proved crucial algorithms are as good as—and in some cases better than—the software originally developed for the data.”

    To help spread the wealth, scientists have made all LSST software open-source.

    “The idea was to create software that’s available to the entire astrophysics community,” Kahn says. The Hyper Suprime-Cam, an 870-megapixel camera recently installed and commissioned on Japan’s Subaru Telescope, is already using an early version of LSST’s processing software.

    Subaru Telescope HyperCam
    >Hyper Suprime-Cam

    Subaru Telescope
    Subaru Telescope

    Meanwhile, Becla wants the database technology to be available to anyone who can put it to good use. “There have already been a lot of inquiries about the software: from Germany, from Brazil, from the United Kingdom,” he says.

    US financial support for the LSST construction comes from the National Science Foundation, the Department of Energy and private funding raised by the LSST Corporation, a non-profit 501(c)3 corporation formed in 2003, with its headquarters in Tucson, Arizona.

    Kahn says he sees their work as an indication that the worlds of “big data” and “high performance”—or supercomputing—are converging.

    “You need high-performance computing to run dark energy simulations; you have the big data you must compare the simulations to; and you have the big database to store the data,” he says. “LSST is a fantastic example.”

    See the full article http://www.symmetrymagazine.org/article/june-2014/getting-the-jump-on-big-data-for-lsst.

    Symmetry is a joint Fermilab/SLAC publication.



    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 10:58 pm on February 21, 2014 Permalink | Reply
    Tags: , , , , Large Synoptic Survey Telescope (LSST),   

    From SETI: “The Large Synoptic Survey Telescope – A New Way to Scan the Sky” 

    Undated
    Seth Shostak, Senior Astronomer

    It will be the mother of all telescopes, and you can bet it will do for astronomy what genome sequencing is doing for biology.

    The clumsy, if utilitarian, name of this mirrored monster is Large Synoptic Survey Telescope, or LSST. You can’t use it yet, but a peak in the Chilean Andes has been decapitated to provide a level spot for placement. This robotically operated sky-eye, with an aperture of 8.4 meters, should be up and running six years from now.

    LSST Telescope

    OK, but so what? After all, there are many new telescopes rolling down the pike these days, some of which will boast far larger optics than the LSST.

    The difference is in the way this scope will sponge data from the sky, and distribute it to the world. The LSST will be the first instrument designed from the pedestal up to work fast, to pile up petabytes of data, and to quickly notice any cosmic phenomena that go bump in the night.

    That last point is important. Generally speaking, most stuff you see in the heavens doesn’t change very quickly. The stars look the same from night to night. Nebulae and galaxies are dully immutable, maintaining the same overall appearance for thousands or millions of years. Indeed, only the Sun, moon and planets – together with the occasional comet, asteroid or meteor – seem dynamic.

    The principal reason for the universe’s poker face is that its constituents are far away. Stars careen through space, and galaxies spin at speeds thousands of times faster than a jet plane. But given their distance, you’d need the patience of Job to notice much change in their appearance or position.

    Nonetheless, we know of celestial circumstances that do change quickly. Stars can explode in minutes. Nearby asteroids capable of cratering your neighborhood can traverse the sky in hours. And surely the most interesting of all are the things we don’t know about: fast phenomena that have escaped our attention simply because astronomers have always used still cameras to photograph the cosmos.

    According to Mario Juric, the LSST’s Data Management Project Scientist, this new telescope will sport a massive, three-ton digital camera with a wide enough field of view to snap photos of the entire southern sky roughly every three days. Since the current plan is to operate the LSST for at least a decade, that means every object visible to this instrument will be imaged nearly a thousand times. Of course, those photos can be viewed in sequence, like a time-lapse film. As Juric says, “It’s a robot telescope that will make a movie of the sky.”

    The LSST camera is designed to provide a wide field of view with better than 0.2 arcsecond sampling and spectral sampling in five or more bands from 400nm to 1060nm. The image surface is flat with a diameter of approximately 64 cm. The detector format will be a circular mosaic providing over 3 Gigapixels per image. The camera includes a filter mechanism and, if necessary, shuttering capability. The camera is positioned in the middle of the telescope. 1/2012 Credit: LSSTC

    And this flick won’t be dull. Juric estimates that ten million transient objects will be photographed each clear night. Many of these will be asteroids prancing through our solar system, and the LSST will catalog millions of them, including 80 percent of the larger ones – rocks as big as a football field or more. Knowing the orbits of these dangerous projectiles will prompt us to deflect those coming our way. To know them is to shove them.

    That’s a nice bit of insurance against collision catastrophe, but the truly revolutionary thing about the LSST is what it will do for fundamental astronomy. It’s a safe bet that this telescope will discover multitudes of extraordinary events such as colliding neutron stars or other exotica worthy of intense, and immediate, study by others. So part of the LSST’s design is to send out electronic alerts within 60 seconds of sensing one of these fast-action, “bump in the night” events.

    With this laundry list of extraordinary capabilities, and its formidable cache of data, the LSST will revolutionize astronomy. Yes, that’s a cliché, but sometimes clichés are appropriate.

    Juric notes that, historically speaking, “astronomy has always been a data-starved science.” That’s because the most interesting research topics are inevitably on the edge of visibility, and consequently require using the largest telescopes.

    The result is a bottleneck. In the not-so-good old days, a working astronomer might get a few nights a year on a world-class instrument, and could generally observe a few dozen or a few hundred objects. Sure, there were published surveys that cataloged reasonably big swaths of the sky, but those data were not always adequate to address the kinds of interesting questions scientists cook up. So big-time astronomers were often intimately dependent on big-time telescopes.

    The LSST may cause a breakup of that exclusive relationship. After its first decade of operation, this new scope will have an image collection of about 20 billion galaxies and 17 billion stars. The data set will tally hundreds of millions of gigabytes. That’s truly big data, and the good news is that anyone will be able to scour through it – via the laptop in their office, or using their smart phone while waiting for the bus.

    So here’s the big shift: For four hundred years, the relationship between astronomers and their telescopes has been as fundamental as that between a psychiatrist and his couch. But once the LSST starts digitizing the sky, it will alter the paradigm dramatically. Scientists will interact with data, not with an instrument. Rather than acquiring photos or spectra to prove or disprove their ideas, they will be free to trawl immense quantities of data, and see what falls out.

    It will be as if biologists and zoologists suddenly had access to detailed information for every single species on Earth (and thousands of examples of each). Think of the sort of investigations even amateurs could make.

    The LSST will be a game changer, and it could very turn out that the biggest discoveries about the past, present and future of the cosmos will come not from tweedy astronomers, but from the keen and curious non-specialists sitting at home, faced off against their laptops. Galileo would have been stupefied.

    See the full article here.

    SETI Institute – 189 Bernardo Ave., Suite 100
    Mountain View, CA 94043
    Phone 650.961.6633 – Fax 650-961-7099


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 9:38 am on February 7, 2014 Permalink | Reply
    Tags: , , , , Large Synoptic Survey Telescope (LSST)   

    From Brookhaven Lab: “Aim for the Sky!” 

    Brookhaven Lab

    February 7, 2014
    Jane Koropsak

    Brookhaven Lab’s Justine Haupt Helping to Build the World’s Largest Digital Camera

    jh
    Design Engineer Justine Haupt is pictured in front of the cryostat she designed for testing LSST’s electro-optic sensor modules. She is holding one of the compact front-end electronic assemblies that will enable the camera to be read out at a remarkable 1.5 billion pixels per second.

    When completed, the Large Synoptic Survey Telescope (LSST) will be the world’s largest digital camera. It’s been called the widest, fastest, deepest eye of the new digital age. This remarkable telescope, to be stationed on a mountaintop in Chile, promises to cast light on mysteries fundamental to our understanding of the universe. It will scan the sky rapidly and chart objects that change or move, including exploding supernovae and potentially hazardous near-Earth asteroids. LSST’s images will trace billions of remote galaxies allowing probing of mysterious dark matter and dark energy. Its uniquely wide field of view will allow LSST to observe large areas of the sky at once and move quickly between images. It will be able to take more than 800 panoramic images each night and cover the entire sky twice a week. And that is just a brief description. The LSST is fascinating—and Brookhaven Lab is playing a big role in the project.

    Meet Engineering Whiz Justine Haupt

    When you meet Justine Haupt you will immediately recognize her calm intelligence and positive attitude. Haupt works in the Lab’s Instrumentation Division designing and building prototypes for the LSST.

    “She thinks ‘outside the box’ and comes up with methods that will streamline testing of the LSST’s focal plane components,” said Brookhaven researcher Paul O’Connor, who mentors Haupt. “She is inventive, spirited, and remarkably creative. In the four years she has worked at the Lab, she has constructed more than two dozen pieces of equipment, ranging from custom catadioptric lens systems to a microprocessor-controlled, in-vacuum induction motor. She also keeps our group’s 3D printer busy turning out parts for her various creations. She has managed to develop a mastery of mechanical, optical, and electrical design.” For her “impressive range of excellence” Haupt received the 2014 Rising Engineering Star award from Mouser Electronics and Design News.

    at work
    At work, Justine Haupt designs and builds prototypes for the LSST, but in her spare time this design engineer takes to the sky in a plane or paragliding.

    You might think work would keep Haupt busy enough, but the young researcher takes her quest for knowledge of the sky and our universe a step further. In her spare time, she is an avid paragliding pilot. She got her pilot’s license at age 18, has performed some flying acrobatics, and holds an FAA certified Advanced Ground Instructor rating. She refurbished the avionics of a 1947 Stinson Voyager (a single-engine plane) and installed a new intercom and strobe system. She volunteers and sits on the board of directors at the Custer Observatory, located on the east end of Long Island in Southold, where she routinely reviews and re-engineers the instrumentation for the largest telescope on Long Island.

    “Flying in either a plane or paragliding is exhilarating,” said Haupt. “And, being part of a team that is designing and building a telescope that may very well be the center of the United States’ astronomy program feels great! This job has far exceeded my expectations. I learn something new every day.”

    In addition, Haupt volunteers at the Laboratory giving talks to students and visitors about her work. She also finds time to play piano, trombone, and do some fiddling during her lunch hour with fellow employees and musicians, Paul O’Connor, Peter Siddons, Sean McCorkle, and Cindy Salwen. “We casually call ourselves the Stochastic Orchestra and have performed at venues like the Custer Observatory, but mostly we just enjoy meeting at lunchtime and jamming,” said Haupt.

    “For fun,” she added, “I’ve been working on developing a new holographic projection technique and exploring an idea for a new class of solar astronomy instrumentation.

    “These are things that could even eventually turn into business ventures. For now, I’m grateful for my job at the Lab and the time I get to take to the sky.”

    See the full article here.

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 3:34 pm on April 24, 2012 Permalink | Reply
    Tags: , , , , Large Synoptic Survey Telescope (LSST),   

    From SLAC News Center: “World’s Largest Digital Camera Project Passes Critical Milestone” 

    April 24, 2012
    by Andy Freeberg

    “A 3.2 billion-pixel digital camera designed by SLAC is now one step closer to reality. The Large Synoptic Survey Telescope camera, which will capture the widest, fastest and deepest view of the night sky ever observed, has received “Critical Decision 1” approval by the U.S. Department of Energy (DOE) to move into the next stage of the project.

    The Large Synoptic Survey Telescope (LSST) will survey the entire visible sky every week, creating an unprecedented public archive of data – about 6 million gigabytes per year, the equivalent of shooting roughly 800,000 images with a regular eight-megapixel digital camera every night, but of much higher quality and scientific value. Its deep and frequent cosmic vistas will help answer critical questions about the nature of dark energy and dark matter and aid studies of near-Earth asteroids, Kuiper belt objects, the structure of our galaxy and many other areas of astronomy and fundamental physics.

    ‘With 189 sensors and over 3 tons of components that have to be packed into an extremely tight space, you can imagine this is a very complex instrument,’ said Nadine Kurita, the project manager for the LSST camera at SLAC. ‘But given the enormous challenges required to provide such a comprehensive view of the universe, it’s been an incredible opportunity to design something so unique.’ “

    See the full article here.

    The effort to build the LSST is led by the LSST Corporation, a non-profit 501(c)3 corporation formed in 2003, with headquarters in Tucson, AZ. Financial support for LSST comes from the National Science Foundation with additional contributions from private foundation gifts, grants to universities, and in-kind support from Department of Energy laboratories and other LSST Member Institutions. In 2011, the LSST construction project was established as an operating center under management of the Association of Universities for Research in Astronomy (AURA).

    Institutional Members
    Last Revision 1/19/2011

    Adler Planetarium
    Brookhaven National Laboratory (BNL)
    California Institute of Technology
    Carnegie Mellon University
    Chile
    Cornell University
    Drexel University
    Fermi National Accelerator Laboratory
    George Mason University
    Google, Inc.
    Harvard-Smithsonian Center for Astrophysics
    Institut de Physique Nucleaire et de Physique des Particules (IN2P3)
    Johns Hopkins University
    Kavli Institute for Particle Astrophysics and Cosmology (KIPAC) – Stanford University
    Las Cumbres Observatory Global Telescope Network, Inc.
    Lawrence Livermore National Laboratory (LLNL)
    Los Alamos National Laboratory (LANL)
    National Optical Astronomy Observatory*
    National Radio Astronomy Observatory
    Princeton University
    Purdue University
    Research Corporation for Science Advancement*
    Rutgers University
    SLAC National Accelerator Laboratory
    Space Telescope Science Institute
    Texas A & M University
    The Pennsylvania State University
    The University of Arizona*
    University of California at Davis
    University of California at Irvine
    University of Illinois at Urbana-Champaign
    University of Michigan
    University of Pennsylvania
    University of Pittsburgh
    University of Washington*
    Vanderbilt University

    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science. i1

     
  • richardmitnick 3:26 pm on March 28, 2012 Permalink | Reply
    Tags: , , , Large Synoptic Survey Telescope (LSST)   

    Brookhaven Lab and the LSST – Large Synoptic Survey Telescope 


    LSST

    Brookhaven Builds the Digital Film

    Brookhaven Lab leads the development of the sensors for the LSST, the array of precise and sensitive electronics that capture images within the digital camera. The efficacy of LSST research hinges upon its massive resolution of 3.2 gigapixels – that’s nearly 200 times larger than a high-end consumer camera. The unique charge-coupled device (CCD) sensors, designed in Brookhaven’s Instrumentation Division, are sensitive to light beyond the visible spectrum and have a much faster readout time than those in today’s most advanced astronomical cameras. Brookhaven scientists are constructing a grid of 200 individual sensors, which will act in concert to render a complete image.

    lsst

    The Large Synoptic Survey Telescope (LSST) will peer into space as no other telescope can. This new facility will create an unparalleled wide-field astronomical survey of our universe – wider and deeper in volume than all previous telescopes combined. The combination of a 3200 megapixel camera sensor array, a powerful supercomputer, a cutting-edge data processing and distribution network, and a massive telescope stationed on a mountaintop in Chile promises to cast light on mysteries fundamental to our understanding of the universe. From the distant signatures of dark energy to the dangers of near-earth asteroids, LSST will capture it all. Three central considerations dictated the design of LSST: wide, fast, and deep.”

    See the full article from Brookhaven here.

    Find out more about the LSST starting here.

    Institutional Members
    Last Revision 1/19/2011
    Adler Planetarium
    Brookhaven National Laboratory (BNL)
    California Institute of Technology
    Carnegie Mellon University
    Chile
    Cornell University
    Drexel University
    Fermi National Accelerator Laboratory
    George Mason University
    Google, Inc.
    Harvard-Smithsonian Center for Astrophysics
    Institut de Physique Nucleaire et de Physique des Particules (IN2P3)
    Johns Hopkins University
    Kavli Institute for Particle Astrophysics and Cosmology (KIPAC) – Stanford University
    Las Cumbres Observatory Global Telescope Network, Inc.
    Lawrence Livermore National Laboratory (LLNL)
    Los Alamos National Laboratory (LANL)
    National Optical Astronomy Observatory
    National Radio Astronomy Observatory
    Princeton University
    Purdue University
    Research Corporation for Science Advancement
    Rutgers University
    SLAC National Accelerator Laboratory
    Space Telescope Science Institute
    Texas A & M University
    The Pennsylvania State University
    The University of Arizona
    University of California at Davis
    University of California at Irvine
    University of Illinois at Urbana-Champaign
    University of Michigan
    University of Pennsylvania
    University of Pittsburgh
    University of Washington
    Vanderbilt University

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 1:39 pm on November 8, 2011 Permalink | Reply
    Tags: , , , Large Synoptic Survey Telescope (LSST),   

    From SLAC News Center: “SLAC-led Project to Build World’s Largest Digital Camera Impresses DOE Panel” 

    November 8, 2011
    Mike Ross

    “A U.S. Department of Energy review panel last week gave a glowing endorsement for the SLAC-led project to create the world’s largest digital camera, which will enable a new telescope being built on a Chilean mountaintop to investigate key astronomical questions ranging from dark matter and dark energy to near-Earth asteroids.

    After two and a half days of presentations and meetings at SLAC, the panel of 19 experts recommended that the 3.2 gigapixel (billion pixel) camera for the Large Synoptic Survey Telescope receive Critical Decision-1 status, the DOE’s project-management milestone that defines a large project’s approach and funding for achieving a specific scientific mission.

    i2
    Once constructed, the Large Synoptic Survey Telescope’s 3.2-billion-pixel camera will be the largest digital camera in the world. Roughly the size of a small car, the camera will take 800 panoramic images each night, surveying the entire southern sky twice a week. These images will enable researchers to create a 3-D map of the universe with unprecedented depth and detail, and could shed light on the fundamental properties of dark energy and dark matter. The camera¹s design team, led by SLAC and Stanford University researchers at the Kavli Institute for Particle Astrophysics and Cosmology, recently reached a key milestone in the project and will undergo a preliminary design review this summer.

    The camera, which will be built at SLAC, is expected to cost about one third of the nearly $500 million price tag for the new telescope, which is being borne by the DOE and the National Science Foundation, as well as several public and private organizations in the United States and abroad.

    ‘The LSST Camera Project team is experienced and has demonstrated a good working relationship,’ said Kurt Fisher, DOE/SC Review Chairperson from DOE’s Office of Project Assessment. “The initial, plenary presentations were impressive, and the team was well-prepared for the review.”

    See the full article here.

    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science. i1

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 345 other followers

%d bloggers like this: