Tagged: Large Synoptic Survey Telescope (LSST) Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:28 pm on January 14, 2017 Permalink | Reply
    Tags: , , , , Large Synoptic Survey Telescope (LSST), , Twinkles   

    From Symmetry: “Twinkle, twinkle, little supernova” 

    Symmetry Mag

    Ricarda Laasch

    Phil Marshall, SLAC

    Using Twinkles, the new simulation of images of our night sky, scientists get ready for a gigantic cosmological survey unlike any before.

    Almost every worthwhile performance is preceded by a rehearsal, and scientific performances are no exception. Engineers test a car’s airbag deployment using crash test dummies before incorporating them into the newest model. Space scientists fire a rocket booster in a test environment before attaching it to a spacecraft in flight.

    One of the newest “training grounds” for astrophysicists is called Twinkles. The Twinkles dataset, which has not yet been released, consists of thousands of simulated, highly realistic images of the night sky, full of supernovae and quasars. The simulated-image database will help scientists rehearse a future giant cosmological survey called LSST.

    LSST/Camera, built at SLAC
    LSST/Camera, built at SLAC
    LSST Interior
    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.
    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    LSST, short for the Large Synoptic Survey Telescope, is under construction in Chile and will conduct a 10-year survey of our universe, covering the entire southern sky once a year. Scientists will use LSST images to explore our galaxy to learn more about supernovae and to shine a light on the mysterious dark energy that is responsible for the expansion of our universe.

    It’s a tall order, and it needs a well prepared team. Scientists designed LSST using simulations and predictions for its scientific capabilities. But Twinkles’ thousands of images will give them an even better chance to see how accurately their LSST analysis tools can measure the changing brightness of supernovae and quasars. That’s the advantage of using simulated data. Scientists don’t know about all the objects in the sky above our heads, but they do know their simulated sky— there, they already know the answers. If the analysis tools make a calculation error, they’ll see it.

    The findings will be a critical addition to LSST’s measurements of certain cosmological parameters, where a small deviation can have a huge impact on the outcome.

    “We want to understand the whole path of the light: From other galaxies through space to our solar system and our planet, then through our atmosphere to the telescope – and from there through our data-taking system and image processing,” says Phil Marshall, a scientist at the US Department of Energy’s SLAC National Accelerator Laboratory who leads the Twinkles project. “Twinkles is our way to go all the way back and study the whole picture instead of one single aspect.”

    Scientists simulate the images as realistically as possible to figure out if some systematic errors add up or intertwine with each other. If they do, it could create unforeseen problems, and scientists of course want to deal with them before LSST starts.

    Twinkles also lets scientists practice sorting out a different kind of problem: A large collaboration spread across the whole globe that will perform numerous scientific searches simultaneously on the same massive amounts of data.

    Richard Dubois, senior scientist at SLAC and co-leader of the software infrastructure team, works with his team of computing experts to create methods and plans to deal with the data coherently across the whole collaboration and advise the scientists to choose specific tools to make their life easier.

    “Chaos is a real danger; so we need to keep it in check,” Dubois says. “So with Twinkles, we test software solutions and databases that help us to keep our heads above water.”

    The first test analysis using Twinkles images will start toward the end of the year. During the first go, scientists extract type 1a supernovae and quasars and learn how to interpret the automated LSST measurements.

    “We hid both types of objects in the Twinkles data,” Marshall says. “Now we can see whether they look the way they’re supposed to.”

    LSST will start up in 2022, and the first LSST data will be released at the end of 2023.

    “High accuracy cosmology will be hard,” Marshall says. “So we want to be ready to start learning more about our universe right away!”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 2:40 pm on November 25, 2016 Permalink | Reply
    Tags: , , GridPP, Large Synoptic Survey Telescope (LSST), Shear brilliance: computing tackles the mystery of the dark universe,   

    From U Manchester: “Shear brilliance: computing tackles the mystery of the dark universe” 

    U Manchester bloc

    University of Manchester

    24 November 2016
    No writer credit found

    Scientists from The University of Manchester working on a revolutionary telescope project have harnessed the power of distributed computing from the UK’s GridPP collaboration to tackle one of the Universe’s biggest mysteries – the nature of dark matter and dark energy.

    Researchers at The University of Manchester have used resources provided by GridPP – who represent the UK’s contribution to the computing grid used to find the Higgs boson at CERN – to run image processing and machine learning algorithms on thousands of images of galaxies from the international Dark Energy Survey.

    Dark Energy Icon

    The Manchester team are part of the collaborative project to build the Large Synoptic Survey Telescope (LSST), a new kind of telescope currently under construction in Chile and designed to conduct a 10-year survey of the dynamic Universe. LSST will be able to map the entire visible sky.

    LSST/Camera, built at SLAC
    LSST/Camera, built at SLAC

    LSST Interior
    LSST telescope, currently under construction at Cerro Pachón Chile
    LSST telescope, currently under construction at Cerro Pachón Chile

    In preparation to the LSST starting its revolutionary scanning, a pilot research project has helped researchers detect and map out the cosmic shear seen across the night sky, one of the tell-tale signs of the dark matter and dark energy thought to make up some 95 per cent of what we see in the Universe. This in turn will help prepare for the analysis of the expected 200 petabytes of data the LSST will collect when it starts operating in 2023.

    The pilot research team based at The Manchester of University was led by Dr Joe Zuntz, a cosmologist originally at Manchester’s Jodrell Bank Observatory and now a researcher at the Royal Observatory in Edinburgh.

    “Our overall aim is to tackle the mystery of the dark universe – and this pilot project has been hugely significant. When the LSST is fully operating researchers will face a galactic data deluge – and our work will prepare us for the analytical challenge ahead.”
    Sarah Bridle, Professor of Astrophysics

    Dr George Beckett, the LSST-UK Science Centre Project Manager based at The University of Edinburgh, added: “The pilot has been a great success. Having completed the work, Joe and his colleagues are able to carry out shear analysis on vast image sets much faster than was previously the case. Thanks are due to the members of the GridPP community for their assistance and support throughout.”

    The LSST will produce images of galaxies in a wide variety of frequency bands of the visible electromagnetic spectrum, with each image giving different information about the galaxy’s nature and history. In times gone by, the measurements needed to determine properties like cosmic shear might have been done by hand, or at least with human-supervised computer processing.

    With the billions of galaxies expected to be observed by LSST, such approaches are unfeasible. Specialised image processing and machine learning software (Zuntz 2013) has therefore been developed for use with galaxy images from telescopes like LSST and its predecessors. This can be used to produce cosmic shear maps like those shown in the figure below. The challenge then becomes one of processing and managing the data for hundreds of thousands of galaxies and extracting scientific results required by LSST researchers and the wider astrophysics community.

    As each galaxy is essentially independent of other galaxies in the catalogue, the image processing workflow itself is highly parallelisable. This makes it an ideal problem to tackle with the kind of High-Throughput Computing (HTP) resources and infrastructure offered by GridPP. In many ways, the data from CERN’s Large Hadron Collider particle collision events is like that produced by a digital camera (indeed, pixel-based detectors are used near the interaction points) – and GridPP regularly processes billions of such events as part of the Worldwide LHC Computing Grid (WLCG).

    A pilot exercise, led by Dr Joe Zuntz while at The University of Manchester and supported by one of the longest serving and most experienced GridPP experts, Senior System Administrator Alessandra Forti, saw the porting of the image analysis workflow to GridPP’s distributed computing infrastructure. Data from the Dark Energy Survey (DES) was used for the pilot.

    After transferring this data from the US to GridPP Storage Elements, and enabling the LSST Virtual Organisation on a number of GridPP Tier-2 sites, the IM3SHAPE analysis software package (Zuntz, 2013) was tested on local, grid-friendly client machines to ensure smooth running on the grid. Analysis jobs were then submitted and managed using the Ganga software suite, which is able to coordinate the thousands of individual analyses associated with each batch of galaxies. Initial runs were submitted using Ganga to local grid sites, but the pilot progressed to submission to multiple sites via the GridPP DIRAC (Distributed Infrastructure with Remote Agent Control) service. The flexibility of Ganga allows both types of submission, which made the transition from local to distributed running significantly easier.

    By the end of pilot, Dr Zuntz was able to run the image processing workflow on multiple GridPP sites, regularly submitting thousands of analysis jobs on DES images.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Manchester campus

    The University of Manchester (UoM) is a public research university in the city of Manchester, England, formed in 2004 by the merger of the University of Manchester Institute of Science and Technology (renamed in 1966, est. 1956 as Manchester College of Science and Technology) which had its ultimate origins in the Mechanics’ Institute established in the city in 1824 and the Victoria University of Manchester founded by charter in 1904 after the dissolution of the federal Victoria University (which also had members in Leeds and Liverpool), but originating in Owens College, founded in Manchester in 1851. The University of Manchester is regarded as a red brick university, and was a product of the civic university movement of the late 19th century. It formed a constituent part of the federal Victoria University between 1880, when it received its royal charter, and 1903–1904, when it was dissolved.

    The University of Manchester is ranked 33rd in the world by QS World University Rankings 2015-16. In the 2015 Academic Ranking of World Universities, Manchester is ranked 41st in the world and 5th in the UK. In an employability ranking published by Emerging in 2015, where CEOs and chairmen were asked to select the top universities which they recruited from, Manchester placed 24th in the world and 5th nationally. The Global Employability University Ranking conducted by THE places Manchester at 27th world-wide and 10th in Europe, ahead of academic powerhouses such as Cornell, UPenn and LSE. It is ranked joint 56th in the world and 18th in Europe in the 2015-16 Times Higher Education World University Rankings. In the 2014 Research Excellence Framework, Manchester came fifth in terms of research power and seventeenth for grade point average quality when including specialist institutions. More students try to gain entry to the University of Manchester than to any other university in the country, with more than 55,000 applications for undergraduate courses in 2014 resulting in 6.5 applicants for every place available. According to the 2015 High Fliers Report, Manchester is the most targeted university by the largest number of leading graduate employers in the UK.

    The university owns and operates major cultural assets such as the Manchester Museum, Whitworth Art Gallery, John Rylands Library and Jodrell Bank Observatory which includes the Grade I listed Lovell Telescope.

  • richardmitnick 9:32 am on October 26, 2016 Permalink | Reply
    Tags: , Large Synoptic Survey Telescope (LSST), Scheduling algorithm for LSST   

    From Harvard John A Paulson School of Engineering and Applied Sciences: “Eye on the sky” 

    Harvard School of Engineering and Applied Sciences
    harvard John A Paulson School of Engineering and Applied Sciences

    October 26, 2016
    Adam Zewe

    Student uses computer science to chart a course for massive telescope

    When it begins operating in 2022, the $500 million Large Synoptic Survey Telescope (LSST) will capture some of the sharpest night sky images ever produced, giving scientists an unprecedented view of near-Earth asteroids, supernovae, and the Milky Way galaxy.

    But the telescope, under construction atop a peak is Chile’s northern Andes, also presents an unprecedented challenge for astrophysicists—it will require a complicated scheduling algorithm to determine where to point the telescope as it traces the sky. To Harvard student Daniel Rothchild, that sounded like a puzzle he could solve.

    “This is not a well-studied problem in astrophysics because there has never been a telescope that behaved like this,” said Rothchild, A.B. ’17, a physics concentrator who is pursing a secondary in computer science at the John A. Paulson School of Engineering and Applied Sciences. “But scheduling is a well-studied problem in computer science. It is very important that the scheduler be effective, or the telescope is not going to be looking at the places that will yield the best data.”

    Working with Christopher Stubbs, Samuel C. Moncher Professor of Physics and Astronomy, who is a contributor to the LSST project, Rothchild launched an independent research project to develop a scheduling algorithm that would be effective in this unique situation.

    LSST/Camera, built at SLAC
    LSST/Camera, built at SLAC
    LSST Interior
    LSST telescope, currently under construction at Cerro Pachón Chile
    LSST telescope, currently under construction at Cerro Pachón Chile

    The LSST, which will image the entire night sky every three days, will stop at each point for 30 seconds before moving onto a new field. Longer calculation time leads to a much more complicated algorithm and that could easily bog down the telescope’s progress. The algorithm must also overcome the challenge of determining the “best” place for the telescope to look, when there are literally 10 billion possibilities.

    “How do you decide if Milky Way astronomy is more important than asteroid science on this particular 30-second exposure?” Rothchild asked. “It’s very difficult for scientists to say, here’s an exact quantification of how important these different areas are.”

    Rather than using machine-learning or mathematical merit functions to determine the ideal next field, Rothchild is writing code that will give the telescope a baseline optimal path to follow, along with instructions for how to respond when faced with adverse weather and unexpected downtime.

    Programming a set path for the entire 10-year span of the project allows scientists to explicitly optimize global properties of the telescope’s data, instead of hoping the merit functions or machine-learning algorithms will perform those optimizations themselves, he said. It also eliminates the headaches of trying to determine why the computer pointed the telescope at a certain location, or troubleshooting a machine-learning algorithm that seems to be aiming the telescope far off the best course.

    “There are certain astronomical elements that are fixed, even 10 years out. We know the moon will be moving a certain way and the stars will appear in specific patterns and locations, and we also know the meridian is generally the best place to point the telescope because there is the least amount of air overhead,” he said. “By programming these considerations into the scheduler explicitly, I hope to create an algorithm that will produce better schedules than those produced with existing methods.”

    His code lays out a path for the telescope to follow using a combination of astronomical data and meteorological predictions. Rothchild’s method involves much faster calculations than other scheduler algorithms because there are no machine-learning elements.

    Several other researchers are working on schedulers, and all have taken a slightly different approach. Once the telescope hardware is complete, the LSST leadership team will test each scheduler and select the one to use.

    Though he still has six years to wait before the LSST has its eye on the sky, Rothchild is excited for the opportunity to contribute to such a significant astrophysics project.

    “The LSST will produce about 15 terabytes of data each night for 10 years. By comparison, the Hubble telescope produces 10 terabytes of data in one year,” he said. “This project is going to enable scientists to take precision measurements of the universe in an unprecedented way. It is very cool to be a part of that.”

    Currently under construction in Chile, the LSST will incorporate the world’s largest digital camera. (Photo credit: LSST.)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Through research and scholarship, the Harvard School of Engineering and Applied Sciences (SEAS) will create collaborative bridges across Harvard and educate the next generation of global leaders. By harnessing the power of engineering and applied sciences we will address the greatest challenges facing our society.

    Specifically, that means that SEAS will provide to all Harvard College students an introduction to and familiarity with engineering and technology as this is essential knowledge in the 21st century.

    Moreover, our concentrators will be immersed in the liberal arts environment and be able to understand the societal context for their problem solving, capable of working seamlessly withothers, including those in the arts, the sciences, and the professional schools. They will focus on the fundamental engineering and applied science disciplines for the 21st century; as we will not teach legacy 20th century engineering disciplines.

    Instead, our curriculum will be rigorous but inviting to students, and be infused with active learning, interdisciplinary research, entrepreneurship and engineering design experiences. For our concentrators and graduate students, we will educate “T-shaped” individuals – with depth in one discipline but capable of working seamlessly with others, including arts, humanities, natural science and social science.

    To address current and future societal challenges, knowledge from fundamental science, art, and the humanities must all be linked through the application of engineering principles with the professions of law, medicine, public policy, design and business practice.

    In other words, solving important issues requires a multidisciplinary approach.

    With the combined strengths of SEAS, the Faculty of Arts and Sciences, and the professional schools, Harvard is ideally positioned to both broadly educate the next generation of leaders who understand the complexities of technology and society and to use its intellectual resources and innovative thinking to meet the challenges of the 21st century.

    Ultimately, we will provide to our graduates a rigorous quantitative liberal arts education that is an excellent launching point for any career and profession.

  • richardmitnick 3:20 pm on September 5, 2014 Permalink | Reply
    Tags: , , , , Large Synoptic Survey Telescope (LSST),   

    From Quanta via FNAL: “A Digital Copy of the Universe, Encrypted” 2013 

    Quanta Magazine
    Quanta Magazine

    October 2, 2013
    Natalie Wolchover

    Even as he installed the landmark camera that would capture the first convincing evidence of dark energy in the 1990s, Tony Tyson, an experimental cosmologist now at the University of California, Davis, knew it could be better. The camera’s power lay in its ability to collect more data than any other. But digital image sensors and computer processors were progressing so rapidly that the amount of data they could collect and store would soon be limited only by the size of the telescopes delivering light to them, and those were growing too. Confident that engineering trends would hold, Tyson envisioned a telescope project on a truly grand scale, one that could survey hundreds of attributes of billions of cosmological objects as they changed over time.

    It would record, Tyson said, “a digital, color movie of the universe.”

    Tyson’s vision has come to life as the Large Synoptic Survey Telescope (LSST) project, a joint endeavor of more than 40 research institutions and national laboratories that has been ranked by the National Academy of Sciences as its top priority for the next ground-based astronomical facility. Set on a Chilean mountaintop, and slated for completion by the early 2020s, the 8.4-meter LSST will be equipped with a 3.2-billion-pixel digital camera that will scan 20 billion cosmological objects 800 times apiece over the course of a decade. That will generate well over 100 petabytes of data that anyone in the United States or Chile will be able to peruse at will. Displaying just one of the LSST’s full-sky images would require 1,500 high-definition TV screens.


    LSST Exterior
    LSST Camera
    LSST Exterior, Camera, Interior

    The LSST epitomizes the new era of big data in physics and astronomy. Less than 20 years ago, Tyson’s cutting-edge digital camera filled 5 gigabytes of disk space per night with revelatory information about the cosmos. When the LSST begins its work, it will collect that amount every few seconds — literally more data than scientists know what to do with.

    Tony Tyson, an experimental cosmologist at the University of California, Davis, with a small test camera for the Large Synoptic Survey Telescope project, which he is helping to launch.
    Peter DaSilva for Quanta Magazine

    “The data volumes we [will get] out of LSST are so large that the limitation on our ability to do science isn’t the ability to collect the data, it’s the ability to understand the systematic uncertainties in the data,” said Andrew Connolly, an astronomer at the University of Washington.

    Typical of today’s costly scientific endeavors, hundreds of scientists from different fields are involved in designing and developing the LSST, with Tyson as chief scientist. “It’s sort of like a federation,” said Kirk Borne, an astrophysicist and data scientist at George Mason University. The group is comprised of nearly 700 astronomers, cosmologists, physicists, engineers and data scientists.

    Much of the scientists’ time and about one-half of the $1 billion cost of the project are being spent on developing software rather than hardware, reflecting the exponential growth of data since the astronomy projects of the 1990s. For the telescope to be useful, the scientists must answer a single question. As Borne put it: “How do you turn petabytes of data into scientific knowledge?”

    Physics has been grappling with huge databases longer than any other field of science because of its reliance on high-energy machines and enormous telescopes to probe beyond the known laws of nature. This has given researchers a steady succession of models upon which to structure and organize each next big project, in addition to providing a starter kit of computational tools that must be modified for use with ever larger and more complex data sets.

    Even backed by this tradition, the LSST tests the limits of scientists’ data-handling abilities. It will be capable of tracking the effects of dark energy, which is thought to make up a whopping 68 percent of the total contents of the universe, and mapping the distribution of dark matter, an invisible substance that accounts for an additional 27 percent. And the telescope will cast such a wide and deep net that scientists say it is bound to snag unforeseen objects and phenomena too. But many of the tools for disentangling them from the rest of the data don’t yet exist.

    New Dimensions

    Particle physics is the elder statesman of big data science. For decades, high-energy http://en.wikipedia.org/wiki/Particle_accelerator
    have been bashing particles together millions of times per second in hopes of generating exotic, never-before-seen particles. These facilities, such as the Large Hadron Collider (LHC) at CERN laboratory in Switzerland, generate so much data that only a tiny fraction (deemed interesting by an automatic selection process) can be kept. A network of hundreds of thousands of computers spread across 36 countries called the Worldwide LHC Computing Grid stores and processes the 25 petabytes of LHC data that were archived in a year’s worth of collisions. The work of thousands of physicists went into finding the bump in that data that last summer was deemed representative of a new subatomic particle, the Higgs boson.

    CERN, the organization that operates the LHC, is sharing its wisdom by working with other research organizations “so they can benefit from the knowledge and experience that has been gathered in data acquisition, processing and storage,” said Bob Jones, head of CERN openlab, which develops new IT technologies and techniques for the LHC. Scientists at the European Space Agency, the European Molecular Biology Laboratory, other physics facilities and even collaborations in the social sciences and humanities have taken cues from the LHC on data handling, Jones said.

    When the LHC turns back on in 2014 or 2015 after an upgrade, higher energies will mean more interesting collisions, and the amount of data collected will grow by a significant factor. But even though the LHC will continue to possess the biggest data set in physics, its data is much simpler than those obtained from astronomical surveys such as the Sloan Digital Sky Survey and Dark Energy Survey and — to an even greater extent — those that will be obtained from future sky surveys such as the Square Kilometer Array, a radio telescope project set to begin construction in 2016, and the LSST.

    Sloan Digital Sky Survey Telescope
    SSDS Telescope


    SKA Square Kilometer Array

    “The LHC generates a lot more data right at the beginning, but they’re only looking for certain events in that data and there’s no correlation between events in that data,” said Jeff Kantor, the LSST data management project manager. “Over time, they still build up large sets, but each one can be individually analyzed.”

    In combining repeat exposures of the same cosmological objects and logging hundreds rather than a handful of attributes of each one, the LSST will have a whole new set of problems to solve. “It’s the complexity of the LSST data that’s a challenge,” Tyson said. “You’re swimming around in this 500-dimensional space.”

    From color to shape, roughly 500 attributes will be recorded for every one of the 20 billion objects surveyed, and each attribute is treated as a separate dimension in the database. Merely cataloguing these attributes consistently from one exposure of a patch of the sky to the next poses a huge challenge. “In one exposure, the scene might be clear enough that you could resolve two different galaxies in the same spot, but in another one, they might be blurred together,” Kantor said. “You have to figure out if it’s one galaxy or two or N.”

    Beyond N-Squared

    To tease scientific discoveries out of the vast trove of data gathered by the LSST and other sky surveys, scientists will need to pinpoint unexpected relationships between attributes, which is extremely difficult in 500 dimensions. Finding correlations is easy with a two-dimensional data set: If two attributes are correlated, then there will be a one-dimensional curve connecting the data points on a two-dimensional plot of one attribute versus the other. But additional attributes plotted as extra dimensions obscure such curves. “Finding the unexpected in a higher-dimensional space is impossible using the human brain,” Tyson said. “We have to design future computers that can in some sense think for themselves.”

    Algorithms exist for “reducing the dimensionality” of data, or finding surfaces on which the data points lie (like that 1-D curve in the 2-D plot), in order to find correlated dimensions and eliminate “nuisance” ones. For example, an algorithm might identify a 3-D surface of data points coursing through a database, indicating that three attributes, such as the type, size and rotation speed of galaxies, are related. But when swamped with petabytes of data, the algorithms take practically forever to run.

    Identifying correlated dimensions is exponentially more difficult than looking for a needle in a haystack. “That’s a linear problem,” said Alex Szalay, a professor of astronomy and computer science at Johns Hopkins University. “You search through the haystack and whatever looks like a needle you throw in one bucket and you throw everything else away.” When you don’t know what correlations you’re looking for, however, you must compare each of the N pieces of hay with every other piece, which takes N-squared operations.

    Adding to the challenge is the fact that the amount of data is doubling every year. “Imagine we are working with an algorithm that if my data doubles, I have to do four times as much computing and then the following year, I have to do 16 times as much computing,” Szalay said. “But by next year, my computers will only be twice as fast, and in two years from today, my computers will only be four times as fast, so I’m falling farther and farther behind in my ability to do this.”

    A huge amount of research has gone into developing scalable algorithms, with techniques such as compressed sensing, topological analysis and the maximal information coefficient emerging as especially promising tools of big data science. But more work remains to be done before astronomers, cosmologists and physicists will be ready to fully exploit the multi-petabyte digital movie of the universe that premiers next decade. Progress is hampered by the fact that researchers in the physical sciences get scant academic credit for developing algorithms — a problem that the community widely recognizes but has yet to solve.

    “It’s always been the case that the people who build the instrumentation don’t get as much credit as the people who use the instruments to do the cutting-edge science,” Connolly said. “Ten years ago, it was people who built physical instruments — the cameras that observe the sky — and today, it’s the people who build the computational instruments who don’t get enough credit. There has to be a career path for someone who wants to work on the software — because they can go get jobs at Google. So if we lose these people, it’s the science that loses.”

    Coffee and Kudos

    In December 2010, in an effort to encourage the development of better algorithms, an international group of astronomers issued a challenge to computer geeks everywhere: What is the best way to measure gravitational lensing, or the distorting effect that dark matter has on the light from distant galaxies? David Kirkby read about the GREAT10 (GRavitational lEnsing Accuracy Testing 2010) Challenge on Wired.com and decided to give it a go.

    David Kirkby, a physicist at the University of California, Irvine, holds an observing plate designed to capture data for a specific circular patch of the sky. Peter DaSilva for Quanta Magazine

    Kirkby, a physicist at the University of California, Irvine, and his graduate student won the contest using a modified version of a neural network algorithm that he had previously developed for the BABAR experiment, a large physics collaboration investigating the asymmetry of matter and antimatter. The victory earned Kirkby a co-author credit on the recent paper detailing the contest, easing his switch from the field of particle physics to astrophysics. Also, with the prize money, “we bought a top of the line espresso machine for the lab,” he said.

    GREAT10 was one of a growing number of “data challenges” designed to find solutions to specific problems faced in creating and analyzing large physics and astronomy databases, such as the best way to reconstruct the shapes of two galaxies that are aligned relative to Earth and so appear blended together.

    “One group produces a set of data — it could be blended galaxies — and then anybody can go out and try and estimate the shape of the galaxies using their best algorithm,” explained Connolly, who is involved in generating simulations of future LSST images that are used to test the performance of algorithms. “It’s quite a lot of kudos to the person who comes out on top.”

    Many of the data challenges, including the GREAT series, focus on teasing out the effects of dark matter. When light from a distant galaxy travels to Earth, it is bent, or “lensed,” by the gravity of the dark matter it passes through. “It’s a bit like looking at wallpaper through a bathroom window with a rough surface,” Kirkby said. “You determine what the wallpaper would look like if you were looking at it directly, and you use that information to figure out what the shape of the glass is.”

    Each new data challenge in a series includes an extra complication — additional distortions caused by atmospheric turbulence or a faulty amplifier in one of the detectors, for example — moving the goal posts of the challenge closer and closer to reality.

    Data challenges are “a great way of crowd-sourcing problems in data science, but I think it would be good if software development was just recognized as part of your productivity as an academic,” Kirkby said. “At career reviews, you measure people based on their scientific contributions even though software packages could have a much broader impact.”

    The culture is slowly changing, the scientists said, as the ability to analyze data becomes an ever-tightening bottleneck in research. “In the past, it was usually some post-doc or grad student poring over data who would find something interesting or something that doesn’t seem to work and stumble across some new effect,” Tyson said. “But increasingly, the amount of data is so large that you have to have machines with algorithms to do this.”

    Dark Side of the Universe

    Assuming that physicists can solve the computing problems they face with the LSST, the results could be transformative. There are many reasons to want a 100-petabyte digital copy of the universe. For one, it would help map the expansion of space and time caused by the still-mysterious dark energy, discovered with the help of the LSST’s predecessor, the Big Throughput Camera, which Tyson and a collaborator built in 1996.

    When that camera, which could cover a patch of the sky the size of a full moon in a single exposure, was installed on the Blanco Telescope in Chile, astrophysicists immediately discovered dozens of exploding stars called Type IA supernovae strewn across the sky that revealed that most stuff in the universe is unknown. Light from nearby supernovae appeared to have stretched more than it should have during its journey through the expanding cosmos compared with light from faraway ones. This suggested that the expansion of the universe had recently sped up, driven by dark energy.

    CTIO Victor M Blanco 4m Telescope
    CTIO Victor M Blanco 4m Telescope interior
    CTIO Victor M Blanco 4m Telescope

    With the LSST, scientists hope to precisely track the accelerating expansion of the universe and thus to better define the nature of dark energy. They aim to do this by mapping a sort of cosmic yardstick called baryon acoustic oscillations. The yardstick was created from sound waves that rippled through the universe when it was young and hot and became imprinted in the distribution of galaxies as it cooled and expanded. The oscillations indicate the size of space at every distance away from Earth — and thus at any point back in time.

    Baryon acoustic oscillations are so enormous that a truly vast astronomical survey is needed to make them a convenient measuring tool. By cataloguing billions of galaxies, the LSST promises to measure the size of these resonances more accurately than any other existing or planned astronomical survey. “The idea is that with the LSST, we will have onion shells of galaxies at different distances and we can look for this pattern and trace the size of the resonant patterns as a function of time,” Szalay said. “This will be beautiful.”

    But, Szalay added, “it will be a nontrivial task to actually milk the information out of the data.”

    See the full article here.

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 10:23 pm on August 4, 2014 Permalink | Reply
    Tags: , , , , , Large Synoptic Survey Telescope (LSST)   

    From AURA: “AURA Awarded Support by the National Science Foundation To Begin Constructing LSST” 

    AURA Icon
    Association of Universities for Research in Astronomy

    August 4, 2014

    The National Science Foundation (NSF) agreed on Friday to support the Association of Universities for Research in Astronomy (AURA) to manage the construction of the Large Synoptic Survey Telescope (LSST).

    LSST Telescope

    This marks the official federal start of the LSST project, the top-ranked major ground-based facility recommended by the National Research Council’s Astronomy and Astrophysics decadal survey committee in its 2010 report, New Worlds, New Horizons. It is being carried out as an NSF and Department of Energy (DOE) partnership, with NSF responsible for the telescope and site, education & outreach, and the data management system, and DOE providing the camera and related instrumentation. Both agencies expect to support post-construction operation of the observatory.

    The NSF construction budget for LSST is not to exceed $473M. The DOE Camera fabrication budget will be baselined later this year, but is estimated to be $165M. Operations costs will be around $40M per year for the ten-year survey. With the approved start occurring now, LSST will see first light in 2019 and begin full science operations in 2022. Today’s action culminates over ten years of developing, planning and reviewing of the LSST concept.

    LSST Project Manager, Victor Krabbendam, was delighted to receive the welcome news from NSF: “This agreement is a tribute to the hard work of an exceptional team of highly skilled individuals, many of whom have dedicated more than a decade to bringing LSST to this point. After a rigorous design and development phase, the project team is ready to get down and dirty and actually build this amazing facility.”

    LSST Director, Steven Kahn of Stanford University, commented on the unique contributions LSST will make to astronomy and fundamental physics: “The broad range of science enabled by the LSST survey will change our understanding of the dynamic Universe on timescales ranging from its earliest moments after the Big Bang to the motions of asteroids in the solar system today. The open nature of our data products means that the public will have the opportunity to share in this exciting adventure along with the scientific community. The most exciting discoveries will probably be those we haven’t yet even envisioned!”

    William Smith, the President AURA, expressed his enthusiasm for AURA’s role in the Project: “AURA is proud to provide management for the construction of LSST, an activity clearly aligned with our mission to promote excellence in astronomical research by providing access to state-of-the-art facilities. Joining the Space Telescope Science Institute, the National Solar Observatory, the National Optical Astronomy Observatory, and the Gemini Telescope as AURA Centers, LSST is a new paradigm in ground-based astronomy that will revolutionize both our cosmic knowledge and the open and collaborative methods of acquiring that knowledge.”

    By digitally imaging the sky for a decade, the LSST will produce a petabyte-scale database enabling new paradigms of knowledge discovery for transformative STEM education. LSST will address the most pressing questions in astronomy and physics, which are driving advances in big data science and computing. LSST is not “just another telescope” but a truly unique discovery engine.

    The early development of LSST was supported by the LSST Corporation (LSSTC), a non-profit consortium of universities and other research institutions. Fabrication of the major mirror components is already underway, thanks to private funding received from the Charles and Lisa Simonyi Foundation for Arts and Sciences, Bill Gates, and other individuals. Receipt of federal construction funds allows major contracts to move forward, including those to build the telescope mount assembly, the figuring of the secondary mirror, the summit facility construction, the focal plane sensors, and the camera lenses.

    LSST’s construction funding will be provided through NSF’s Major Research Equipment and Facilities (MREFC) account. LSST passed its NSF Final Design Review in December of 2013; the National Science Board gave the NSF conditional approval to move the project to construction status in May of 2014. On the DOE side, LSST received Critical Decision-1 approval (CD-1) in 2011 and also just received CD-3a approval, which allows the project to move forward with long-lead procurements. The CD-2 review will take place the first week in November, with approval expected shortly afterward, formally fixing the baseline budget for completion of the camera project. The Particle Physics Project Prioritization Panel (P5), an advisory subpanel of the High Energy Physics Advisory Panel (HEPAP), recommended last month that DOE move forward with LSST under all budget scenarios, even the most pessimistic.

    The Association of Universities for Research in Astronomy (AURA) is a consortium of 39 US institutions and 6 international affiliates that operates world-class astronomical observatories. AURA’s role is to establish, nurture, and promote public observatories and facilities that advance innovative astronomical research. In addition, AURA is deeply committed to public and educational outreach, and to diversity throughout the astronomical and scientific workforce. AURA carries out its role through its astronomical facilities: http://www.aura-astronomy.org

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 11:29 am on July 24, 2014 Permalink | Reply
    Tags: , , , , , Large Synoptic Survey Telescope (LSST)   

    From Brookhaven Lab: “Instrumentation Division Nears Production Phase for LSST Camera Sensors” 

    Brookhaven Lab

    July 21, 2014
    Rebecca Harrington

    Precision assembly is required to capture the clearest and most extensive picture of the cosmos

    A single sensor for the world’s largest digital camera detected light making its way through wind, air turbulence, and Earth’s atmosphere, successfully converting the light into a glimpse of the galactic wonders that this delicate instrument will eventually capture as it scans the night sky. When installed in the camera of the Large Synoptic Survey Telescope (LSST), these sensors will convert light captured from distant galaxies into digital information that will provide unprecedented insight into our understanding of the universe.

    Design Engineer Justine Haupt (left) and Postdoctoral Research Associate Dajun Huang (right) prepare a test chamber that scientists in the Instrumentation Division are are using to evaluate the digital sensors they are designing for the Large Synoptic Survey Telescope, which is scheduled to see “first light” in 2020, and start surveying in 2022.

    LSST Telescope

    But the sensor wasn’t on the telescope yet; it was in a clean room at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory. And the “atmosphere” was being projected from a custom piece of glass made to replicate what the sensor will actually see once it is part of the camera inside the LSST, which every three days will survey the entire night sky visible from its location atop a mountain in Chile. The meticulous laboratory test at Brookhaven was one of many that scientists in the Lab’s Instrumentation Division are conducting on the 201 sensors they are designing for the digital “film” of the telescope’s camera.

    Scheduled to see “first light” in 2020, and start surveying in 2022, the LSST will ultimately survey 20 billion galaxies and 17 billion stars in a 10-year period. . In working on sensors for the camera, Brookhaven is partnering with dozens of public and private organizations, including universities, national laboratories, and Google, Inc., to make the LSST a reality. The project is jointly sponsored by the National Science Foundation (NSF) and DOE’s Office of Science. NSF leads the overall LSST effort, while DOE is responsible for providing the camera, with the DOE-supported effort led by the SLAC National Accelerator Laboratory.

    I think it will be an important chapter in the history of physics.
    Paul O’Connor, Brookhaven Senior Scientist leading the LSST camera team at Brookhaven

    The data gathered from those distant galaxies will offer scientists insight into the seemingly unreal: the dark matter and dark energy that in fact comprise more than 95 percent of our universe (the planets, stars, and other visible matter making up a mere 5 percent). Dark energy, the mysterious force that is accelerating the universe’s expansion, only manifests itself by its effects on large-scale cosmic structures. Dark matter, invisible on its own, can be measured by observing how light bends around it. Understanding these strange concepts and their role in cosmic acceleration are among the “science drivers” recently identified by a panel reviewing priorities in particle physics, which recommended that DOE’s work on the LSST camera go forward no matter what funding scenario the field may face.

    “This question of dark energy and dark matter is so compelling,” said Senior Scientist Paul O’Connor, who’s leading the LSST camera team at Brookhaven. “There’s incontrovertible evidence that these are the major constituents of the universe; they don’t fit into the rest of physics.”

    LSST’s incredible precision and sensitivity will give scientists access to both.

    To unlock the mysteries of dark energy, LSST needs to be able to measure redshift, a phenomenon observed when the wavelengths of light emitted by galaxies receding at the distant edges of space appear to stretch out, or shift to the red end of the light spectrum. Most galaxies to be detected by LSST are faint and far away, at the limits of current sensor technology for measuring redshifts. So O’Connor said his team needed to design the LSST camera sensors with a much thicker layer of silicon and entirely new electronics.

    “Making a contribution on the experimental end exploring these phenomena is quite satisfying,” O’Connor said. “I think it will be an important chapter in the history of physics.”

    But the LSST won’t just be for scientists. The general public will be able to access its images through planned projects such as adopting a patch of sky to monitor and track changes, and interacting with a time-lapse movie shown in science centers depicting a decade of observation. The telescope’s imaging powers will also join the host of other instruments used to detect exploding supernovae, and asteroids that could hit our planet, giving scientists more warning before they come close to Earth.
    Building the World’s Largest Digital Camera

    A design of a single raft tower housing the charge-coupled devices (CCDs) — sensors that convert light captured by the telescope into an electrical charge representing a specific detail that a computer can turn into a digital picture. The full camera will have 21 raft towers.

    The LSST sensors that the Brookhaven scientists are designing, building, and testing are known as charge-coupled devices (CCDs). Each pixel on a CCD converts light captured by the telescope into an electrical charge representing a specific detail that a computer can turn into a digital picture. LSST’s CCDs will capture deep space in unprecedented detail with 3.2-gigapixel sensors — that’s nearly 200 times larger than a high-end consumer camera.

    Each CCD operates individually, but they will all work together to render a complete image. Nine CCDs sit in a “raft,” or support structure, with their electronics packed underneath. The modularity that the LSST gains because of these rafts will allow for the incredibly quick sky surveys — reading 3 billion pixels in 2 seconds. It will also enable easier telescope maintenance since scientists can fix a single CCD instead of fixing the whole system, which will come in handy when the rafts are housed in a vacuum chamber kept at -100 degrees Celsius inside the telescope.

    The modularity of the rafts will also be a benefit during the installation and testing of the telescope base. Typically, when scientists build a telescope, they use a placeholder camera to test whether the mount and optics are working properly. Later, they install the full camera and sensors, after those instruments have undergone their own functional tests. But O’Connor said the LSST team will be able to use a single raft for initial testing on the mountain, allowing the scientists to measure the success of these components on the telescope itself.

    “We’re now finding some of the instrument effects emerging as we put the CCDs together at the laboratory phase, so we can prepare the type of software we need now,” O’Connor said. “But the sky tells you things you can’t easily measure in the lab.”

    To capture the clearest and most extensive picture of the cosmos, the CCDs must lie perfectly flat and have no more than a 250 micron (millionth of a meter) space between them. This requires painstaking assembly at Brookhaven, but at some point the sensors have to get to California to join the other parts, and then to Chile for operation. Mechanical engineers at Brookhaven are designing a stabilized shipping container to transport the sensitive CCDs across the country and continents.

    By the end of 2014, O’Connor said, his team hopes to have the first fully functional raft completed and tested. After that, he said, it will take four years to build and test the rest of the CCD rafts, which is on track to meet the “first light” deadline.

    “We have a well-defined job now. We can do our part while the other teams building the rest of the LSST do theirs,” O’Connor said. “This is a big project. This is the way science is going to solve big problems.”

    For more information, go to http://www.lsst.org.

    DOE’s Office of High Energy Physics funds the LSST camera development.

    See the full article here.

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 1:38 pm on June 27, 2014 Permalink | Reply
    Tags: , , , , Large Synoptic Survey Telescope (LSST),   

    From Symmetry: “Getting the jump on big data for LSST” 


    June 27, 2014
    Lori Ann White

    Efforts are already underway to ensure that the data the Large Synoptic Survey Telescope collects will be ready to be mined for scientific gold.

    LSST Telescope

    On the first night the Large Synoptic Survey Telescope points its 8.4-meter mirror toward the exquisitely dark skies over Chile—probably in the year 2022—its 3.2-billion-pixel camera will record 30 trillion bytes of data about the contents of the universe. On the second night, LSST will do it again. It will do it again and again, collecting unprecedented amounts of data every clear night for 10 years.

    By the end of its proposed mission, the LSST camera, designed and constructed at SLAC National Accelerator Laboratory, will have captured a full picture of the southern sky hundreds of times over.

    Scientists around the world will search trillions of bytes of LSST data for information about the universe on all scales. They will look for asteroids in the Earth’s backyard; map the Milky Way Galaxy; and study dark energy, the name given to whatever is causing the acceleration of the expansion of the entire universe.

    Cosmic Background Radiation Planck
    CMB Planck

    But getting those terabytes of raw data from the camera processed, polished and to the researchers’ computers in a usable form will be no small task. Cutting-edge computer applications will need to hold the data and mine it for scientific discoveries. These processing and database applications must work together flawlessly.

    Jacek Becla, technology officer for scientific databases at SLAC, leads the group at SLAC constructing the LSST database. Their design recently passed a “stress test” intended to determine whether the software could put more resources to effective use as more was asked of it.

    “We have a very solid prototype,” Becla says. “I’m actually quite confident we’ll be ready for LSST. We just have to stay focused.”

    The LSST processing software, which is being developed by a collaboration led by the Associated Universities for Research in Astronomy, has also proven itself through an ongoing series of “data challenges.” In these challenges, the software is used to analyze data from previous astronomical studies, including nine years of data from the Sloan Digital Sky Survey and a total of 450 nights of data collected over five years by the Legacy Survey at the Canada-France-Hawaii Telescope. The results of the challenges are compared with results from the original surveys, which can highlight bugs and verify that the software does what it’s been written to do.

    Canada-France-Hawaii Telescope
    Canada-France-Hawaii Telescope

    “These challenges have been very successful,” says LSST Director Steven Kahn. “They’ve already proved crucial algorithms are as good as—and in some cases better than—the software originally developed for the data.”

    To help spread the wealth, scientists have made all LSST software open-source.

    “The idea was to create software that’s available to the entire astrophysics community,” Kahn says. The Hyper Suprime-Cam, an 870-megapixel camera recently installed and commissioned on Japan’s Subaru Telescope, is already using an early version of LSST’s processing software.

    Subaru Telescope HyperCam
    >Hyper Suprime-Cam

    Subaru Telescope
    Subaru Telescope

    Meanwhile, Becla wants the database technology to be available to anyone who can put it to good use. “There have already been a lot of inquiries about the software: from Germany, from Brazil, from the United Kingdom,” he says.

    US financial support for the LSST construction comes from the National Science Foundation, the Department of Energy and private funding raised by the LSST Corporation, a non-profit 501(c)3 corporation formed in 2003, with its headquarters in Tucson, Arizona.

    Kahn says he sees their work as an indication that the worlds of “big data” and “high performance”—or supercomputing—are converging.

    “You need high-performance computing to run dark energy simulations; you have the big data you must compare the simulations to; and you have the big database to store the data,” he says. “LSST is a fantastic example.”

    See the full article http://www.symmetrymagazine.org/article/june-2014/getting-the-jump-on-big-data-for-lsst.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 10:58 pm on February 21, 2014 Permalink | Reply
    Tags: , , , , Large Synoptic Survey Telescope (LSST),   

    From SETI: “The Large Synoptic Survey Telescope – A New Way to Scan the Sky” 

    Seth Shostak, Senior Astronomer

    It will be the mother of all telescopes, and you can bet it will do for astronomy what genome sequencing is doing for biology.

    The clumsy, if utilitarian, name of this mirrored monster is Large Synoptic Survey Telescope, or LSST. You can’t use it yet, but a peak in the Chilean Andes has been decapitated to provide a level spot for placement. This robotically operated sky-eye, with an aperture of 8.4 meters, should be up and running six years from now.

    LSST Telescope

    OK, but so what? After all, there are many new telescopes rolling down the pike these days, some of which will boast far larger optics than the LSST.

    The difference is in the way this scope will sponge data from the sky, and distribute it to the world. The LSST will be the first instrument designed from the pedestal up to work fast, to pile up petabytes of data, and to quickly notice any cosmic phenomena that go bump in the night.

    That last point is important. Generally speaking, most stuff you see in the heavens doesn’t change very quickly. The stars look the same from night to night. Nebulae and galaxies are dully immutable, maintaining the same overall appearance for thousands or millions of years. Indeed, only the Sun, moon and planets – together with the occasional comet, asteroid or meteor – seem dynamic.

    The principal reason for the universe’s poker face is that its constituents are far away. Stars careen through space, and galaxies spin at speeds thousands of times faster than a jet plane. But given their distance, you’d need the patience of Job to notice much change in their appearance or position.

    Nonetheless, we know of celestial circumstances that do change quickly. Stars can explode in minutes. Nearby asteroids capable of cratering your neighborhood can traverse the sky in hours. And surely the most interesting of all are the things we don’t know about: fast phenomena that have escaped our attention simply because astronomers have always used still cameras to photograph the cosmos.

    According to Mario Juric, the LSST’s Data Management Project Scientist, this new telescope will sport a massive, three-ton digital camera with a wide enough field of view to snap photos of the entire southern sky roughly every three days. Since the current plan is to operate the LSST for at least a decade, that means every object visible to this instrument will be imaged nearly a thousand times. Of course, those photos can be viewed in sequence, like a time-lapse film. As Juric says, “It’s a robot telescope that will make a movie of the sky.”

    The LSST camera is designed to provide a wide field of view with better than 0.2 arcsecond sampling and spectral sampling in five or more bands from 400nm to 1060nm. The image surface is flat with a diameter of approximately 64 cm. The detector format will be a circular mosaic providing over 3 Gigapixels per image. The camera includes a filter mechanism and, if necessary, shuttering capability. The camera is positioned in the middle of the telescope. 1/2012 Credit: LSSTC

    And this flick won’t be dull. Juric estimates that ten million transient objects will be photographed each clear night. Many of these will be asteroids prancing through our solar system, and the LSST will catalog millions of them, including 80 percent of the larger ones – rocks as big as a football field or more. Knowing the orbits of these dangerous projectiles will prompt us to deflect those coming our way. To know them is to shove them.

    That’s a nice bit of insurance against collision catastrophe, but the truly revolutionary thing about the LSST is what it will do for fundamental astronomy. It’s a safe bet that this telescope will discover multitudes of extraordinary events such as colliding neutron stars or other exotica worthy of intense, and immediate, study by others. So part of the LSST’s design is to send out electronic alerts within 60 seconds of sensing one of these fast-action, “bump in the night” events.

    With this laundry list of extraordinary capabilities, and its formidable cache of data, the LSST will revolutionize astronomy. Yes, that’s a cliché, but sometimes clichés are appropriate.

    Juric notes that, historically speaking, “astronomy has always been a data-starved science.” That’s because the most interesting research topics are inevitably on the edge of visibility, and consequently require using the largest telescopes.

    The result is a bottleneck. In the not-so-good old days, a working astronomer might get a few nights a year on a world-class instrument, and could generally observe a few dozen or a few hundred objects. Sure, there were published surveys that cataloged reasonably big swaths of the sky, but those data were not always adequate to address the kinds of interesting questions scientists cook up. So big-time astronomers were often intimately dependent on big-time telescopes.

    The LSST may cause a breakup of that exclusive relationship. After its first decade of operation, this new scope will have an image collection of about 20 billion galaxies and 17 billion stars. The data set will tally hundreds of millions of gigabytes. That’s truly big data, and the good news is that anyone will be able to scour through it – via the laptop in their office, or using their smart phone while waiting for the bus.

    So here’s the big shift: For four hundred years, the relationship between astronomers and their telescopes has been as fundamental as that between a psychiatrist and his couch. But once the LSST starts digitizing the sky, it will alter the paradigm dramatically. Scientists will interact with data, not with an instrument. Rather than acquiring photos or spectra to prove or disprove their ideas, they will be free to trawl immense quantities of data, and see what falls out.

    It will be as if biologists and zoologists suddenly had access to detailed information for every single species on Earth (and thousands of examples of each). Think of the sort of investigations even amateurs could make.

    The LSST will be a game changer, and it could very turn out that the biggest discoveries about the past, present and future of the cosmos will come not from tweedy astronomers, but from the keen and curious non-specialists sitting at home, faced off against their laptops. Galileo would have been stupefied.

    See the full article here.

    SETI Institute – 189 Bernardo Ave., Suite 100
    Mountain View, CA 94043
    Phone 650.961.6633 – Fax 650-961-7099

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 9:38 am on February 7, 2014 Permalink | Reply
    Tags: , , , Large Synoptic Survey Telescope (LSST)   

    From Brookhaven Lab: “Aim for the Sky!” 

    Brookhaven Lab

    February 7, 2014
    Jane Koropsak

    Brookhaven Lab’s Justine Haupt Helping to Build the World’s Largest Digital Camera

    Design Engineer Justine Haupt is pictured in front of the cryostat she designed for testing LSST’s electro-optic sensor modules. She is holding one of the compact front-end electronic assemblies that will enable the camera to be read out at a remarkable 1.5 billion pixels per second.

    When completed, the Large Synoptic Survey Telescope (LSST) will be the world’s largest digital camera. It’s been called the widest, fastest, deepest eye of the new digital age. This remarkable telescope, to be stationed on a mountaintop in Chile, promises to cast light on mysteries fundamental to our understanding of the universe. It will scan the sky rapidly and chart objects that change or move, including exploding supernovae and potentially hazardous near-Earth asteroids. LSST’s images will trace billions of remote galaxies allowing probing of mysterious dark matter and dark energy. Its uniquely wide field of view will allow LSST to observe large areas of the sky at once and move quickly between images. It will be able to take more than 800 panoramic images each night and cover the entire sky twice a week. And that is just a brief description. The LSST is fascinating—and Brookhaven Lab is playing a big role in the project.

    Meet Engineering Whiz Justine Haupt

    When you meet Justine Haupt you will immediately recognize her calm intelligence and positive attitude. Haupt works in the Lab’s Instrumentation Division designing and building prototypes for the LSST.

    “She thinks ‘outside the box’ and comes up with methods that will streamline testing of the LSST’s focal plane components,” said Brookhaven researcher Paul O’Connor, who mentors Haupt. “She is inventive, spirited, and remarkably creative. In the four years she has worked at the Lab, she has constructed more than two dozen pieces of equipment, ranging from custom catadioptric lens systems to a microprocessor-controlled, in-vacuum induction motor. She also keeps our group’s 3D printer busy turning out parts for her various creations. She has managed to develop a mastery of mechanical, optical, and electrical design.” For her “impressive range of excellence” Haupt received the 2014 Rising Engineering Star award from Mouser Electronics and Design News.

    at work
    At work, Justine Haupt designs and builds prototypes for the LSST, but in her spare time this design engineer takes to the sky in a plane or paragliding.

    You might think work would keep Haupt busy enough, but the young researcher takes her quest for knowledge of the sky and our universe a step further. In her spare time, she is an avid paragliding pilot. She got her pilot’s license at age 18, has performed some flying acrobatics, and holds an FAA certified Advanced Ground Instructor rating. She refurbished the avionics of a 1947 Stinson Voyager (a single-engine plane) and installed a new intercom and strobe system. She volunteers and sits on the board of directors at the Custer Observatory, located on the east end of Long Island in Southold, where she routinely reviews and re-engineers the instrumentation for the largest telescope on Long Island.

    “Flying in either a plane or paragliding is exhilarating,” said Haupt. “And, being part of a team that is designing and building a telescope that may very well be the center of the United States’ astronomy program feels great! This job has far exceeded my expectations. I learn something new every day.”

    In addition, Haupt volunteers at the Laboratory giving talks to students and visitors about her work. She also finds time to play piano, trombone, and do some fiddling during her lunch hour with fellow employees and musicians, Paul O’Connor, Peter Siddons, Sean McCorkle, and Cindy Salwen. “We casually call ourselves the Stochastic Orchestra and have performed at venues like the Custer Observatory, but mostly we just enjoy meeting at lunchtime and jamming,” said Haupt.

    “For fun,” she added, “I’ve been working on developing a new holographic projection technique and exploring an idea for a new class of solar astronomy instrumentation.

    “These are things that could even eventually turn into business ventures. For now, I’m grateful for my job at the Lab and the time I get to take to the sky.”

    See the full article here.

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 3:34 pm on April 24, 2012 Permalink | Reply
    Tags: , , , , Large Synoptic Survey Telescope (LSST),   

    From SLAC News Center: “World’s Largest Digital Camera Project Passes Critical Milestone” 

    April 24, 2012
    by Andy Freeberg

    “A 3.2 billion-pixel digital camera designed by SLAC is now one step closer to reality. The Large Synoptic Survey Telescope camera, which will capture the widest, fastest and deepest view of the night sky ever observed, has received “Critical Decision 1” approval by the U.S. Department of Energy (DOE) to move into the next stage of the project.

    The Large Synoptic Survey Telescope (LSST) will survey the entire visible sky every week, creating an unprecedented public archive of data – about 6 million gigabytes per year, the equivalent of shooting roughly 800,000 images with a regular eight-megapixel digital camera every night, but of much higher quality and scientific value. Its deep and frequent cosmic vistas will help answer critical questions about the nature of dark energy and dark matter and aid studies of near-Earth asteroids, Kuiper belt objects, the structure of our galaxy and many other areas of astronomy and fundamental physics.

    ‘With 189 sensors and over 3 tons of components that have to be packed into an extremely tight space, you can imagine this is a very complex instrument,’ said Nadine Kurita, the project manager for the LSST camera at SLAC. ‘But given the enormous challenges required to provide such a comprehensive view of the universe, it’s been an incredible opportunity to design something so unique.’ ”

    See the full article here.

    The effort to build the LSST is led by the LSST Corporation, a non-profit 501(c)3 corporation formed in 2003, with headquarters in Tucson, AZ. Financial support for LSST comes from the National Science Foundation with additional contributions from private foundation gifts, grants to universities, and in-kind support from Department of Energy laboratories and other LSST Member Institutions. In 2011, the LSST construction project was established as an operating center under management of the Association of Universities for Research in Astronomy (AURA).

    Institutional Members
    Last Revision 1/19/2011

    Adler Planetarium
    Brookhaven National Laboratory (BNL)
    California Institute of Technology
    Carnegie Mellon University
    Cornell University
    Drexel University
    Fermi National Accelerator Laboratory
    George Mason University
    Google, Inc.
    Harvard-Smithsonian Center for Astrophysics
    Institut de Physique Nucleaire et de Physique des Particules (IN2P3)
    Johns Hopkins University
    Kavli Institute for Particle Astrophysics and Cosmology (KIPAC) – Stanford University
    Las Cumbres Observatory Global Telescope Network, Inc.
    Lawrence Livermore National Laboratory (LLNL)
    Los Alamos National Laboratory (LANL)
    National Optical Astronomy Observatory*
    National Radio Astronomy Observatory
    Princeton University
    Purdue University
    Research Corporation for Science Advancement*
    Rutgers University
    SLAC National Accelerator Laboratory
    Space Telescope Science Institute
    Texas A & M University
    The Pennsylvania State University
    The University of Arizona*
    University of California at Davis
    University of California at Irvine
    University of Illinois at Urbana-Champaign
    University of Michigan
    University of Pennsylvania
    University of Pittsburgh
    University of Washington*
    Vanderbilt University

    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science. i1

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: