Tagged: LSST-Large Synoptic Survey Telescope Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:12 am on June 20, 2019 Permalink | Reply
    Tags: , , , ComCam miniature camers for the LSST, , LSST-Large Synoptic Survey Telescope,   

    From SLAC: “A miniature camera for the Large Synoptic Survey Telescope will help test the observatory and take first images” 

    June 19, 2019
    By Aiko Takeuchi-Demirci

    SLAC completed its work on ComCam, a commissioning device to be installed in Chile later this year.

    LSST ComCam

    Scientists at the Department of Energy’s SLAC National Accelerator Laboratory are building the world’s largest digital camera for astronomy and astrophysics – a minivan-sized 3,200-megapixel ‘eye’ of the future Large Synoptic Survey Telescope (LSST) that will enable unprecedented views of the universe starting in the fall of 2022 and provide new insights into dark energy and other cosmic mysteries.

    LSST Camera, being built at SLAC

    In the meantime, the lab has completed its work on a miniature version that will soon be used for testing the telescope and taking LSST’s first images of the night sky.

    These images will include glimpses of the motions of asteroids and objects in our solar system with orbits beyond that of Neptune, as well as alerts of sudden events such as supernovae, exploding stars that temporarily light up parts of the sky.


    ComCam, a commissioning camera for LSST. (Farrin Abbott/SLAC National Accelerator Laboratory)

    The device, called ComCam (short for Commissioning Camera), will use only four percent of the full LSST camera’s focal plane and produce much smaller images, but it will provide enough “imaging power” to test the observatory while its ultimate camera is still under construction. In fact, ComCam’s 144 megapixels outnumber the pixel count that was available to the Sloan Digital Sky Survey, a pioneering astrophysical survey project in the early 2000s.

    “ComCam will give us a great head start in checking all of the interfaces between the camera, telescope, site infrastructure and data management,” says Kevin Reil, LSST commissioning scientist and SLAC staff scientist.

    After completing the integration of imaging sensors into ComCam and other tasks, the SLAC team today shipped the device to LSST headquarters in Tucson, Arizona. There, more components will be added before the finished ComCam is sent to its final destination in Chile later this year.

    A miniature LSST camera

    The extraordinarily high image quality of the full LSST camera will be largely due to its 189 state-of-the-art imaging sensors. Arranged into square arrays, called rafts, of nine sensors each, they’ll make up the camera’s focal plane. ComCam has only a single raft, which was provided by DOE’s Brookhaven National Laboratory and recently inserted into the ComCam cryostat at SLAC.

    The cryostat, specially designed and built for ComCam, holds the raft in place and cools its imaging sensors to very low temperatures to eliminate unwanted background signals and improve image quality. The ComCam cryostat uses a different refrigeration system from that of the final LSST camera, which requires a more complex system in order to handle 21 rafts.

    The raft also contains electronics boards that will digitize data taken with ComCam. These data will be sent to data management systems at the National Science Foundation-supported National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign and centers at France’s National Institute of Nuclear and Particle Physics and in Chile, where they will be analyzed by scientists around the world.

    SLAC is also building and testing the camera control system, which will allow the observatory software to send commands to ComCam, for instance, to change filters and take images. The LSST camera will use the same control system.

    Toward first images

    Once ComCam arrives in Tucson, LSST scientists will add lenses, a filter changer and a shutter. They will integrate the complete instrument with the observatory software and computing infrastructure and perform crucial tests, including a dry run that will simulate a night of observations.

    “In large projects like LSST, it’s exciting to watch the hardware and software come together into a working system over the years,” says Brian Stalder, LSST commissioning scientist in Tucson.

    Finally, ComCam will be sent to Chile and installed on the actual telescope, paving the way for LSST commissioning.

    In addition, it’ll produce LSST’s first images, albeit at a much smaller scale than the final camera. Although science studies won’t be ComCam’s primary purpose, the team expects the camera to produce images of very good quality, Reil says: “It’ll be exciting to see these early images taken with our brand new, world-class telescope.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    SLAC/LCLS


    SLAC/LCLS II projected view


    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

     
  • richardmitnick 11:09 am on June 4, 2019 Permalink | Reply
    Tags: , , , , LSST-Large Synoptic Survey Telescope,   

    From Symmetry: “Engineering the world’s largest digital camera” 

    Symmetry Mag
    From Symmetry

    06/04/19
    Erika K. Carlson

    Building the Large Synoptic Survey Telescope also means solving extraordinary technological challenges.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.


    LSST Data Journey, Illustration by Sandbox Studio, Chicago with Ana Kova

    In a brightly lit clean room at the US Department of Energy’s SLAC National Accelerator Laboratory, engineers are building a car-sized digital camera for the Large Synoptic Survey Telescope.

    When it’s ready, LSST will image almost all of the sky visible from its vantage point on a Chilean mountain, Cerro Pachón, every few nights for a decade to make an astronomical movie of unprecedented proportions.

    The camera is a combination of many extremes. Its largest lens is one of the biggest ever created for astronomy and astrophysics. The ceramic grid that will hold its imaging sensors is so flat that no feature larger than a human red blood cell sticks up from its surface. The electronics that control the sensors are customized to fit in a very tight space and use as little power as possible.

    All of these specifications are vital for letting LSST achieve its scientific goals. And not many of them are easy to achieve. The LSST camera will do what no camera has been capable of doing before, and building it requires solving technical problems that have never been solved before.

    A game of ‘Operation’

    “When you consider a project this complex, you can’t just dive in and say ‘Here, I’m going to design and build this in one shot,’ right?” says Tim Bond, head of the LSST Camera Integration and Test team at SLAC. “You have to divide and conquer. So you break it up into smaller pieces that individual groups can work on.”

    One of those pieces is figuring out how to get the camera’s sensors into place.

    The 3.2-billion-pixel LSST camera will be the largest digital camera ever constructed. Much like handheld digital cameras, the LSST camera will be made up of imaging sensors called charge-coupled devices—189 of them. These sensors and their bundles of electronics are arranged into 21 nine-sensor pallets called “rafts.” Each one weighs more than 20 pounds and stands almost 2 feet tall.

    Each sensor is fragile enough to chip if it even touches one of the other rafts. And, to minimize gaps in the sensors’ images, all of the rafts must be installed two hundredths of an inch apart inside the camera’s ceramic grid.

    The LSST engineers couldn’t possibly install the delicate rafts by hand without destroying them, so they took on the challenge of creating a device that could do this very specific task in their place.

    They concocted one concept after another. Travis Lange, a SLAC mechanical engineer, created computer models of each to find a design that could both do the job and be built with the level of machining precision available.

    “One of the bigger challenges for this is just the tolerance of all the individual pieces and how it corresponds to how much motion I am allowed to use,” Lange says. If a part is the wrong size by even just the width of a human hair, it’s a problem. “If you have many of those parts that are off by that much, those errors all stack up.”

    One of the designs that the team drew up resembled a claw-machine game. The device would sit on a structure above the cryostat, the apparatus that keeps the camera cold. With a long arm, it would reach through to a raft waiting for installation below. Over the course of several hours, it would pull the raft up through a very precisely sized slot and into place in the grid.

    Four specialized cameras pointing at the edges of the imaging sensors would help steer the raft into place without hitting neighboring sensors, and unique imaging software would measure the gaps between rafts in real time. “It’s a crazy game of ‘Operation,’” Lange says.

    The team went with the claw-machine plan. In May 2018, they put it to the test with its first practice raft and a mock-up of the camera. After most of a day had passed, the raft was successfully in place.

    The installation robot has since gone through several other successful test runs. Now that they’ve figured out the kinks in the process, installing each raft takes about two hours. Engineers plan to start the real installation process this summer.

    Not your everyday refrigerator

    The electronics and sensors crammed together inside the camera heat up as electricity runs through them. But heat is the enemy of astronomical observation. A warm sensor will sabotage its own observations by behaving as if it senses light where there is none. And as anyone who has ever heard their laptop fan working overtime before the computer crashed may know, heat can also cause electronics to stop working.

    To keep the camera cold enough, the engineers needed to create a customized refrigeration system. They eventually made a system of eight refrigeration circuits—two for the electronics and six for the sensors.

    Each of these systems works similarly to a kitchen refrigerator, in which a fluid refrigerant carries heat away from the object or area it’s supposed to cool. Networks of tubes carry the refrigerant into and out of the camera.

    At first, the team used only metal tubes for this job. Metal is good at keeping moisture out, which is important because any water that gets into the tubes from the surrounding air would freeze and clog the system. At parts of the system where the camera would need to move around with the telescope as it points to different parts of the sky, the tubes were corrugated to make them into flexible metal hoses.

    But there was a problem. The refrigeration system’s compressor, a device that forces the refrigerant to dump its absorbed heat outside the camera, uses lubricating oil to work smoothly. As the refrigeration system ran, some of the oil would leave the compressor and travel through the tubes.

    This wouldn’t have been a problem if the oil had traveled at a consistent pace all the way through the circuit, back to the compressor. But that wasn’t happening; the oil was getting slowed down and sometimes trapped by the grooves in the corrugated metal hoses. The compressor was getting oil back in trickles or spurts rather than in a steady stream. This made the refrigeration system unpredictable and harder to maintain.

    So the team switched to a different kind of hose for the refrigeration system’s “joints,” says Diane Hascall, a SLAC mechanical engineer on the LSST camera team. “You can almost think of it like a garden hose. But it’s a very special garden hose that’s made to work with refrigerants.”

    The new hoses, called smooth-bore hoses, are made of layers of rubber, braid and other flexible materials, and they are smooth on the inside. The smooth hose lets oil return to the compressor more effectively, Hascall says.

    But there was a trade-off. Unlike the metal hoses, the smooth-bore hoses do let some moisture in.

    To deal with that, the team installed filter dryers that absorb moisture from the system. They are still figuring out how often the dryers need to be replaced to keep the camera in good shape.

    Building next-gen technology

    Building each component of a piece of technology as sophisticated as LSST is a challenge in itself, but the challenges don’t end there. Engineers must also design specialized equipment, software and procedures to test different pieces; put the pieces together; and determine what maintenance the technology will need to run smoothly.

    “There’s a huge number of subsystems,” Bond says. “All of those subsystems have to present their products. And all those products have to be assembled and tested and work in the final finished full product.”

    Bond says working on the project has been a great boon to the engineering team. He says figuring out all of the unexpected challenges that have come with making such an advanced piece of technology has been a great experience, and he looks forward to seeing what future projects the team will tackle together.

    “It’s like picking a bunch of players to set up a hockey team or something,” Bond says. “We’ve actually put together a very good team, and we’re just getting some of our younger people up to speed and trained for the next generation of experiments and projects that will come along.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 10:52 am on March 25, 2019 Permalink | Reply
    Tags: , , , , , , ExaLearn, , LSST-Large Synoptic Survey Telescope, , ,   

    From insideHPC: “ExaLearn Project to bring Machine Learning to Exascale” 

    From insideHPC

    March 24, 2019

    As supercomputers become ever more capable in their march toward exascale levels of performance, scientists can run increasingly detailed and accurate simulations to study problems ranging from cleaner combustion to the nature of the universe. Enter ExaLearn, a new machine learning project supported by DOE’s Exascale Computing Project (ECP), aims to develop new tools to help scientists overcome this challenge by applying machine learning to very large experimental datasets and simulations.

    1
    The first research area for ExaLearn’s surrogate models will be in cosmology to support projects such a the LSST (Large Synoptic Survey Telescope) now under construction in Chile and shown here in an artist’s rendering. (Todd Mason, Mason Productions Inc. / LSST Corporation)

    “The challenge is that these powerful simulations require lots of computer time. That is, they are “computationally expensive,” consuming 10 to 50 million CPU hours for a single simulation. For example, running a 50-million-hour simulation on all 658,784 compute cores on the Cori supercomputer NERSC would take more than three days.

    NERSC

    NERSC Cray Cori II supercomputer at NERSC at LBNL, named after Gerty Cori, the first American woman to win a Nobel Prize in science

    NERSC Hopper Cray XE6 supercomputer


    LBL NERSC Cray XC30 Edison supercomputer


    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF


    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Future:

    Cray Shasta Perlmutter SC18 AMD Epyc Nvidia pre-exascale supeercomputer

    Running thousands of these simulations, which are needed to explore wide ranges in parameter space, would be intractable.

    One of the areas ExaLearn is focusing on is surrogate models. Surrogate models, often known as emulators, are built to provide rapid approximations of more expensive simulations. This allows a scientist to generate additional simulations more cheaply – running much faster on many fewer processors. To do this, the team will need to run thousands of computationally expensive simulations over a wide parameter space to train the computer to recognize patterns in the simulation data. This then allows the computer to create a computationally cheap model, easily interpolating between the parameters it was initially trained on to fill in the blanks between the results of the more expensive models.

    “Training can also take a long time, but then we expect these models to generate new simulations in just seconds,” said Peter Nugent, deputy director for science engagement in the Computational Research Division at LBNL.

    From Cosmology to Combustion

    Nugent is leading the effort to develop the so-called surrogate models as part of ExaLearn. The first research area will be cosmology, followed by combustion. But the team expects the tools to benefit a wide range of disciplines.

    “Many DOE simulation efforts could benefit from having realistic surrogate models in place of computationally expensive simulations,” ExaLearn Principal Investigator Frank Alexander of Brookhaven National Lab said at the recent ECP Annual Meeting.

    “These can be used to quickly flesh out parameter space, help with real-time decision making and experimental design, and determine the best areas to perform additional simulations.”

    The surrogate models and related simulations will aid in cosmological analyses to reduce systematic uncertainties in observations by telescopes and satellites. Such observations generate massive datasets that are currently limited by systematic uncertainties. Since we only have a single universe to observe, the only way to address these uncertainties is through simulations, so creating cheap but realistic and unbiased simulations greatly speeds up the analysis of these observational datasets. A typical cosmology experiment now requires sub-percent level control of statistical and systematic uncertainties. This then requires the generation of thousands to hundreds of thousands of computationally expensive simulations to beat down the uncertainties.

    These parameters are critical in light of two upcoming programs:

    The Dark Energy Spectroscopic Instrument, or DESI, is an advanced instrument on a telescope located in Arizona that is expected to begin surveying the universe this year.

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA


    NOAO/Mayall 4 m telescope at Kitt Peak, Arizona, USA, Altitude 2,120 m (6,960 ft)

    DESI seeks to map the large-scale structure of the universe over an enormous volume and a wide range of look-back times (based on “redshift,” or the shift in the light of distant objects toward redder wavelengths of light). Targeting about 30 million pre-selected galaxies across one-third of the night sky, scientists will use DESI’s redshifts data to construct 3D maps of the universe. There will be about 10 terabytes (TB) of raw data per year transferred from the observatory to NERSC. After running the data through the pipelines at NERSC (using millions of CPU hours), about 100 TB per year of data products will be made available as data releases approximately once a year throughout DESI’s five years of operations.

    The Large Synoptic Survey Telescope, or LSST, is currently being built on a mountaintop in Chile.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.


    LSST Data Journey, Illustration by Sandbox Studio, Chicago with Ana Kova

    When completed in 2021, the LSST will take more than 800 panoramic images each night with its 3.2 billion-pixel camera, recording the entire visible sky twice each week. Each patch of sky it images will be visited 1,000 times during the survey, and each of its 30-second observations will be able to detect objects 10 million times fainter than visible with the human eye. A powerful data system will compare new with previous images to detect changes in brightness and position of objects as big as far-distant galaxy clusters and as small as nearby asteroids.

    For these programs, the ExaLearn team will first target large-scale structure simulations of the universe since the field is more developed than others and the scale of the problem size can easily be ramped up to an exascale machine learning challenge.

    As an example of how ExaLearn will advance the field, Nugent said a researcher could run a suite of simulations with the parameters of the universe consisting of 30 percent dark energy and 70 percent dark matter, then a second simulation with 25 percent and 75 percent, respectively. Each of these simulations generates three-dimensional maps of tens of billions of galaxies in the universe and how the cluster and spread apart as time goes by. Using a surrogate model trained on these simulations, the researcher could then quickly run another surrogate model that would generate the output of a simulation in between these values, at 27.5 and 72.5 percent, without needing to run a new, costly simulation — that too would show the evolution of the galaxies in the universe as a function of time. The goal of the ExaLearn software suite is that such results, and their uncertainties and biases, would be a byproduct of the training so that one would know the generated models are consistent with a full simulation.

    Toward this end, Nugent’s team will build on two projects already underway at Berkeley Lab: CosmoFlow and CosmoGAN. CosmoFlow is a deep learning 3D convolutional neural network that can predict cosmological parameters with unprecedented accuracy using the Cori supercomputer at NERSC. CosmoGAN is exploring the use of generative adversarial networks to create cosmological weak lensing convergence maps — maps of the matter density of the universe as would be observed from Earth — at lower computational costs.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Founded on December 28, 2006, insideHPC is a blog that distills news and events in the world of HPC and presents them in bite-sized nuggets of helpfulness as a resource for supercomputing professionals. As one reader said, we’re sifting through all the news so you don’t have to!

    If you would like to contact me with suggestions, comments, corrections, errors or new company announcements, please send me an email at rich@insidehpc.com. Or you can send me mail at:

    insideHPC
    2825 NW Upshur
    Suite G
    Portland, OR 97239

    Phone: (503) 877-5048

     
  • richardmitnick 9:15 am on March 25, 2019 Permalink | Reply
    Tags: "Women in Physics Group inspires the next generation of physicists and astronomers", LSST-Large Synoptic Survey Telescope, , ,   

    From University of Pennsylvania: “Women in Physics Group inspires the next generation of physicists and astronomers” 

    U Penn bloc

    From University of Pennsylvania

    March 22, 2019

    Credits

    Erica K. Brockmeier Writer
    Eric Sucar Photographer

    1
    Willman (center) and a group of undergraduates, including physics majors as well as students studying other STEM-related disciplines, chatted informally over breakfast about their personal experiences as STEM students and researchers.

    Earlier this month, Penn’s Women in Physics group hosted its fifth annual spring conference and networking event. Students had the opportunity to meet informally and share their work with Beth Willman, a world-renowned astronomer and deputy director of the Large Synoptic Survey Telescope (LSST).

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.


    LSST Data Journey, Illustration by Sandbox Studio, Chicago with Ana Kova

    Providing access to strong role models is just one of the goals of the undergraduate led group, which was founded in 2013 to support women studying physics through scholarship, mentorship, and social activities.

    “It’s a positive message that [Willman] is a strong, leading woman in a field that’s usually dominated by men,” says junior Olivia Sylvester from Mendham, New Jersey, a board member of the group. “In addition to learning about what she has to say about her research, you’re also taking in the fact that she’s probably overcome a lot of barriers to achieve such great success.”

    The conference kicked off with a casual morning get-together as Willman and a group of undergraduates chatted over coffee and breakfast. Students shared their experiences at Penn, with several indicating that they felt the atmosphere in the Department of Physics & Astronomy was generally welcoming and inclusive for women.

    After being introduced to several researchers in the department and sharing lunch with the Society of Physics group, undergraduate students presented the results of their summer research projects to Willman.

    First-year student Jen Locke from Ambler, Pennsylvania, presented her work from the lab of Masao Sako, an associate professor and undergraduate chair of the physics and astronomy department, on visualizing new planet candidates located in the Kuiper belt.

    Kuiper Belt. Minor Planet Center

    Next summer, Locke will work on developing a search strategy for finding new objects in the LSST database, a project that will likely involve Willman to a certain extent.

    Junior Alex Ulin from Los Angeles talked about her NASA internship on the flower-shaped starshade, a complex foldable structure that will make it easier to take pictures of potentially habitable planets that are difficult to visualize because of the brightness of the sun.

    NASA JPL Starshade

    Ulin, who wants to study materials science after graduation, worked on how to cut the nanometers-thin sheets of metal so they can cover the 20-meters-wide, origami-like structure as precisely as possible.

    Senior Abby Lee from St. Paul, Minnesota, who is advised by Gary Bernstein, the Reese W. Flower Professor of Astronomy and Astrophysics, presented the results of her research on selecting features for a physical model that describes dark matter subhalo disruption. These events, which happen when the circular “halo” around stars and galaxies interact with black holes or large areas of dark matter, can now be visualized thanks to improvements in technology but now require models that can help describe their behavior.

    Caterpillar Project A Milky-Way-size dark-matter halo and its subhalos circled, an enormous suite of simulations . Griffen et al. 2016

    Throughout the student presentations, Willman asked questions that ranged from the technical to the philosophical. Ulin, who also sits on the board for the Women in Physics group, says that these types of projects, as well as having researchers and mentors who can provide meaningful feedback on results, are instrumental experiences for undergraduate students in physics. “Talking to someone that you see having a success in the field can really inspire you to consider research and a career in STEM,” she says.

    The final event of the conference was a public lecture from Willman. More than 70 students, faculty, and other members of the Penn community attended her presentation, “The Most Magnificent Map Ever Made.” Willman, who is a Philadelphia native, says that the LSST is poised to become one the most productive scientific endeavors of all time. The project will look at half of the sky over 1,000 times across a 10-year period, and each image it collects will be 3.2 billion pixels large.

    2
    In 2022, the Large Synoptic Survey Telescope (LSST) will embark on a 10-year mission to map half the sky. Willman discussed this ambitious project, as well as how the data could revolutionize the field of astronomy, during a public lecture that was held at Houston Hall.

    But Willman says that LSST’s real impact will come from distributing data in “science-ready” formats that can be used and studied easily. Through open-data initiatives that reduce barriers and enable people from a broad range of backgrounds to get involved with astronomy, Willman says that both scientists and society can benefit. “Everything that’s required in the future of scientific progress requires diversity,” she says. “Bringing ideas and people together is beneficial, and science needs as many viewpoints as possible.”

    Junior Abby Timmel from Baltimore, the third board member of the group, says that researchers like Willman who teach from their own experience instead of a textbook can do a lot to inspire students. “This event shows what it looks like to be really successful in physics, how to take the things that you’re learning about and go further with them to really make an impact,” she says.

    With more than 30 active members and a number of events throughout the year, the members of Women in Physics will continue working on their own “magnificent map” as they chart a course towards improved inclusion in STEM.

    Their annual conference is just one example of how important making connections and providing encouragement are for students in STEM. “It spreads awareness that there is a group for women physicists, but I also think that having an event that we’ve organized helps people respect the idea of a group like this,” says Ulin. “They see that not only are we trying to be a support system, we’re also actively doing things for the community.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Penn campus

    Academic life at Penn is unparalleled, with 100 countries and every U.S. state represented in one of the Ivy League’s most diverse student bodies. Consistently ranked among the top 10 universities in the country, Penn enrolls 10,000 undergraduate students and welcomes an additional 10,000 students to our world-renowned graduate and professional schools.

    Penn’s award-winning educators and scholars encourage students to pursue inquiry and discovery, follow their passions, and address the world’s most challenging problems through an interdisciplinary approach.

     
  • richardmitnick 3:58 pm on February 19, 2019 Permalink | Reply
    Tags: A simplified version of that interface will make some of that data accessible to the public, , , , , Every 40 seconds LSST’s camera will snap a new image of the sky, Hundreds of computer cores at NCSA will be dedicated to this task, International data highways, LSST Data Journey, LSST-Large Synoptic Survey Telescope, National Center for Supercomputing Applications at the University of Illinois Urbana-Champaign, NCSA will be the central node of LSST’s data network, , , The two data centers NCSA and IN2P3 will provide petascale computing power corresponding to several million billion computing operations per second, They are also developing machine learning algorithms to help classify the different objects LSST finds in the sky   

    From Symmetry: “An astronomical data challenge” 

    Symmetry Mag
    From Symmetry

    1
    Illustration by Sandbox Studio, Chicago with Ana Kova

    02/19/19
    Manuel Gnida

    The Large Synoptic Survey Telescope will manage unprecedented volumes of data produced each night.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The Large Synoptic Survey Telescope—scheduled to come online in the early 2020s—will use a 3.2-gigapixel camera to photograph a giant swath of the heavens. It’ll keep it up for 10 years, every night with a clear sky, creating the world’s largest astronomical stop-motion movie.

    The results will give scientists both an unprecedented big-picture look at the motions of billions of celestial objects over time, and an ongoing stream of millions of real-time updates each night about changes in the sky.

    3
    Illustration by Sandbox Studio, Chicago with Ana Kova

    Accomplishing both of these tasks will require dealing with a lot of data, more than 20 terabytes each day for a decade. Collecting and storing the enormous volume of raw data, turning it into processed data that scientists can use, distributing it among institutions all over the globe, and doing all of this reliably and fast requires elaborate data management and technology.

    International data highways

    This type of data stream can be handled only with high-performance computing, the kind available at the National Center for Supercomputing Applications at the University of Illinois, Urbana-Champaign.

    NCSA U Illinois Urbana-Champaign Blue Waters Cray Linux XE/XK hybrid machine supercomputer

    Unfortunately, the U of I is a long way from Cerro Pachón, the remote Chilean mountaintop where the telescope will actually sit.

    But a network of dedicated data highways will make it feel like the two are right next door.

    LSST Data Journey,Illustration by Sandbox Studio, Chicago with Ana Kova

    Every 40 seconds, LSST’s camera will snap a new image of the sky. The camera’s data acquisition system will read out the data, and, after some initial corrections, send them hurtling down the mountain through newly installed high-speed optical fibers. These fibers have a bandwidth of up to 400 gigabits per second, thousands of times larger than the bandwidth of your typical home internet.

    Within a second, the data will arrive at the LSST base site in La Serena, Chile, which will store a copy before sending them to Chile’s capital, Santiago.

    From there, the data will take one of two routes across the ocean.

    The main route will lead them to São Paolo, Brazil, then fire them through cables across the ocean floor to Florida, which will pass them to Chicago, where they will finally be rerouted to the NCSA facility at the University of Illinois.

    If the primary path is interrupted, the data will take an alternative route through the Republic of Panama instead of Brazil. Either way, the entire trip—covering a distance of about 5000 miles—will take no more than 5 seconds.

    Curating LSST data for the world

    NCSA will be the central node of LSST’s data network. It will archive a second copy of the raw data and maintain key connections to two US-based facilities, the LSST headquarters in Tucson, which will manage science operations, and SLAC National Accelerator Laboratory in Menlo Park, California, which will provide support for the camera. But NCSA will also serve as the main data processing center, getting raw data ready for astrophysics research.

    NCSA will prepare the data at two speeds: quickly, for use in nightly alerts about changes to the sky, and at a more leisurely pace, for release as part of the annual catalogs of LSST data.

    6
    Illustration by Sandbox Studio, Chicago with Ana Kova

    Alert production has to be quick, to give scientists at LSST and other instruments time to respond to transient events, such as a sudden flare from an active galaxy or dying star, or the discovery of a new asteroid streaking across the firmament. LSST will send out about 10 million of these alerts per night, each within a minute after the event.

    Hundreds of computer cores at NCSA will be dedicated to this task. With the help of event brokers—software that facilitates the interaction with the alert stream—everyone in the world will be able to subscribe to all or a subset of these alerts.

    NCSA will share the task of processing data for the annual data releases with IN2P3, the French National Institution of Nuclear and Particle Physics, which will also archive a copy of the raw data.

    3

    The two data centers will provide petascale computing power, corresponding to several million billion computing operations per second.

    7
    Illustration by Sandbox Studio, Chicago with Ana Kova

    The releases will be curated catalogs of billions of objects containing calibrated images and measurements of object properties, such as positions, shapes and the power of their light emissions. To pull these details from the data, LSST’s data experts are creating advanced software for image processing and analysis. They are also developing machine learning algorithms to help classify the different objects LSST finds in the sky.

    Annual data releases will be made available to scientists in the US and Chile and institutions supporting LSST operations.

    Last but not least, LSST’s data management team is working on an interface that will make it easy for scientists to use the data LSST collects. What’s even better: A simplified version of that interface will make some of that data accessible to the public.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 9:38 am on December 3, 2018 Permalink | Reply
    Tags: , , , , , , LSST-Large Synoptic Survey Telescope,   

    From Science Alert: “Astronaut Warns This Neglected NASA Telescope Is Our Best Chance to Avoid Death by Asteroid” 

    ScienceAlert

    From Science Alert

    3 DEC 2018
    DAVE MOSHER

    A former NASA astronaut says the agency he used to work for has a duty to protect civilians from killer asteroids, but that it isn’t meeting that obligation.

    The threat of asteroid strikes might seem as abstract as outer space itself. But the risk, while infrequent, is real – and potentially more deadly than the threat posed by some of the most powerful nuclear weapons ever detonated.

    Risk of death from above

    In 1908, a space rock estimated to be several hundred feet in diameter screamed into Earth’s atmosphere at many thousands of miles per hour, causing the foreign body to explode over the remote Tunguska region of Russia with the force of a thermonuclear weapon.

    The resulting blast flattened trees over an area nearly twice the size of New York City.

    More recently, in 2013, a roughly 70-foot-wide meteorite shot over Chelyabinsk, Russia.

    The concussive fireball smashed windows for miles around and sent more than 1,000 people in multiple cities to hospitals, several dozen of them with serious injuries.

    We know they’re out there

    NASA is poignantly aware of such risks – and so are lawmakers.

    In 2005 Congress made one of the agency’s seven core goals to track down 90 percent of asteroids 460 feet (140 meters) and larger, which could lead to a worse-than-Tunguska-level event. The deadline for this legally mandated goal is 2020.

    So far, however, telescopes on Earth and in space have found less than one third of these near-Earth objects (NEOs) and NASA will almost certainly fail to hit its deadline.

    Practically, this means tens of thousands of NEOs big enough to wipe out a city have yet to be found, according to a June 2018 report published by the White House.

    The same report concludes that even with current and planned capabilities, less than half of such space rocks will be located by 2033.

    We have the technology to confront the problem

    Russell “Rusty” Schweickart, an aerospace engineer retired astronaut who flew on the Apollo 9 mission, says there is a solution in waiting for this problem: NASA can launch the Near-Earth Object Camera (NEOCam), which is a small infrared observatory, into space.

    NASA NEOCAM

    “It’s a critical discovery telescope to protect life on Earth, and it’s ready to go,” Schweickart told Business Insider at The Economist Space Summit on November 1.

    NEOCam’s designers have pitched the mission to NASA multiple times. The mission has received several million dollars here and there to continue its development in response to those proposals, but the agency has denied full funding in every instance on account of it not being the best purely science-focused mission.

    “For God’s sake, fund it as a mainline program. Don’t put it in yet another competition with science,” Schweickart said. “This is a public safety program.”

    How NEOCam would hunt for ‘city killer’ asteroids

    Telescopes that are looking in the right place at the right time can detect a dot of that light sneaking across the blackness of space. This allows scientists to calculate an NEO’s mass, speed, orbit, and the odds that it will eventually smack into Earth.

    Small NEOs, though, aren’t very bright. This means a telescope has to be big, see a lot of the sky, and use very advanced hardware to pick them up. These monstrous telescopes take a very long time to build and calibrate and are budget-crushingly expensive.

    Take the Large Synoptic Survey Telescope (LSST), for example, which is one of Earth’s best current hopes of finding killer asteroids.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The project broke ground in 2015 and is expected to cost about half a billion dollars to build.

    Based on its current construction schedule, it won’t be fully operational until late 2021, at the soonest, or able to fulfill the 90 percent detection goal set by Congress until the mid-2030s.

    LSST, like all ground-based observatories, also comes with two major limitations.

    The first: “You can’t see asteroids near the Sun. You’re blinded by the sky,” Mark Sykes, director of the Planetary Science Institute and a scientist on the NEOCam team, previously told Business Insider.

    “Right now we have to wait until those pop out in front of us.”

    Sykes said the second snag is that ground-based telescopes mainly rely on visible light for detection. “If [an asteroid] has a dark surface, it’s going to be very hard to see,” he said.

    NEOCam addresses these two problems by being in space, where Sykes says “you’re not blinded by the sky.”

    The telescope would also use an advanced, high-resolution infrared camera. Infrared is a longer wavelength of light that’s invisible to our eyes, but if a source is strong enough – say, a roaring fire – we can feel invisible light as warmth on our skin.

    Asteroids warmed by the Sun, radioactive elements, or both will emit infrared light, even when they’re too small or dark for ground-based telescopes to see. Which means NEOCam could spot them merely by their heat signatures.

    This approach is already proven to work.

    The prime example is NASA’s eight-year-old Wide-field Infrared Survey Explorer (WISE) telescope, which has found roughly 275 NEOs, including 50 potentially hazardous asteroids, or PHOs (so named because they come within 4.6 million miles of Earth at some point in their orbits).

    NASA Wise Telescope

    1
    (NASA/JPL-Caltech)

    However, it’s a less powerful telescope, has a smaller field of view, an older camera that requires cryogenic cooling that eventually runs out (NEOCam’s doesn’t need it), and wasn’t designed just to hunt asteroids.

    The telescope, now called NEOWISE, may end operations in December 2018.

    NEOCam is Earth’s best immediate hope for quick detection of asteroids

    According to a recent study in The Astronomical Journal, neither NEOCam nor LSST alone would ever achieve Congress’ 90 percent detection mandate – only by working together, the research found, could the observatories achieve that goal over a decade.

    But NEOCam offers significant upgrades to the situation under LSST.

    In its latest pitch to NASA, the NEOCam team proposed to launch in 2021 and find two-thirds of missing objects in the larger-than-460-feet (140 meters) category within four years, or about a decade ahead of LSST’s schedule.

    Less than 70 percent of all NEOs that are 460 feet (140 meters) or larger have not been found, according to a report published by the White House’s National Science and Technology Council (NSTC) in December 2016.

    This amounts to about 25,000 nearby asteroids and roughly 2,300 potentially hazardous ones.

    The NTSC report suggests that an orbiting telescope like NEOCam could also help root out asteroids that would strike with a force somewhere between a Tunguska-type event (occurring about once every 100-200 years) and a Chelyabinsk-type event (occurring about once every 10 years), of which less than 1 percent have been located.

    So if launching a more-capable replacement for NEOWISE is a top priority, why might NASA not fully fund NEOCam for a 2024 launch?

    ‘NASA has a responsibility to do it’

    The team behind NEOCam has pitched the mission to NASA three times – in 2006, 2010, and 2015 – and three times NASA has punted on fully funding the telescope.

    The last instance it was denied, sources told Business Insider the proposal had no major technical weaknesses. Instead, it was a case of trying to jam a square peg into a round bureaucratic hole.

    The NASA competition it was a part of, called Discovery, values scientific firsts – not ensuring humanity’s safety – and thus did not grant NEOCam nearly US$450 million to develop its spacecraft and a rocket with which to launch it.

    NASA instead picked two new space missions to explore the Solar System: Lucy, a probe that will visit swarms of ancient asteroids lurking near Jupiter, and Psyche, which will orbit the all-metal core of a dead planet.

    For Schweickart’s part, he doesn’t care about the distinction.

    “NASA has a responsibility to do it, and it’s not happening,” he said. “It needs to be put into the NASA budget both by NASA and by the Congress.”

    NEOCam did get US$35 million in the 2018 government funding bill to keep itself going, but proponents say this is not enough to get the telescope to a launch pad.

    “In the meantime, NEOCam is in a zombie state and all the while Earth waits inevitably in the crosshairs,” Richard Binzel, a planetary scientist and expert on the hazards posed by asteroids at Massachusetts Institute of Technology, told Business Insider in an email.

    Binzel is one of three scientists who wrote a recent op-ed in Space News in support of fully funding the project, even though they’re not on the project’s team.

    Binzel and others argue NEOCam could get launched by raising the House of Representatives’ proposed budget for NASA planetary defence by another US$40 million (up from a US$160 million to US$200 million) and by sharing a rocket ride with a spacecraft called IMAP, which the agency plans to launch in 2024.

    By working in coordination with ground-based telescopes, NEOCam could achieve nearly 70 percent detection in four years, and the agency’s target of 90 percent detection in less than 10 years.

    Finding such money is not easy though. Binzel said the infrequency of asteroid strikes makes it politically uncostly to instead fund other initiatives year after year.

    “But the consequences of being wrong are irresponsible, especially when the capability to gain the necessary knowledge is easily within our grasp,” he said.

    “We should simply act like responsible adults and ‘just do it.’ What are we waiting for?”

    It’s now up to President Trump and Congress

    Schweickart acknowledged that NASA’s budgeting and culture has, for decades, been focused on pushing top-tier scientific exploration and that deviating from this norm – Congressional mandate or not – isn’t easy.

    “You’re going upstream. You’re fighting a pretty strong headwind within NASA,” he said, adding that pulling money from science budgets to fund anything is extremely unpopular. “But government agencies are not at liberty to ask for increases in their budget.”

    Schweickart and fellow retired astronaut Ed Lu tried years ago to end-run around the problem by co-founding the B612 Foundation, which is a nonprofit dedicated to developing NEO-detecting capabilities.

    But the group tabled its longest-running (and most expensive) idea, the Sentinel space telescope, in part to improve NEOCam’s chances of getting funded. On Oct. 29, the organisation even publicized its strong support for lawmakers fully funding its rival.

    The public also appears to be on-board with NASA making asteroid detection projects like NEOCam happen.

    In a June poll by Pew Research Center, nearly two-thirds of 2,500 American adults surveyed said that asteroid monitoring should be a top priority for NASA. (Only monitoring climate change was higher.)

    It remains to be seen what the Trump administration will decide to do with NEOCam in the next NASA budget, and if Congress authorizes that funding.

    “That’s a February discussion,” Stephen Jurczyk, NASA’s associate administrator, told Business Insider at the Economist Space Summit.

    “All of that’s all embargoed until the president releases his budget to Congress.”

    Jurczyk acknowledged the tension between NASA’s duty to locate dangerous asteroids along with internal changes required to make that work happen.

    “It is to some extent a cultural issue, where we kind of have this mentality of pure science and pure competition,” he said.

    “I think we’re starting to evolve to a more diverse and more balanced approach between pure science and other things that we need to do.”

    The question is whether those changes will happen before the next Tunguska-type asteroid arrives at Earth. Given enough warning, we might fly out to such a space rock and prevent a calamity or, if there isn’t enough time for that, try to move people out of harm’s way.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 2:49 pm on October 16, 2018 Permalink | Reply
    Tags: , , , , , Deep Skies Lab, Galaxy Zoo-Citizen Science, Gravitational lenses, , LSST-Large Synoptic Survey Telescope, ,   

    From Symmetry: “Studying the stars with machine learning” 

    Symmetry Mag
    From Symmetry

    10/16/18
    Evelyn Lamb

    1
    Illustration by Sandbox Studio, Chicago with Corinne Mucha

    To keep up with an impending astronomical increase in data about our universe, astrophysicists turn to machine learning.

    Kevin Schawinski had a problem.

    In 2007 he was an astrophysicist at Oxford University and hard at work reviewing seven years’ worth of photographs from the Sloan Digital Sky Survey—images of more than 900,000 galaxies. He spent his days looking at image after image, noting whether a galaxy looked spiral or elliptical, or logging which way it seemed to be spinning.

    Technological advancements had sped up scientists’ ability to collect information, but scientists were still processing information at the same rate. After working on the task full time and barely making a dent, Schawinski and colleague Chris Lintott decided there had to be a better way to do this.

    There was: a citizen science project called Galaxy Zoo. Schawinski and Lintott recruited volunteers from the public to help out by classifying images online. Showing the same images to multiple volunteers allowed them to check one another’s work. More than 100,000 people chipped in and condensed a task that would have taken years into just under six months.

    Citizen scientists continue to contribute to image-classification tasks. But technology also continues to advance.

    The Dark Energy Spectroscopic Instrument, scheduled to begin in 2019, will measure the velocities of about 30 million galaxies and quasars over five years.

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA

    The Large Synoptic Survey Telescope, scheduled to begin in the early 2020s, will collect more than 30 terabytes of data each night—for a decade.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    “The volume of datasets [from those surveys] will be at least an order of magnitude larger,” says Camille Avestruz, a postdoctoral researcher at the University of Chicago.

    To keep up, astrophysicists like Schawinski and Avestruz have recruited a new class of non-scientist scientists: machines.

    Researchers are using artificial intelligence to help with a variety of tasks in astronomy and cosmology, from image analysis to telescope scheduling.

    Superhuman scheduling, computerized calibration

    Artificial intelligence is an umbrella term for ways in which computers can seem to reason, make decisions, learn, and perform other tasks that we associate with human intelligence. Machine learning is a subfield of artificial intelligence that uses statistical techniques and pattern recognition to train computers to make decisions, rather than programming more direct algorithms.

    In 2017, a research group from Stanford University used machine learning to study images of strong gravitational lensing, a phenomenon in which an accumulation of matter in space is dense enough that it bends light waves as they travel around it.

    Gravitational Lensing NASA/ESA

    Because many gravitational lenses can’t be accounted for by luminous matter alone, a better understanding of gravitational lenses can help astronomers gain insight into dark matter.

    In the past, scientists have conducted this research by comparing actual images of gravitational lenses with large numbers of computer simulations of mathematical lensing models, a process that can take weeks or even months for a single image. The Stanford team showed that machine learning algorithms can speed up this process by a factor of millions.

    3
    Greg Stewart, SLAC National Accelerator Laboratory

    Schawinski, who is now an astrophysicist at ETH Zürich, uses machine learning in his current work. His group has used tools called generative adversarial networks, or GAN, to recover clean versions of images that have been degraded by random noise. They recently published a paper [Astronomy and Astrophysics]about using AI to generate and test new hypotheses in astrophysics and other areas of research.

    Another application of machine learning in astrophysics involves solving logistical challenges such as scheduling. There are only so many hours in a night that a given high-powered telescope can be used, and it can only point in one direction at a time. “It costs millions of dollars to use a telescope for on the order of weeks,” says Brian Nord, a physicist at the University of Chicago and part of Fermilab’s Machine Intelligence Group, which is tasked with helping researchers in all areas of high-energy physics deploy AI in their work.

    Machine learning can help observatories schedule telescopes so they can collect data as efficiently as possible. Both Schawinski’s lab and Fermilab are using a technique called reinforcement learning to train algorithms to solve problems like this one. In reinforcement learning, an algorithm isn’t trained on “right” and “wrong” answers but through differing rewards that depend on its outputs. The algorithms must strike a balance between the safe, predictable payoffs of understood options and the potential for a big win with an unexpected solution.

    4
    Illustration by Sandbox Studio, Chicago with Corinne Mucha

    A growing field

    When computer science graduate student Shubhendu Trivedi of the Toyota Technological Institute at University of Chicago started teaching a graduate course on deep learning with one of his mentors, Risi Kondor, he was pleased with how many researchers from the physical sciences signed up for it. They didn’t know much about how to use AI in their research, and Trivedi realized there was an unmet need for machine learning experts to help scientists in different fields find ways of exploiting these new techniques.

    The conversations he had with researchers in his class evolved into collaborations, including participation in the Deep Skies Lab, an astronomy and artificial intelligence research group co-founded by Avestruz, Nord and astronomer Joshua Peek of the Space Telescope Science Institute. Earlier this month, they submitted their first peer-reviewed paper demonstrating the efficiency of an AI-based method to measure gravitational lensing in the Cosmic Microwave Background [CMB].

    Similar groups are popping up across the world, from Schawinski’s group in Switzerland to the Centre for Astrophysics and Supercomputing in Australia. And adoption of machine learning techniques in astronomy is increasing rapidly. In an arXiv search of astronomy papers, the terms “deep learning” and “machine learning” appear more in the titles of papers from the first seven months of 2018 than from all of 2017, which in turn had more than 2016.

    “Five years ago, [machine learning algorithms in astronomy] were esoteric tools that performed worse than humans in most circumstances,” Nord says. Today, more and more algorithms are consistently outperforming humans. “You’d be surprised at how much low-hanging fruit there is.”

    But there are obstacles to introducing machine learning into astrophysics research. One of the biggest is the fact that machine learning is a black box. “We don’t have a fundamental theory of how neural networks work and make sense of things,” Schawinski says. Scientists are understandably nervous about using tools without fully understanding how they work.

    Another related stumbling block is uncertainty. Machine learning often depends on inputs that all have some amount of noise or error, and the models themselves make assumptions that introduce uncertainty. Researchers using machine learning techniques in their work need to understand these uncertainties and communicate those accurately to each other and the broader public.

    The state of the art in machine learning is changing so rapidly that researchers are reluctant to make predictions about what will be coming even in the next five years. “I would be really excited if as soon as data comes off the telescopes, a machine could look at it and find unexpected patterns,” Nord says.

    No matter exactly the form future advances take, the data keeps coming faster and faster, and researchers are increasingly convinced that artificial intelligence is going to be necessary to help them keep up.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 3:17 pm on May 14, 2018 Permalink | Reply
    Tags: , , , , LSST-Large Synoptic Survey Telescope, , , The next big discovery in astronomy? Scientists probably found it years ago – but they don’t know it yet   

    From The Conversation: “The next big discovery in astronomy? Scientists probably found it years ago – but they don’t know it yet” 

    Conversation
    From The Conversation

    May 14, 2018
    Eileen Meyer

    1
    An artist’s illustration of a black hole “eating” a star. NASA/JPL-Caltech

    Earlier this year, astronomers stumbled upon a fascinating finding: Thousands of black holes likely exist near the center of our galaxy.

    1
    Hundreds — Perhaps Thousands — of Black Holes Occupy the Center of the Milky Way

    The X-ray images that enabled this discovery weren’t from some state-of-the-art new telescope. Nor were they even recently taken – some of the data was collected nearly 20 years ago.

    No, the researchers discovered the black holes by digging through old, long-archived data.

    Discoveries like this will only become more common, as the era of “big data” changes how science is done. Astronomers are gathering an exponentially greater amount of data every day – so much that it will take years to uncover all the hidden signals buried in the archives.

    The evolution of astronomy

    Sixty years ago, the typical astronomer worked largely alone or in a small team. They likely had access to a respectably large ground-based optical telescope at their home institution.

    Their observations were largely confined to optical wavelengths – more or less what the eye can see. That meant they missed signals from a host of astrophysical sources, which can emit non-visible radiation from very low-frequency radio all the way up to high-energy gamma rays. For the most part, if you wanted to do astronomy, you had to be an academic or eccentric rich person with access to a good telescope.

    Old data was stored in the form of photographic plates or published catalogs. But accessing archives from other observatories could be difficult – and it was virtually impossible for amateur astronomers.

    Today, there are observatories that cover the entire electromagnetic spectrum. No longer operated by single institutions, these state-of-the-art observatories are usually launched by space agencies and are often joint efforts involving many countries.

    With the coming of the digital age, almost all data are publicly available shortly after they are obtained. This makes astronomy very democratic – anyone who wants to can reanalyze almost any data set that makes the news. (You too can look at the Chandra data that led to the discovery of thousands of black holes!)

    These observatories generate a staggering amount of data. For example, the Hubble Space Telescope, operating since 1990, has made over 1.3 million observations and transmits around 20 GB of raw data every week, which is impressive for a telescope first designed in the 1970s.

    NASA/ESA Hubble Telescope

    The Atacama Large Millimeter Array in Chile now anticipates adding 2 TB of data to its archives every day.

    ESO/NRAO/NAOJ ALMA Array in Chile in the Atacama at Chajnantor plateau, at 5,000 metres

    Data firehose

    The archives of astronomical data are already impressively large. But things are about to explode.

    Each generation of observatories are usually at least 10 times more sensitive than the previous, either because of improved technology or because the mission is simply larger. Depending on how long a new mission runs, it can detect hundreds of times more astronomical sources than previous missions at that wavelength.

    For example, compare the early EGRET gamma ray observatory, which flew in the 1990s, to NASA’s flagship mission Fermi, which turns 10 this year. EGRET detected only about 190 gamma ray sources in the sky. Fermi has seen over 5,000.

    NASA/Fermi LAT


    NASA/Fermi Gamma Ray Space Telescope

    The Large Synoptic Survey Telescope, an optical telescope currently under construction in Chile, will image the entire sky every few nights. It will be so sensitive that it will generate 10 million alerts per night on new or transient sources, leading to a catalog of over 15 petabytes after 10 years.

    LSST

    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The Square Kilometre Array , when completed in 2020, will be the most sensitive telescope in the world, capable of detecting airport radar stations of alien civilizations up to 50 light-years away. In just one year of activity, it will generate more data than the entire internet.


    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia


    SKA Murchison Widefield Array, Boolardy station in outback Western Australia, at the Murchison Radio-astronomy Observatory (MRO)


    SKA Meerkat telescope, 90 km outside the small Northern Cape town of Carnarvon, SA


    SKA LOFAR core (“superterp”) near Exloo, Netherlands


    These ambitious projects will test scientists’ ability to handle data. Images will need to be automatically processed – meaning that the data will need to be reduced down to a manageable size or transformed into a finished product. The new observatories are pushing the envelope of computational power, requiring facilities capable of processing hundreds of terabytes per day.

    The resulting archives – all publicly searchable – will contain 1 million times more information that what can be stored on your typical 1 TB backup disk.

    Unlocking new science

    The data deluge will make astronomy become a more collaborative and open science than ever before. Thanks to internet archives, robust learning communities and new outreach initiatives, citizens can now participate in science. For example, with the computer program Einstein@Home, anyone can use their computer’s idle time to help search for gravitational waves from colliding black holes.

    It’s an exciting time for scientists, too. Astronomers like myself often study physical phenomena on timescales so wildly beyond the typical human lifetime that watching them in real-time just isn’t going to happen. Events like a typical galaxy merger – which is exactly what it sounds like – can take hundreds of millions of years. All we can capture is a snapshot, like a single still frame from a video of a car accident.

    However, there are some phenomena that occur on shorter timescales, taking just a few decades, years or even seconds. That’s how scientists discovered those thousands of black holes in the new study. It’s also how they recently realized that the X-ray emission from the center of a nearby dwarf galaxy has been fading since first detected in the 1990s. These new discoveries suggest that more will be found in archival data spanning decades.

    In my own work, I use Hubble archives to make movies of “jets,” high-speed plasma ejected in beams from black holes. I used over 400 raw images spanning 13 years to make a movie of the jet in nearby galaxy M87. That movie showed, for the first time, the twisting motions of the plasma, suggesting that the jet has a helical structure.

    This kind of work was only possible because other observers, for other purposes, just happened to capture images of the source I was interested in, back when I was in kindergarten. As astronomical images become larger, higher resolution and ever more sensitive, this kind of research will become the norm.

    See the full article here .

    Please help promote STEM in your local schools.

    stem

    Stem Education Coalition

    The Conversation US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

     
  • richardmitnick 12:53 pm on April 17, 2018 Permalink | Reply
    Tags: , , , LSST-Large Synoptic Survey Telescope,   

    From Symmetry: “The world’s largest astronomical movie” 

    Symmetry Mag
    Symmetry

    04/17/18
    Manuel Gnida

    1
    Artwork by Sandbox Studio, Chicago with Ana Kova

    When the Large Synoptic Survey Telescope begins to survey the night sky in the early 2020s, it’ll collect a treasure trove of data.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The information will benefit a wide range of groundbreaking astronomical and astrophysical research, addressing topics such as dark matter, dark energy, the formation of galaxies and detailed studies of objects in our very own cosmic neighborhood, the Milky Way.

    LSST’s centerpiece will be its 3.2-gigapixel camera, which is being assembled at the US Department of Energy’s SLAC National Accelerator Laboratory. Every few days, the largest digital camera ever built for astronomy will compile a complete image of the Southern sky. Moreover, it’ll do so over and over again for a period of 10 years. It’ll track the motions and changes of tens of billions of stars, galaxies and other objects in what will be the world’s largest stop-motion movie of the universe.

    Fulfilling this extraordinary task requires extraordinary technology. The camera will be the size of a small SUV, weigh in at a whopping 3 tons, and use state-of-the-art optics, imaging technology and data management tools. But how exactly will it work?

    2
    Artwork by Sandbox Studio, Chicago with Ana Kova

    Collecting ancient light

    It all starts with choosing the right location for the telescope. Astronomers want the sharpest images of the dimmest objects for their analyses, and they also want to maximize their observation time. They need the nights to be dark and the air to be dry and stable.

    It turns out that the Atacama Desert, a plateau in the foothills of the Andes Mountains, scores very high for these criteria. That’s where LSST will be located—at nearly 8700 feet altitude on the Cerro Pachón ridge in Chile, 60 miles from the coastal town of La Serena.

    The next challenge is that most objects LSST researchers want to study are so far away that their light has been traveling through space for millions to billions of years. It arrives on Earth merely as a faint glow, and astronomers need to collect as much of that glow as possible. For this purpose, LSST will have a large primary mirror with a diameter close to 28 feet.

    The mirror will be part of a sophisticated three-mirror system that will reflect and focus the cosmic light into the camera.

    The unique optical design is crucial for the telescope’s extraordinary field of view—a measure of the area of sky captured with every snapshot. At 9.6 square degrees, corresponding to 40 times the area of the full moon, the large field of view will allow astronomers to put together a complete map of the Southern night sky every few days.

    After bouncing off the mirrors, the ancient cosmic light will enter the camera through a set of three large lenses. The largest one will have a diameter of more than 5 feet.

    Together with the mirrors, the lenses’ job is to focus the light as sharply as possible onto the focal plane—a grid of light-sensitive sensors at the back of the camera where the light from the sky will be detected.

    A filter changer will insert filters in front of the third lens, allowing astronomers to take images with different kinds of cosmic light that range from the ultraviolet to the near-infrared. This flexibility enhances the range of possible observations with LSST. For example, with an infrared filter researchers can look right through dust and get a better view of objects obscured by it. By comparing how bright an object is when seen through different filters, astronomers also learn how its emitted light varies with the wavelength, which reveals details about how the light is produced.

    4
    Artwork by Sandbox Studio, Chicago with Ana Kova

    An Extraordinary Imaging Device

    The heart of LSST’s camera is its 25-inch-wide focal plane. That’s where the light of stars and galaxies will be turned into electrical signals, which will then be used to reconstruct images of the sky. The focal plane will hold 189 imaging sensors, called charge-coupled devices, that perform this transformation.

    Each CCD is 4096 pixels wide and long, and together they’ll add up to the camera’s 3.2 gigapixels. A “good” star will be the size of only a handful of pixels, whereas distant galaxies might appear as somewhat larger fuzzballs.

    The focal plane will consist of 21 smaller square arrays, called rafts, with nine CCDs each. This modular structure will make it easier and less costly to replace imaging sensors if needed in the future.

    To the delight of astronomers interested in extremely dim objects, the camera will have a large aperture (f/1.2, for the photographers among us), meaning that it’ll let a lot of light onto the imaging sensors. However, the large aperture will also make the depth of field very shallow, which means that objects will become blurry very quickly if they are not precisely projected onto the focal plane. That’s why the focal plane will need to be extremely flat, demanding that individual CCDs don’t stick out or recess by more than 0.0004 inches.

    To eliminate unwanted background signals, known as dark currents, the sensors will also need to be cooled to minus 150 degrees Fahrenheit. The temperature will need to be kept stable to half a degree. Because water vapor inside the camera housing would form ice on the sensors at this chilly temperature, the focal plane must also be kept in a vacuum.

    In addition to the 189 “science” sensors that will capture images of the sky, the focal plane will also have three specialty sensors in each of the four corners of the focal plane. Two so-called guiders will frequently monitor the position of a reference star and help LSST stay in sync with the Earth’s rotation. The third sensor, called a wavefront sensor, will be split into two halves that will be positioned six-hundredths of an inch above and below the focal plane. It’ll see objects as blurry “donuts” and provide information that will be used to adjust the telescope’s focus.

    Cinematography of astronomical dimension

    Once the camera has taken enough data from a patch in the sky, about every 36 seconds, the telescope will be repositioned to look at the next spot. A computer algorithm will determine the patches in the sky that will be surveyed by LSST on any given night.

    While the telescope is moving, a shutter between the filter and the third lens camera will close to prevent more light from falling onto the imaging sensors. At the same time, the CCDs will be read out and their information digitized.

    The data will be sent into the processing and analysis pipeline that will handle LSST’s enormous flood of information (about 20 terabytes of data every single night). There, it will be turned into useable images. The system will also flag potential interesting events and send out alerts to astronomers within a minute.

    This way—patch by patch—a complete image of the entire Southern sky will be stitched together every few days. Then the imaging process will start over and repeat for the 10-year duration of the survey, ultimately creating the largest time-lapse movie of the universe ever made and providing researchers with unprecedented research opportunities.

    For more information on LSST, visit LSST’s website or SLAC’s LSST camera website.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 1:27 pm on February 9, 2018 Permalink | Reply
    Tags: , , , , LSST-Large Synoptic Survey Telescope   

    From LSST: “LSST’s Auxiliary Telescope” 

    LSST

    Large Synoptic Survey Telescope

    February 6, 2018

    1

    In tandem with LSST’s construction on Cerro Pachón, a smaller telescope will soon be assembled on nearby calibration hill, a short distance away from the main LSST Facility. LSST’s 1.2-meter Auxiliary Telescope will measure atmospheric transmission, which refers to how directly light is transmitting through the Earth’s atmosphere in a given spot, as opposed to being absorbed or scattered. Because the presence of certain molecules and particles in the atmosphere will change the color of light detected by the LSST telescope, data collected by the Auxiliary Telescope, as it mirrors the nightly movements of LSST, will inform the catalog corrections that need to be made to LSST data in order to render it more accurate.

    Elements in the atmosphere that affect how light is detected by a ground based telescope like LSST include water, oxygen, and ozone, as well as aerosols like sea salt, dust from volcanoes, and smoke from forest fires. The presence and quantity of these elements varies from night to night, so the Auxiliary Telescope will provide this important complementary data for LSST throughout survey operations. According to Calibration Hardware Scientist Patrick Ingraham, “Having a dedicated auxiliary telescope supporting the main telescope is somewhat unique, and it will increase the quality of data produced by LSST.”

    The Auxiliary Telescope itself wasn’t built from scratch; it’s an existing telescope that has been repurposed for its role in the LSST survey. Since being moved from its original location on nearby Kitt Peak in May, 2014, it’s been housed in the workshop at LSST’s Project Office in Tucson, AZ. Refurbishment work has included replacement of all the telescope’s electrical parts including the motors and the position encoders, which record the exact position of the telescope at any given time. Mechanically speaking, the telescope is largely unchanged. Its mirrors, which were removed while work was done, will be recoated and replaced once the telescope arrives on Cerro Pachón; they are currently in separate protective crates that will protect them during shipping.

    Currently, the subcontractor working on the refurbishment project is almost finished with the wiring of the telescope’s electrical components. Once that’s complete, the telescope will undergo functional testing of its mechanical and electrical systems. Individual tasks that make up this testing include driving the telescope toward its upper and lower limits and ensuring the system will shut off before those limits are reached (preventing damage to the telescope), testing for excessive vibration, and testing the speed at which the telescope slews, or moves from one spot to the next. Extensive functional testing is critical now, because once the telescope is on Cerro Pachón there won’t be sufficient facilities to easily make repairs. Optical testing of the telescope will occur after the telescope is installed in its facility on the summit and re-integrated with its mirrors.

    Once the telescope is officially ready to be shipped from Tucson to Chile, the individual telescope assemblies will be packed in custom crates, and these crates will be loaded into a shipping container. It will take about 2 months for the shipping container to get from Tucson to Cerro Pachón. Once there, the telescope will be installed in a few pieces, with a crane, through the dome of its facility on calibration hill. Photos of the Auxiliary Telescope in the workshop , as well as the facility on Cerro Pachón (link is external), can be viewed and downloaded from the LSST Gallery.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    LSST telescope, currently under construction at Cerro Pachón Chile
    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    LSST Interior
    LSST telescope, currently under construction at Cerro Pachón Chile.

    LSST/Camera, built at SLAC
    LSST/Camera, built at SLAC

    The LSST is a new kind of telescope. Currently under construction in Chile, it is being built to rapidly survey the night-time sky. Compact and nimble, the LSST will move quickly between images, yet its large mirror and large field of view—almost 10 square degrees of sky, or 40 times the size of the full moon—work together to deliver more light from faint astronomical objects than any optical telescope in the world.

    From its mountaintop site in the foothills of the Andes, the LSST will take more than 800 panoramic images each night with its 3.2 billion-pixel camera, recording the entire visible sky twice each week. Each patch of sky it images will be visited 1000 times during the survey. With a light-gathering power equal to a 6.7-m diameter primary mirror, each of its 30-second observations will be able to detect objects 10 million times fainter than visible with the human eye. A powerful data system will compare new with previous images to detect changes in brightness and position of objects as big as far-distant galaxy clusters and as small as near-by asteroids.

    The LSST’s combination of telescope, mirror, camera, data processing, and survey will capture changes in billions of faint objects and the data it provides will be used to create an animated, three-dimensional cosmic map with unprecedented depth and detail , giving us an entirely new way to look at the Universe. This map will serve a myriad of purposes, from locating that mysterious substance called dark matter and characterizing the properties of the even more mysterious dark energy, to tracking transient objects, to studying our own Milky Way Galaxy in depth. It will even be used to detect and track potentially hazardous asteroids—asteroids that might impact the Earth and cause significant damage.

    As with past technological advances that opened new windows of discovery, such a powerful system for exploring the faint and transient Universe will undoubtedly serve up surprises.

    Plans for sharing the data from LSST with the public are as ambitious as the telescope itself. Anyone with a computer will be able to view the moving map of the Universe created by the LSST, including objects a hundred million times fainter than can be observed with the unaided eye. The LSST project will provide analysis tools to enable both students and the public to participate in the process of scientific discovery. We invite you to learn more about LSST science.

    The LSST will be unique: no existing telescope or proposed camera could be retrofitted or re-designed to cover ten square degrees of sky with a collecting area of forty square meters. Named the highest priority for ground-based astronomy in the 2010 Decadal Survey, the LSST project formally began construction in July 2014.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: