Tagged: LSST-Large Synoptic Survey Telescope Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:15 am on August 19, 2019 Permalink | Reply
    Tags: "Brookhaven Completes LSST's Digital Sensor Array", , , , , , , LSST-Large Synoptic Survey Telescope,   

    From Brookhaven National Lab: “Brookhaven Completes LSST’s Digital Sensor Array” 

    From Brookhaven National Lab

    August 19, 2019

    Stephanie Kossman
    (631) 344-8671

    Peter Genzer,
    (631) 344-3174

    Brookhaven National Lab has finished constructing the 3.2 gigapixel “digital film” for the world’s largest camera for cosmology, physics, and astronomy.

    SLAC National Accelerator Laboratory installs the first of Brookhaven’s 21 rafts that make up LSST’s digital sensor array. Photo courtesy SLAC National Accelerator Laboratory.

    After 16 years of dedicated planning and engineering, scientists at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory have completed a 3.2 gigapixel sensor array for the camera that will be used in the Large Synoptic Survey Telescope (LSST), a massive telescope that will observe the universe like never before.


    LSST Camera, built at SLAC

    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    LSST Data Journey, Illustration by Sandbox Studio, Chicago with Ana Kova

    “This is the biggest charge-coupled device (CCD) array that has ever been built,” said Paul O’Connor, senior scientist at Brookhaven Lab’s instrumentation division. “It’s three billion pixels. No telescope has ever put this many sensors into one camera.”

    The digital sensor array is composed of about 200 16-megapixel sensors, divided into 21 modules called “rafts.” Each raft can function on its own, but when combined, they will view an area of sky that can fit more than 40 full moons in a single image. Researchers will stitch these images together to create a time-lapse movie of the complete visible universe accessible from Chile.

    Currently under construction on a mountaintop in Chile, LSST is designed to capture the most complete images of our universe that have ever been achieved. The project to build the telescope facility and camera is a collaborative effort among more than 30 institutions from around the world, and it is primarily funded by DOE’s Office of Science and the National Science Foundation. DOE’s SLAC National Accelerator Laboratory is leading the overall effort to construct the camera—the world’s largest camera for astronomy—while Brookhaven led the design, construction, and qualification of the digital sensor array—the “digital film” for the camera.

    “It’s the heart of the camera,” said Bill Wahl, science raft subsystem manager of the LSST project at Brookhaven Lab. “What we’ve done here at Brookhaven represents years of great work by many talented scientists, engineers, and technicians. Their work will lead to a collection of images that has never been seen before by anyone. It’s an exciting time for the project and for the Lab.”

    Members of the LSST project team at Brookhaven Lab are shown with a prototype raft cryostat. In addition to the rafts, Brookhaven scientists designed and built the cryostats that hold and cool the rafts to -100° Celsius.

    Brookhaven began its LSST research and development program in 2003, with construction of the digital sensor array starting in 2014. In the time leading up to construction, Brookhaven designed and fabricated the assembly and test equipment for the science rafts used both at Brookhaven and SLAC. The Laboratory also created an entire automated production facility and cleanroom, along with production and tracking software.

    “We made sure to automate as much of the production facility as possible,” O’Connor said. “Testing a single raft could take up to three days. We were working on a tight schedule, so we had our automated facility running 24/7. Of course, out of a concern for safety, we always had someone monitoring the facility throughout the day and night.”

    Constructing the complex sensor array, which operates in a vacuum and must be cooled to -100° Celsius, is a challenge on its own. But the Brookhaven team was also tasked with testing each fully assembled raft, as well as individual sensors and electronics. Once each raft was complete, it needed to be carefully packaged in a protective environment to be safely shipped across the country to SLAC.

    The LSST team at Brookhaven completed the first raft in 2017. But soon after, they were presented with a new challenge.

    “We later discovered that design features inadvertently led to the possibility that electrical wires in the rafts could get shorted out,” O’Connor said. “The rate at which this effect was impacting the rafts was only on the order of 0.2%, but to avoid any possibility of degradation, we went through the trouble of refitting almost every raft.”

    Now, just two years after the start of raft production, the team has successfully built and shipped the final raft to SLAC for integration into the camera. This marks the end of a 16-year project at Brookhaven, which will be followed by many years of astronomical observation.

    Many of the talented team members recruited to Brookhaven for the LSST project were young engineers and technicians hired right out of graduate school. Now, they’ve all been assigned to ongoing physics projects at the Lab, such as upgrading the PHENIX detector at the Relativistic Heavy Ion Collider—a DOE Office of Science User Facility for nuclear physics research—to sPHENIX [see RHIC components below], as well as ongoing work with the ATLAS detector at CERN’s Large Hadron Collider. Brookhaven is the U.S. host laboratory for the ATLAS collaboration.

    CERN ATLAS Image Claudia Marcelloni

    “Brookhaven’s role in the LSST camera project afforded new and exciting opportunities for engineers, technicians, and scientists in electro-optics, where very demanding specifications must be met,” Wahl said. “The multi-disciplined team we assembled did an excellent job achieving design objectives and I am proud of our time together. Watching junior engineers and scientists grow into very capable team members was extremely rewarding.”

    Brookhaven Lab will continue to play a strong role in LSST going forward. As the telescope undergoes its commissioning phase, Brookhaven scientists will serve as experts on the digital sensor array in the camera. They will also provide support during LSST’s operations, which are projected to begin in 2022.

    SLAC National Accelerator Laboratory installs the first of Brookhaven’s 21 rafts that make up LSST’s digital sensor array. Photo courtesy SLAC National Accelerator Laboratory.

    “The commissioning of such a complex camera will be an exciting and challenging endeavor,” said Brookhaven physicist Andrei Nomerotski, who is leading Brookhaven’s contributions to the commissioning and operation phases of the LSST project. “After years of using artificial signal sources for the sensor characterization, we are looking forward to seeing real stars and galaxies in the LSST CCDs.”

    Once operational in the Andes Mountains, LSST will serve nearly every subset of the astrophysics community. Perhaps most importantly, LSST will enable scientists to investigate dark energy and dark matter—two puzzles that have baffled physicists for decades. It is also estimated that LSST will find millions of asteroids in our solar system, in addition to offering new information about the creation of our galaxy. The images captured by LSST will be made available to physicists and astronomers in the U.S. and Chile immediately, making LSST one of the most advanced and accessible cosmology experiments ever created. Over time, the data will be made available to the public worldwide.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL Center for Functional Nanomaterials



    BNL RHIC Campus

    BNL/RHIC Star Detector


    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

  • richardmitnick 11:12 am on June 20, 2019 Permalink | Reply
    Tags: , , , ComCam miniature camers for the LSST, , LSST-Large Synoptic Survey Telescope,   

    From SLAC: “A miniature camera for the Large Synoptic Survey Telescope will help test the observatory and take first images” 

    June 19, 2019
    By Aiko Takeuchi-Demirci

    SLAC completed its work on ComCam, a commissioning device to be installed in Chile later this year.

    LSST ComCam

    Scientists at the Department of Energy’s SLAC National Accelerator Laboratory are building the world’s largest digital camera for astronomy and astrophysics – a minivan-sized 3,200-megapixel ‘eye’ of the future Large Synoptic Survey Telescope (LSST) that will enable unprecedented views of the universe starting in the fall of 2022 and provide new insights into dark energy and other cosmic mysteries.

    LSST Camera, being built at SLAC

    In the meantime, the lab has completed its work on a miniature version that will soon be used for testing the telescope and taking LSST’s first images of the night sky.

    These images will include glimpses of the motions of asteroids and objects in our solar system with orbits beyond that of Neptune, as well as alerts of sudden events such as supernovae, exploding stars that temporarily light up parts of the sky.

    ComCam, a commissioning camera for LSST. (Farrin Abbott/SLAC National Accelerator Laboratory)

    The device, called ComCam (short for Commissioning Camera), will use only four percent of the full LSST camera’s focal plane and produce much smaller images, but it will provide enough “imaging power” to test the observatory while its ultimate camera is still under construction. In fact, ComCam’s 144 megapixels outnumber the pixel count that was available to the Sloan Digital Sky Survey, a pioneering astrophysical survey project in the early 2000s.

    “ComCam will give us a great head start in checking all of the interfaces between the camera, telescope, site infrastructure and data management,” says Kevin Reil, LSST commissioning scientist and SLAC staff scientist.

    After completing the integration of imaging sensors into ComCam and other tasks, the SLAC team today shipped the device to LSST headquarters in Tucson, Arizona. There, more components will be added before the finished ComCam is sent to its final destination in Chile later this year.

    A miniature LSST camera

    The extraordinarily high image quality of the full LSST camera will be largely due to its 189 state-of-the-art imaging sensors. Arranged into square arrays, called rafts, of nine sensors each, they’ll make up the camera’s focal plane. ComCam has only a single raft, which was provided by DOE’s Brookhaven National Laboratory and recently inserted into the ComCam cryostat at SLAC.

    The cryostat, specially designed and built for ComCam, holds the raft in place and cools its imaging sensors to very low temperatures to eliminate unwanted background signals and improve image quality. The ComCam cryostat uses a different refrigeration system from that of the final LSST camera, which requires a more complex system in order to handle 21 rafts.

    The raft also contains electronics boards that will digitize data taken with ComCam. These data will be sent to data management systems at the National Science Foundation-supported National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign and centers at France’s National Institute of Nuclear and Particle Physics and in Chile, where they will be analyzed by scientists around the world.

    SLAC is also building and testing the camera control system, which will allow the observatory software to send commands to ComCam, for instance, to change filters and take images. The LSST camera will use the same control system.

    Toward first images

    Once ComCam arrives in Tucson, LSST scientists will add lenses, a filter changer and a shutter. They will integrate the complete instrument with the observatory software and computing infrastructure and perform crucial tests, including a dry run that will simulate a night of observations.

    “In large projects like LSST, it’s exciting to watch the hardware and software come together into a working system over the years,” says Brian Stalder, LSST commissioning scientist in Tucson.

    Finally, ComCam will be sent to Chile and installed on the actual telescope, paving the way for LSST commissioning.

    In addition, it’ll produce LSST’s first images, albeit at a much smaller scale than the final camera. Although science studies won’t be ComCam’s primary purpose, the team expects the camera to produce images of very good quality, Reil says: “It’ll be exciting to see these early images taken with our brand new, world-class telescope.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition


    SLAC/LCLS II projected view

    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

  • richardmitnick 11:09 am on June 4, 2019 Permalink | Reply
    Tags: , , , , LSST-Large Synoptic Survey Telescope,   

    From Symmetry: “Engineering the world’s largest digital camera” 

    Symmetry Mag
    From Symmetry

    Erika K. Carlson

    Building the Large Synoptic Survey Telescope also means solving extraordinary technological challenges.


    LSST Camera, built at SLAC

    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    LSST Data Journey, Illustration by Sandbox Studio, Chicago with Ana Kova

    In a brightly lit clean room at the US Department of Energy’s SLAC National Accelerator Laboratory, engineers are building a car-sized digital camera for the Large Synoptic Survey Telescope.

    When it’s ready, LSST will image almost all of the sky visible from its vantage point on a Chilean mountain, Cerro Pachón, every few nights for a decade to make an astronomical movie of unprecedented proportions.

    The camera is a combination of many extremes. Its largest lens is one of the biggest ever created for astronomy and astrophysics. The ceramic grid that will hold its imaging sensors is so flat that no feature larger than a human red blood cell sticks up from its surface. The electronics that control the sensors are customized to fit in a very tight space and use as little power as possible.

    All of these specifications are vital for letting LSST achieve its scientific goals. And not many of them are easy to achieve. The LSST camera will do what no camera has been capable of doing before, and building it requires solving technical problems that have never been solved before.

    A game of ‘Operation’

    “When you consider a project this complex, you can’t just dive in and say ‘Here, I’m going to design and build this in one shot,’ right?” says Tim Bond, head of the LSST Camera Integration and Test team at SLAC. “You have to divide and conquer. So you break it up into smaller pieces that individual groups can work on.”

    One of those pieces is figuring out how to get the camera’s sensors into place.

    The 3.2-billion-pixel LSST camera will be the largest digital camera ever constructed. Much like handheld digital cameras, the LSST camera will be made up of imaging sensors called charge-coupled devices—189 of them. These sensors and their bundles of electronics are arranged into 21 nine-sensor pallets called “rafts.” Each one weighs more than 20 pounds and stands almost 2 feet tall.

    Each sensor is fragile enough to chip if it even touches one of the other rafts. And, to minimize gaps in the sensors’ images, all of the rafts must be installed two hundredths of an inch apart inside the camera’s ceramic grid.

    The LSST engineers couldn’t possibly install the delicate rafts by hand without destroying them, so they took on the challenge of creating a device that could do this very specific task in their place.

    They concocted one concept after another. Travis Lange, a SLAC mechanical engineer, created computer models of each to find a design that could both do the job and be built with the level of machining precision available.

    “One of the bigger challenges for this is just the tolerance of all the individual pieces and how it corresponds to how much motion I am allowed to use,” Lange says. If a part is the wrong size by even just the width of a human hair, it’s a problem. “If you have many of those parts that are off by that much, those errors all stack up.”

    One of the designs that the team drew up resembled a claw-machine game. The device would sit on a structure above the cryostat, the apparatus that keeps the camera cold. With a long arm, it would reach through to a raft waiting for installation below. Over the course of several hours, it would pull the raft up through a very precisely sized slot and into place in the grid.

    Four specialized cameras pointing at the edges of the imaging sensors would help steer the raft into place without hitting neighboring sensors, and unique imaging software would measure the gaps between rafts in real time. “It’s a crazy game of ‘Operation,’” Lange says.

    The team went with the claw-machine plan. In May 2018, they put it to the test with its first practice raft and a mock-up of the camera. After most of a day had passed, the raft was successfully in place.

    The installation robot has since gone through several other successful test runs. Now that they’ve figured out the kinks in the process, installing each raft takes about two hours. Engineers plan to start the real installation process this summer.

    Not your everyday refrigerator

    The electronics and sensors crammed together inside the camera heat up as electricity runs through them. But heat is the enemy of astronomical observation. A warm sensor will sabotage its own observations by behaving as if it senses light where there is none. And as anyone who has ever heard their laptop fan working overtime before the computer crashed may know, heat can also cause electronics to stop working.

    To keep the camera cold enough, the engineers needed to create a customized refrigeration system. They eventually made a system of eight refrigeration circuits—two for the electronics and six for the sensors.

    Each of these systems works similarly to a kitchen refrigerator, in which a fluid refrigerant carries heat away from the object or area it’s supposed to cool. Networks of tubes carry the refrigerant into and out of the camera.

    At first, the team used only metal tubes for this job. Metal is good at keeping moisture out, which is important because any water that gets into the tubes from the surrounding air would freeze and clog the system. At parts of the system where the camera would need to move around with the telescope as it points to different parts of the sky, the tubes were corrugated to make them into flexible metal hoses.

    But there was a problem. The refrigeration system’s compressor, a device that forces the refrigerant to dump its absorbed heat outside the camera, uses lubricating oil to work smoothly. As the refrigeration system ran, some of the oil would leave the compressor and travel through the tubes.

    This wouldn’t have been a problem if the oil had traveled at a consistent pace all the way through the circuit, back to the compressor. But that wasn’t happening; the oil was getting slowed down and sometimes trapped by the grooves in the corrugated metal hoses. The compressor was getting oil back in trickles or spurts rather than in a steady stream. This made the refrigeration system unpredictable and harder to maintain.

    So the team switched to a different kind of hose for the refrigeration system’s “joints,” says Diane Hascall, a SLAC mechanical engineer on the LSST camera team. “You can almost think of it like a garden hose. But it’s a very special garden hose that’s made to work with refrigerants.”

    The new hoses, called smooth-bore hoses, are made of layers of rubber, braid and other flexible materials, and they are smooth on the inside. The smooth hose lets oil return to the compressor more effectively, Hascall says.

    But there was a trade-off. Unlike the metal hoses, the smooth-bore hoses do let some moisture in.

    To deal with that, the team installed filter dryers that absorb moisture from the system. They are still figuring out how often the dryers need to be replaced to keep the camera in good shape.

    Building next-gen technology

    Building each component of a piece of technology as sophisticated as LSST is a challenge in itself, but the challenges don’t end there. Engineers must also design specialized equipment, software and procedures to test different pieces; put the pieces together; and determine what maintenance the technology will need to run smoothly.

    “There’s a huge number of subsystems,” Bond says. “All of those subsystems have to present their products. And all those products have to be assembled and tested and work in the final finished full product.”

    Bond says working on the project has been a great boon to the engineering team. He says figuring out all of the unexpected challenges that have come with making such an advanced piece of technology has been a great experience, and he looks forward to seeing what future projects the team will tackle together.

    “It’s like picking a bunch of players to set up a hockey team or something,” Bond says. “We’ve actually put together a very good team, and we’re just getting some of our younger people up to speed and trained for the next generation of experiments and projects that will come along.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 10:52 am on March 25, 2019 Permalink | Reply
    Tags: , , , , , , ExaLearn, , LSST-Large Synoptic Survey Telescope, , ,   

    From insideHPC: “ExaLearn Project to bring Machine Learning to Exascale” 

    From insideHPC

    March 24, 2019

    As supercomputers become ever more capable in their march toward exascale levels of performance, scientists can run increasingly detailed and accurate simulations to study problems ranging from cleaner combustion to the nature of the universe. Enter ExaLearn, a new machine learning project supported by DOE’s Exascale Computing Project (ECP), aims to develop new tools to help scientists overcome this challenge by applying machine learning to very large experimental datasets and simulations.

    The first research area for ExaLearn’s surrogate models will be in cosmology to support projects such a the LSST (Large Synoptic Survey Telescope) now under construction in Chile and shown here in an artist’s rendering. (Todd Mason, Mason Productions Inc. / LSST Corporation)

    “The challenge is that these powerful simulations require lots of computer time. That is, they are “computationally expensive,” consuming 10 to 50 million CPU hours for a single simulation. For example, running a 50-million-hour simulation on all 658,784 compute cores on the Cori supercomputer NERSC would take more than three days.


    NERSC Cray Cori II supercomputer at NERSC at LBNL, named after Gerty Cori, the first American woman to win a Nobel Prize in science

    NERSC Hopper Cray XE6 supercomputer

    LBL NERSC Cray XC30 Edison supercomputer

    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.


    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.


    Cray Shasta Perlmutter SC18 AMD Epyc Nvidia pre-exascale supeercomputer

    Running thousands of these simulations, which are needed to explore wide ranges in parameter space, would be intractable.

    One of the areas ExaLearn is focusing on is surrogate models. Surrogate models, often known as emulators, are built to provide rapid approximations of more expensive simulations. This allows a scientist to generate additional simulations more cheaply – running much faster on many fewer processors. To do this, the team will need to run thousands of computationally expensive simulations over a wide parameter space to train the computer to recognize patterns in the simulation data. This then allows the computer to create a computationally cheap model, easily interpolating between the parameters it was initially trained on to fill in the blanks between the results of the more expensive models.

    “Training can also take a long time, but then we expect these models to generate new simulations in just seconds,” said Peter Nugent, deputy director for science engagement in the Computational Research Division at LBNL.

    From Cosmology to Combustion

    Nugent is leading the effort to develop the so-called surrogate models as part of ExaLearn. The first research area will be cosmology, followed by combustion. But the team expects the tools to benefit a wide range of disciplines.

    “Many DOE simulation efforts could benefit from having realistic surrogate models in place of computationally expensive simulations,” ExaLearn Principal Investigator Frank Alexander of Brookhaven National Lab said at the recent ECP Annual Meeting.

    “These can be used to quickly flesh out parameter space, help with real-time decision making and experimental design, and determine the best areas to perform additional simulations.”

    The surrogate models and related simulations will aid in cosmological analyses to reduce systematic uncertainties in observations by telescopes and satellites. Such observations generate massive datasets that are currently limited by systematic uncertainties. Since we only have a single universe to observe, the only way to address these uncertainties is through simulations, so creating cheap but realistic and unbiased simulations greatly speeds up the analysis of these observational datasets. A typical cosmology experiment now requires sub-percent level control of statistical and systematic uncertainties. This then requires the generation of thousands to hundreds of thousands of computationally expensive simulations to beat down the uncertainties.

    These parameters are critical in light of two upcoming programs:

    The Dark Energy Spectroscopic Instrument, or DESI, is an advanced instrument on a telescope located in Arizona that is expected to begin surveying the universe this year.

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA

    NOAO/Mayall 4 m telescope at Kitt Peak, Arizona, USA, Altitude 2,120 m (6,960 ft)

    DESI seeks to map the large-scale structure of the universe over an enormous volume and a wide range of look-back times (based on “redshift,” or the shift in the light of distant objects toward redder wavelengths of light). Targeting about 30 million pre-selected galaxies across one-third of the night sky, scientists will use DESI’s redshifts data to construct 3D maps of the universe. There will be about 10 terabytes (TB) of raw data per year transferred from the observatory to NERSC. After running the data through the pipelines at NERSC (using millions of CPU hours), about 100 TB per year of data products will be made available as data releases approximately once a year throughout DESI’s five years of operations.

    The Large Synoptic Survey Telescope, or LSST, is currently being built on a mountaintop in Chile.


    LSST Camera, built at SLAC

    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    LSST Data Journey, Illustration by Sandbox Studio, Chicago with Ana Kova

    When completed in 2021, the LSST will take more than 800 panoramic images each night with its 3.2 billion-pixel camera, recording the entire visible sky twice each week. Each patch of sky it images will be visited 1,000 times during the survey, and each of its 30-second observations will be able to detect objects 10 million times fainter than visible with the human eye. A powerful data system will compare new with previous images to detect changes in brightness and position of objects as big as far-distant galaxy clusters and as small as nearby asteroids.

    For these programs, the ExaLearn team will first target large-scale structure simulations of the universe since the field is more developed than others and the scale of the problem size can easily be ramped up to an exascale machine learning challenge.

    As an example of how ExaLearn will advance the field, Nugent said a researcher could run a suite of simulations with the parameters of the universe consisting of 30 percent dark energy and 70 percent dark matter, then a second simulation with 25 percent and 75 percent, respectively. Each of these simulations generates three-dimensional maps of tens of billions of galaxies in the universe and how the cluster and spread apart as time goes by. Using a surrogate model trained on these simulations, the researcher could then quickly run another surrogate model that would generate the output of a simulation in between these values, at 27.5 and 72.5 percent, without needing to run a new, costly simulation — that too would show the evolution of the galaxies in the universe as a function of time. The goal of the ExaLearn software suite is that such results, and their uncertainties and biases, would be a byproduct of the training so that one would know the generated models are consistent with a full simulation.

    Toward this end, Nugent’s team will build on two projects already underway at Berkeley Lab: CosmoFlow and CosmoGAN. CosmoFlow is a deep learning 3D convolutional neural network that can predict cosmological parameters with unprecedented accuracy using the Cori supercomputer at NERSC. CosmoGAN is exploring the use of generative adversarial networks to create cosmological weak lensing convergence maps — maps of the matter density of the universe as would be observed from Earth — at lower computational costs.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Founded on December 28, 2006, insideHPC is a blog that distills news and events in the world of HPC and presents them in bite-sized nuggets of helpfulness as a resource for supercomputing professionals. As one reader said, we’re sifting through all the news so you don’t have to!

    If you would like to contact me with suggestions, comments, corrections, errors or new company announcements, please send me an email at rich@insidehpc.com. Or you can send me mail at:

    2825 NW Upshur
    Suite G
    Portland, OR 97239

    Phone: (503) 877-5048

  • richardmitnick 9:15 am on March 25, 2019 Permalink | Reply
    Tags: "Women in Physics Group inspires the next generation of physicists and astronomers", LSST-Large Synoptic Survey Telescope, , ,   

    From University of Pennsylvania: “Women in Physics Group inspires the next generation of physicists and astronomers” 

    U Penn bloc

    From University of Pennsylvania

    March 22, 2019


    Erica K. Brockmeier Writer
    Eric Sucar Photographer

    Willman (center) and a group of undergraduates, including physics majors as well as students studying other STEM-related disciplines, chatted informally over breakfast about their personal experiences as STEM students and researchers.

    Earlier this month, Penn’s Women in Physics group hosted its fifth annual spring conference and networking event. Students had the opportunity to meet informally and share their work with Beth Willman, a world-renowned astronomer and deputy director of the Large Synoptic Survey Telescope (LSST).


    LSST Camera, built at SLAC

    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    LSST Data Journey, Illustration by Sandbox Studio, Chicago with Ana Kova

    Providing access to strong role models is just one of the goals of the undergraduate led group, which was founded in 2013 to support women studying physics through scholarship, mentorship, and social activities.

    “It’s a positive message that [Willman] is a strong, leading woman in a field that’s usually dominated by men,” says junior Olivia Sylvester from Mendham, New Jersey, a board member of the group. “In addition to learning about what she has to say about her research, you’re also taking in the fact that she’s probably overcome a lot of barriers to achieve such great success.”

    The conference kicked off with a casual morning get-together as Willman and a group of undergraduates chatted over coffee and breakfast. Students shared their experiences at Penn, with several indicating that they felt the atmosphere in the Department of Physics & Astronomy was generally welcoming and inclusive for women.

    After being introduced to several researchers in the department and sharing lunch with the Society of Physics group, undergraduate students presented the results of their summer research projects to Willman.

    First-year student Jen Locke from Ambler, Pennsylvania, presented her work from the lab of Masao Sako, an associate professor and undergraduate chair of the physics and astronomy department, on visualizing new planet candidates located in the Kuiper belt.

    Kuiper Belt. Minor Planet Center

    Next summer, Locke will work on developing a search strategy for finding new objects in the LSST database, a project that will likely involve Willman to a certain extent.

    Junior Alex Ulin from Los Angeles talked about her NASA internship on the flower-shaped starshade, a complex foldable structure that will make it easier to take pictures of potentially habitable planets that are difficult to visualize because of the brightness of the sun.

    NASA JPL Starshade

    Ulin, who wants to study materials science after graduation, worked on how to cut the nanometers-thin sheets of metal so they can cover the 20-meters-wide, origami-like structure as precisely as possible.

    Senior Abby Lee from St. Paul, Minnesota, who is advised by Gary Bernstein, the Reese W. Flower Professor of Astronomy and Astrophysics, presented the results of her research on selecting features for a physical model that describes dark matter subhalo disruption. These events, which happen when the circular “halo” around stars and galaxies interact with black holes or large areas of dark matter, can now be visualized thanks to improvements in technology but now require models that can help describe their behavior.

    Caterpillar Project A Milky-Way-size dark-matter halo and its subhalos circled, an enormous suite of simulations . Griffen et al. 2016

    Throughout the student presentations, Willman asked questions that ranged from the technical to the philosophical. Ulin, who also sits on the board for the Women in Physics group, says that these types of projects, as well as having researchers and mentors who can provide meaningful feedback on results, are instrumental experiences for undergraduate students in physics. “Talking to someone that you see having a success in the field can really inspire you to consider research and a career in STEM,” she says.

    The final event of the conference was a public lecture from Willman. More than 70 students, faculty, and other members of the Penn community attended her presentation, “The Most Magnificent Map Ever Made.” Willman, who is a Philadelphia native, says that the LSST is poised to become one the most productive scientific endeavors of all time. The project will look at half of the sky over 1,000 times across a 10-year period, and each image it collects will be 3.2 billion pixels large.

    In 2022, the Large Synoptic Survey Telescope (LSST) will embark on a 10-year mission to map half the sky. Willman discussed this ambitious project, as well as how the data could revolutionize the field of astronomy, during a public lecture that was held at Houston Hall.

    But Willman says that LSST’s real impact will come from distributing data in “science-ready” formats that can be used and studied easily. Through open-data initiatives that reduce barriers and enable people from a broad range of backgrounds to get involved with astronomy, Willman says that both scientists and society can benefit. “Everything that’s required in the future of scientific progress requires diversity,” she says. “Bringing ideas and people together is beneficial, and science needs as many viewpoints as possible.”

    Junior Abby Timmel from Baltimore, the third board member of the group, says that researchers like Willman who teach from their own experience instead of a textbook can do a lot to inspire students. “This event shows what it looks like to be really successful in physics, how to take the things that you’re learning about and go further with them to really make an impact,” she says.

    With more than 30 active members and a number of events throughout the year, the members of Women in Physics will continue working on their own “magnificent map” as they chart a course towards improved inclusion in STEM.

    Their annual conference is just one example of how important making connections and providing encouragement are for students in STEM. “It spreads awareness that there is a group for women physicists, but I also think that having an event that we’ve organized helps people respect the idea of a group like this,” says Ulin. “They see that not only are we trying to be a support system, we’re also actively doing things for the community.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Penn campus

    Academic life at Penn is unparalleled, with 100 countries and every U.S. state represented in one of the Ivy League’s most diverse student bodies. Consistently ranked among the top 10 universities in the country, Penn enrolls 10,000 undergraduate students and welcomes an additional 10,000 students to our world-renowned graduate and professional schools.

    Penn’s award-winning educators and scholars encourage students to pursue inquiry and discovery, follow their passions, and address the world’s most challenging problems through an interdisciplinary approach.

  • richardmitnick 3:58 pm on February 19, 2019 Permalink | Reply
    Tags: A simplified version of that interface will make some of that data accessible to the public, , , , , Every 40 seconds LSST’s camera will snap a new image of the sky, Hundreds of computer cores at NCSA will be dedicated to this task, International data highways, LSST Data Journey, LSST-Large Synoptic Survey Telescope, National Center for Supercomputing Applications at the University of Illinois Urbana-Champaign, NCSA will be the central node of LSST’s data network, , , The two data centers NCSA and IN2P3 will provide petascale computing power corresponding to several million billion computing operations per second, They are also developing machine learning algorithms to help classify the different objects LSST finds in the sky   

    From Symmetry: “An astronomical data challenge” 

    Symmetry Mag
    From Symmetry

    Illustration by Sandbox Studio, Chicago with Ana Kova

    Manuel Gnida

    The Large Synoptic Survey Telescope will manage unprecedented volumes of data produced each night.


    LSST Camera, built at SLAC

    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The Large Synoptic Survey Telescope—scheduled to come online in the early 2020s—will use a 3.2-gigapixel camera to photograph a giant swath of the heavens. It’ll keep it up for 10 years, every night with a clear sky, creating the world’s largest astronomical stop-motion movie.

    The results will give scientists both an unprecedented big-picture look at the motions of billions of celestial objects over time, and an ongoing stream of millions of real-time updates each night about changes in the sky.

    Illustration by Sandbox Studio, Chicago with Ana Kova

    Accomplishing both of these tasks will require dealing with a lot of data, more than 20 terabytes each day for a decade. Collecting and storing the enormous volume of raw data, turning it into processed data that scientists can use, distributing it among institutions all over the globe, and doing all of this reliably and fast requires elaborate data management and technology.

    International data highways

    This type of data stream can be handled only with high-performance computing, the kind available at the National Center for Supercomputing Applications at the University of Illinois, Urbana-Champaign.

    NCSA U Illinois Urbana-Champaign Blue Waters Cray Linux XE/XK hybrid machine supercomputer

    Unfortunately, the U of I is a long way from Cerro Pachón, the remote Chilean mountaintop where the telescope will actually sit.

    But a network of dedicated data highways will make it feel like the two are right next door.

    LSST Data Journey,Illustration by Sandbox Studio, Chicago with Ana Kova

    Every 40 seconds, LSST’s camera will snap a new image of the sky. The camera’s data acquisition system will read out the data, and, after some initial corrections, send them hurtling down the mountain through newly installed high-speed optical fibers. These fibers have a bandwidth of up to 400 gigabits per second, thousands of times larger than the bandwidth of your typical home internet.

    Within a second, the data will arrive at the LSST base site in La Serena, Chile, which will store a copy before sending them to Chile’s capital, Santiago.

    From there, the data will take one of two routes across the ocean.

    The main route will lead them to São Paolo, Brazil, then fire them through cables across the ocean floor to Florida, which will pass them to Chicago, where they will finally be rerouted to the NCSA facility at the University of Illinois.

    If the primary path is interrupted, the data will take an alternative route through the Republic of Panama instead of Brazil. Either way, the entire trip—covering a distance of about 5000 miles—will take no more than 5 seconds.

    Curating LSST data for the world

    NCSA will be the central node of LSST’s data network. It will archive a second copy of the raw data and maintain key connections to two US-based facilities, the LSST headquarters in Tucson, which will manage science operations, and SLAC National Accelerator Laboratory in Menlo Park, California, which will provide support for the camera. But NCSA will also serve as the main data processing center, getting raw data ready for astrophysics research.

    NCSA will prepare the data at two speeds: quickly, for use in nightly alerts about changes to the sky, and at a more leisurely pace, for release as part of the annual catalogs of LSST data.

    Illustration by Sandbox Studio, Chicago with Ana Kova

    Alert production has to be quick, to give scientists at LSST and other instruments time to respond to transient events, such as a sudden flare from an active galaxy or dying star, or the discovery of a new asteroid streaking across the firmament. LSST will send out about 10 million of these alerts per night, each within a minute after the event.

    Hundreds of computer cores at NCSA will be dedicated to this task. With the help of event brokers—software that facilitates the interaction with the alert stream—everyone in the world will be able to subscribe to all or a subset of these alerts.

    NCSA will share the task of processing data for the annual data releases with IN2P3, the French National Institution of Nuclear and Particle Physics, which will also archive a copy of the raw data.


    The two data centers will provide petascale computing power, corresponding to several million billion computing operations per second.

    Illustration by Sandbox Studio, Chicago with Ana Kova

    The releases will be curated catalogs of billions of objects containing calibrated images and measurements of object properties, such as positions, shapes and the power of their light emissions. To pull these details from the data, LSST’s data experts are creating advanced software for image processing and analysis. They are also developing machine learning algorithms to help classify the different objects LSST finds in the sky.

    Annual data releases will be made available to scientists in the US and Chile and institutions supporting LSST operations.

    Last but not least, LSST’s data management team is working on an interface that will make it easy for scientists to use the data LSST collects. What’s even better: A simplified version of that interface will make some of that data accessible to the public.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 9:38 am on December 3, 2018 Permalink | Reply
    Tags: , , , , , , LSST-Large Synoptic Survey Telescope,   

    From Science Alert: “Astronaut Warns This Neglected NASA Telescope Is Our Best Chance to Avoid Death by Asteroid” 


    From Science Alert

    3 DEC 2018

    A former NASA astronaut says the agency he used to work for has a duty to protect civilians from killer asteroids, but that it isn’t meeting that obligation.

    The threat of asteroid strikes might seem as abstract as outer space itself. But the risk, while infrequent, is real – and potentially more deadly than the threat posed by some of the most powerful nuclear weapons ever detonated.

    Risk of death from above

    In 1908, a space rock estimated to be several hundred feet in diameter screamed into Earth’s atmosphere at many thousands of miles per hour, causing the foreign body to explode over the remote Tunguska region of Russia with the force of a thermonuclear weapon.

    The resulting blast flattened trees over an area nearly twice the size of New York City.

    More recently, in 2013, a roughly 70-foot-wide meteorite shot over Chelyabinsk, Russia.

    The concussive fireball smashed windows for miles around and sent more than 1,000 people in multiple cities to hospitals, several dozen of them with serious injuries.

    We know they’re out there

    NASA is poignantly aware of such risks – and so are lawmakers.

    In 2005 Congress made one of the agency’s seven core goals to track down 90 percent of asteroids 460 feet (140 meters) and larger, which could lead to a worse-than-Tunguska-level event. The deadline for this legally mandated goal is 2020.

    So far, however, telescopes on Earth and in space have found less than one third of these near-Earth objects (NEOs) and NASA will almost certainly fail to hit its deadline.

    Practically, this means tens of thousands of NEOs big enough to wipe out a city have yet to be found, according to a June 2018 report published by the White House.

    The same report concludes that even with current and planned capabilities, less than half of such space rocks will be located by 2033.

    We have the technology to confront the problem

    Russell “Rusty” Schweickart, an aerospace engineer retired astronaut who flew on the Apollo 9 mission, says there is a solution in waiting for this problem: NASA can launch the Near-Earth Object Camera (NEOCam), which is a small infrared observatory, into space.


    “It’s a critical discovery telescope to protect life on Earth, and it’s ready to go,” Schweickart told Business Insider at The Economist Space Summit on November 1.

    NEOCam’s designers have pitched the mission to NASA multiple times. The mission has received several million dollars here and there to continue its development in response to those proposals, but the agency has denied full funding in every instance on account of it not being the best purely science-focused mission.

    “For God’s sake, fund it as a mainline program. Don’t put it in yet another competition with science,” Schweickart said. “This is a public safety program.”

    How NEOCam would hunt for ‘city killer’ asteroids

    Telescopes that are looking in the right place at the right time can detect a dot of that light sneaking across the blackness of space. This allows scientists to calculate an NEO’s mass, speed, orbit, and the odds that it will eventually smack into Earth.

    Small NEOs, though, aren’t very bright. This means a telescope has to be big, see a lot of the sky, and use very advanced hardware to pick them up. These monstrous telescopes take a very long time to build and calibrate and are budget-crushingly expensive.

    Take the Large Synoptic Survey Telescope (LSST), for example, which is one of Earth’s best current hopes of finding killer asteroids.


    LSST Camera, built at SLAC

    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The project broke ground in 2015 and is expected to cost about half a billion dollars to build.

    Based on its current construction schedule, it won’t be fully operational until late 2021, at the soonest, or able to fulfill the 90 percent detection goal set by Congress until the mid-2030s.

    LSST, like all ground-based observatories, also comes with two major limitations.

    The first: “You can’t see asteroids near the Sun. You’re blinded by the sky,” Mark Sykes, director of the Planetary Science Institute and a scientist on the NEOCam team, previously told Business Insider.

    “Right now we have to wait until those pop out in front of us.”

    Sykes said the second snag is that ground-based telescopes mainly rely on visible light for detection. “If [an asteroid] has a dark surface, it’s going to be very hard to see,” he said.

    NEOCam addresses these two problems by being in space, where Sykes says “you’re not blinded by the sky.”

    The telescope would also use an advanced, high-resolution infrared camera. Infrared is a longer wavelength of light that’s invisible to our eyes, but if a source is strong enough – say, a roaring fire – we can feel invisible light as warmth on our skin.

    Asteroids warmed by the Sun, radioactive elements, or both will emit infrared light, even when they’re too small or dark for ground-based telescopes to see. Which means NEOCam could spot them merely by their heat signatures.

    This approach is already proven to work.

    The prime example is NASA’s eight-year-old Wide-field Infrared Survey Explorer (WISE) telescope, which has found roughly 275 NEOs, including 50 potentially hazardous asteroids, or PHOs (so named because they come within 4.6 million miles of Earth at some point in their orbits).

    NASA Wise Telescope


    However, it’s a less powerful telescope, has a smaller field of view, an older camera that requires cryogenic cooling that eventually runs out (NEOCam’s doesn’t need it), and wasn’t designed just to hunt asteroids.

    The telescope, now called NEOWISE, may end operations in December 2018.

    NEOCam is Earth’s best immediate hope for quick detection of asteroids

    According to a recent study in The Astronomical Journal, neither NEOCam nor LSST alone would ever achieve Congress’ 90 percent detection mandate – only by working together, the research found, could the observatories achieve that goal over a decade.

    But NEOCam offers significant upgrades to the situation under LSST.

    In its latest pitch to NASA, the NEOCam team proposed to launch in 2021 and find two-thirds of missing objects in the larger-than-460-feet (140 meters) category within four years, or about a decade ahead of LSST’s schedule.

    Less than 70 percent of all NEOs that are 460 feet (140 meters) or larger have not been found, according to a report published by the White House’s National Science and Technology Council (NSTC) in December 2016.

    This amounts to about 25,000 nearby asteroids and roughly 2,300 potentially hazardous ones.

    The NTSC report suggests that an orbiting telescope like NEOCam could also help root out asteroids that would strike with a force somewhere between a Tunguska-type event (occurring about once every 100-200 years) and a Chelyabinsk-type event (occurring about once every 10 years), of which less than 1 percent have been located.

    So if launching a more-capable replacement for NEOWISE is a top priority, why might NASA not fully fund NEOCam for a 2024 launch?

    ‘NASA has a responsibility to do it’

    The team behind NEOCam has pitched the mission to NASA three times – in 2006, 2010, and 2015 – and three times NASA has punted on fully funding the telescope.

    The last instance it was denied, sources told Business Insider the proposal had no major technical weaknesses. Instead, it was a case of trying to jam a square peg into a round bureaucratic hole.

    The NASA competition it was a part of, called Discovery, values scientific firsts – not ensuring humanity’s safety – and thus did not grant NEOCam nearly US$450 million to develop its spacecraft and a rocket with which to launch it.

    NASA instead picked two new space missions to explore the Solar System: Lucy, a probe that will visit swarms of ancient asteroids lurking near Jupiter, and Psyche, which will orbit the all-metal core of a dead planet.

    For Schweickart’s part, he doesn’t care about the distinction.

    “NASA has a responsibility to do it, and it’s not happening,” he said. “It needs to be put into the NASA budget both by NASA and by the Congress.”

    NEOCam did get US$35 million in the 2018 government funding bill to keep itself going, but proponents say this is not enough to get the telescope to a launch pad.

    “In the meantime, NEOCam is in a zombie state and all the while Earth waits inevitably in the crosshairs,” Richard Binzel, a planetary scientist and expert on the hazards posed by asteroids at Massachusetts Institute of Technology, told Business Insider in an email.

    Binzel is one of three scientists who wrote a recent op-ed in Space News in support of fully funding the project, even though they’re not on the project’s team.

    Binzel and others argue NEOCam could get launched by raising the House of Representatives’ proposed budget for NASA planetary defence by another US$40 million (up from a US$160 million to US$200 million) and by sharing a rocket ride with a spacecraft called IMAP, which the agency plans to launch in 2024.

    By working in coordination with ground-based telescopes, NEOCam could achieve nearly 70 percent detection in four years, and the agency’s target of 90 percent detection in less than 10 years.

    Finding such money is not easy though. Binzel said the infrequency of asteroid strikes makes it politically uncostly to instead fund other initiatives year after year.

    “But the consequences of being wrong are irresponsible, especially when the capability to gain the necessary knowledge is easily within our grasp,” he said.

    “We should simply act like responsible adults and ‘just do it.’ What are we waiting for?”

    It’s now up to President Trump and Congress

    Schweickart acknowledged that NASA’s budgeting and culture has, for decades, been focused on pushing top-tier scientific exploration and that deviating from this norm – Congressional mandate or not – isn’t easy.

    “You’re going upstream. You’re fighting a pretty strong headwind within NASA,” he said, adding that pulling money from science budgets to fund anything is extremely unpopular. “But government agencies are not at liberty to ask for increases in their budget.”

    Schweickart and fellow retired astronaut Ed Lu tried years ago to end-run around the problem by co-founding the B612 Foundation, which is a nonprofit dedicated to developing NEO-detecting capabilities.

    But the group tabled its longest-running (and most expensive) idea, the Sentinel space telescope, in part to improve NEOCam’s chances of getting funded. On Oct. 29, the organisation even publicized its strong support for lawmakers fully funding its rival.

    The public also appears to be on-board with NASA making asteroid detection projects like NEOCam happen.

    In a June poll by Pew Research Center, nearly two-thirds of 2,500 American adults surveyed said that asteroid monitoring should be a top priority for NASA. (Only monitoring climate change was higher.)

    It remains to be seen what the Trump administration will decide to do with NEOCam in the next NASA budget, and if Congress authorizes that funding.

    “That’s a February discussion,” Stephen Jurczyk, NASA’s associate administrator, told Business Insider at the Economist Space Summit.

    “All of that’s all embargoed until the president releases his budget to Congress.”

    Jurczyk acknowledged the tension between NASA’s duty to locate dangerous asteroids along with internal changes required to make that work happen.

    “It is to some extent a cultural issue, where we kind of have this mentality of pure science and pure competition,” he said.

    “I think we’re starting to evolve to a more diverse and more balanced approach between pure science and other things that we need to do.”

    The question is whether those changes will happen before the next Tunguska-type asteroid arrives at Earth. Given enough warning, we might fly out to such a space rock and prevent a calamity or, if there isn’t enough time for that, try to move people out of harm’s way.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 2:49 pm on October 16, 2018 Permalink | Reply
    Tags: , , , , , Deep Skies Lab, Galaxy Zoo-Citizen Science, , , LSST-Large Synoptic Survey Telescope, ,   

    From Symmetry: “Studying the stars with machine learning” 

    Symmetry Mag
    From Symmetry

    Evelyn Lamb

    Illustration by Sandbox Studio, Chicago with Corinne Mucha

    To keep up with an impending astronomical increase in data about our universe, astrophysicists turn to machine learning.

    Kevin Schawinski had a problem.

    In 2007 he was an astrophysicist at Oxford University and hard at work reviewing seven years’ worth of photographs from the Sloan Digital Sky Survey—images of more than 900,000 galaxies. He spent his days looking at image after image, noting whether a galaxy looked spiral or elliptical, or logging which way it seemed to be spinning.

    Technological advancements had sped up scientists’ ability to collect information, but scientists were still processing information at the same rate. After working on the task full time and barely making a dent, Schawinski and colleague Chris Lintott decided there had to be a better way to do this.

    There was: a citizen science project called Galaxy Zoo. Schawinski and Lintott recruited volunteers from the public to help out by classifying images online. Showing the same images to multiple volunteers allowed them to check one another’s work. More than 100,000 people chipped in and condensed a task that would have taken years into just under six months.

    Citizen scientists continue to contribute to image-classification tasks. But technology also continues to advance.

    The Dark Energy Spectroscopic Instrument, scheduled to begin in 2019, will measure the velocities of about 30 million galaxies and quasars over five years.

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA

    The Large Synoptic Survey Telescope, scheduled to begin in the early 2020s, will collect more than 30 terabytes of data each night—for a decade.


    LSST Camera, built at SLAC

    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    “The volume of datasets [from those surveys] will be at least an order of magnitude larger,” says Camille Avestruz, a postdoctoral researcher at the University of Chicago.

    To keep up, astrophysicists like Schawinski and Avestruz have recruited a new class of non-scientist scientists: machines.

    Researchers are using artificial intelligence to help with a variety of tasks in astronomy and cosmology, from image analysis to telescope scheduling.

    Superhuman scheduling, computerized calibration

    Artificial intelligence is an umbrella term for ways in which computers can seem to reason, make decisions, learn, and perform other tasks that we associate with human intelligence. Machine learning is a subfield of artificial intelligence that uses statistical techniques and pattern recognition to train computers to make decisions, rather than programming more direct algorithms.

    In 2017, a research group from Stanford University used machine learning to study images of strong gravitational lensing, a phenomenon in which an accumulation of matter in space is dense enough that it bends light waves as they travel around it.

    Gravitational Lensing NASA/ESA

    Because many gravitational lenses can’t be accounted for by luminous matter alone, a better understanding of gravitational lenses can help astronomers gain insight into dark matter.

    In the past, scientists have conducted this research by comparing actual images of gravitational lenses with large numbers of computer simulations of mathematical lensing models, a process that can take weeks or even months for a single image. The Stanford team showed that machine learning algorithms can speed up this process by a factor of millions.

    Greg Stewart, SLAC National Accelerator Laboratory

    Schawinski, who is now an astrophysicist at ETH Zürich, uses machine learning in his current work. His group has used tools called generative adversarial networks, or GAN, to recover clean versions of images that have been degraded by random noise. They recently published a paper [Astronomy and Astrophysics]about using AI to generate and test new hypotheses in astrophysics and other areas of research.

    Another application of machine learning in astrophysics involves solving logistical challenges such as scheduling. There are only so many hours in a night that a given high-powered telescope can be used, and it can only point in one direction at a time. “It costs millions of dollars to use a telescope for on the order of weeks,” says Brian Nord, a physicist at the University of Chicago and part of Fermilab’s Machine Intelligence Group, which is tasked with helping researchers in all areas of high-energy physics deploy AI in their work.

    Machine learning can help observatories schedule telescopes so they can collect data as efficiently as possible. Both Schawinski’s lab and Fermilab are using a technique called reinforcement learning to train algorithms to solve problems like this one. In reinforcement learning, an algorithm isn’t trained on “right” and “wrong” answers but through differing rewards that depend on its outputs. The algorithms must strike a balance between the safe, predictable payoffs of understood options and the potential for a big win with an unexpected solution.

    Illustration by Sandbox Studio, Chicago with Corinne Mucha

    A growing field

    When computer science graduate student Shubhendu Trivedi of the Toyota Technological Institute at University of Chicago started teaching a graduate course on deep learning with one of his mentors, Risi Kondor, he was pleased with how many researchers from the physical sciences signed up for it. They didn’t know much about how to use AI in their research, and Trivedi realized there was an unmet need for machine learning experts to help scientists in different fields find ways of exploiting these new techniques.

    The conversations he had with researchers in his class evolved into collaborations, including participation in the Deep Skies Lab, an astronomy and artificial intelligence research group co-founded by Avestruz, Nord and astronomer Joshua Peek of the Space Telescope Science Institute. Earlier this month, they submitted their first peer-reviewed paper demonstrating the efficiency of an AI-based method to measure gravitational lensing in the Cosmic Microwave Background [CMB].

    Similar groups are popping up across the world, from Schawinski’s group in Switzerland to the Centre for Astrophysics and Supercomputing in Australia. And adoption of machine learning techniques in astronomy is increasing rapidly. In an arXiv search of astronomy papers, the terms “deep learning” and “machine learning” appear more in the titles of papers from the first seven months of 2018 than from all of 2017, which in turn had more than 2016.

    “Five years ago, [machine learning algorithms in astronomy] were esoteric tools that performed worse than humans in most circumstances,” Nord says. Today, more and more algorithms are consistently outperforming humans. “You’d be surprised at how much low-hanging fruit there is.”

    But there are obstacles to introducing machine learning into astrophysics research. One of the biggest is the fact that machine learning is a black box. “We don’t have a fundamental theory of how neural networks work and make sense of things,” Schawinski says. Scientists are understandably nervous about using tools without fully understanding how they work.

    Another related stumbling block is uncertainty. Machine learning often depends on inputs that all have some amount of noise or error, and the models themselves make assumptions that introduce uncertainty. Researchers using machine learning techniques in their work need to understand these uncertainties and communicate those accurately to each other and the broader public.

    The state of the art in machine learning is changing so rapidly that researchers are reluctant to make predictions about what will be coming even in the next five years. “I would be really excited if as soon as data comes off the telescopes, a machine could look at it and find unexpected patterns,” Nord says.

    No matter exactly the form future advances take, the data keeps coming faster and faster, and researchers are increasingly convinced that artificial intelligence is going to be necessary to help them keep up.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 3:17 pm on May 14, 2018 Permalink | Reply
    Tags: , , , , LSST-Large Synoptic Survey Telescope, , , The next big discovery in astronomy? Scientists probably found it years ago – but they don’t know it yet   

    From The Conversation: “The next big discovery in astronomy? Scientists probably found it years ago – but they don’t know it yet” 

    From The Conversation

    May 14, 2018
    Eileen Meyer

    An artist’s illustration of a black hole “eating” a star. NASA/JPL-Caltech

    Earlier this year, astronomers stumbled upon a fascinating finding: Thousands of black holes likely exist near the center of our galaxy.

    Hundreds — Perhaps Thousands — of Black Holes Occupy the Center of the Milky Way

    The X-ray images that enabled this discovery weren’t from some state-of-the-art new telescope. Nor were they even recently taken – some of the data was collected nearly 20 years ago.

    No, the researchers discovered the black holes by digging through old, long-archived data.

    Discoveries like this will only become more common, as the era of “big data” changes how science is done. Astronomers are gathering an exponentially greater amount of data every day – so much that it will take years to uncover all the hidden signals buried in the archives.

    The evolution of astronomy

    Sixty years ago, the typical astronomer worked largely alone or in a small team. They likely had access to a respectably large ground-based optical telescope at their home institution.

    Their observations were largely confined to optical wavelengths – more or less what the eye can see. That meant they missed signals from a host of astrophysical sources, which can emit non-visible radiation from very low-frequency radio all the way up to high-energy gamma rays. For the most part, if you wanted to do astronomy, you had to be an academic or eccentric rich person with access to a good telescope.

    Old data was stored in the form of photographic plates or published catalogs. But accessing archives from other observatories could be difficult – and it was virtually impossible for amateur astronomers.

    Today, there are observatories that cover the entire electromagnetic spectrum. No longer operated by single institutions, these state-of-the-art observatories are usually launched by space agencies and are often joint efforts involving many countries.

    With the coming of the digital age, almost all data are publicly available shortly after they are obtained. This makes astronomy very democratic – anyone who wants to can reanalyze almost any data set that makes the news. (You too can look at the Chandra data that led to the discovery of thousands of black holes!)

    These observatories generate a staggering amount of data. For example, the Hubble Space Telescope, operating since 1990, has made over 1.3 million observations and transmits around 20 GB of raw data every week, which is impressive for a telescope first designed in the 1970s.

    NASA/ESA Hubble Telescope

    The Atacama Large Millimeter Array in Chile now anticipates adding 2 TB of data to its archives every day.

    ESO/NRAO/NAOJ ALMA Array in Chile in the Atacama at Chajnantor plateau, at 5,000 metres

    Data firehose

    The archives of astronomical data are already impressively large. But things are about to explode.

    Each generation of observatories are usually at least 10 times more sensitive than the previous, either because of improved technology or because the mission is simply larger. Depending on how long a new mission runs, it can detect hundreds of times more astronomical sources than previous missions at that wavelength.

    For example, compare the early EGRET gamma ray observatory, which flew in the 1990s, to NASA’s flagship mission Fermi, which turns 10 this year. EGRET detected only about 190 gamma ray sources in the sky. Fermi has seen over 5,000.

    NASA/Fermi LAT

    NASA/Fermi Gamma Ray Space Telescope

    The Large Synoptic Survey Telescope, an optical telescope currently under construction in Chile, will image the entire sky every few nights. It will be so sensitive that it will generate 10 million alerts per night on new or transient sources, leading to a catalog of over 15 petabytes after 10 years.


    LSST Camera, built at SLAC

    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The Square Kilometre Array , when completed in 2020, will be the most sensitive telescope in the world, capable of detecting airport radar stations of alien civilizations up to 50 light-years away. In just one year of activity, it will generate more data than the entire internet.

    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia

    SKA Murchison Widefield Array, Boolardy station in outback Western Australia, at the Murchison Radio-astronomy Observatory (MRO)

    SKA Meerkat telescope, 90 km outside the small Northern Cape town of Carnarvon, SA

    SKA LOFAR core (“superterp”) near Exloo, Netherlands

    These ambitious projects will test scientists’ ability to handle data. Images will need to be automatically processed – meaning that the data will need to be reduced down to a manageable size or transformed into a finished product. The new observatories are pushing the envelope of computational power, requiring facilities capable of processing hundreds of terabytes per day.

    The resulting archives – all publicly searchable – will contain 1 million times more information that what can be stored on your typical 1 TB backup disk.

    Unlocking new science

    The data deluge will make astronomy become a more collaborative and open science than ever before. Thanks to internet archives, robust learning communities and new outreach initiatives, citizens can now participate in science. For example, with the computer program Einstein@Home, anyone can use their computer’s idle time to help search for gravitational waves from colliding black holes.

    It’s an exciting time for scientists, too. Astronomers like myself often study physical phenomena on timescales so wildly beyond the typical human lifetime that watching them in real-time just isn’t going to happen. Events like a typical galaxy merger – which is exactly what it sounds like – can take hundreds of millions of years. All we can capture is a snapshot, like a single still frame from a video of a car accident.

    However, there are some phenomena that occur on shorter timescales, taking just a few decades, years or even seconds. That’s how scientists discovered those thousands of black holes in the new study. It’s also how they recently realized that the X-ray emission from the center of a nearby dwarf galaxy has been fading since first detected in the 1990s. These new discoveries suggest that more will be found in archival data spanning decades.

    In my own work, I use Hubble archives to make movies of “jets,” high-speed plasma ejected in beams from black holes. I used over 400 raw images spanning 13 years to make a movie of the jet in nearby galaxy M87. That movie showed, for the first time, the twisting motions of the plasma, suggesting that the jet has a helical structure.

    This kind of work was only possible because other observers, for other purposes, just happened to capture images of the source I was interested in, back when I was in kindergarten. As astronomical images become larger, higher resolution and ever more sensitive, this kind of research will become the norm.

    See the full article here .

    Please help promote STEM in your local schools.


    Stem Education Coalition

    The Conversation US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

  • richardmitnick 12:53 pm on April 17, 2018 Permalink | Reply
    Tags: , , , LSST-Large Synoptic Survey Telescope,   

    From Symmetry: “The world’s largest astronomical movie” 

    Symmetry Mag

    Manuel Gnida

    Artwork by Sandbox Studio, Chicago with Ana Kova

    When the Large Synoptic Survey Telescope begins to survey the night sky in the early 2020s, it’ll collect a treasure trove of data.


    LSST Camera, built at SLAC

    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The information will benefit a wide range of groundbreaking astronomical and astrophysical research, addressing topics such as dark matter, dark energy, the formation of galaxies and detailed studies of objects in our very own cosmic neighborhood, the Milky Way.

    LSST’s centerpiece will be its 3.2-gigapixel camera, which is being assembled at the US Department of Energy’s SLAC National Accelerator Laboratory. Every few days, the largest digital camera ever built for astronomy will compile a complete image of the Southern sky. Moreover, it’ll do so over and over again for a period of 10 years. It’ll track the motions and changes of tens of billions of stars, galaxies and other objects in what will be the world’s largest stop-motion movie of the universe.

    Fulfilling this extraordinary task requires extraordinary technology. The camera will be the size of a small SUV, weigh in at a whopping 3 tons, and use state-of-the-art optics, imaging technology and data management tools. But how exactly will it work?

    Artwork by Sandbox Studio, Chicago with Ana Kova

    Collecting ancient light

    It all starts with choosing the right location for the telescope. Astronomers want the sharpest images of the dimmest objects for their analyses, and they also want to maximize their observation time. They need the nights to be dark and the air to be dry and stable.

    It turns out that the Atacama Desert, a plateau in the foothills of the Andes Mountains, scores very high for these criteria. That’s where LSST will be located—at nearly 8700 feet altitude on the Cerro Pachón ridge in Chile, 60 miles from the coastal town of La Serena.

    The next challenge is that most objects LSST researchers want to study are so far away that their light has been traveling through space for millions to billions of years. It arrives on Earth merely as a faint glow, and astronomers need to collect as much of that glow as possible. For this purpose, LSST will have a large primary mirror with a diameter close to 28 feet.

    The mirror will be part of a sophisticated three-mirror system that will reflect and focus the cosmic light into the camera.

    The unique optical design is crucial for the telescope’s extraordinary field of view—a measure of the area of sky captured with every snapshot. At 9.6 square degrees, corresponding to 40 times the area of the full moon, the large field of view will allow astronomers to put together a complete map of the Southern night sky every few days.

    After bouncing off the mirrors, the ancient cosmic light will enter the camera through a set of three large lenses. The largest one will have a diameter of more than 5 feet.

    Together with the mirrors, the lenses’ job is to focus the light as sharply as possible onto the focal plane—a grid of light-sensitive sensors at the back of the camera where the light from the sky will be detected.

    A filter changer will insert filters in front of the third lens, allowing astronomers to take images with different kinds of cosmic light that range from the ultraviolet to the near-infrared. This flexibility enhances the range of possible observations with LSST. For example, with an infrared filter researchers can look right through dust and get a better view of objects obscured by it. By comparing how bright an object is when seen through different filters, astronomers also learn how its emitted light varies with the wavelength, which reveals details about how the light is produced.

    Artwork by Sandbox Studio, Chicago with Ana Kova

    An Extraordinary Imaging Device

    The heart of LSST’s camera is its 25-inch-wide focal plane. That’s where the light of stars and galaxies will be turned into electrical signals, which will then be used to reconstruct images of the sky. The focal plane will hold 189 imaging sensors, called charge-coupled devices, that perform this transformation.

    Each CCD is 4096 pixels wide and long, and together they’ll add up to the camera’s 3.2 gigapixels. A “good” star will be the size of only a handful of pixels, whereas distant galaxies might appear as somewhat larger fuzzballs.

    The focal plane will consist of 21 smaller square arrays, called rafts, with nine CCDs each. This modular structure will make it easier and less costly to replace imaging sensors if needed in the future.

    To the delight of astronomers interested in extremely dim objects, the camera will have a large aperture (f/1.2, for the photographers among us), meaning that it’ll let a lot of light onto the imaging sensors. However, the large aperture will also make the depth of field very shallow, which means that objects will become blurry very quickly if they are not precisely projected onto the focal plane. That’s why the focal plane will need to be extremely flat, demanding that individual CCDs don’t stick out or recess by more than 0.0004 inches.

    To eliminate unwanted background signals, known as dark currents, the sensors will also need to be cooled to minus 150 degrees Fahrenheit. The temperature will need to be kept stable to half a degree. Because water vapor inside the camera housing would form ice on the sensors at this chilly temperature, the focal plane must also be kept in a vacuum.

    In addition to the 189 “science” sensors that will capture images of the sky, the focal plane will also have three specialty sensors in each of the four corners of the focal plane. Two so-called guiders will frequently monitor the position of a reference star and help LSST stay in sync with the Earth’s rotation. The third sensor, called a wavefront sensor, will be split into two halves that will be positioned six-hundredths of an inch above and below the focal plane. It’ll see objects as blurry “donuts” and provide information that will be used to adjust the telescope’s focus.

    Cinematography of astronomical dimension

    Once the camera has taken enough data from a patch in the sky, about every 36 seconds, the telescope will be repositioned to look at the next spot. A computer algorithm will determine the patches in the sky that will be surveyed by LSST on any given night.

    While the telescope is moving, a shutter between the filter and the third lens camera will close to prevent more light from falling onto the imaging sensors. At the same time, the CCDs will be read out and their information digitized.

    The data will be sent into the processing and analysis pipeline that will handle LSST’s enormous flood of information (about 20 terabytes of data every single night). There, it will be turned into useable images. The system will also flag potential interesting events and send out alerts to astronomers within a minute.

    This way—patch by patch—a complete image of the entire Southern sky will be stitched together every few days. Then the imaging process will start over and repeat for the 10-year duration of the survey, ultimately creating the largest time-lapse movie of the universe ever made and providing researchers with unprecedented research opportunities.

    For more information on LSST, visit LSST’s website or SLAC’s LSST camera website.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: