Tagged: Basic Research Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 5:28 pm on January 24, 2020 Permalink | Reply
    Tags: "Confirming New Physics in Space", , , , Basic Research, , RER- Rydberg Enhanced Recombination   

    From AAS NOVA: “Confirming New Physics in Space” 

    AASNOVA

    From AAS NOVA

    24 January 2020
    Susanna Kohler

    1
    Hubble image of the planetary nebula NGC 5189. A recent study has confirmed a new atomic process at work in nebulae such as this one. [NASA, ESA and the Hubble Heritage Team (STScI/AURA)]

    Not all laboratory astrophysics occurs in labs down here on Earth; sometimes, the lab is in space! A new study has used a space laboratory to confirm a new atomic process — with far-reaching implications.

    2
    The Cat’s Eye planetary nebula, as imaged in X-rays and optical light. [X-ray: NASA/CXC/SAO; Optical: NASA/STScI]

    NASA/ESA Hubble Telescope

    Balancing a Plasma

    Throughout our universe, cosmic soups of electrons and ions — astrophysical nebulae — fill the spaces surrounding dying stars, hot and compact binaries, and even supermassive black holes. The atoms in these nebulae cycle within a delicate balance: they are ionized (electrons are torn off) by the high-energy photons emitted from the hot nearby sources, and then they recombine (electrons are recaptured), emitting glowing radiation in the process.

    After many years of research into atomic processes, we thought that we’d pretty well pinned down the ways in which this photoionization and recombination takes place. This is crucial, since these rates go into models that we use to determine abundances — which, in turn, informs our understanding of stellar evolution, nucleosynthesis, galactic composition and kinematics, and cosmology.

    But what if we’re missing something?

    A New Process

    3
    A diagram of how Rydberg Enhanced Recombination works. [Nemer et al. 2019]

    In 2010, a team of scientists proposed exactly this: that we’re missing an additional type of recombination process that occurs frequently in astrophysical plasmas throughout the universe.

    The catch? This type of recombination — which they termed Rydberg Enhanced Recombination, or RER — had never before been detected, and it’s effectively impossible to study in Earth-based laboratories. Only in cold, low-density cosmic environments like astrophysical nebulae do the conditions necessary for RER exist.

    Laboratories in Space

    When Earth-based labs fail, it’s time to look to space! A team of scientists led by Ahmad Nemer (Auburn University; Princeton University) recently went on the hunt for astrophysical laboratories showing evidence of RER.

    First, Nemer and collaborators developed detailed models of how RER would work, under what conditions it would be effective, and what observable spectral lines this process would produce.

    5
    Illustration of a symbiotic binary system, consisting of a white dwarf and a red giant. [NASA, ESA, and D. Berry (STScI)]

    With sample spectra in hand, they then explored the high-resolution optical spectra of several planetary nebulae (the clouds of ionized plasma that surround dying, low-mass stars) and ultraviolet spectra of symbiotic binaries (systems where ionized plasma surrounds a white dwarf accreting mass from a red giant).

    Time for an Update

    Space lab success! In eight of the planetary nebulae and one of the symbiotic binaries, the authors found spectral lines that provide evidence of the RER process at work, with relative strengths that agree nicely with predictions.

    This confirmation of a predicted new atomic process represents a remarkable discovery with far-reaching implications. Nemer and collaborators show that the addition of RER contributions into our current models of ionization balance makes a significant difference in estimated elemental abundances of astrophysical nebulae — which means we may have a lot of work ahead of us to update our past research!

    Thanks to the power of laboratories in space, however, we now have a clearer idea of what we’ve been missing.

    Citation

    “First Evidence of Enhanced Recombination in Astrophysical Environments and the Implications for Plasma Diagnostics,” A. Nemer et al 2019 ApJL 887 L9.

    https://iopscience.iop.org/article/10.3847/2041-8213/ab5954

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    1

    AAS Mission and Vision Statement

    The mission of the American Astronomical Society is to enhance and share humanity’s scientific understanding of the Universe.

    The Society, through its publications, disseminates and archives the results of astronomical research. The Society also communicates and explains our understanding of the universe to the public.
    The Society facilitates and strengthens the interactions among members through professional meetings and other means. The Society supports member divisions representing specialized research and astronomical interests.
    The Society represents the goals of its community of members to the nation and the world. The Society also works with other scientific and educational societies to promote the advancement of science.
    The Society, through its members, trains, mentors and supports the next generation of astronomers. The Society supports and promotes increased participation of historically underrepresented groups in astronomy.
    The Society assists its members to develop their skills in the fields of education and public outreach at all levels. The Society promotes broad interest in astronomy, which enhances science literacy and leads many to careers in science and engineering.

    Adopted June 7, 2009

     
  • richardmitnick 4:40 pm on January 24, 2020 Permalink | Reply
    Tags: "Simulations Reveal Galaxy Clusters Details", , Astrophysicists have developed cosmological computer simulations called RomulusC where the ‘C' stands for galaxy cluster., , Basic Research, , RomulusC has produced some of the highest resolution simulations ever of galaxy clusters which can contain hundreds or even thousands of galaxies., ,   

    From Texas Advanced Computing Center: “Simulations Reveal Galaxy Clusters Details” 

    TACC bloc

    From Texas Advanced Computing Center

    January 23, 2020
    Jorge Salazar

    Galaxy clusters probed with Stampede2, Comet supercomputers [and others-see below]

    1
    RomulusC has produced some of the highest resolution simulations ever of galaxy clusters, which can contain hundreds or even thousands of galaxies. The galaxy cluster simulations generated by supercomputers are helping scientists map the unknown universe. Credit: Butsky et al.

    Inspired by the science fiction of the spacefaring Romulans of Star Trek, astrophysicists have developed cosmological computer simulations called RomulusC, where the ‘C’ stands for galaxy cluster. With a focus on black hole physics, RomulusC has produced some of the highest resolution simulations ever of galaxy clusters, which can contain hundreds or even thousands of galaxies.

    On Star Trek, the Romulans powered their spaceships with an artificial black hole. In reality, it turns out that black holes can drive the formation of stars and the evolution of whole galaxies. And this galaxy cluster work is helping scientists map the unknown universe.

    An October 2019 study yielded results from RomulusC simulations, published in the Monthly Notices of the Royal Astronomical Society. It probed the ionized gas of mainly hydrogen and helium within and surrounding the intracluster medium, which fills the space between galaxies in a galaxy cluster.

    Hot, dense gas of more than a million degrees Kelvin fills the inner cluster with roughly uniform metallicity. Cool-warm gas between ten thousand and a million degrees Kelvin lurks in patchy distributions at the outskirts, with greater variety of metals. Looking like the tail of a jellyfish, the cool-warm gas traces the process of galaxies falling into the cluster and losing their gas. The gas gets stripped from the falling galaxy and eventually mixes with the inner region of the galaxy cluster.

    “We find that there’s a substantial amount of this cool-warm gas in galaxy clusters,” said study co-author Iryna Butsky, a PhD Student in the Department of Astronomy at the University of Washington. “We see that this cool-warm gas traces at extremely different and complementary structures compared to the hot gas. And we also predict that this cool-warm component can be observed now with existing instruments like the Hubble Space Telescope Cosmic Origins Spectrograph.”

    Scientists are just beginning to probe the intracluster medium, which is so diffuse that its emissions are invisible to any current telescopes. Scientists are using RomulusC to help see clusters indirectly using the ultraviolet (UV) light from quasars, which act like a beacon shining through the gas. The gas absorbs UV light, and the resulting spectrum yields density, temperature, and metallicity profiles when analyzed with instruments like the Cosmic Origins Spectrograph aboard the Hubble Space Telescope (HST).

    NASA Hubble Cosmic Origins Spectrograph

    NASA/ESA Hubble Telescope

    2
    A 5×5 megaparsec (~18.15 light years) snapshot of the RomulusC simulation at redshift z = 0.31. The top row shows density-weighted projections of gas density, temperature, and metallicity. The bottom row shows the integrated X-ray intensity, O VI column density, and H I column density. Credit: Butsky et al.

    “One really cool thing about simulations is that we know what’s going on everywhere inside the simulated box,” Butsky said. “We can make some synthetic observations and compare them to what we actually see in absorption spectra and then connect the dots and match the spectra that’s observed and try to understand what’s really going on in this simulated box.”

    They applied a software tool called Trident developed by Cameron Hummels of Caltech and colleagues that takes the synthetic absorption line spectra and adds a bit of noise and instrument quirks known about the HST.

    “The end result is a very realistic looking spectrum that we can directly compare to existing observations,” Butsky said. “But what we can’t do with observations is reconstruct three-dimensional information from a one-dimensional spectrum. That’s what’s bridging the gap between observations and simulations.”

    One key assumption behind the RomulusC simulations supported by the latest science is that the gas making up the intracluster medium originates at least partly in the galaxies themselves. “We have to model how that gas gets out of the galaxies, which is happening through supernovae going off, and supernovae coming from young stars,” said study co-author Tom Quinn, a professor of astronomy at the University of Washington. That means a dynamic range of more than a billion to contend with.

    What’s more, clusters don’t form in isolation, so their environment has to be accounted for.

    Then there’s a computational challenge that’s particular to clusters. “Most of the computational action is happening in the very center of the cluster. Even though we’re simulating a much larger volume, most of the computation is happening at a particular spot. There’s a challenge of, as you’re trying to simulate this on a large supercomputer with tens of thousands of cores, how do you distribute that computation across those cores?” Quinn said.

    Quinn is no stranger to computational challenges. Since 1995, he’s used computing resources funded by the National Science Foundation (NSF), most recently those that are part of XSEDE, the Extreme Science and Engineering Discovery Environment.

    “Over the course of my career, NSF’s ability to provide high-end computing has helped the overall development of the simulation code that produced this,” said Quinn. “These parallel codes take a while to develop. And XSEDE has been supporting me throughout that development period. Access to a variety of high-end machines has helped with the development of the simulation code.”

    RomulusC started out as a proof-of-concept with friendly user time on the Stampede2 [below] system at the Texas Advanced Computing Center (TACC), when the Intel Xeon Phil (“Knights Landing”) processors first became available. “I got help from the TACC staff on getting the code up and running on the many-core, 68 core per chip machines.”

    Quinn and colleagues eventually scaled up RomulusC to 32,000 processors and completed the simulation on the Blue Waters system of the National Center for Supercomputing Applications.

    NCSA U Illinois Urbana-Champaign Blue Waters Cray Linux XE/XK hybrid machine supercomputer

    Along the way, they also used the NASA Pleiades supercomputer and the XSEDE-allocated Comet system at the San Diego Supercomputer Center, an Organized Research Unit of the University of California San Diego.

    NASA SGI Intel Advanced Supercomputing Center Pleiades Supercomputer

    SDSC Dell Comet supercomputer at San Diego Supercomputer Center (SDSC)

    “Comet fills a particular niche,” Quinn said. “It has large memory nodes available. Particular aspects of the analysis, for example identifying the galaxies, is not easily done on a distributed memory machine. Having the large shared memory machine available was very beneficial. In a sense, we didn’t have to completely parallelize that particular aspect of the analysis.”

    4
    The Stampede2 supercomputer at the Texas Advanced Computing Center (left) and the Comet supercomputer at the San Diego Supercomputer Center (right) are allocated resources of the Extreme Science and Engineering Discovery Environment (XSEDE) funded by the National Science Foundation (NSF). Credit: TACC, SDSC.

    “Without XSEDE, we couldn’t have done this simulation,” Quinn recounted. “It’s essentially a capability simulation. We needed the capability to actually do the simulation, but also the capability of the analysis machines.”

    The next generation of simulations are being made using the NSF-funded Frontera [below] system, the fastest academic supercomputer and currently the #5 fastest system in the world, according to the November 2019 Top500 List.

    “Right now on Frontera, we’re doing runs at higher resolution of individual galaxies,” Quinn said. “Since we started these simulations, we’ve been working on proving how we model the star formation. And of course we have more computational power, so just purely higher mass resolution, again, to make our simulations of individual galaxies more realistic. More and bigger clusters would be good too,” Quinn added.

    Said Butsky: “What I think is really cool about using supercomputers to model the universe is that they play a unique role in allowing us to do experiments. In many of the other sciences, you have a lab where you can test your theories. But in astronomy, you can come up with a pen and paper theory and observe the universe as it is. But without simulations, it’s very hard to run these tests because it’s hard to reproduce some of the extreme phenomena in space, like temporal scales and getting the temperatures and densities of some of these extreme objects. Simulations are extremely important in being able to make progress in theoretical work.”

    The study,”Ultraviolet Signatures of the Multiphase Intracluster and Circumgalactic Media in the RomulusC Simulation,” was published in October of 2019 in the Monthly Notices of the Royal Astronomical Society. The study co-authors are Iryna S. Butsky, Thomas R. Quinn, and Jessica K. Werk of the University of Washington; Joseph N. Burchett of UC Santa Cruz, and Daisuke Nagai and Michael Tremmel of Yale University. Study funding came from the NSF and NASA.

    See the full article here .

    Please help promote STEM in your local schools.


    Stem Education Coalition

    The Texas Advanced Computing Center (TACC) designs and operates some of the world’s most powerful computing resources. The center’s mission is to enable discoveries that advance science and society through the application of advanced computing technologies.

    TACC Maverick HP NVIDIA supercomputer

    TACC Lonestar Cray XC40 supercomputer

    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF

    TACC HPE Apollo 8000 Hikari supercomputer

    TACC Maverick HP NVIDIA supercomputer

    TACC DELL EMC Stampede2 supercomputer


    TACC Frontera Dell EMC supercomputer fastest at any university

     
  • richardmitnick 3:36 pm on January 24, 2020 Permalink | Reply
    Tags: , Basic Research, , Microway supercomputer being installed, , The new cluster from Microway affords the university five times the compute performance its researchers enjoyed previously with over 85% more total memory and over four times the aggregate memory band, The UMass Dartmouth cluster reflects a hybrid design to appeal to a wide array of the campus’ workloads.,   

    From insideHPC: “UMass Dartmouth Speeds Research with Hybrid Supercomputer from Microway” 

    From insideHPC

    Today Microway announced that research activities are accelerating at the University of Massachusetts Dartmouth since the installation of a new supercomputing cluster.

    “UMass Dartmouth’s powerful new cluster from Microway affords the university five times the compute performance its researchers enjoyed previously, with over 85% more total memory and over four times the aggregate memory bandwidth. It includes a heterogeneous system architecture featuring a wide array of computational engines.”

    2

    The UMass Dartmouth cluster reflects a hybrid design to appeal to a wide array of the campus’ workloads.

    Over 50 nodes include Intel Xeon Scalable Processors, DDR4 memory, SSDs and Mellanox ConnectX-5 EDR 100Gb InfiniBand. A subset of systems also feature NVIDIA V100 GPU Accelerators for GPU computing applications.

    Equally important are a second subset of POWER9 with 2nd Generation NVLink- based- IBM Power Systems AC922 Compute nodes. These systems are similar to those utilized in the world’s #1 and #2 most powerful Summit and Sierra supercomputers at ORNL and LLNL. The advanced NVIDIA NVLink interfaces built into POWER9 CPU and NVIDIA GPU ensure a broad pipeline between CPU:GPU for data intensive workloads.

    The deployment of the hybrid architecture system was critical to meeting the users’ needs. It also allowed those on the UMass Dartmouth campus to apply to test workloads onto the larger national laboratory systems at ORNL.

    Microway was one of the few vendors able to deliver a unified system with a mix of x86 and POWER9 systems, complete software integration across both kinds of nodes in the cluster, and offer a single point of sale and warranty coverage.

    Microway was selected as the vendor for the new cluster through an open bidding process. “They not only competed well on the price,” says Khanna, “but they were also the only company that could deliver the kind of heterogeneous system we wanted with a mixture of architecture.”

    For more information about the UMass Dartmouth Center for Scientific Computing and Visualization Research please navigate to: http://cscvr1.umassd.edu/

    This new cluster purchase was funded through an Office of Naval Research (ONR) DURIP grant award.

    Serving Users Across a Research Campus

    The deployment has helped continue to serve, attract and retain faculty, undergraduate students, and those seeking advance degrees to the UMass Dartmouth campus. The Center for Scientific Computing and Visualization Research administers the new compute resource.

    With its new cluster, CSCVR is undertaking cutting edge work. Mathematics researchers are developing new numerical algorithms on the new deployment. A primary focus is in astrophysics: with focus on the study of black holes and stars.

    “Our engineering researchers,” says Gaurav Khanna, Co-Director of UMass Dartmouth’s Center for Scientific Computing & Visualization Research, “are very actively focused on computational engineering, and there are people in mechanical engineering who look at fluid and solid object interactions.” This type of research is known as two-phase fluid flow. Practical applications can take the form of modelling windmills and coming up with a better design for the materials on the windmill such as the coatings on the blade, as well as improved designs for the blades themselves.

    This team is also looking at wave energy converters in ocean buoys. “As buoys bob up and down,” Khanna explains, “you can use that motion to generate electricity. You can model that into the computation of that environment and then try to optimize the parameters needed to have the most efficient design for that type of buoy.”

    A final area of interest to this team is ocean weather systems. Here, UMass Dartmouth researchers are building large models to predict regional currents in the ocean, weather patterns, and weather changes.

    2

    A Hybrid Architecture for a Broad Array of Workloads

    The UMass Dartmouth cluster reflects a hybrid design to appeal to a wide array of the campus’ workloads.

    The deployment of the hybrid architecture system was critical to meeting the users’ needs. It also allowed those on the UMass Dartmouth campus to apply to test workloads onto the larger national laboratory systems at ORNL.

    Microway was one of the few vendors able to deliver a unified system with a mix of x86 and POWER9 systems, complete software integration across both kinds of nodes in the cluster, and offer a single point of sale and warranty coverage.

    “Microway was selected as the vendor for the new cluster through an open bidding process. “They not only competed well on the price,” says Khanna, “but they were also the only company that could deliver the kind of heterogeneous system we wanted with a mixture of architecture.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Founded on December 28, 2006, insideHPC is a blog that distills news and events in the world of HPC and presents them in bite-sized nuggets of helpfulness as a resource for supercomputing professionals. As one reader said, we’re sifting through all the news so you don’t have to!

    If you would like to contact me with suggestions, comments, corrections, errors or new company announcements, please send me an email at rich@insidehpc.com. Or you can send me mail at:

    insideHPC
    2825 NW Upshur
    Suite G
    Portland, OR 97239

    Phone: (503) 877-5048

     
  • richardmitnick 12:20 pm on January 24, 2020 Permalink | Reply
    Tags: "NASA's Kepler Witnesses Vampire Star System Undergoing Super-Outburst", , , Basic Research, , ,   

    From NASA/ESA Hubble Telescope and NASA/Kepler: “NASA’s Kepler Witnesses Vampire Star System Undergoing Super-Outburst” 

    NASA Hubble Banner

    NASA/ESA Hubble Telescope


    From NASA/ESA Hubble Telescope

    and

    NASA Kepler Logo

    NASA Kepler Telescope
    From NASA/Kepler

    January 24, 2020

    Christine Pulliam
    Space Telescope Science Institute, Baltimore, Maryland
    410-338-4366
    cpulliam@stsci.edu

    Ryan Ridden-Harper
    Space Telescope Science Institute, Baltimore, Maryland, and
    Australian National University, Canberra, Australia
    rridden@stsci.edu

    1
    NASA and L. Hustak

    This illustration shows a newly discovered dwarf nova system, in which a white dwarf star is pulling material off a brown dwarf companion. The material collects into an accretion disk until reaching a tipping point, causing it to suddenly increase in brightness. Using archival Kepler data, a team observed a previously unseen, and unexplained, gradual intensification followed by a super-outburst in which the system brightened by a factor of 1,600 over less than a day.

    Archival data reveals earliest stages of a dramatic event

    Astronomers searching archival data from NASA’s Kepler exoplanet hunting mission identified a previously unknown dwarf nova that underwent a super-outburst, brightening by a factor of 1,600 times in less than a day. While the outburst itself has a theoretical explanation, the slow rise in brightness that preceded it remains a mystery. Kepler’s rapid cadence of observations were crucial for recording the entire event in detail.

    The dwarf nova system consists of a white dwarf star with a brown dwarf companion. The white dwarf is stripping material from the brown dwarf, sucking its essence away like a vampire. The stripped material forms an accretion disk around the white dwarf, which is the source of the super-outburst. Such systems are rare and may go for years or decades between outbursts, making it a challenge to catch one in the act.

    NASA’s Kepler spacecraft was designed to find exoplanets by looking for stars that dim as a planet crosses the star’s face. Fortuitously, the same design makes it ideal for spotting other astronomical transients – objects that brighten or dim over time. A new search of Kepler archival data has uncovered an unusual super-outburst from a previously unknown dwarf nova. The system brightened by a factor of 1,600 over less than a day before slowly fading away.

    The star system in question consists of a white dwarf star with a brown dwarf companion about one-tenth as massive as the white dwarf. A white dwarf is the leftover core of an aging Sun-like star and contains about a Sun’s worth of material in a globe the size of Earth. A brown dwarf is an object with a mass between 10 and 80 Jupiters that is too small to undergo nuclear fusion.

    The brown dwarf circles the white dwarf star every 83 minutes at a distance of only 250,000 miles (400,000 km) – about the distance from Earth to the Moon. They are so close that the white dwarf’s strong gravity strips material from the brown dwarf, sucking its essence away like a vampire. The stripped material forms a disk as it spirals toward the white dwarf (known as an accretion disk).

    It was sheer chance that Kepler was looking in the right direction when this system underwent a super-outburst, brightening by more than 1,000 times. In fact, Kepler was the only instrument that could have witnessed it, since the system was too close to the Sun from Earth’s point of view at the time. Kepler’s rapid cadence of observations, taking data every 30 minutes, was crucial for catching every detail of the outburst.

    The event remained hidden in Kepler’s archive until identified by a team led by Ryan Ridden-Harper of the Space Telescope Science Institute (STScI), Baltimore, Maryland, and the Australian National University, Canberra, Australia. “In a sense, we discovered this system accidentally. We weren’t specifically looking for a super-outburst. We were looking for any sort of transient,” said Ridden-Harper.

    Kepler captured the entire event, observing a slow rise in brightness followed by a rapid intensification. While the sudden brightening is predicted by theories, the cause of the slow start remains a mystery. Standard theories of accretion disk physics don’t predict this phenomenon, which has subsequently been observed in two other dwarf nova super-outbursts.

    “These dwarf nova systems have been studied for decades, so spotting something new is pretty tricky,” said Ridden-Harper. “We see accretion disks all over – from newly forming stars to supermassive black holes – so it’s important to understand them.”

    Theories suggest that a super-outburst is triggered when the accretion disk reaches a tipping point. As it accumulates material, it grows in size until the outer edge experiences gravitational resonance with the orbiting brown dwarf. This might trigger a thermal instability, causing the disk to get superheated. Indeed, observations show that the disk’s temperature rises from about 5,000–10,000° F (2,700–5,300° C) in its normal state to a high of 17,000–21,000° F (9,700–11,700° C) at the peak of the super-outburst.

    This type of dwarf nova system is relatively rare, with only about 100 known. An individual system may go for years or decades between outbursts, making it a challenge to catch one in the act.

    “The detection of this object raises hopes for detecting even more rare events hidden in Kepler data,” said co-author Armin Rest of STScI.

    The team plans to continue mining Kepler data, as well as data from another exoplanet hunter, the Transiting Exoplanet Survey Satellite (TESS) mission, in search of other transients.

    NASA/MIT TESS replaced Kepler in search for exoplanets

    “The continuous observations by Kepler/K2, and now TESS, of these dynamic stellar systems allows us to study the earliest hours of the outburst, a time domain that is nearly impossible to reach from ground-based observatories,” said Peter Garnavich of the University of Notre Dame in Indiana.

    This work was published in the Oct. 21, 2019 issue of the Monthly Notices of the Royal Astronomical Society.

    The Space Telescope Science Institute is expanding the frontiers of space astronomy by hosting the science operations center of the Hubble Space Telescope, the science and operations center for the James Webb Space Telescope, and the science operations center for the future Wide Field Infrared Survey Telescope (WFIRST). STScI also houses the Mikulski Archive for Space Telescopes (MAST) which is a NASA-funded project to support and provide to the astronomical community a variety of astronomical data archives, and is the data repository for the Hubble, Webb, Kepler, K2, TESS missions and more.

    NASA/ESA/CSA Webb Telescope annotated

    NASA/WFIRST

    STScI also houses the Mikulski Archive for Space Telescopes (MAST) which is a NASA-funded project to support and provide to the astronomical community a variety of astronomical data archives, and is the data repository for the Hubble, Webb, Kepler, K2, TESS missions and more.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA’s Goddard Space Flight Center manages the telescope. The Space Telescope Science Institute (STScI), is a free-standing science center, located on the campus of The Johns Hopkins University and operated by the Association of Universities for Research in Astronomy (AURA) for NASA, conducts Hubble science operations.

    ESA50 Logo large

     
  • richardmitnick 10:05 am on January 24, 2020 Permalink | Reply
    Tags: , , Basic Research, , , , , The team has been able to ramp up the machine to 500 milliamperes (mA) of current and to keep this current stable for more than six hours.,   

    From Brookhaven National Lab: “NSLS-II Achieves Design Beam Current of 500 Milliamperes” 

    From Brookhaven National Lab

    January 22, 2020
    Cara Laasch
    laasch@bnl.gov

    Accelerator division enables new record current during studies.

    1
    The NSLS-II accelerator division proudly gathered to celebrate their recent achievement. The screen above them shows the slow increase of the electron current in the NSLS-II storage ring and its stability.

    The National Synchrotron Light Source II (NSLS-II) [below] at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory is a gigantic x-ray microscope that allows scientists to study the inner structure of all kinds of material and devices in real time under realistic operating conditions. The scientists using the machine are seeking answers to questions including how can we built longer lasting batteries; when life started on our planet; and what kinds of new materials can be used in quantum computers, along with many other questions in a wide variety of research fields.

    The heart of the facility is a particle accelerator that circulates electrons at nearly the speed of light around the roughly half-a-mile-long ring. Steered by special magnets within the ring, the electrons generate ultrabright x-rays that enable scientists to address the broad spectrum of research at NSLS-II.

    Now, the accelerator division at NSLS-II has reached a new milestone for machine performance. During recent accelerator studies, the team has been able to ramp up the machine to 500 milliamperes (mA) of current and to keep this current stable for more than six hours. Similar to a current in a river, the current in an accelerator is a measure of the number of electrons that circulate the ring at any given time. In NSLS-II’s case, a higher electron current opens the pathway to more intense x-rays for all the experiments happening at the facility.

    “Since we turned on the machine for the first time in 2014 with 50mA current, we have progressed steadily upwards in current and now – in just five years – we have reached 500mA,” said Timur Shaftan, NSLS-II accelerator division director. “Along the way, we encountered many significant challenges, and it is thanks to the dedication, knowledge, and expertise of the team that we were able to overcome them all to get here.”

    All good things come in threes?

    On their quest to a higher current, the accelerator division faced three major challenges: an increase in power consumption of the radiofrequency (RF) accelerating cavities, more intense “wakefields,” and the unexpected heating of some accelerator components.

    The purpose of the RF accelerating cavities can be compared to pushing a child on a swing – with the child being the electrons. With the correct timing, large amplitudes can be driven with little effort. The cavities feed more and more energy to the electrons to compensate for the energy the electrons lose as they generate x-rays in their trips around the ring.

    “The cavities use electricity to push the electrons forward, and even though our cavities are very efficient, they still draw a good amount of raw power,” said Jim Rose, RF group leader. “To reach 500 mA, we monitored this increase closely to ensure that we wouldn’t cross our limit for power, which we didn’t. However, there is another challenge we now have to face: The cavities compress the groups of electrons—we call them bunches—that rush through the machine, and by doing so they increase the heating issues that we face. To fully address this in the future, we will install other cavities of a different RF frequency that would lengthen the bunches again.”

    Rose is referring to the issue of “wakefields.” As the electrons speed around the ring, they create so called wakefields—just like when you run your finger through still water and create waves that roll on even though your fingers are long gone. In the same way, the rushing electrons generate a front of electric fields that follow them around the ring.

    “Having more intense wakefields causes two challenges: First, these fields influence the next set of electrons, causing them to lose energy and become unstable, and second, they heat up the vacuum chamber in which the beam travels,” said accelerator physicist Alexei Blednykh. “One of the limiting components in our efforts to reach 500mA was the ceramic vacuum chambers, because they were overheating. We mitigated the effect by installing additional cooling fans. However, to fully solve the issue we will need to replace the existing chambers with new chambers that have a thin titanium coating on the inside.”

    The accelerator division decided to coat all the new vacuum chambers in house using a technique called direct current magnetron sputtering. During the sputtering process, a titanium target is bombarded with ionized gas so that it ejects millions of titanium atoms, which spray onto the surface of the vacuum chamber to create a thin metal film.

    “At first, coating chambers sounds easy enough, but our chambers are long and narrow, which forces you to think differently how you can apply the coating. We had to design a coating system that was capable of handling the geometry of our chambers,” said vacuum group leader Charlie Hetzel. “Once we developed a system that could be used to coat the chambers, we had to develop a method that could accurately measure the thickness and uniformity along the entire length of the chamber.”

    For the vacuum chambers to survive the machine at high current, the coatings had to meet a number of demanding requirements in terms of their adhesion, thickness, and uniformity.

    The third challenge the team needed to overcome was resolving the unexpected heating found between some of the vacuum flanges. Each of the vacuum joints around the half-mile long accelerator contain a delicate RF bridge. Any errors during installation can result in additional heating and risk to the vacuum seal of the machine.

    “We knew from the beginning that increasing the current to 500 mA would be hard on the machine, however, we needed to know exactly where the real hot spots were,” explained accelerator coordination group leader Guimei Wang. “So, we installed more than 1000 temperature sensors around the whole machine, and we ran more than 400 hours of high-current beam studies over the past three years, where we monitored the temperature, vacuum, and many other parameters of the electrons very closely to really understand how our machine is behaving.”

    Based on all these studies and many more hours spend analyzing each single study run, the accelerator team made the necessary decisions as to which what parts needed to be coated or changed and, most importantly, how to run the machine at such a high current safely and reliably.

    Where do we go from here?

    Achieving 500mA during beam studies was an important step to begin to shed light on the physics within the machine at these high currents, as well as to understand the present limits of the accelerator. Equipped with these new insights, the accelerator division now knows that their machine can reach the 500mA current for a short time, but at this point it’s not possible to sustain high current for operations over extended periods with the RF power necessary to deliver it to users. To run the machine at this current, NSLS-II’s accelerator will need additional RF systems both to lengthen the bunches and to secure high reliability of operations, while providing sufficient RF power to the beam to generate x-rays for the growing set of beamlines.

    “Achieving 500 mA for the first time is a major milestone in the life of NSLS-II, showing that we can reach the aggressive design current goals we set for ourselves when we first started thinking about what NSLS-II could be all those years ago. This success is due to a lot of hard work, expertise, and dedication by many, many people at NSLS-II and I would like to thank them all very much,” said NSLS-II Director John Hill. “The next steps are to fully understand how the machine behaves at this current and ultimately deliver it to our users. This will require further upgrades to our accelerator systems—and we are actively working towards those now.”

    NSLS-II is a DOE Office of Science user facility.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    Brookhaven campus

    BNL Center for Functional Nanomaterials

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL Phenix Detector

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

     
  • richardmitnick 6:25 pm on January 23, 2020 Permalink | Reply
    Tags: "Astronomers Find a New Explanation for a Super-Bright Supernova", , , Basic Research, , , Finding so much iron means the star that exploded was a white dwarf and not a large massive star that collapsed explosively., Finding the gassy material must have been released outward only 100 years or so before the supernova explosion — barely any time at all on astronomical scales., It was the brightest superluminous supernova they’d ever seen., or the two phenomena are somehow linked., Supernova SN 2006gy, The team found that the stellar explosion must have contained iron — and a lot of it.   

    From Discover Magazine: “Astronomers Find a New Explanation for a Super-Bright Supernova” 

    DiscoverMag

    From Discover Magazine

    January 23, 2020
    Erika K. Carlson

    1
    The supernova SN 2006gy, shown in this illustration, was the brightest supernova discovered yet when it was spotted in 2006. (Credit: NASA/CXC/M.Weiss)

    Many stars end their lives as bright explosions called supernovas. Some of these explosions are much brighter than typical and give off up to 100 times more energy. Astronomers call these “superluminous supernovas,” and they don’t yet understand exactly what makes them super-bright.

    Now, a team of researchers has proposed an origin story for one of these superluminous supernovas, SN 2006gy. They suggest that the explosion happened in a binary star system when a small, dense white dwarf star spiraled into the core of its giant star companion.

    The researchers presented their findings in a new paper published Jan. 23 in Science.

    A Strange Explosion

    When astronomers spotted SN 2006gy in 2006, it was the brightest superluminous supernova they’d ever seen.

    Later, a group of researchers led by Koji Kawabata, now at Hiroshima University in Japan, managed to capture a detailed picture of the light that the supernova was emitting at various wavelengths, or colors. They saw that SN 2006gy was emitting light in combinations of wavelengths that hadn’t been seen in supernovas before.

    “It was kind of a very exciting mystery,” said Anders Jerkstrand, an astronomer at Stockholm University. He teamed up with Kawabata and another researcher to figure out what was going on and write the new paper.

    2
    The supernova SN 2006gy. (Credit: Fox et al 2015)

    A New Explanation

    By modeling what elements could have produced the wavelengths of light that SN 2006gy emitted, the team found that the stellar explosion must have contained iron — and a lot of it. Finding so much iron means the star that exploded was a white dwarf and not a large, massive star that collapsed explosively.

    The wavelengths SN 2006gy emitted also showed that the explosion must have rammed into and interacted with a slower-moving shell of gassy material around it, as other astronomers had previously pointed out. The collision with the surrounding material likely caused the explosion to convert a lot of its energy into light and produce such a bright supernova, Jerkstrand said.

    But the team found that the gassy material must have been released outward only 100 years or so before the supernova explosion — barely any time at all, on astronomical scales. Either it’s a coincidence that the shell of gases was ejected just before the supernova explosion, or the two phenomena are somehow linked.

    So the team came up with a scenario to explain both events. A dense white dwarf and a giant star with a stretched-out gassy atmosphere orbit each other in a binary system. The two stars are close enough that the white dwarf orbits inside the giant star’s gaseous outer layers. The resulting drag sends the white dwarf spiraling in toward the larger star’s core and also pushes gassy material outward.

    If the white dwarf colliding into the larger star’s core caused the supernova, the explosion would interact with the ejected gas on its way out, as astronomers observed. But the team can’t yet say that this is the case with SN 2006gy, because they don’t know for sure that the inspiraling would lead to the white dwarf’s explosion.

    “What we are saying is that if that happens, you get a supernova that looks just like 2006gy,” Jerkstrand said.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 5:54 pm on January 23, 2020 Permalink | Reply
    Tags: "A megalibrary of nanoparticles", , Basic Research, , , , Schaak Laboratory   

    From Pennsylvania State University: “A megalibrary of nanoparticles” 

    Penn State Bloc

    From Pennsylvania State University

    January 23, 2020
    Sam Sholtis

    1
    A simple, modular chemical approach could produce over 65,000 different types of complex nanorods. Electron microscope images are shown for 32 of these nanorods, which form with various combinations of materials. Each color represents a different material. Image: Schaak Laboratory, Penn State

    Using straightforward chemistry and a mix-and-match, modular strategy, researchers have developed a simple approach that could produce over 65,000 different types of complex nanoparticles, each containing up to six different materials and eight segments, with interfaces that could be exploited in electrical or optical applications. These rod-shaped nanoparticles are about 55 nanometers long and 20 nanometers wide — by comparison a human hair is about 100,000 nanometers thick — and many are considered to be among the most complex ever made.

    A paper describing the research, by a team of Penn State chemists, appears Jan. 24 in the journal Science.

    “There is a lot of interest in the world of nanoscience in making nanoparticles that combine several different materials — semiconductors, catalysts, magnets, electronic materials,” said Raymond E. Schaak, DuPont Professor of Materials Chemistry at Penn State and the leader of the research team. “You can think about having different semiconductors linked together to control how electrons move through a material, or arranging materials in different ways to modify their optical, catalytic, or magnetic properties. We can use computers and chemical knowledge to predict a lot of this, but the bottleneck has been in actually making the particles, especially at a large-enough scale so that you can actually use them.”

    The team starts with simple nanorods composed of copper and sulfur. They then sequentially replace some of the copper with other metals using a process called “cation exchange.” By altering the reaction conditions, they can control where in the nanorod the copper is replaced — at one end of the rod, at both ends simultaneously, or in the middle. They can then repeat the process with other metals, which can also be placed at precise locations within the nanorods. By performing up to seven sequential reactions with several different metals, they can create a veritable rainbow of particles — over 65,000 different combinations of metal sulfide materials are possible.

    “The real beauty of our method is its simplicity,” said Benjamin C. Steimle, a graduate student at Penn State and the first author of the paper. “It used to take months or years to make even one type of nanoparticle that contains several different materials. Two years ago we were really excited that we could make 47 different metal sulfide nanoparticles using an earlier version of this approach. Now that we’ve made some significant new advances and learned more about these systems, we can go way beyond what anyone has been able to do before. We are now able to produce nanoparticles with previously unimaginable complexity simply by controlling temperature and concentration, all using standard laboratory glassware and principles covered in an Introductory Chemistry course.”

    “The other really exciting aspect of this work is that it is rational and scalable,” said Schaak. “Because we understand how everything works, we can identify a highly complex nanoparticle, plan out a way to make it, and then go into the laboratory and actually make it quite easily. And, these particles can be made in quantities that are useful. In principle, we can now make what we want and as much as we want. There are still limitations, of course — we can’t wait until we are able to do this with even more types of materials — but even with what we have now, it changes how we think about what is possible to make.”

    In addition to Schaak and Steimle, the research team at Penn State included Julie L. Fenton. The research was funded by the U.S. National Science Foundation.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Penn State Campus

    About Penn State

    WHAT WE DO BEST

    We teach students that the real measure of success is what you do to improve the lives of others, and they learn to be hard-working leaders with a global perspective. We conduct research to improve lives. We add millions to the economy through projects in our state and beyond. We help communities by sharing our faculty expertise and research.

    Penn State lives close by no matter where you are. Our campuses are located from one side of Pennsylvania to the other. Through Penn State World Campus, students can take courses and work toward degrees online from anywhere on the globe that has Internet service.

    We support students in many ways, including advising and counseling services for school and life; diversity and inclusion services; social media sites; safety services; and emergency assistance.

    Our network of more than a half-million alumni is accessible to students when they want advice and to learn about job networking and mentor opportunities as well as what to expect in the future. Through our alumni, Penn State lives all over the world.

    The best part of Penn State is our people. Our students, faculty, staff, alumni, and friends in communities near our campuses and across the globe are dedicated to education and fostering a diverse and inclusive environment.

     
  • richardmitnick 5:24 pm on January 23, 2020 Permalink | Reply
    Tags: "Large Amounts of Oxygen Detected in Ancient Star’s Atmosphere", , , Basic Research, , Halo stars-roughly spherical distribution around the Milky Way, , Old star J0815+4729, ,   

    From UC San Diego: “Large Amounts of Oxygen Detected in Ancient Star’s Atmosphere” 

    From UC San Diego

    Cynthia Dillon, 858-822-0142, cdillon@ucsd.edu


    This animation illustrates the earliest epoch of our universe, just after the Big Bang, when the first elements of hydrogen, helium and lithium were created in the still hot cosmos. These atoms eventually collected to form the first generation of massive stars, which in turn produced heavier elements such as carbon, oxygen and nitrogen. As these massive stars exploded as supernovae, they released these heavier elements into the universe, eventually collecting on next generation stars such as J0815+4729. Video courtesy of Gabriel Pérez, SMM (IAC); IACVideos, YouTube

    An international team of astronomers from the University of California San Diego, the Instituto de Astrofísica de Canarias (IAC) and the University of Cambridge have detected large amounts of oxygen in the atmosphere of one of the oldest and most elementally depleted stars known—a primitive star scientists call “J0815+4729.” This new finding, reported in The Astrophysical Journal Letters, provides an important clue about how oxygen and other important elements were produced in the universe’s first generations of stars.

    1
    Artistic image of the supernova explosions of the first massive stars that formed in the Milky Way. The star J0815+4729 was formed from the material ejected by these first supernovae. Image courtesy of Gabriel Pérez, SMM (IAC).

    After hydrogen and helium, oxygen is the third most abundant element in the universe and important to all life forms on Earth. It serves as a chemical basis of respiration and a building block of carbohydrates, as well as the main element in the Earth’s crust. Absent from the early universe, it emerged through nuclear fusion reactions that occurred deep inside the most massive stars—stars roughly 10 times or more massive than the sun.

    To trace this early production of oxygen and other elements, astronomers study the oldest existing stars. J0815+4729 is one of them. It was first discovered by the IAC team in 2017 using the Grand Canary Telescope in La Palma, in the Canaries, Spain.

    Gran Telescopio Canarias at the Roque de los Muchachos Observatory on the island of La Palma, in the Canaries, Spain, sited on a volcanic peak 2,267 metres (7,438 ft) above sea level

    It resides over 5,000 light years away toward the constellation Lynx.

    “Stars like J0815+4729 are referred to as halo stars,” explained UC San Diego Professor of Physics Adam Burgasser, a co-author of the study. “This is due to their roughly spherical distribution around the Milky Way, as opposed to the more familiar flat disk of younger stars that include the sun.”

    Halo stars like J0815+4729 are truly ancient stars, allowing astronomers a peek into the universe’s early history of element production. The research team observed J0815+4729 with the W. M. Keck Observatory’s Keck I 10-meter telescope on Mauna Kea, Hawaii, using a high resolution spectrograph called HIRES.

    Keck Keck High-Resolution Echelle Spectrometer (HIRES), at the Keck I telescope, Keck Observatory, Maunakea, Hawaii, USA.4,207 m (13,802 ft) above sea level

    Keck Observatory, operated by Caltech and the University of California, Maunakea Hawaii USA, 4,207 m (13,802 ft)

    The data, which required more than five hours of staring at the star over a single night, were used to measure the abundances of 16 chemical species in the star’s atmosphere, including oxygen.

    “The primitive composition of the star indicates that it was formed during the first hundreds of millions of years after the Big Bang, possibly from the material expelled from the first supernovae of the Milky Way,” said Jonay González Hernández, an IAC Ramón y Cajal postdoctoral researcher and lead author of the study.

    The chemical composition of the star was found to be very unusual. While it has relatively large amounts of carbon, nitrogen and oxygen, approximately 10, 8 and 3 percent of the abundances measured in the sun, other elements like calcium and iron have abundances around one millionth that of the sun.

    “Only a few such stars are known in the halo of our galaxy, but none have such an enormous amount of carbon, nitrogen and oxygen compared to their iron content,” said David Aguado, a postdoctoral researcher at the University of Cambridge and co-author of the study.

    The search for stars of this type involves dedicated projects that sift through hundreds of thousands of stellar spectra to uncover a few rare sources like J0815+4729 and follow-up observation to measure their chemical composition. This star was first discovered in data obtained with the Sloan Digital Sky Survey (SDSS).

    SDSS Telescope at Apache Point Observatory, near Sunspot NM, USA, Altitude2,788 meters (9,147 ft)

    According to Rafael Rebolo, IAC director and co-author of the paper, the institute began studying the presence of oxygen in the oldest stars of the galaxy 30 years ago, with results indicating that this element was produced enormously in the first generations of supernovae.

    “However, we could not imagine that we would find a case of enrichment as spectacular as that of this star,” Rebolo noted.

    The researchers acknowledge Heather Hershley and Sherry Yeh at Keck Observatory for their assistance with the observations; financial support from the Spanish Ministry of Science, Innovation and Universities (MICIU) under the 2013 Ramón y Cajal program (RYC-2013-14875); the Spanish Ministry project MICIU (AYA2017-86389-P) and Leverhulme Trust.

    UC San Diego’s Department of Physics in the Division of Physical Sciences offers one of the top graduate programs in the U.S. Many of its faculty are active at the Center for Astrophysics and Space Sciences (CASS), an interdisciplinary research unit for research and graduate study in astronomy, astrophysics and space sciences. Areas of specialization include high-energy astrophysics, optical and ultraviolet astronomy, infrared astronomy, radio astronomy, theoretical astrophysics, cosmology, solar physics, space plasma physics, interferometry and astronomical instrumentation.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of California, San Diego (also referred to as UC San Diego or UCSD), is a public research university located in the La Jolla area of San Diego, California, in the United States.[12] The university occupies 2,141 acres (866 ha) near the coast of the Pacific Ocean with the main campus resting on approximately 1,152 acres (466 ha).[13] Established in 1960 near the pre-existing Scripps Institution of Oceanography, UC San Diego is the seventh oldest of the 10 University of California campuses and offers over 200 undergraduate and graduate degree programs, enrolling about 22,700 undergraduate and 6,300 graduate students. UC San Diego is one of America’s Public Ivy universities, which recognizes top public research universities in the United States. UC San Diego was ranked 8th among public universities and 37th among all universities in the United States, and rated the 18th Top World University by U.S. News & World Report ‘s 2015 rankings.

     
  • richardmitnick 3:11 pm on January 23, 2020 Permalink | Reply
    Tags: , , Basic Research, , , , , New Mexico Exoplanet Spectroscopic Survey Instrument or NESSI on the Caltech Palomar 200 inch Hale Telescope located in San Diego County California USA   

    From NASA JPL-Caltech: “NESSI Emerges as New Tool for Exoplanet Atmospheres” 

    NASA JPL Banner

    From NASA JPL-Caltech

    January 23, 2020
    Calla Cofield
    Jet Propulsion Laboratory, Pasadena, Calif.
    626-808-2469
    calla.e.cofield@jpl.nasa.gov

    Written by Elizabeth Landau

    Caltech Palomar 200 inch Hale Telescope, Altitude 1,713 m (5,620 ft), located in San Diego County, California, United States

    The infrared instrument at Palomar Observatory’s Hale Telescope holds the promise of deepening our understanding of planets beyond our Sun.

    The darkness surrounding the Hale Telescope breaks with a sliver of blue sky as the dome begins to open, screeching with metallic, sci-fi-like sounds atop San Diego County’s Palomar Mountain. The historic observatory smells of the oil pumped in to support the bearings that make this giant telescope float ever so slightly as it moves to track the stars.

    Since February 2018, scientists have been testing an instrument at the Hale Telescope called the New Mexico Exoplanet Spectroscopic Survey Instrument, or NESSI.

    New Mexico Exoplanet Spectroscopic Survey Instrument or NESSI on the Caltech Palomar 200 inch Hale Telescope,located in San Diego County, California, United States

    A collaboration between NASA’s Jet Propulsion Laboratory in Pasadena, California, and the New Mexico Institute of Mining and Technology, NESSI was built to examine the atmospheres of planets that orbit stars beyond our Sun, or exoplanets, providing new insights into what these worlds are like.

    So far, NESSI has checked out two “hot Jupiters,” massive gas giants orbiting close to their stars and too scorching to sustain life. One, called HD 189773b, has such extreme temperatures and winds that it may rain glass sideways there. The other, WASP-33b, has a “sunscreen” layer of atmosphere, with molecules that absorb ultraviolet and visible light.

    Recently, NESSI observed these planets crossing their host stars, proving the instrument would be able to help confirm possible planets previously observed by other telescopes. Now it is ready for more detailed studies of distant cousins of our solar system. And while the instrument is designed to look at planets much larger than Earth, NESSI’s methods could be used to search for Earth-size planets someday as well once future technologies become available.

    “NESSI is a powerful tool to help us meet the family,” said Mark Swain, an astrophysicist and the JPL lead for NESSI. “Twenty-five years ago, to our best knowledge, we thought we were alone. Now we know that – at least in terms of planets – we’re not, and that this family is extensive and very diverse.”

    Why NESSI

    NESSI views the galaxy in infrared light, which is invisible to the human eye. It stares at individual stars to observe the dimming of light as a planet passes in front of its host star – an event called a transit. From the transit, astronomers can learn how big the planet is relative to its host star. When the planet passes directly behind the star and re-emerges, it’s called an eclipse. NESSI can look for signatures of molecules from the planet’s atmosphere detectable in starlight before and after the eclipse.

    Inside NESSI, devices that focus infrared light spread it into a rainbow, or spectrum, filtering it for particular wavelengths that relate to the atmospheric chemistry of distant planets.

    “We can pick out the parts of the spectrum where the molecules are, because that’s really what we’re looking for in the infrared in these exoplanets – molecular signatures of things like carbon dioxide and water and methane to tell us that there’s something interesting going on in that particular planet,” said Michelle Creech-Eakman, principal investigator for NESSI at New Mexico Tech.

    NESSI is equipped to follow up on discoveries from other observatories such as NASA’s Transiting Exoplanet Survey Satellite (TESS).

    NASA/MIT TESS replaced Kepler in search for exoplanets

    TESS scans the entire sky in visible light for planets around bright, nearby stars, but the planet candidates it discovers must be confirmed through other methods. That is to make sure these signals TESS detects actually come from planet transits, not other sources.

    Planet transit. NASA/Ames

    NESSI can also help bridge the science between TESS and NASA’s James Webb Space Telescope, scheduled to launch in 2021.

    NASA/ESA/CSA Webb Telescope annotated

    The largest, most complex space observatory ever to fly, Webb will study individual planets to learn about their atmospheres and whether they contain molecules associated with habitability. But since Webb’s time will be precious, scientists want to point it only at the most interesting and accessible targets. For example, if NESSI sees no molecular signatures around a planet, that implies clouds are blocking its atmosphere, making it unlikely to be a good target for Webb.

    “This helps us see if a planet is clear or cloudy or hazy,” said Rob Zellem, an astrophysicist and the JPL commissioning lead on NESSI. “And if it’s clear, we’ll see the molecules. And if then we see the molecules, they’ll say, ‘Hey, it’s a great target to look at with James Webb or Hubble or anything else.'”

    NASA/ESA Hubble Telescope

    A Window to the Galaxy

    NESSI began as a concept in 2008 when Swain visited Creech-Eakman’s astrobiology class at New Mexico Tech. Over coffee, Swain told his colleague about exoplanet observations he had done with a ground-based telescope that didn’t turn out well. Creech-Eakman realized a different instrument combined with the right telescope could accomplish Swain’s goals. On a napkin, the two sketched an idea for what would become NESSI.

    They designed the instrument for the Magdalena Ridge Observatory in Magdalena, New Mexico.

    3
    2.4-meter Telescope at Magdalena Ridge, Magdalena, New Mexico, New Mexico Institute of Mining and Technology, Socorro County, New Mexico, USA, Altitude 3,230 m (10,600 ft)

    But once the researchers began using it in April 2014, the instrument didn’t work as expected.

    Swain suggested moving NESSI to Palomar’s 200-inch Hale Telescope, which is much larger and more powerful – and also more accessible for the team. Owned and operated by Caltech, which manages JPL for NASA, Palomar has designated observing nights for researchers from JPL.

    Relocating NESSI – a 5-foot-tall (1.5-meter-tall) blue, cylindrical device with wires coming out of it – wasn’t just a matter of placing it on a truck and driving southwest. The electrical and optical systems needed to be reworked for its new host and then tested again. NESSI also needed a way to communicate with a different telescope, so University of Arizona doctoral student Kyle Pearson developed software to operate the instrument at Palomar. By early 2018, NESSI was ready to climb the mountain.

    A crane lifted NESSI more than 100 feet (30 meters) to the top of the Hale Telescope on Feb. 1, 2018. Technicians installed the instrument in a “cage” at the Hale’s prime focus, which enables all of the light from the 530-ton telescope to be funneled into NESSI’s detectors.

    The team celebrated NESSI’s glimpse of its first star on Feb. 2, 2018, but between limited telescope time and fickle weather, more than a year of testing and troubleshooting would pass (never mind the time the decades-old lift got stuck as Zellem and Swain ascended to the telescope cage).

    “We track down the problems and we fix them. That’s the name of the game,” Creech-Eakman said.

    As the team continued making adjustments in 2019, Swain tapped a local high school student to design a baffle – a cylindrical device to help direct more light to NESSI’s sensors. This piece was then 3D-printed in JPL’s machine shop.

    When NESSI finally detected transiting planets on Sept. 11, 2019, the team didn’t pause to pop open champagne. Researchers are now working out the measurements of HD 189773b’s atmosphere. The team has also compiled a list of exoplanets they want to go after next.

    “It’s really rewarding, finally, to see all of our hard work is paying off and that we’re getting NESSI to work,” Zellem said. “It’s been a long journey, and it’s really gratifying to see this happen, especially in real time.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL)) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge, on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo

    NASA image

     
  • richardmitnick 2:25 pm on January 23, 2020 Permalink | Reply
    Tags: , , Basic Research, , , , , NESSI observed its first exoplanet signatures on Sept. 11 2019 proving its readiness for further studies., New Mexico Exoplanet Spectroscopic Survey Instrument or NESSI   

    From NASA JPL-Caltech: “Up All Night: NESSI Comes to Life at Palomar Observatory” 

    NASA JPL Banner

    From NASA JPL-Caltech

    January 23, 2020

    Calla Cofield
    Jet Propulsion Laboratory, Pasadena, Calif.
    626-808-2469
    calla.e.cofield@jpl.nasa.gov

    Elizabeth Landau
    Headquarters, Washington
    818-359-3241
    elandau@jpl.nasa.gov


    Caltech Palomar 200 inch Hale Telescope, located in San Diego County, California, US, at 1,712 m (5,617 ft)

    On Feb. 2, 2018, a handful of researchers began testing an instrument called the New Mexico Exoplanet Spectroscopic Survey Instrument, or NESSI, at the historic 200-inch Hale Telescope at Palomar Observatory in Southern California. NESSI is designed to look at the atmospheres of exoplanets, or planets beyond our solar system.

    New Mexico Exoplanet Spectroscopic Survey Instrument or NESSI

    Here’s what that first night of testing was like:

    4:00 p.m. The NESSI team united for an early dinner at the dormitory called “the monastery” before driving to the telescope. Principal investigator Michelle Creech-Eakman, who grew up under the clear skies of North Dakota, has spent hundreds of nights at Palomar, so she’s familiar with the overnight-astronomy lifestyle. Working there as a Caltech postdoctoral researcher, she once accidentally scared a herd of cows in her quests to tame the mysteries of stars and planets.

    5:20 p.m. Sunset. The telescope dome, with proportions similar to Rome’s Pantheon, opened, synchronized with the theme from “2001: A Space Odyssey,” which Rob Zellem, an astrophysicist at NASA’s Jet Propulsion Laboratory, jokingly played on his phone. Afterward, he climbed up to the outdoor catwalk to admire the fiery sky for a few minutes.

    5:36 p.m. The team convened in the observing room, adjacent to the dome, taking images called “sky flats” to calibrate NESSI using the light of the sky itself. This is so the team can understand how each pixel of NESSI’s detector responds to incoming light. If astronomers spot inconsistencies from pixel to pixel, they can adjust for them and subtract out “noise” when making real observations.

    Around 5:49 p.m. NESSI’s detectors were exposed to the sky at Palomar for the first time. To the untrained eye it looked like black-and-white static with lines through it on an old TV.

    Around 6:10 p.m. NESSI saw its first star, Alpha Perseus. A round of applause resounded in the observation room. Zellem’s excitement was palpable. “It’s one thing to see it in a lab; it’s another to see a real star,” he said.

    But the team was just getting started. NESSI’s many components needed to be calibrated and examined – so many that Creech-Eakman didn’t expect to get actual data from a star tonight. Zellem opened a bag of turkey jerky for the long night ahead.

    NESSI at first delivered a strange pattern of pixels on Zellem’s computer screen. The researchers examined a star called Eta Aurigae to compare its appearance to Alpha Perseus in NESSI’s field of view and tried to figure out whether the changes in brightness were due to NESSI’s detector or to the thin clouds rolling in.

    8:50 p.m. The team got an error message when they tried to get a stellar spectrum, the array of lines corresponding to different wavelengths of light a star produces. When they took the image again, it worked – but not as expected. With clouds coming in and out of view, getting a clear image would prove difficult.

    The troubleshooting continued through the next hour. “I think we’re missing something fundamental,” Creech-Eakman said.

    Just before 11 p.m. Creech-Eakman and Zellem decided on a new target: a star called Capella. It’s here they realized that the star needed to be in a different part of NESSI’s field of view. With a 10-second exposure, they were at last able to see part of a spectrum. And as they adjusted the positioning of the star with respect to NESSI, the full spectrum came into view. The team exploded in applause.

    Around 2 a.m. Because of clouds, they stopped and ceded the rest of the time to another group of astronomers. By then, the NESSI team had noted a variety of unexpected behavior from the instrument that they would need to investigate in the light of day.

    As with all new technologies, NESSI presented its researchers with challenges that had no immediate solutions, and there’s no manual to follow or help line to call. But the evening was a tremendous success in taking stock of NESSI’s components and functions. After an additional year-and-a-half of tweaking, testing and observing, NESSI observed its first exoplanet signatures on Sept. 11, 2019, proving its readiness for further studies.

    Between the picturesque mountaintop setting and the engineering marvel of the “Big Eye” Hale Telescope itself, Creech-Eakman doesn’t mind making more trips to Palomar Observatory. It’s been a special place for her since her Caltech days, when she worked there on someone else’s experiment.

    “My father had a small telescope that he had built, and I got to use that when I was little. He had made the mirrors himself – all of it,” she said. “To bring my own instrument to a place like this is – I really don’t have words.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL)) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge, on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo

    NASA image

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: