Tagged: Astrophysics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:16 pm on October 14, 2021 Permalink | Reply
    Tags: "Rocky exoplanets and their host stars may have similar composition", , Astrophysics, , ,   

    From IAC Institute of Astrophysics of the Canary Islands [Instituto de Astrofísica de Canarias] (ES) : “Rocky exoplanets and their host stars may have similar composition” 

    Instituto de Astrofísica de Andalucía

    From IAC Institute of Astrophysics of the Canary Islands [Instituto de Astrofísica de Canarias] (ES)

    14/10/2021

    Garik Israelian
    gil@iac.es

    1
    Illustration of the formation of a planet round a star similar to the Sun, with rocks and iron molecules in the foreground. Credit: Tania Cunha (Harbor Planetarium [Planetário do Porto](PT) – Centro Ciência Viva & Instituto de Astrofísica e Ciências do Espaço).

    Newly formed stars have protoplanetary discs around them. A fraction of the material in the disc condenses into planet-forming chunks, and the rest finally falls into the star. Because of their common origin, researchers have assumed that the composition of these chunks and that of the rocky planets with low masses should be similar to that of their host stars. However, until now the Solar System was the only available reference for the astronomers.

    In a new research article, published today in the journal Science, an international team of astronomers led by the researcher Vardan Adibekyan, of The Instituto de Astrofísica e Ciências do Espaço (IA), with participation by the Instituto de Astrofísica de Canarias (IAC), has established for the first time a correlation between the composition of rocky exoplanets and that of their host stars. The study also shows that this relation does not correspond exactly to the relation previously assumed.

    “The team found that the composition of rocky planets is closely related to the composition of their host stars, which could help us to identify planets which may be similar to ours”, explains Vardan Adibekyan, the first author on the paper. “In addition, the iron content of these planets is higher than that predicted from the composition of the protoplanetary discs from which they formed, which is due to the specific characteristics of the formation processes of planets, and the chemistry of the discs. Our work supports models of planet formation and a level of certainty and detail without precedent”, he added.

    For Garik Israelian, an IAC researcher and co-author of the article, this result could not have been imagined in the year 2000. “At that time we tried to find a correlation between the chemical composition of certain solar type stars and the presence of planets orbiting them (or of their orbital characteristics). It was hard to believe that twenty years later these studies would grow to include the metal abundances of planets similar to the Earth”, he emphasises.

    “For us this would have seemed to be science fiction. Planets similar to the Earth were not yet known, and we concentrated only on the planets we could find, and on the parameters of their orbits around their host stars. And today, we are studying the chemical composition of the interiors and of the atmospheres of extrasolar planets. It is a great leap forward”, he added.

    To establish the relation, the team selected twenty-one rocky planets which had been characterized most accurately, using their measurements of mass and radius to determine their densities and their iron content. They also used high-resolution spectra from the latest generation of spectrographs in the major world observatories: at Mauna Kea (Hawaii), at La Silla and Paranal (Chile) and at the Roque de los Muchachos, (Garafía, La Palma, Canary Islands), to determine the compositions of their host stars, and of the most critical components for the formation of rocks in the protoplanetary discs.

    “Understanding the link in the composition between the stars and their planets has been a basic aspect of research in our centre for over a decade. Using the best high-resolution spectrographs, such as HARPS and ESPRESSO at the European Southern Observatory (ESO), our team has collected spectra of the host stars of exoplanets for several years.

    These spectra were used to determine the stellar parameters and abundances of the host stars, and the results have been put together in the published catalogue SWEET-Cat”, explained Nuno Santos, a researcher at the IA and a co-author of the article.

    The team also found an intriguing result. They found differences in the fraction of iron between the super earths and super mercurys, which implies that these planets seem to constitute different populations in terms of composition, with further implications for their formation. This finding will need more studies, because the simulations of the formation of planets, incorporating collision, cannot by themselves reproduce the super mercurys of high density. “Understanding the formation of the super mercurys will help us to understand the especially high density of Mercury”, Adibekyan assures us.

    This research was carried out in the framework of the project “Observational Tests of the Processes of Nucleosynthesis in the Universe” started in the year 2000 by the IAC researcher Garik Israelian; Michel Mayor, Nobel Laureate in Physics, 2019; and Nuno Santos, researcher at the IA.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    IAC Institute of Astrophysics of the Canary Islands [Instituto de Astrofísica de Canarias] (ES) operates two astronomical observatories in the Canary Islands:

    Roque de los Muchachos Observatory on La Palma
    Teide Observatory on Tenerife.

    The seeing statistics at ORM make it the second-best location for optical and infrared astronomy in the Northern Hemisphere, after Mauna Kea Observatory Hawaii (US).

    Maunakea Observatories Hawai’i (US) altitude 4,213 m (13,822 ft)

    The site also has some of the most extensive astronomical facilities in the Northern Hemisphere; its fleet of telescopes includes the 10.4 m Gran Telescopio Canarias, the world’s largest single-aperture optical telescope as of July 2009, the William Herschel Telescope (second largest in Europe), and the adaptive optics corrected Swedish 1-m Solar Telescope.

    Gran Telescopio Canarias [Instituto de Astrofísica de Canarias ](ES) sited on a volcanic peak 2,267 metres (7,438 ft) above sea level.

    The observatory was established in 1985, after 15 years of international work and cooperation of several countries with the Spanish island hosting many telescopes from Britain, The Netherlands, Spain, and other countries. The island provided better seeing conditions for the telescopes that had been moved to Herstmonceux by the Royal Greenwich Observatory, including the 98 inch aperture Isaac Newton Telescope (the largest reflector in Europe at that time). When it was moved to the island it was upgraded to a 100-inch (2.54 meter), and many even larger telescopes from various nations would be hosted there.

    Teide Observatory [Observatorio del Teide], IAU code 954, is an astronomical observatory on Mount Teide at 2,390 metres (7,840 ft), located on Tenerife, Spain. It has been operated by the Instituto de Astrofísica de Canarias since its inauguration in 1964. It became one of the first major international observatories, attracting telescopes from different countries around the world because of the good astronomical seeing conditions. Later the emphasis for optical telescopes shifted more towards Roque de los Muchachos Observatory on La Palma.

     
  • richardmitnick 10:35 am on October 14, 2021 Permalink | Reply
    Tags: "Hubble Finds Evidence of Persistent Water Vapor in One Hemisphere of Europa", Astrophysics, , , ,   

    From Hubblesite (US)(EU) and NASA/ESA Hubble: “Hubble Finds Evidence of Persistent Water Vapor in One Hemisphere of Europa” 

    From Hubblesite (US)(EU) and NASA/ESA Hubble

    October 14, 2021

    MEDIA CONTACT:

    Ray Villard
    Space Telescope Science Institute (US), Baltimore, Maryland

    Bethany Downer
    ESA/Hubble.org

    SCIENCE CONTACT:

    Lorenz Roth
    KTH Royal Institute of Technology [Kungliga Tekniska högskolan](SE)

    1
    Europa

    Summary

    Ice Sublimating Off the Surface Replenishes a Tenuous Envelope

    You would think that living half-a-billion miles from the Sun would be no place to call home. But planetary astronomers are very interested in exploring the moon Europa in search of life. Slightly smaller than Earth’s moon, Europa orbits monstrous Jupiter. Surface temperatures on the icy moon never rise above a frigid minus 260 degrees Fahrenheit. A temperature so cold that water-ice is as hard as rock.

    Yet, beneath the solid ice crust there may be a global ocean with more water than found on Earth. And, where there is water, there could be life. Like a leaky garden hose, the ocean vents water vapor into space from geysers poking through cracks in the surface, as first photographed by the Hubble Space Telescope in 2013.

    The latest twist comes from archival Hubble observations, spanning 1999 to 2015, which find that water vapor is constantly being replenished throughout one hemisphere of the moon. That’s a bit mysterious. Nevertheless, the atmosphere is only one-billionth the surface pressure of Earth’s atmosphere.

    The water vapor wasn’t seen directly, but rather oxygen’s ultraviolet spectral fingerprint was measured by Hubble. Oxygen is one of the constituents of water. Unlike the geysers, this water vapor is not coming from Europa’s interior, but rather sunlight is causing the surface ice to sublimate. A similar water vapor atmosphere was recently found on the Jovian moon Ganymede.

    Europa is so exciting as a potential abode of life it is a target of NASA’s Europa Clipper and the Jupiter Icy Moons Explorer (JUICE) of the European Space Agency – planned for launch within a decade.


    _____________________________________________________________________________________

    NASA’s Hubble Space Telescope observations of Jupiter’s icy moon Europa have revealed the presence of persistent water vapor — but, mysteriously, only in one hemisphere.

    Europa harbors a vast ocean underneath its icy surface, which might offer conditions hospitable for life. This result advances astronomers’ understanding of the atmospheric structure of icy moons, and helps lay the groundwork for planned science missions to the Jovian system to, in part, explore whether an environment half-a-billion miles from the Sun could support life.

    Previous observations of water vapor on Europa have been associated with plumes erupting through the ice, as photographed by Hubble in 2013. They are analogous to geysers on Earth, but extend more than 60 miles high. They produce transient blobs of water vapor in the moon’s atmosphere, which is only one-billionth the surface pressure of Earth’s atmosphere.

    The new results, however, show similar amounts of water vapor spread over a larger area of Europa in Hubble observations spanning from 1999 to 2015. This suggests a long-term presence of a water vapor atmosphere only in Europa’s trailing hemisphere — that portion of the moon that is always opposite its direction of motion along its orbit. The cause of this asymmetry between the leading and trailing hemisphere is not fully understood.

    This discovery is gleaned from a new analysis of Hubble archival images and spectra, using a technique that recently resulted in the discovery of water vapor in the atmosphere of Jupiter’s moon Ganymede, by Lorenz Roth of the KTH Royal Institute of Technology, Space and Plasma Physics, Sweden.

    “The observation of water vapor on Ganymede, and on the trailing side of Europa, advances our understanding of the atmospheres of icy moons,” said Roth. “However, the detection of a stable water abundance on Europa is a bit more surprising than on Ganymede because Europa’s surface temperatures are lower than Ganymede’s.”

    Europa reflects more sunlight than Ganymede, keeping the surface 60 degrees Fahrenheit cooler than Ganymede. The daytime high on Europa is a frigid minus 260 degrees Fahrenheit. Yet, even at the lower temperature, the new observations suggest water ice is sublimating — that is, transforming directly from solid to vapor without a liquid phase — off Europa’s surface, just like on Ganymede.

    To make this discovery, Roth delved into archival Hubble datasets, selecting ultraviolet observations of Europa from 1999, 2012, 2014 and 2015 while the moon was at various orbital positions. These observations were all taken with Hubble’s Space Telescope Imaging Spectrograph (STIS). The ultraviolet STIS observations allowed Roth to determine the abundance of oxygen — one of the constituents of water — in Europa’s atmosphere, and by interpreting the strength of emission at different wavelengths he was able to infer the presence of water vapor.

    This detection paves the way for in-depth studies of Europa by future probes including NASA’s Europa Clipper and the Jupiter Icy Moons Explorer (JUICE) mission from the European Space Agency (ESA). Understanding the formation and evolution of Jupiter and its moons also helps astronomers gain insights into Jupiter-like planets around other stars.

    These results have been published in the journal Geophysical Research Letters.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition
    The NASA/ESA Hubble Space Telescope is a space telescope that was launched into low Earth orbit in 1990 and remains in operation. It was not the first space telescope, but it is one of the largest and most versatile, renowned both as a vital research tool and as a public relations boon for astronomy. The Hubble telescope is named after astronomer Edwin Hubble and is one of NASA’s Great Observatories, along with the NASA Compton Gamma Ray Observatory, the Chandra X-ray Observatory, and the NASA Spitzer Infared Space Telescope.



    Edwin Hubble at Caltech Palomar Samuel Oschin 48 inch Telescope(US). Credit: Emilio Segre Visual Archives/AIP/SPL).

    Hubble features a 2.4-meter (7.9 ft) mirror, and its four main instruments observe in the ultraviolet, visible, and near-infrared regions of the electromagnetic spectrum. Hubble’s orbit outside the distortion of Earth’s atmosphere allows it to capture extremely high-resolution images with substantially lower background light than ground-based telescopes. It has recorded some of the most detailed visible light images, allowing a deep view into space. Many Hubble observations have led to breakthroughs in astrophysics, such as determining the rate of expansion of the universe.

    The Hubble telescope was built by the United States space agency National Aeronautics Space Agency(US) with contributions from the European Space Agency [Agence spatiale européenne](EU). The Space Telescope Science Institute (STScI) selects Hubble’s targets and processes the resulting data, while the NASA Goddard Space Flight Center(US) controls the spacecraft. Space telescopes were proposed as early as 1923. Hubble was funded in the 1970s with a proposed launch in 1983, but the project was beset by technical delays, budget problems, and the 1986 Challenger disaster. It was finally launched by Space Shuttle Discovery in 1990, but its main mirror had been ground incorrectly, resulting in spherical aberration that compromised the telescope’s capabilities. The optics were corrected to their intended quality by a servicing mission in 1993.

    Hubble is the only telescope designed to be maintained in space by astronauts. Five Space Shuttle missions have repaired, upgraded, and replaced systems on the telescope, including all five of the main instruments. The fifth mission was initially canceled on safety grounds following the Columbia disaster (2003), but NASA administrator Michael D. Griffin approved the fifth servicing mission which was completed in 2009. The telescope was still operating as of April 24, 2020, its 30th anniversary, and could last until 2030–2040. One successor to the Hubble telescope is the National Aeronautics Space Agency(USA)/European Space Agency [Agence spatiale européenne](EU)/Canadian Space Agency(CA) Webb Infrared Space Telescope scheduled for launch in October 2021.

    Proposals and precursors

    In 1923, Hermann Oberth—considered a father of modern rocketry, along with Robert H. Goddard and Konstantin Tsiolkovsky—published Die Rakete zu den Planetenräumen (“The Rocket into Planetary Space“), which mentioned how a telescope could be propelled into Earth orbit by a rocket.

    The history of the Hubble Space Telescope can be traced back as far as 1946, to astronomer Lyman Spitzer’s paper entitled Astronomical advantages of an extraterrestrial observatory. In it, he discussed the two main advantages that a space-based observatory would have over ground-based telescopes. First, the angular resolution (the smallest separation at which objects can be clearly distinguished) would be limited only by diffraction, rather than by the turbulence in the atmosphere, which causes stars to twinkle, known to astronomers as seeing. At that time ground-based telescopes were limited to resolutions of 0.5–1.0 arcseconds, compared to a theoretical diffraction-limited resolution of about 0.05 arcsec for an optical telescope with a mirror 2.5 m (8.2 ft) in diameter. Second, a space-based telescope could observe infrared and ultraviolet light, which are strongly absorbed by the atmosphere.

    Spitzer devoted much of his career to pushing for the development of a space telescope. In 1962, a report by the U.S. National Academy of Sciences recommended development of a space telescope as part of the space program, and in 1965 Spitzer was appointed as head of a committee given the task of defining scientific objectives for a large space telescope.

    Space-based astronomy had begun on a very small scale following World War II, as scientists made use of developments that had taken place in rocket technology. The first ultraviolet spectrum of the Sun was obtained in 1946, and the National Aeronautics and Space Administration (US) launched the Orbiting Solar Observatory (OSO) to obtain UV, X-ray, and gamma-ray spectra in 1962.

    An orbiting solar telescope was launched in 1962 by the United Kingdom as part of the Ariel space program, and in 1966 NASA launched the first Orbiting Astronomical Observatory (OAO) mission. OAO-1’s battery failed after three days, terminating the mission. It was followed by OAO-2, which carried out ultraviolet observations of stars and galaxies from its launch in 1968 until 1972, well beyond its original planned lifetime of one year.

    The OSO and OAO missions demonstrated the important role space-based observations could play in astronomy. In 1968, NASA developed firm plans for a space-based reflecting telescope with a mirror 3 m (9.8 ft) in diameter, known provisionally as the Large Orbiting Telescope or Large Space Telescope (LST), with a launch slated for 1979. These plans emphasized the need for crewed maintenance missions to the telescope to ensure such a costly program had a lengthy working life, and the concurrent development of plans for the reusable Space Shuttle indicated that the technology to allow this was soon to become available.

    Quest for funding

    The continuing success of the OAO program encouraged increasingly strong consensus within the astronomical community that the LST should be a major goal. In 1970, NASA established two committees, one to plan the engineering side of the space telescope project, and the other to determine the scientific goals of the mission. Once these had been established, the next hurdle for NASA was to obtain funding for the instrument, which would be far more costly than any Earth-based telescope. The U.S. Congress questioned many aspects of the proposed budget for the telescope and forced cuts in the budget for the planning stages, which at the time consisted of very detailed studies of potential instruments and hardware for the telescope. In 1974, public spending cuts led to Congress deleting all funding for the telescope project.
    In response a nationwide lobbying effort was coordinated among astronomers. Many astronomers met congressmen and senators in person, and large scale letter-writing campaigns were organized. The National Academy of Sciences published a report emphasizing the need for a space telescope, and eventually the Senate agreed to half the budget that had originally been approved by Congress.

    The funding issues led to something of a reduction in the scale of the project, with the proposed mirror diameter reduced from 3 m to 2.4 m, both to cut costs and to allow a more compact and effective configuration for the telescope hardware. A proposed precursor 1.5 m (4.9 ft) space telescope to test the systems to be used on the main satellite was dropped, and budgetary concerns also prompted collaboration with the European Space Agency. ESA agreed to provide funding and supply one of the first generation instruments for the telescope, as well as the solar cells that would power it, and staff to work on the telescope in the United States, in return for European astronomers being guaranteed at least 15% of the observing time on the telescope. Congress eventually approved funding of US$36 million for 1978, and the design of the LST began in earnest, aiming for a launch date of 1983. In 1983 the telescope was named after Edwin Hubble, who confirmed one of the greatest scientific discoveries of the 20th century, made by Georges Lemaître, that the universe is expanding.

    Construction and engineering

    Once the Space Telescope project had been given the go-ahead, work on the program was divided among many institutions. NASA Marshall Space Flight Center (MSFC) was given responsibility for the design, development, and construction of the telescope, while Goddard Space Flight Center was given overall control of the scientific instruments and ground-control center for the mission. MSFC commissioned the optics company Perkin-Elmer to design and build the Optical Telescope Assembly (OTA) and Fine Guidance Sensors for the space telescope. Lockheed was commissioned to construct and integrate the spacecraft in which the telescope would be housed.

    Optical Telescope Assembly

    Optically, the HST is a Cassegrain reflector of Ritchey–Chrétien design, as are most large professional telescopes. This design, with two hyperbolic mirrors, is known for good imaging performance over a wide field of view, with the disadvantage that the mirrors have shapes that are hard to fabricate and test. The mirror and optical systems of the telescope determine the final performance, and they were designed to exacting specifications. Optical telescopes typically have mirrors polished to an accuracy of about a tenth of the wavelength of visible light, but the Space Telescope was to be used for observations from the visible through the ultraviolet (shorter wavelengths) and was specified to be diffraction limited to take full advantage of the space environment. Therefore, its mirror needed to be polished to an accuracy of 10 nanometers, or about 1/65 of the wavelength of red light. On the long wavelength end, the OTA was not designed with optimum IR performance in mind—for example, the mirrors are kept at stable (and warm, about 15 °C) temperatures by heaters. This limits Hubble’s performance as an infrared telescope.

    Perkin-Elmer intended to use custom-built and extremely sophisticated computer-controlled polishing machines to grind the mirror to the required shape. However, in case their cutting-edge technology ran into difficulties, NASA demanded that PE sub-contract to Kodak to construct a back-up mirror using traditional mirror-polishing techniques. (The team of Kodak and Itek also bid on the original mirror polishing work. Their bid called for the two companies to double-check each other’s work, which would have almost certainly caught the polishing error that later caused such problems.) The Kodak mirror is now on permanent display at the National Air and Space Museum. An Itek mirror built as part of the effort is now used in the 2.4 m telescope at the Magdalena Ridge Observatory.

    Construction of the Perkin-Elmer mirror began in 1979, starting with a blank manufactured by Corning from their ultra-low expansion glass. To keep the mirror’s weight to a minimum it consisted of top and bottom plates, each one inch (25 mm) thick, sandwiching a honeycomb lattice. Perkin-Elmer simulated microgravity by supporting the mirror from the back with 130 rods that exerted varying amounts of force. This ensured the mirror’s final shape would be correct and to specification when finally deployed. Mirror polishing continued until May 1981. NASA reports at the time questioned Perkin-Elmer’s managerial structure, and the polishing began to slip behind schedule and over budget. To save money, NASA halted work on the back-up mirror and put the launch date of the telescope back to October 1984. The mirror was completed by the end of 1981; it was washed using 2,400 US gallons (9,100 L) of hot, deionized water and then received a reflective coating of 65 nm-thick aluminum and a protective coating of 25 nm-thick magnesium fluoride.

    Doubts continued to be expressed about Perkin-Elmer’s competence on a project of this importance, as their budget and timescale for producing the rest of the OTA continued to inflate. In response to a schedule described as “unsettled and changing daily”, NASA postponed the launch date of the telescope until April 1985. Perkin-Elmer’s schedules continued to slip at a rate of about one month per quarter, and at times delays reached one day for each day of work. NASA was forced to postpone the launch date until March and then September 1986. By this time, the total project budget had risen to US$1.175 billion.

    Spacecraft systems

    The spacecraft in which the telescope and instruments were to be housed was another major engineering challenge. It would have to withstand frequent passages from direct sunlight into the darkness of Earth’s shadow, which would cause major changes in temperature, while being stable enough to allow extremely accurate pointing of the telescope. A shroud of multi-layer insulation keeps the temperature within the telescope stable and surrounds a light aluminum shell in which the telescope and instruments sit. Within the shell, a graphite-epoxy frame keeps the working parts of the telescope firmly aligned. Because graphite composites are hygroscopic, there was a risk that water vapor absorbed by the truss while in Lockheed’s clean room would later be expressed in the vacuum of space; resulting in the telescope’s instruments being covered by ice. To reduce that risk, a nitrogen gas purge was performed before launching the telescope into space.

    While construction of the spacecraft in which the telescope and instruments would be housed proceeded somewhat more smoothly than the construction of the OTA, Lockheed still experienced some budget and schedule slippage, and by the summer of 1985, construction of the spacecraft was 30% over budget and three months behind schedule. An MSFC report said Lockheed tended to rely on NASA directions rather than take their own initiative in the construction.

    Computer systems and data processing

    The two initial, primary computers on the HST were the 1.25 MHz DF-224 system, built by Rockwell Autonetics, which contained three redundant CPUs, and two redundant NSSC-1 (NASA Standard Spacecraft Computer, Model 1) systems, developed by Westinghouse and GSFC using diode–transistor logic (DTL). A co-processor for the DF-224 was added during Servicing Mission 1 in 1993, which consisted of two redundant strings of an Intel-based 80386 processor with an 80387 math co-processor. The DF-224 and its 386 co-processor were replaced by a 25 MHz Intel-based 80486 processor system during Servicing Mission 3A in 1999. The new computer is 20 times faster, with six times more memory, than the DF-224 it replaced. It increases throughput by moving some computing tasks from the ground to the spacecraft and saves money by allowing the use of modern programming languages.

    Additionally, some of the science instruments and components had their own embedded microprocessor-based control systems. The MATs (Multiple Access Transponder) components, MAT-1 and MAT-2, utilize Hughes Aircraft CDP1802CD microprocessors. The Wide Field and Planetary Camera (WFPC) also utilized an RCA 1802 microprocessor (or possibly the older 1801 version). The WFPC-1 was replaced by the WFPC-2 [below] during Servicing Mission 1 in 1993, which was then replaced by the Wide Field Camera 3 (WFC3) [below] during Servicing Mission 4 in 2009.

    Initial instruments

    When launched, the HST carried five scientific instruments: the Wide Field and Planetary Camera (WF/PC), Goddard High Resolution Spectrograph (GHRS), High Speed Photometer (HSP), Faint Object Camera (FOC) and the Faint Object Spectrograph (FOS). WF/PC was a high-resolution imaging device primarily intended for optical observations. It was built by NASA JPL-Caltech(US), and incorporated a set of 48 filters isolating spectral lines of particular astrophysical interest. The instrument contained eight charge-coupled device (CCD) chips divided between two cameras, each using four CCDs. Each CCD has a resolution of 0.64 megapixels. The wide field camera (WFC) covered a large angular field at the expense of resolution, while the planetary camera (PC) took images at a longer effective focal length than the WF chips, giving it a greater magnification.

    The GHRS was a spectrograph designed to operate in the ultraviolet. It was built by the Goddard Space Flight Center and could achieve a spectral resolution of 90,000. Also optimized for ultraviolet observations were the FOC and FOS, which were capable of the highest spatial resolution of any instruments on Hubble. Rather than CCDs these three instruments used photon-counting digicons as their detectors. The FOC was constructed by ESA, while the University of California, San Diego(US), and Martin Marietta Corporation built the FOS.

    The final instrument was the HSP, designed and built at the University of Wisconsin–Madison(US). It was optimized for visible and ultraviolet light observations of variable stars and other astronomical objects varying in brightness. It could take up to 100,000 measurements per second with a photometric accuracy of about 2% or better.

    HST’s guidance system can also be used as a scientific instrument. Its three Fine Guidance Sensors (FGS) are primarily used to keep the telescope accurately pointed during an observation, but can also be used to carry out extremely accurate astrometry; measurements accurate to within 0.0003 arcseconds have been achieved.

    Ground support

    The Space Telescope Science Institute (STScI) is responsible for the scientific operation of the telescope and the delivery of data products to astronomers. STScI is operated by the Association of Universities for Research in Astronomy(US) (AURA) and is physically located in Baltimore, Maryland on the Homewood campus of Johns Hopkins University(US), one of the 39 U.S. universities and seven international affiliates that make up the AURA consortium. STScI was established in 1981 after something of a power struggle between NASA and the scientific community at large. NASA had wanted to keep this function in-house, but scientists wanted it to be based in an academic establishment. The Space Telescope European Coordinating Facility (ST-ECF), established at Garching bei München near Munich in 1984, provided similar support for European astronomers until 2011, when these activities were moved to the European Space Astronomy Centre.

    One rather complex task that falls to STScI is scheduling observations for the telescope. Hubble is in a low-Earth orbit to enable servicing missions, but this means most astronomical targets are occulted by the Earth for slightly less than half of each orbit. Observations cannot take place when the telescope passes through the South Atlantic Anomaly due to elevated radiation levels, and there are also sizable exclusion zones around the Sun (precluding observations of Mercury), Moon and Earth. The solar avoidance angle is about 50°, to keep sunlight from illuminating any part of the OTA. Earth and Moon avoidance keeps bright light out of the FGSs, and keeps scattered light from entering the instruments. If the FGSs are turned off, the Moon and Earth can be observed. Earth observations were used very early in the program to generate flat-fields for the WFPC1 instrument. There is a so-called continuous viewing zone (CVZ), at roughly 90° to the plane of Hubble’s orbit, in which targets are not occulted for long periods.

    Challenger disaster, delays, and eventual launch

    By January 1986, the planned launch date of October looked feasible, but the Challenger explosion brought the U.S. space program to a halt, grounding the Shuttle fleet and forcing the launch of Hubble to be postponed for several years. The telescope had to be kept in a clean room, powered up and purged with nitrogen, until a launch could be rescheduled. This costly situation (about US$6 million per month) pushed the overall costs of the project even higher. This delay did allow time for engineers to perform extensive tests, swap out a possibly failure-prone battery, and make other improvements. Furthermore, the ground software needed to control Hubble was not ready in 1986, and was barely ready by the 1990 launch.

    Eventually, following the resumption of shuttle flights in 1988, the launch of the telescope was scheduled for 1990. On April 24, 1990, Space Shuttle Discovery successfully launched it during the STS-31 mission.

    From its original total cost estimate of about US$400 million, the telescope cost about US$4.7 billion by the time of its launch. Hubble’s cumulative costs were estimated to be about US$10 billion in 2010, twenty years after launch.

    List of Hubble instruments

    Hubble accommodates five science instruments at a given time, plus the Fine Guidance Sensors, which are mainly used for aiming the telescope but are occasionally used for scientific astrometry measurements. Early instruments were replaced with more advanced ones during the Shuttle servicing missions. COSTAR was a corrective optics device rather than a science instrument, but occupied one of the five instrument bays.
    Since the final servicing mission in 2009, the four active instruments have been ACS, COS, STIS and WFC3. NICMOS is kept in hibernation, but may be revived if WFC3 were to fail in the future.

    Advanced Camera for Surveys (ACS; 2002–present)
    Cosmic Origins Spectrograph (COS; 2009–present)
    Corrective Optics Space Telescope Axial Replacement (COSTAR; 1993–2009)
    Faint Object Camera (FOC; 1990–2002)
    Faint Object Spectrograph (FOS; 1990–1997)
    Fine Guidance Sensor (FGS; 1990–present)
    Goddard High Resolution Spectrograph (GHRS/HRS; 1990–1997)
    High Speed Photometer (HSP; 1990–1993)
    Near Infrared Camera and Multi-Object Spectrometer (NICMOS; 1997–present, hibernating since 2008)
    Space Telescope Imaging Spectrograph (STIS; 1997–present (non-operative 2004–2009))
    Wide Field and Planetary Camera (WFPC; 1990–1993)
    Wide Field and Planetary Camera 2 (WFPC2; 1993–2009)
    Wide Field Camera 3 (WFC3; 2009–present)

    Of the former instruments, three (COSTAR, FOS and WFPC2) are displayed in the Smithsonian National Air and Space Museum. The FOC is in the Dornier museum, Germany. The HSP is in the Space Place at the University of Wisconsin–Madison. The first WFPC was dismantled, and some components were then re-used in WFC3.

    Flawed mirror

    Within weeks of the launch of the telescope, the returned images indicated a serious problem with the optical system. Although the first images appeared to be sharper than those of ground-based telescopes, Hubble failed to achieve a final sharp focus and the best image quality obtained was drastically lower than expected. Images of point sources spread out over a radius of more than one arcsecond, instead of having a point spread function (PSF) concentrated within a circle 0.1 arcseconds (485 nrad) in diameter, as had been specified in the design criteria.

    Analysis of the flawed images revealed that the primary mirror had been polished to the wrong shape. Although it was believed to be one of the most precisely figured optical mirrors ever made, smooth to about 10 nanometers, the outer perimeter was too flat by about 2200 nanometers (about 1⁄450 mm or 1⁄11000 inch). This difference was catastrophic, introducing severe spherical aberration, a flaw in which light reflecting off the edge of a mirror focuses on a different point from the light reflecting off its center.

    The effect of the mirror flaw on scientific observations depended on the particular observation—the core of the aberrated PSF was sharp enough to permit high-resolution observations of bright objects, and spectroscopy of point sources was affected only through a sensitivity loss. However, the loss of light to the large, out-of-focus halo severely reduced the usefulness of the telescope for faint objects or high-contrast imaging. This meant nearly all the cosmological programs were essentially impossible, since they required observation of exceptionally faint objects. This led politicians to question NASA’s competence, scientists to rue the cost which could have gone to more productive endeavors, and comedians to make jokes about NASA and the telescope − in the 1991 comedy The Naked Gun 2½: The Smell of Fear, in a scene where historical disasters are displayed, Hubble is pictured with RMS Titanic and LZ 129 Hindenburg. Nonetheless, during the first three years of the Hubble mission, before the optical corrections, the telescope still carried out a large number of productive observations of less demanding targets. The error was well characterized and stable, enabling astronomers to partially compensate for the defective mirror by using sophisticated image processing techniques such as deconvolution.

    Origin of the problem

    A commission headed by Lew Allen, director of the Jet Propulsion Laboratory, was established to determine how the error could have arisen. The Allen Commission found that a reflective null corrector, a testing device used to achieve a properly shaped non-spherical mirror, had been incorrectly assembled—one lens was out of position by 1.3 mm (0.051 in). During the initial grinding and polishing of the mirror, Perkin-Elmer analyzed its surface with two conventional refractive null correctors. However, for the final manufacturing step (figuring), they switched to the custom-built reflective null corrector, designed explicitly to meet very strict tolerances. The incorrect assembly of this device resulted in the mirror being ground very precisely but to the wrong shape. A few final tests, using the conventional null correctors, correctly reported spherical aberration. But these results were dismissed, thus missing the opportunity to catch the error, because the reflective null corrector was considered more accurate.

    The commission blamed the failings primarily on Perkin-Elmer. Relations between NASA and the optics company had been severely strained during the telescope construction, due to frequent schedule slippage and cost overruns. NASA found that Perkin-Elmer did not review or supervise the mirror construction adequately, did not assign its best optical scientists to the project (as it had for the prototype), and in particular did not involve the optical designers in the construction and verification of the mirror. While the commission heavily criticized Perkin-Elmer for these managerial failings, NASA was also criticized for not picking up on the quality control shortcomings, such as relying totally on test results from a single instrument.

    Design of a solution

    Many feared that Hubble would be abandoned. The design of the telescope had always incorporated servicing missions, and astronomers immediately began to seek potential solutions to the problem that could be applied at the first servicing mission, scheduled for 1993. While Kodak had ground a back-up mirror for Hubble, it would have been impossible to replace the mirror in orbit, and too expensive and time-consuming to bring the telescope back to Earth for a refit. Instead, the fact that the mirror had been ground so precisely to the wrong shape led to the design of new optical components with exactly the same error but in the opposite sense, to be added to the telescope at the servicing mission, effectively acting as “spectacles” to correct the spherical aberration.

    The first step was a precise characterization of the error in the main mirror. Working backwards from images of point sources, astronomers determined that the conic constant of the mirror as built was −1.01390±0.0002, instead of the intended −1.00230. The same number was also derived by analyzing the null corrector used by Perkin-Elmer to figure the mirror, as well as by analyzing interferograms obtained during ground testing of the mirror.

    Because of the way the HST’s instruments were designed, two different sets of correctors were required. The design of the Wide Field and Planetary Camera 2, already planned to replace the existing WF/PC, included relay mirrors to direct light onto the four separate charge-coupled device (CCD) chips making up its two cameras. An inverse error built into their surfaces could completely cancel the aberration of the primary. However, the other instruments lacked any intermediate surfaces that could be figured in this way, and so required an external correction device.

    The Corrective Optics Space Telescope Axial Replacement (COSTAR) system was designed to correct the spherical aberration for light focused at the FOC, FOS, and GHRS. It consists of two mirrors in the light path with one ground to correct the aberration. To fit the COSTAR system onto the telescope, one of the other instruments had to be removed, and astronomers selected the High Speed Photometer to be sacrificed. By 2002, all the original instruments requiring COSTAR had been replaced by instruments with their own corrective optics. COSTAR was removed and returned to Earth in 2009 where it is exhibited at the National Air and Space Museum. The area previously used by COSTAR is now occupied by the Cosmic Origins Spectrograph.

    Servicing missions and new instruments

    Servicing Mission 1

    The first Hubble serving mission was scheduled for 1993 before the mirror problem was discovered. It assumed greater importance, as the astronauts would need to do extensive work to install corrective optics; failure would have resulted in either abandoning Hubble or accepting its permanent disability. Other components failed before the mission, causing the repair cost to rise to $500 million (not including the cost of the shuttle flight). A successful repair would help demonstrate the viability of building Space Station Alpha, however.

    STS-49 in 1992 demonstrated the difficulty of space work. While its rescue of Intelsat 603 received praise, the astronauts had taken possibly reckless risks in doing so. Neither the rescue nor the unrelated assembly of prototype space station components occurred as the astronauts had trained, causing NASA to reassess planning and training, including for the Hubble repair. The agency assigned to the mission Story Musgrave—who had worked on satellite repair procedures since 1976—and six other experienced astronauts, including two from STS-49. The first mission director since Project Apollo would coordinate a crew with 16 previous shuttle flights. The astronauts were trained to use about a hundred specialized tools.

    Heat had been the problem on prior spacewalks, which occurred in sunlight. Hubble needed to be repaired out of sunlight. Musgrave discovered during vacuum training, seven months before the mission, that spacesuit gloves did not sufficiently protect against the cold of space. After STS-57 confirmed the issue in orbit, NASA quickly changed equipment, procedures, and flight plan. Seven total mission simulations occurred before launch, the most thorough preparation in shuttle history. No complete Hubble mockup existed, so the astronauts studied many separate models (including one at the Smithsonian) and mentally combined their varying and contradictory details. Service Mission 1 flew aboard Endeavour in December 1993, and involved installation of several instruments and other equipment over ten days.

    Most importantly, the High Speed Photometer was replaced with the COSTAR corrective optics package, and WFPC was replaced with the Wide Field and Planetary Camera 2 (WFPC2) with an internal optical correction system. The solar arrays and their drive electronics were also replaced, as well as four gyroscopes in the telescope pointing system, two electrical control units and other electrical components, and two magnetometers. The onboard computers were upgraded with added coprocessors, and Hubble’s orbit was boosted.

    On January 13, 1994, NASA declared the mission a complete success and showed the first sharper images. The mission was one of the most complex performed up until that date, involving five long extra-vehicular activity periods. Its success was a boon for NASA, as well as for the astronomers who now had a more capable space telescope.

    Servicing Mission 2

    Servicing Mission 2, flown by Discovery in February 1997, replaced the GHRS and the FOS with the Space Telescope Imaging Spectrograph (STIS) and the Near Infrared Camera and Multi-Object Spectrometer (NICMOS), replaced an Engineering and Science Tape Recorder with a new Solid State Recorder, and repaired thermal insulation. NICMOS contained a heat sink of solid nitrogen to reduce the thermal noise from the instrument, but shortly after it was installed, an unexpected thermal expansion resulted in part of the heat sink coming into contact with an optical baffle. This led to an increased warming rate for the instrument and reduced its original expected lifetime of 4.5 years to about two years.

    Servicing Mission 3A

    Servicing Mission 3A, flown by Discovery, took place in December 1999, and was a split-off from Servicing Mission 3 after three of the six onboard gyroscopes had failed. The fourth failed a few weeks before the mission, rendering the telescope incapable of performing scientific observations. The mission replaced all six gyroscopes, replaced a Fine Guidance Sensor and the computer, installed a Voltage/temperature Improvement Kit (VIK) to prevent battery overcharging, and replaced thermal insulation blankets.

    Servicing Mission 3B

    Servicing Mission 3B flown by Columbia in March 2002 saw the installation of a new instrument, with the FOC (which, except for the Fine Guidance Sensors when used for astrometry, was the last of the original instruments) being replaced by the Advanced Camera for Surveys (ACS). This meant COSTAR was no longer required, since all new instruments had built-in correction for the main mirror aberration. The mission also revived NICMOS by installing a closed-cycle cooler and replaced the solar arrays for the second time, providing 30 percent more power.

    Servicing Mission 4

    Plans called for Hubble to be serviced in February 2005, but the Columbia disaster in 2003, in which the orbiter disintegrated on re-entry into the atmosphere, had wide-ranging effects on the Hubble program. NASA Administrator Sean O’Keefe decided all future shuttle missions had to be able to reach the safe haven of the International Space Station should in-flight problems develop. As no shuttles were capable of reaching both HST and the space station during the same mission, future crewed service missions were canceled. This decision was criticised by numerous astronomers who felt Hubble was valuable enough to merit the human risk. HST’s planned successor, the James Webb Telescope (JWST), as of 2004 was not expected to launch until at least 2011. A gap in space-observing capabilities between a decommissioning of Hubble and the commissioning of a successor was of major concern to many astronomers, given the significant scientific impact of HST. The consideration that JWST will not be located in low Earth orbit, and therefore cannot be easily upgraded or repaired in the event of an early failure, only made concerns more acute. On the other hand, many astronomers felt strongly that servicing Hubble should not take place if the expense were to come from the JWST budget.

    In January 2004, O’Keefe said he would review his decision to cancel the final servicing mission to HST, due to public outcry and requests from Congress for NASA to look for a way to save it. The National Academy of Sciences convened an official panel, which recommended in July 2004 that the HST should be preserved despite the apparent risks. Their report urged “NASA should take no actions that would preclude a space shuttle servicing mission to the Hubble Space Telescope”. In August 2004, O’Keefe asked Goddard Space Flight Center to prepare a detailed proposal for a robotic service mission. These plans were later canceled, the robotic mission being described as “not feasible”. In late 2004, several Congressional members, led by Senator Barbara Mikulski, held public hearings and carried on a fight with much public support (including thousands of letters from school children across the U.S.) to get the Bush Administration and NASA to reconsider the decision to drop plans for a Hubble rescue mission.

    The nomination in April 2005 of a new NASA Administrator, Michael D. Griffin, changed the situation, as Griffin stated he would consider a crewed servicing mission. Soon after his appointment Griffin authorized Goddard to proceed with preparations for a crewed Hubble maintenance flight, saying he would make the final decision after the next two shuttle missions. In October 2006 Griffin gave the final go-ahead, and the 11-day mission by Atlantis was scheduled for October 2008. Hubble’s main data-handling unit failed in September 2008, halting all reporting of scientific data until its back-up was brought online on October 25, 2008. Since a failure of the backup unit would leave the HST helpless, the service mission was postponed to incorporate a replacement for the primary unit.

    Servicing Mission 4 (SM4), flown by Atlantis in May 2009, was the last scheduled shuttle mission for HST. SM4 installed the replacement data-handling unit, repaired the ACS and STIS systems, installed improved nickel hydrogen batteries, and replaced other components including all six gyroscopes. SM4 also installed two new observation instruments—Wide Field Camera 3 (WFC3) and the Cosmic Origins Spectrograph (COS)—and the Soft Capture and Rendezvous System, which will enable the future rendezvous, capture, and safe disposal of Hubble by either a crewed or robotic mission. Except for the ACS’s High Resolution Channel, which could not be repaired and was disabled, the work accomplished during SM4 rendered the telescope fully functional.

    Major projects

    Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey [CANDELS]

    The survey “aims to explore galactic evolution in the early Universe, and the very first seeds of cosmic structure at less than one billion years after the Big Bang.” The CANDELS project site describes the survey’s goals as the following:

    The Cosmic Assembly Near-IR Deep Extragalactic Legacy Survey is designed to document the first third of galactic evolution from z = 8 to 1.5 via deep imaging of more than 250,000 galaxies with WFC3/IR and ACS. It will also find the first Type Ia SNe beyond z > 1.5 and establish their accuracy as standard candles for cosmology. Five premier multi-wavelength sky regions are selected; each has multi-wavelength data from Spitzer and other facilities, and has extensive spectroscopy of the brighter galaxies. The use of five widely separated fields mitigates cosmic variance and yields statistically robust and complete samples of galaxies down to 109 solar masses out to z ~ 8.

    Frontier Fields program

    The program, officially named Hubble Deep Fields Initiative 2012, is aimed to advance the knowledge of early galaxy formation by studying high-redshift galaxies in blank fields with the help of gravitational lensing to see the “faintest galaxies in the distant universe”. The Frontier Fields web page describes the goals of the program being:

    To reveal hitherto inaccessible populations of z = 5–10 galaxies that are ten to fifty times fainter intrinsically than any presently known
    To solidify our understanding of the stellar masses and star formation histories of sub-L* galaxies at the earliest times
    To provide the first statistically meaningful morphological characterization of star forming galaxies at z > 5
    To find z > 8 galaxies stretched out enough by cluster lensing to discern internal structure and/or magnified enough by cluster lensing for spectroscopic follow-up.

    Cosmic Evolution Survey (COSMOS)

    The Cosmic Evolution Survey (COSMOS) is an astronomical survey designed to probe the formation and evolution of galaxies as a function of both cosmic time (redshift) and the local galaxy environment. The survey covers a two square degree equatorial field with spectroscopy and X-ray to radio imaging by most of the major space-based telescopes and a number of large ground based telescopes, making it a key focus region of extragalactic astrophysics. COSMOS was launched in 2006 as the largest project pursued by the Hubble Space Telescope at the time, and still is the largest continuous area of sky covered for the purposes of mapping deep space in blank fields, 2.5 times the area of the moon on the sky and 17 times larger than the largest of the CANDELS regions. The COSMOS scientific collaboration that was forged from the initial COSMOS survey is the largest and longest-running extragalactic collaboration, known for its collegiality and openness. The study of galaxies in their environment can be done only with large areas of the sky, larger than a half square degree. More than two million galaxies are detected, spanning 90% of the age of the Universe. The COSMOS collaboration is led by Caitlin Casey, Jeyhan Kartaltepe, and Vernesa Smolcic and involves more than 200 scientists in a dozen countries.

    Important discoveries

    Hubble has helped resolve some long-standing problems in astronomy, while also raising new questions. Some results have required new theories to explain them.

    Age of the universe

    Among its primary mission targets was to measure distances to Cepheid variable stars more accurately than ever before, and thus constrain the value of the Hubble constant, the measure of the rate at which the universe is expanding, which is also related to its age. Before the launch of HST, estimates of the Hubble constant typically had errors of up to 50%, but Hubble measurements of Cepheid variables in the Virgo Cluster and other distant galaxy clusters provided a measured value with an accuracy of ±10%, which is consistent with other more accurate measurements made since Hubble’s launch using other techniques. The estimated age is now about 13.7 billion years, but before the Hubble Telescope, scientists predicted an age ranging from 10 to 20 billion years.

    Expansion of the universe

    While Hubble helped to refine estimates of the age of the universe, it also cast doubt on theories about its future. Astronomers from the High-z Supernova Search Team and the Supernova Cosmology Project used ground-based telescopes and HST to observe distant supernovae and uncovered evidence that, far from decelerating under the influence of gravity, the expansion of the universe may in fact be accelerating. Three members of these two groups have subsequently been awarded Nobel Prizes for their discovery.

    Saul Perlmutter [The Supernova Cosmology Project] shared the 2006 Shaw Prize in Astronomy, the 2011 Nobel Prize in Physics, and the 2015 Breakthrough Prize in Fundamental Physics with Brian P. Schmidt and Adam Riess [The High-z Supernova Search Team] for providing evidence that the expansion of the universe is accelerating.

    The cause of this acceleration remains poorly understood; the most common cause attributed is Dark Energy.

    Black holes

    The high-resolution spectra and images provided by the HST have been especially well-suited to establishing the prevalence of black holes in the center of nearby galaxies. While it had been hypothesized in the early 1960s that black holes would be found at the centers of some galaxies, and astronomers in the 1980s identified a number of good black hole candidates, work conducted with Hubble shows that black holes are probably common to the centers of all galaxies. The Hubble programs further established that the masses of the nuclear black holes and properties of the galaxies are closely related. The legacy of the Hubble programs on black holes in galaxies is thus to demonstrate a deep connection between galaxies and their central black holes.

    Extending visible wavelength images

    A unique window on the Universe enabled by Hubble are the Hubble Deep Field, Hubble Ultra-Deep Field, and Hubble Extreme Deep Field images, which used Hubble’s unmatched sensitivity at visible wavelengths to create images of small patches of sky that are the deepest ever obtained at optical wavelengths. The images reveal galaxies billions of light years away, and have generated a wealth of scientific papers, providing a new window on the early Universe. The Wide Field Camera 3 improved the view of these fields in the infrared and ultraviolet, supporting the discovery of some of the most distant objects yet discovered, such as MACS0647-JD.

    The non-standard object SCP 06F6 was discovered by the Hubble Space Telescope in February 2006.

    On March 3, 2016, researchers using Hubble data announced the discovery of the farthest known galaxy to date: GN-z11. The Hubble observations occurred on February 11, 2015, and April 3, 2015, as part of the CANDELS/GOODS-North surveys.

    Solar System discoveries

    HST has also been used to study objects in the outer reaches of the Solar System, including the dwarf planets Pluto and Eris.

    The collision of Comet Shoemaker-Levy 9 with Jupiter in 1994 was fortuitously timed for astronomers, coming just a few months after Servicing Mission 1 had restored Hubble’s optical performance. Hubble images of the planet were sharper than any taken since the passage of Voyager 2 in 1979, and were crucial in studying the dynamics of the collision of a comet with Jupiter, an event believed to occur once every few centuries.

    During June and July 2012, U.S. astronomers using Hubble discovered Styx, a tiny fifth moon orbiting Pluto.

    In March 2015, researchers announced that measurements of aurorae around Ganymede, one of Jupiter’s moons, revealed that it has a subsurface ocean. Using Hubble to study the motion of its aurorae, the researchers determined that a large saltwater ocean was helping to suppress the interaction between Jupiter’s magnetic field and that of Ganymede. The ocean is estimated to be 100 km (60 mi) deep, trapped beneath a 150 km (90 mi) ice crust.

    From June to August 2015, Hubble was used to search for a Kuiper belt object (KBO) target for the New Horizons Kuiper Belt Extended Mission (KEM) when similar searches with ground telescopes failed to find a suitable target.

    This resulted in the discovery of at least five new KBOs, including the eventual KEM target, 486958 Arrokoth, that New Horizons performed a close fly-by of on January 1, 2019.

    In August 2020, taking advantage of a total lunar eclipse, astronomers using NASA’s Hubble Space Telescope have detected Earth’s own brand of sunscreen – ozone – in our atmosphere. This method simulates how astronomers and astrobiology researchers will search for evidence of life beyond Earth by observing potential “biosignatures” on exoplanets (planets around other stars).
    Hubble and ALMA image of MACS J1149.5+2223.

    Supernova reappearance

    On December 11, 2015, Hubble captured an image of the first-ever predicted reappearance of a supernova, dubbed “Refsdal”, which was calculated using different mass models of a galaxy cluster whose gravity is warping the supernova’s light. The supernova was previously seen in November 2014 behind galaxy cluster MACS J1149.5+2223 as part of Hubble’s Frontier Fields program. Astronomers spotted four separate images of the supernova in an arrangement known as an “Einstein Cross”.

    The light from the cluster has taken about five billion years to reach Earth, though the supernova exploded some 10 billion years ago. Based on early lens models, a fifth image was predicted to reappear by the end of 2015. The detection of Refsdal’s reappearance in December 2015 served as a unique opportunity for astronomers to test their models of how mass, especially dark matter, is distributed within this galaxy cluster.

    Impact on astronomy

    Many objective measures show the positive impact of Hubble data on astronomy. Over 15,000 papers based on Hubble data have been published in peer-reviewed journals, and countless more have appeared in conference proceedings. Looking at papers several years after their publication, about one-third of all astronomy papers have no citations, while only two percent of papers based on Hubble data have no citations. On average, a paper based on Hubble data receives about twice as many citations as papers based on non-Hubble data. Of the 200 papers published each year that receive the most citations, about 10% are based on Hubble data.

    Although the HST has clearly helped astronomical research, its financial cost has been large. A study on the relative astronomical benefits of different sizes of telescopes found that while papers based on HST data generate 15 times as many citations as a 4 m (13 ft) ground-based telescope such as the William Herschel Telescope, the HST costs about 100 times as much to build and maintain.

    Deciding between building ground- versus space-based telescopes is complex. Even before Hubble was launched, specialized ground-based techniques such as aperture masking interferometry had obtained higher-resolution optical and infrared images than Hubble would achieve, though restricted to targets about 108 times brighter than the faintest targets observed by Hubble. Since then, advances in “adaptive optics” have extended the high-resolution imaging capabilities of ground-based telescopes to the infrared imaging of faint objects.

    The usefulness of adaptive optics versus HST observations depends strongly on the particular details of the research questions being asked. In the visible bands, adaptive optics can correct only a relatively small field of view, whereas HST can conduct high-resolution optical imaging over a wide field. Only a small fraction of astronomical objects are accessible to high-resolution ground-based imaging; in contrast Hubble can perform high-resolution observations of any part of the night sky, and on objects that are extremely faint.

    Impact on aerospace engineering

    In addition to its scientific results, Hubble has also made significant contributions to aerospace engineering, in particular the performance of systems in low Earth orbit. These insights result from Hubble’s long lifetime on orbit, extensive instrumentation, and return of assemblies to the Earth where they can be studied in detail. In particular, Hubble has contributed to studies of the behavior of graphite composite structures in vacuum, optical contamination from residual gas and human servicing, radiation damage to electronics and sensors, and the long term behavior of multi-layer insulation. One lesson learned was that gyroscopes assembled using pressurized oxygen to deliver suspension fluid were prone to failure due to electric wire corrosion. Gyroscopes are now assembled using pressurized nitrogen. Another is that optical surfaces in LEO can have surprisingly long lifetimes; Hubble was only expected to last 15 years before the mirror became unusable, but after 14 years there was no measureable degradation. Finally, Hubble servicing missions, particularly those that serviced components not designed for in-space maintenance, have contributed towards the development of new tools and techniques for on-orbit repair.

    Archives

    All Hubble data is eventually made available via the Mikulski Archive for Space Telescopes at STScI, CADC and ESA/ESAC. Data is usually proprietary—available only to the principal investigator (PI) and astronomers designated by the PI—for twelve months after being taken. The PI can apply to the director of the STScI to extend or reduce the proprietary period in some circumstances.

    Observations made on Director’s Discretionary Time are exempt from the proprietary period, and are released to the public immediately. Calibration data such as flat fields and dark frames are also publicly available straight away. All data in the archive is in the FITS format, which is suitable for astronomical analysis but not for public use. The Hubble Heritage Project processes and releases to the public a small selection of the most striking images in JPEG and TIFF formats.

    Outreach activities

    It has always been important for the Space Telescope to capture the public’s imagination, given the considerable contribution of taxpayers to its construction and operational costs. After the difficult early years when the faulty mirror severely dented Hubble’s reputation with the public, the first servicing mission allowed its rehabilitation as the corrected optics produced numerous remarkable images.

    Several initiatives have helped to keep the public informed about Hubble activities. In the United States, outreach efforts are coordinated by the Space Telescope Science Institute (STScI) Office for Public Outreach, which was established in 2000 to ensure that U.S. taxpayers saw the benefits of their investment in the space telescope program. To that end, STScI operates the HubbleSite.org website. The Hubble Heritage Project, operating out of the STScI, provides the public with high-quality images of the most interesting and striking objects observed. The Heritage team is composed of amateur and professional astronomers, as well as people with backgrounds outside astronomy, and emphasizes the aesthetic nature of Hubble images. The Heritage Project is granted a small amount of time to observe objects which, for scientific reasons, may not have images taken at enough wavelengths to construct a full-color image.

    Since 1999, the leading Hubble outreach group in Europe has been the Hubble European Space Agency Information Centre (HEIC). This office was established at the Space Telescope European Coordinating Facility in Munich, Germany. HEIC’s mission is to fulfill HST outreach and education tasks for the European Space Agency. The work is centered on the production of news and photo releases that highlight interesting Hubble results and images. These are often European in origin, and so increase awareness of both ESA’s Hubble share (15%) and the contribution of European scientists to the observatory. ESA produces educational material, including a videocast series called Hubblecast designed to share world-class scientific news with the public.

    The Hubble Space Telescope has won two Space Achievement Awards from the Space Foundation, for its outreach activities, in 2001 and 2010.

    A replica of the Hubble Space Telescope is on the courthouse lawn in Marshfield, Missouri, the hometown of namesake Edwin P. Hubble.

    Major Instrumentation

    Hubble WFPC2 no longer in service.

    Wide Field Camera 3 [WFC3]

    Advanced Camera for Surveys [ACS]

    Cosmic Origins Spectrograph [COS]

    The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA’s Goddard Space Flight Center manages the telescope. The Space Telescope Science Institute (STScI), is a free-standing science center, located on the campus of The Johns Hopkins University and operated by the Association of Universities for Research in Astronomy (AURA) for NASA, conducts Hubble science operations.

    ESA50 Logo large

     
  • richardmitnick 9:43 pm on October 13, 2021 Permalink | Reply
    Tags: "Einstein’s Principle of Equivalence verified in quasars for the first time", , Astrophysics, , , ,   

    From IAC Institute of Astrophysics of the Canary Islands [Instituto de Astrofísica de Canarias] (ES) : “Einstein’s Principle of Equivalence verified in quasars for the first time” 

    Instituto de Astrofísica de Andalucía

    From IAC Institute of Astrophysics of the Canary Islands [Instituto de Astrofísica de Canarias] (ES)

    13/10/2021

    Evencio Mediavilla
    emg@iac.es

    1
    Artist impression of a quasar. Credit: M. Kornmesser/ European Southern Observatory [Observatoire européen austral][Europäische Südsternwarte](EU)(CL)

    According to Einstein’s theory of general relativity gravity affects light as well as matter. One consequence of this theory, based on the Principle of Equivalence, is that the light which escapes from a region with a strong gravitational field loses energy on its way, so that it becomes redder, a phenomenon known as the gravitational redshift. Quantifying this gives a fundamental test of Einstein’s theory of gravitation. Until now this test had been performed only on bodies in the nearby universe, but thanks to the use of a new experimental procedure scientists at the Instituto de Astrofísica de Canarias (IAC) and The University of Granada [Universidad de Granada] (ES) have been able to measure the gravitational redshift in quasars, and thus extend the test to very distant regions from where the light was emitted when our universe was young.

    Einstein’s Principle of Equivalence is the cornestone of the General Theory of Relativity, which is our best current description of gravity, and is one of the basic theories of modern physics. The principle states that it is experimentally impossible to distinguish between a gravitational field and an accelerated motion of the observer, and one of its predictions is that the light emitted from within an intense gravitational field should undergo a measurable shift to lower spectral energy, which for light means a shift to the red, which is termed “redshift”.

    This prediction has been well and very frequently confirmed close to the Earth, from the first measurements by R.V. Pound and G.A. Rebka at Harvard in 1959 until the most recent measurements with satellites. It has also been confirmed using observations of the Sun, and of some stars, such as our neighbour Sirius B, and the star S2 close to the supermassive black hole at the centre of the Galaxy. But to confirm it with measurements beyond the Galaxy has proved difficult, and there have been only a few tests with complicated measurements and low precision in clusters of galaxies relatively near to us in cosmological terms.

    The reason for this lack of testing in the more distant universe is the difficulty of measuring the redshift because in the majority of situations the effect of gravity on the light is very small. For that reason massive black holes with very strong gravitational fields offer promising scenarios for measuring gravitational redshifts. In particular the supermassive black holes found at the centres of galaxies, which have huge gravitational fields, offer one of the more promising scenarios to measure the gravitational redshift. They are situated at the centres of the extraordinarily luminous and distant quasars.

    A quasar is an object in the sky which looks like a star but is situated at a great distance from us, so that the light we receive from it was emitted when the universe was much younger than now. This means that they must be extremely bright. The origin of this huge power output is a disc of hot material which is being swallowed by the supermassive black hole at its centre. This energy is generated in a very small region, barely a few light days in size.

    In the neighbourhood of the black hole there is a very intense gravitational field and so by studying the light emitted by the chemical elements in this region (mainly hydrogen, carbon, and magnesium) we would expect to measure very large gravitational redshifts. Unfortunately the majority of the elements in quasar accretion discs are also present in regions further out from the central black hole where the gravitational effects are much smaller, so the light we receive from those elements is a mixture in which it is not easy to pick out clearly the gravitational redshifts.

    The measurements cover 80% of the history of the universe

    Now a team of researchers at the Instituto de Astrofísica de Canarias (IAC) and the University of Granada (UGR) have found a well defined portion of he ultraviolet light emitted by iron atoms from a region confined to the neighbourhood of the black hole. “Through our research related to gravitational lensing, another of the predictions of Einstein’s theory of General Relativity, we found that a characteristic spectal feature of iron in quasars seemed to be coming from a region very close to the black hole. Our measurements of the redshift confirmed this finding” explains Evencio Mediavilla, an IAC researcher, Professor at the Unversity of La Laguna(ULL) and first author of the article.

    Using this feature the researchers have been able to measure clearly and precisely the gravitational redshifts of many quasars and, using them, estimate the masses of the black holes. “This technique marks an extraordinary advance, because it allows us to measure precisely the gravitational redshifts of individual objects at great distances, which opens up important possibilities for the future” says Mediavilla.

    Jorge Jimenez Vicente, a researcher at the UGR, and co-author of the article, stressess the implications of this new experimental procedure, as it allows comparison of the measured redshift with the theoretcially predicted value: “this technique allows us for the first time to test Einstein’s Principle of Equivalence, and with it the basis of our understanding of gravity on cosmological scales.”

    This test of the Principle of Equivalence performed by these researchers is based on measurements which include active galaxies in our neighbourhood (some 13,800 million years after the Big Bang) out to individual quasars a large distances, whose light was emitted when the age of the universe was only some 2,200 million years, thus covering around 80% of the history of the universe. “The results, with a precision comparable to those of experiments carried out within our Galaxy, validate the Principle of Equivalence over this vast period of time” notes Jiménez-Vicente.

    The article has been published in the journal The Astrophysical Journal, and recently has been selected by The American Astronomical Society (US) which as published an interview with the researchers in the section “AAS Journal Author Series” of its YouTube channel, whose aim is to link the authors with their article, their personal histories, and the astronomical community in general.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    IAC Institute of Astrophysics of the Canary Islands [Instituto de Astrofísica de Canarias] (ES) operates two astronomical observatories in the Canary Islands:

    Roque de los Muchachos Observatory on La Palma
    Teide Observatory on Tenerife.

    The seeing statistics at ORM make it the second-best location for optical and infrared astronomy in the Northern Hemisphere, after Mauna Kea Observatory Hawaii (US).

    Maunakea Observatories Hawai’i (US) altitude 4,213 m (13,822 ft)

    The site also has some of the most extensive astronomical facilities in the Northern Hemisphere; its fleet of telescopes includes the 10.4 m Gran Telescopio Canarias, the world’s largest single-aperture optical telescope as of July 2009, the William Herschel Telescope (second largest in Europe), and the adaptive optics corrected Swedish 1-m Solar Telescope.

    Gran Telescopio Canarias [Instituto de Astrofísica de Canarias ](ES) sited on a volcanic peak 2,267 metres (7,438 ft) above sea level.

    The observatory was established in 1985, after 15 years of international work and cooperation of several countries with the Spanish island hosting many telescopes from Britain, The Netherlands, Spain, and other countries. The island provided better seeing conditions for the telescopes that had been moved to Herstmonceux by the Royal Greenwich Observatory, including the 98 inch aperture Isaac Newton Telescope (the largest reflector in Europe at that time). When it was moved to the island it was upgraded to a 100-inch (2.54 meter), and many even larger telescopes from various nations would be hosted there.

    Teide Observatory [Observatorio del Teide], IAU code 954, is an astronomical observatory on Mount Teide at 2,390 metres (7,840 ft), located on Tenerife, Spain. It has been operated by the Instituto de Astrofísica de Canarias since its inauguration in 1964. It became one of the first major international observatories, attracting telescopes from different countries around the world because of the good astronomical seeing conditions. Later the emphasis for optical telescopes shifted more towards Roque de los Muchachos Observatory on La Palma.

     
  • richardmitnick 1:40 pm on October 13, 2021 Permalink | Reply
    Tags: "Eerie Discovery of 2 'Identical' Galaxies in Deep Space Is Finally Explained", , Astrophysics, , , ,   

    From Science Alert (US) : “Eerie Discovery of 2 ‘Identical’ Galaxies in Deep Space Is Finally Explained” 

    ScienceAlert

    From Science Alert (US)

    13 OCTOBER 2021
    MICHELLE STARR

    1
    Hamilton’s Object. (Joseph DePasquale/Space Telescope Science Institute (US))

    Galaxies are a bit like fingerprints, or snowflakes. There are many of them out there, and they can have a lot of characteristics in common, but no two are exactly alike.

    So, back in 2013, when two galaxies were spotted side-by-side in the distant reaches of the Universe, and which looked to be startlingly similar, astronomers were flummoxed.

    Now, they’ve finally solved the mystery of these strange “identical objects” – and the answer could have implications for understanding dark matter.

    The object, now named Hamilton’s Object, was discovered by astronomer Timothy Hamilton of Shawnee State University (US) by accident, in data obtained by the Hubble Space Telescope nearly a decade ago.

    The two galaxies appeared to be the same shape, and had the same nearly parallel dark streaks across the galactic bulge – the central region of the galaxy where most of the stars live.

    “We were really stumped,” Hamilton said. “My first thought was that maybe they were interacting galaxies with tidally stretched-out arms. It didn’t really fit well, but I didn’t know what else to think.”

    It wasn’t until 2015 that a more plausible answer would emerge. Astronomer Richard Griffiths of The University of Hawaii (US), on seeing Hamilton present his object at a meeting, suggested that the culprit might be a rare phenomenon: gravitational lensing.

    This is a phenomenon that results purely from a chance alignment of massive objects in space. If a massive object sits directly between us and a more distant object, a magnification effect occurs due to the gravitational curvature of space-time around the closer object.

    Any light that then travels through this space-time follows this curvature and enters our telescopes smeared and distorted to varying degrees – but also often magnified and duplicated.

    This made a lot more sense than two identical galaxies, especially when Griffiths found yet another duplication of the galaxy (as can be seen in the picture below).

    A huge problem, however, remained: What was causing the gravitational curvature? So Griffiths and his team set about searching sky survey data for an object massive enough to produce the lensing effect.

    And they found it. Between us and Hamilton’s Object lurks a cluster of galaxies that had only been poorly documented. Usually, these discoveries go the other way – first the cluster is identified, and then astronomers go looking for lensed galaxies behind them.

    The team’s work revealed that Hamilton’s Object is around 11 billion light-years away, and a different team’s work revealed that that the cluster is about 7 billion light-years away.

    The galaxy itself is a barred spiral galaxy with its edge facing us, undergoing clumpy and uneven star formation, the researchers determined. Computer simulations then helped determine that the three duplicated images could only be created if the distribution of dark matter is smooth at small scales.

    3
    (Joseph DePasquale/STScI)

    “It’s great that we only need two mirror images in order to get the scale of how clumpy or not dark matter can be at these positions,” said astronomer Jenny Wagner of the The Ruprecht Karl University of Heidelberg, [Ruprecht-Karls-Universität Heidelberg] (DE).

    “Here, we don’t use any lens models. We just take the observables of the multiple images and the fact they can be transformed into one another. They can be folded into one another by our method. This already gives us an idea of how smooth the dark matter needs to be at these two positions.”

    The two identical side-by-side images were created because they straddle a “ripple” in space-time – an area of greatest magnification created by the gravity of a filament of dark matter. Such filaments are thought to connect the Universe in a vast, invisible cosmic web, joining galaxies and galaxy clusters and feeding them with hydrogen gas.

    But we don’t actually know what dark matter is, so any new discovery that lets us map where it is, how it’s distributed, and how it affects the space around it is another drop of evidence that will ultimately help us solve the mystery.

    “We know it’s some form of matter, but we have no idea what the constituent particle is,” Griffiths explained.

    “So we don’t know how it behaves at all. We just know that it has mass and is subject to gravity. The significance of the limits of size on the clumping or smoothness is that it gives us some clues as to what the particle might be. The smaller the dark matter clumps, the more massive the particles must be.”

    The research has been published in the MNRAS3.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

     
  • richardmitnick 10:56 am on October 13, 2021 Permalink | Reply
    Tags: "A Crystal Ball Into Our Solar System’s Future", , Astrophysics, , , ,   

    From W.M. Keck Observatory (US) : “A Crystal Ball Into Our Solar System’s Future” 

    From W.M. Keck Observatory (US)

    October 13, 2021

    Mari-Ela Chock, Communications Officer
    W. M. Keck Observatory
    (808) 554-0567
    mchock@keck.hawaii.edu

    Giant Gas Planet Orbiting a Dead Star Gives Glimpse Into the Predicted Aftermath of our Sun’s Demise.

    1
    Artist’s rendition of a newly-discovered jupiter-like exoplanet orbiting a white dwarf, or dead star. This system is evidence that planets can survive their host star’s explosive red giant phase and is the very first confirmed planetary system that serves as an analog to the fate of the sun and jupiter in our own solar system.
    Credit: Adam Makarenko/ W. M. Keck Observatory.

    Astronomers have discovered the very first confirmed planetary system that resembles the expected fate of our solar system, when the Sun reaches the end of its life in about five billion years.

    The researchers detected the system using W. M. Keck Observatory on Maunakea in Hawaiʻi; it consists of a Jupiter-like planet with a Jupiter-like orbit revolving around a white dwarf star located near the center of our Milky Way galaxy.

    “This evidence confirms that planets orbiting at a large enough distance can continue to exist after their star’s death,” says Joshua Blackman, an astronomy postdoctoral researcher at the The University of Tasmania (AU) and lead author of the study. “Given that this system is an analog to our own solar system, it suggests that Jupiter and Saturn might survive the Sun’s red giant phase, when it runs out of nuclear fuel and self-destructs.”

    The study is published in today’s issue of the journal Nature.

    “Earth’s future may not be so rosy because it is much closer to the Sun,” says co-author David Bennett, a senior research scientist at The University of Maryland (US) and The Goddard Space Flight Center | NASA (US). “If humankind wanted to move to a moon of Jupiter or Saturn before the Sun fried the Earth during its red supergiant phase, we’d still remain in orbit around the Sun, although we would not be able to rely on heat from the Sun as a white dwarf for very long.”

    A white dwarf is what main sequence stars like our Sun become when they die. In the last stages of the stellar life cycle, a star burns off all of the hydrogen in its core and balloons into a red giant star. It then collapses into itself, shrinking into a white dwarf, where all that’s left is a hot, dense core, typically Earth-sized and half as massive as the Sun. Because these compact stellar corpses are small and no longer have the nuclear fuel to radiate brightly, white dwarfs are very faint and difficult to detect.

    Animation showing an artist’s rendering of a main sequence star ballooning into a red giant as it burns the last of its hydrogen fuel, then collapses into a white dwarf. What remains is a hot, dense core roughly the size of Earth and about half the mass of the Sun. A gas giant similar to Jupiter orbits from a distance, surviving the explosive transformation. Credit: Adam Makarenko/W. M. Keck Observatory.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    Mission
    To advance the frontiers of astronomy and share our discoveries with the world.

    The W. M. Keck Observatory (US) operates the largest, most scientifically productive telescopes on Earth. The two, 10-meter optical/infrared telescopes on the summit of Mauna Kea on the Island of Hawaii feature a suite of advanced instruments including imagers, multi-object spectrographs, high-resolution spectrographs, integral-field spectrometer and world-leading laser guide star adaptive optics systems. Keck Observatory is a private 501(c) 3 non-profit organization and a scientific partnership of the California Institute of Technology, the University of California and NASA.

    Today Keck Observatory is supported by both public funding sources and private philanthropy. As a 501(c)3, the organization is managed by the California Association for Research in Astronomy (CARA), whose Board of Directors includes representatives from the California Institute of Technology and the University of California, with liaisons to the board from NASA and the Keck Foundation.


    Keck UCal

    Instrumentation

    Keck 1

    HIRES – The largest and most mechanically complex of the Keck’s main instruments, the High Resolution Echelle Spectrometer breaks up incoming starlight into its component colors to measure the precise intensity of each of thousands of color channels. Its spectral capabilities have resulted in many breakthrough discoveries, such as the detection of planets outside our solar system and direct evidence for a model of the Big Bang theory.

    height=”375″ class=”size-full wp-image-32389″ /> Keck High-Resolution Echelle Spectrometer (HIRES), at the Keck I telescope.[/caption]

    LRIS – The Low Resolution Imaging Spectrograph is a faint-light instrument capable of taking spectra and images of the most distant known objects in the universe. The instrument is equipped with a red arm and a blue arm to explore stellar populations of distant galaxies, active galactic nuclei, galactic clusters, and quasars.

    VISIBLE BAND (0.3-1.0 Micron)

    MOSFIRE – The Multi-Object Spectrograph for Infrared Exploration gathers thousands of spectra from objects spanning a variety of distances, environments and physical conditions. What makes this huge, vacuum-cryogenic instrument unique is its ability to select up to 46 individual objects in the field of view and then record the infrared spectrum of all 46 objects simultaneously. When a new field is selected, a robotic mechanism inside the vacuum chamber reconfigures the distribution of tiny slits in the focal plane in under six minutes. Eight years in the making with First Light in 2012, MOSFIRE’s early performance results range from the discovery of ultra-cool, nearby substellar mass objects, to the detection of oxygen in young galaxies only 2 billion years after the Big Bang.

    OSIRIS – The OH-Suppressing Infrared Imaging Spectrograph is a near-infrared spectrograph for use with the Keck I adaptive optics system. OSIRIS takes spectra in a small field of view to provide a series of images at different wavelengths. The instrument allows astronomers to ignore wavelengths where the Earth’s atmosphere shines brightly due to emission from OH (hydroxl) molecules, thus allowing the detection of objects 10 times fainter than previously available.

    Keck 2

    DEIMOS – The Deep Extragalactic Imaging Multi-Object Spectrograph is the most advanced optical spectrograph in the world, capable of gathering spectra from 130 galaxies or more in a single exposure. In ‘Mega Mask’ mode, DEIMOS can take spectra of more than 1,200 objects at once, using a special narrow-band filter.

    NIRSPEC – The Near Infrared Spectrometer studies very high redshift radio galaxies, the motions and types of stars located near the Galactic Center, the nature of brown dwarfs, the nuclear regions of dusty starburst galaxies, active galactic nuclei, interstellar chemistry, stellar physics, and solar-system science.


    ESI – The Echellette Spectrograph and Imager captures high-resolution spectra of very faint galaxies and quasars ranging from the blue to the infrared in a single exposure. It is a multimode instrument that allows users to switch among three modes during a night. It has produced some of the best non-AO images at the Observatory.

    KCWI – The Keck Cosmic Web Imager is designed to provide visible band, integral field spectroscopy with moderate to high spectral resolution, various fields of view and image resolution formats and excellent sky-subtraction. The astronomical seeing and large aperture of the telescope enables studies of the connection between galaxies and the gas in their dark matter halos, stellar relics, star clusters and lensed galaxies.

    NEAR-INFRARED (1-5 Micron)

    ADAPTIVE OPTICS – Adaptive optics senses and compensates for the atmospheric distortions of incoming starlight up to 1,000 times per second. This results in an improvement in image quality on fairly bright astronomical targets by a factor 10 to 20.

    LASER GUIDE STAR ADAPTIVE OPTICS [pictured above] – The Keck Laser Guide Star expands the range of available targets for study with both the Keck I and Keck II adaptive optics systems. They use sodium lasers to excite sodium atoms that naturally exist in the atmosphere 90 km (55 miles) above the Earth’s surface. The laser creates an “artificial star” that allows the Keck adaptive optics system to observe 70-80 percent of the targets in the sky, compared to the 1 percent accessible without the laser.

    NIRC-2/AO – The second generation Near Infrared Camera works with the Keck Adaptive Optics system to produce the highest-resolution ground-based images and spectroscopy in the 1-5 micron range. Typical programs include mapping surface features on solar system bodies, searching for planets around other stars, and analyzing the morphology of remote galaxies.


    ABOUT NIRES
    The Near Infrared Echellette Spectrograph (NIRES) is a prism cross-dispersed near-infrared spectrograph built at the California Institute of Technology by a team led by Chief Instrument Scientist Keith Matthews and Prof. Tom Soifer. Commissioned in 2018, NIRES covers a large wavelength range at moderate spectral resolution for use on the Keck II telescope and observes extremely faint red objects found with the Spitzer and WISE infrared space telescopes, as well as brown dwarfs, high-redshift galaxies, and quasars.

    Future Instrumentation

    KCRM – The Keck Cosmic Reionization Mapper will complete the Keck Cosmic Web Imager (KCWI), the world’s most capable spectroscopic imager. The design for KCWI includes two separate channels to detect light in the blue and the red portions of the visible wavelength spectrum. KCWI-Blue was commissioned and started routine science observations in September 2017. The red channel of KCWI is KCRM; a powerful addition that will open a window for new discoveries at high redshifts.

    KPF – The Keck Planet Finder (KPF) will be the most advanced spectrometer of its kind in the world. The instrument is a fiber-fed high-resolution, two-channel cross-dispersed echelle spectrometer for the visible wavelengths and is designed for the Keck II telescope. KPF allows precise measurements of the mass-density relationship in Earth-like exoplanets, which will help astronomers identify planets around other stars that are capable of supporting life.

     
  • richardmitnick 1:40 pm on October 12, 2021 Permalink | Reply
    Tags: "Stellar fossils in meteorites point to distant stars", , Astrophysics, , , NanoSIMS: a state-of-the-art mass spectrometer,   

    From Washington University in St. Louis (US) : “Stellar fossils in meteorites point to distant stars” 

    Wash U Bloc

    From Washington University in St. Louis (US)

    October 12, 2021

    Talia Ogliore
    talia.ogliore@wustl.edu

    Some pristine meteorites contain a record of the original building blocks of the solar system, including grains that formed in ancient stars that died before the sun formed. One of the biggest challenges in studying these presolar grains is to determine the type of star each grain came from.

    1
    An electron microscope image of a micron-sized silicon carbide stardust grain (lower right). The grain is coated with meteoritic organics (dark gunk on the left side of the grain). Such grains formed in the cooling winds lost from the surface of low-mass carbon-rich stars near the end of their lives, typified here (upper left) by a Hubble Space Telescope image of the asymptotic giant branch (AGB) star U Camelopardalis. Laboratory analysis of such tiny dust grains provides unique information on nuclear reactions in low-mass AGB stars and their evolutions. (Image credits: NASA, Nan Liu and Andrew Davis)

    Nan Liu, research assistant professor of physics in Arts & Sciences at Washington University in St. Louis, is first author of a new study in The Astrophysical Journal Letters that analyzes a diverse set of presolar grains with the goal of realizing their true stellar origins.

    Liu and her team used a state-of-the-art mass spectrometer called NanoSIMS to measure isotopes of a suite of elements including the N and Mg-Al isotopes in presolar silicon carbide (SiC) grains. By refining their analytical protocols and also utilizing a new-generation plasma ion source, the scientists were able to visualize their samples with better spatial resolution than could be accomplished by previous studies.

    “Presolar grains have been embedded in meteorites for 4.6 billion years and are sometimes coated with solar materials on the surface,” Liu said. “Thanks to the improved spatial resolution, our team was able to see Al contamination attached on the surface of a grain and to obtain true stellar signatures by including signals only from the core of the grain during the data reduction.”

    The scientists sputtered the grains using an ion beam for extended periods of time to expose clean, interior grain surfaces for their isotopic analyses. The researchers found that the N isotope ratios of the same grain greatly increased after the grain was exposed to extended ion sputtering.

    Isotope ratios can be rarely measured for stars, but C and N isotopes are two exceptions. The new C and N isotope data for the presolar grains reported in this study directly link the grains to different types of carbon stars based on these stars’ observed isotopic ratios.

    “The new isotopic data obtained in this study are exciting for stellar physicists and nuclear astrophysicists like me,” said Maurizio Busso, a co-author of the study who is based at the University of Perugia, in Italy. “Indeed, the ‘strange’ N isotopic ratios of presolar SiC grains have been in the last two decades a remarkable source of concern. The new data explain the difference between what was originally present in the presolar stardust grains and what was attached later, thus solving a long-standing puzzle in the community.”

    2
    NanoSIMS images of a SiC grain. The upper panel shows images taken at a spatial resolution of ~1 μm, the typical resolution of previous analyses. The lower panel shows the same grain’s ion images taken at a spatial resolution of 100 nm, the resolution achieved in this study. (Image courtesy of Nan Liu)

    The study also includes a significant exploration of radioactive isotope aluminum-26 (26Al), an important heat source during the evolution of young planetary bodies in the early solar system and also other extra-solar systems. The scientists inferred the initial presence of large amounts of 26Al in all measured grains, as predicted by current models. The study determined how much 26Al was produced by the “parent stars” of the grains they measured. Liu and her collaborators concluded that stellar model predictions for 26Al are too high by at least a factor of two, compared to the grain data.

    The data-model offsets likely point to uncertainties in relevant nuclear reaction rates, Liu noted, and will motivate nuclear physicists to pursue better measurements of these reaction rates in the future.

    The team’s results link some of the presolar grains in this collection to poorly known carbon stars with peculiar chemical compositions.

    The grains’ isotopic data point to H-burning processes occurring in such carbon stars at higher-than-expected temperatures. This information will help astrophysicists to construct stellar models to better understand the evolution of these stellar objects.

    “As we learn more about the sources for dust, we can gain additional knowledge about the history of the universe and how various stellar objects within it evolve,” Liu said.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Wash U campus

    Washington University in St. Louis (US) is a private research university in Greater St. Louis with its main campus (Danforth) mostly in unincorporated St. Louis County, Missouri, and Clayton, Missouri. It also has a West Campus in Clayton, North Campus in the West End neighborhood of St. Louis, Missouri, and Medical Campus in the Central West End neighborhood of St. Louis, Missouri.

    Founded in 1853 and named after George Washington, the university has students and faculty from all 50 U.S. states and more than 120 countries. Washington University is composed of seven graduate and undergraduate schools that encompass a broad range of academic fields. To prevent confusion over its location, the Board of Trustees added the phrase “in St. Louis” in 1976. Washington University is a member of the Association of American Universities (US) and is classified among “R1: Doctoral Universities – Very high research activity”.

    As of 2020, 25 Nobel laureates in economics, physiology and medicine, chemistry, and physics have been affiliated with Washington University, ten having done the major part of their pioneering research at the university. In 2019, Clarivate Analytics ranked Washington University 7th in the world for most cited researchers. The university also received the 4th highest amount of National Institutes of Health (US) medical research grants among medical schools in 2019.

    Research

    Virtually all faculty members at Washington University engage in academic research, offering opportunities for both undergraduate and graduate students across the university’s seven schools. Known for its interdisciplinary and departmental collaboration, many of Washington University’s research centers and institutes are collaborative efforts between many areas on campus. More than 60% of undergraduates are involved in faculty research across all areas; it is an institutional priority for undergraduates to be allowed to participate in advanced research. According to the Center for Measuring University Performance, it is considered to be one of the top 10 private research universities in the nation. A dedicated Office of Undergraduate Research is located on the Danforth Campus and serves as a resource to post research opportunities, advise students in finding appropriate positions matching their interests, publish undergraduate research journals, and award research grants to make it financially possible to perform research.

    According to the National Science Foundation (US), Washington University spent $816 million on research and development in 2018, ranking it 27th in the nation. The university has over 150 National Institutes of Health funded inventions, with many of them licensed to private companies. Governmental agencies and non-profit foundations such as the NIH, Department of Defense (US), National Science Foundation, and National Aeronautics Space Agency (US) provide the majority of research grant funding, with Washington University being one of the top recipients in NIH grants from year-to-year. Nearly 80% of NIH grants to institutions in the state of Missouri went to Washington University alone in 2007. Washington University and its Medical School play a large part in the Human Genome Project, where it contributes approximately 25% of the finished sequence. The Genome Sequencing Center has decoded the genome of many animals, plants, and cellular organisms, including the platypus, chimpanzee, cat, and corn.

    NASA hosts its Planetary Data System Geosciences Node on the campus of Washington University. Professors, students, and researchers have been heavily involved with many unmanned missions to Mars. Professor Raymond Arvidson has been deputy principal investigator of the Mars Exploration Rover mission and co-investigator of the Phoenix lander robotic arm.

    Washington University professor Joseph Lowenstein, with the assistance of several undergraduate students, has been involved in editing, annotating, making a digital archive of the first publication of poet Edmund Spenser’s collective works in 100 years. A large grant from the National Endowment for the Humanities (US) has been given to support this ambitious project centralized at Washington University with support from other colleges in the United States.

    In 2019, Folding@Home (US), a distributed computing project for performing molecular dynamics simulations of protein dynamics, was moved to Washington University School of Medicine from Stanford University (US). The project, currently led by Dr. Greg Bowman, uses the idle CPU time of personal computers owned by volunteers to conduct protein folding research. Folding@home’s research is primarily focused on biomedical problems such as Alzheimer’s disease, Cancer, Coronavirus disease 2019, and Ebola virus disease. In April 2020, Folding@home became the world’s first exaFLOP computing system with a peak performance of 1.5 exaflops, making it more than seven times faster than the world’s fastest supercomputer, Summit, and more powerful than the top 100 supercomputers in the world, combined.

     
  • richardmitnick 1:05 pm on October 12, 2021 Permalink | Reply
    Tags: "G344.7-0.1- When a Stable Star Explodes", A white dwarf with a nearby companion star can become a cosmic powder keg if the companion's orbit brings it too close., Astrophysics, , , Encounters between white dwarfs and "normal" companion stars are one likely source of Type Ia supernova explosions., , One way to investigate the explosion mechanism is to look at the elements left behind by the supernova in its debris or ejecta., Supernova remnant G344.7-0.1, White dwarfs are among the most stable of stars.,   

    From National Aeronautics and Space Administration (US) Chandra X-ray Telescope (US): “G344.7-0.1- When a Stable Star Explodes” 

    NASA Chandra Banner

    From National Aeronautics and Space Administration (US) Chandra X-ray Telescope (US)

    October 12, 2021

    5
    Composite

    6
    X-ray

    7
    Infrared

    8
    Radio

    The supernova remnant G344.7-0.1 is across the Milky Way about 19,600 light years from Earth.

    It belongs to a class of supernovas called “Type Ia” that can result from a white dwarf accumulating material from a companion star until it explodes.

    A new composite image contains X-rays from Chandra (blue), infrared data from Spitzer (yellow and green) and radio data from two telescopes (red).

    National Aeronautics and Space Administration(US) Spitzer Infrared Space Telescope no longer in service. Launched in 2003 and retired on 30 January 2020.

    Chandra’s data reveal different elements such as iron, silicon, sulfur and others found in the aftermath of the stellar explosion.

    White dwarfs are among the most stable of stars. Left on their own, these stars that have exhausted most of their nuclear fuel — while still typically as massive as the Sun — and shrunk to a relatively small size can last for billions or even trillions of years.

    However, a white dwarf with a nearby companion star can become a cosmic powder keg. If the companion’s orbit brings it too close, the white dwarf can pull material from it until the white dwarf grows so much that it becomes unstable and explodes. This kind of stellar blast is called a Type Ia supernova.

    While it is generally accepted by astronomers that such encounters between white dwarfs and “normal” companion stars are one likely source of Type Ia supernova explosions, many details of the process are not well understood. One way to investigate the explosion mechanism is to look at the elements left behind by the supernova in its debris or ejecta.

    This new composite image shows G344.7-0.1, a supernova remnant created by a Type Ia supernova, through the eyes of different telescopes. X-rays from NASA’s Chandra X-ray Observatory (blue) have been combined with infrared data from NASA’s Spitzer Space Telescope (yellow and green) as well as radio data from the NSF’s Very Large Array and the Commonwealth Scientific and Industrial Research Organisation’s Australia Telescope Compact Array (red).

    National Radio Astronomy Observatory(US)Karl G Jansky Very Large Array located in central New Mexico on the Plains of San Agustin, between the towns of Magdalena and Datil, ~50 miles (80 km) west of Socorro. The VLA comprises twenty-eight 25-meter radio telescopes.

    Chandra is one of the best tools available for scientists to study supernova remnants and measure the composition and distribution of “heavy” elements — that is, anything other than hydrogen and helium — they contain.

    Astronomers estimate that G344.7-0.1 is about 3,000 to 6,000 years old in Earth’s time frame. On the other hand, the most well-known and widely-observed Type Ia remnants, including Kepler, Tycho, and SN 1006, have all exploded within the last millennium or so as seen from Earth. Therefore, this deep look at G344.7-0.1 with Chandra gives astronomers a window into an important phase later in the evolution of a Type Ia supernova remnant.

    Both the expanding blast wave and the stellar debris produce X-rays in supernova remnants. As the debris moves outward from the initial explosion, it encounters resistance from surrounding gas and slows down, creating a reverse shock wave that travels back toward the center of the explosion. This process is analogous to a traffic jam on a highway, where as times passes an increasing number of cars will stop or slow down behind the accident, causing the traffic jam to travel backwards. The reverse shock heats the debris to millions of degrees, causing it to glow in X-rays.

    Type Ia remnants like Kepler, Tycho and SN 1006 are too young for the reverse shock to have time to plausibly travel backwards to heat all of the debris in the remnant’s center. However, the relatively advanced age of G344.7-0.1 means that the reverse shock has moved back through the entire debris field.

    A separate color version of only the Chandra data shows X-ray emission from iron (blue) and silicon (red) respectively, and X-rays produced by the acceleration of electrons as they are deflected by the nuclei of atoms that are positively charged (green). The region with the highest density of iron and the arc-like structures of silicon are labeled.

    1
    Composite

    2
    Iron

    3
    Silicon

    4
    Hard Band (3-6 keV)

    The Chandra image of G344.7-0.1 shows that the region with the highest density of iron (blue) is surrounded by arc-like structures (green) containing silicon. Similar arc-like structures are found for sulfur, argon, and calcium. The Chandra data also suggests that the region with the highest density iron has been heated by the reverse shock more recently than the elements in the arc-like structures, implying that it is located near the true center of the stellar explosion. These results support the predictions of models for Type Ia supernova explosions, which show that heavier elements are produced in the interior of an exploding white dwarf.

    This three-color Chandra image also shows that the densest iron is located to the right of the supernova remnant’s geometric center. This asymmetry is likely caused by gas surrounding the remnant being denser on the right than it is on the left.

    A paper describing these results was published in the July 1st, 2020 issue of The Astrophysical Journal. The authors of the study are Kotaro Fukushima (Tokyo University of Science, Japan), Hiroya Yamaguchi (JAXA), Patrick Slane (Center for Astrophysics | Harvard & Smithsonian), Sangwook Park (University of Texas, Austin), Satoru Katsuda (Saitama University, Japan), Hidetoshi Sano (Nagoya University, Japan), Laura Lopez (The Ohio State University, Columbus), Paul Plucinsky (Center for Astrophysics), Shogo Kobayashi (Tokyo University of Science), and Kyoko Matsushita (Tokyo University of Science). The radio data were provided by Elsa Giacani from the Institute of Astronomy and Space Physics, who led a study of G344.7-0.1 published in 2011 in the journal Astronomy and Astrophysics.


    Quick Look: When a Stable Star Explodes.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NASA’s Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program for NASA’s Science Mission Directorate in Washington. The Smithsonian Astrophysical Observatory controls Chandra’s science and flight operations from Cambridge, Mass.

    In 1976 the Chandra X-ray Observatory (called AXAF at the time) was proposed to National Aeronautics and Space Administration (US) by Riccardo Giacconi and Harvey Tananbaum. Preliminary work began the following year at NASA’s Marshall Space Flight Center(US) and the Harvard Smithsonian Center for Astrophysics(US) . In the meantime, in 1978, NASA launched the first imaging X-ray telescope, Einstein (HEAO-2), into orbit. Work continued on the AXAF project throughout the 1980s and 1990s. In 1992, to reduce costs, the spacecraft was redesigned. Four of the twelve planned mirrors were eliminated, as were two of the six scientific instruments. AXAF’s planned orbit was changed to an elliptical one, reaching one third of the way to the Moon’s at its farthest point. This eliminated the possibility of improvement or repair by the space shuttle but put the observatory above the Earth’s radiation belts for most of its orbit. AXAF was assembled and tested by TRW (now Northrop Grumman Aerospace Systems) in Redondo Beach, California.

    AXAF was renamed Chandra as part of a contest held by NASA in 1998, which drew more than 6,000 submissions worldwide. The contest winners, Jatila van der Veen and Tyrel Johnson (then a high school teacher and high school student, respectively), suggested the name in honor of Nobel Prize–winning Indian-American astrophysicist Subrahmanyan Chandrasekhar. He is known for his work in determining the maximum mass of white dwarf stars, leading to greater understanding of high energy astronomical phenomena such as neutron stars and black holes. Fittingly, the name Chandra means “moon” in Sanskrit.

    Originally scheduled to be launched in December 1998, the spacecraft was delayed several months, eventually being launched on July 23, 1999, at 04:31 UTC by Space Shuttle Columbia during STS-93. Chandra was deployed from Columbia at 11:47 UTC. The Inertial Upper Stage’s first stage motor ignited at 12:48 UTC, and after burning for 125 seconds and separating, the second stage ignited at 12:51 UTC and burned for 117 seconds. At 22,753 kilograms (50,162 lb), it was the heaviest payload ever launched by the shuttle, a consequence of the two-stage Inertial Upper Stage booster rocket system needed to transport the spacecraft to its high orbit.

    Chandra has been returning data since the month after it launched. It is operated by the SAO at the Chandra X-ray Center in Cambridge, Massachusetts, with assistance from Massachusetts Institute of Technology(US) and Northrop Grumman Space Technology. The ACIS CCDs suffered particle damage during early radiation belt passages. To prevent further damage, the instrument is now removed from the telescope’s focal plane during passages.

    Although Chandra was initially given an expected lifetime of 5 years, on September 4, 2001, NASA extended its lifetime to 10 years “based on the observatory’s outstanding results.” Physically Chandra could last much longer. A 2004 study performed at the Chandra X-ray Center indicated that the observatory could last at least 15 years.

    In July 2008, the International X-ray Observatory, a joint project between European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU), NASA and Japan Aerospace Exploration Agency (JAXA) (国立研究開発法人宇宙航空研究開発機構], was proposed as the next major X-ray observatory but was later cancelled. ESA later resurrected a downsized version of the project as the Advanced Telescope for High Energy Astrophysics (ATHENA), with a proposed launch in 2028.

    European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU) Athena spacecraft depiction

    On October 10, 2018, Chandra entered safe mode operations, due to a gyroscope glitch. NASA reported that all science instruments were safe. Within days, the 3-second error in data from one gyro was understood, and plans were made to return Chandra to full service. The gyroscope that experienced the glitch was placed in reserve and is otherwise healthy.

     
  • richardmitnick 9:37 am on October 12, 2021 Permalink | Reply
    Tags: "Is dark matter cold or warm or hot?", Astrophysics, , , , , , , , ,   

    From Symmetry: “Is dark matter cold or warm or hot?” 

    Symmetry Mag

    From Symmetry

    10/12/21
    Glennda Chui

    The answer has to do with dark matter’s role in shaping the cosmos.

    Milky Way Dark Matter Halo Credit:L. Calçada/ European Southern Observatory [Observatoire européen austral][Europäische Südsternwarte](EU)(CL)

    Half a century after Vera Rubin and Kent Ford confirmed that a form of invisible matter—now called dark matter—is required to account for the rotation of galaxies, the evidence for its existence is overwhelming.
    _____________________________________________________________________________________
    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM, denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com.

    Coma cluster via NASA/ESA Hubble.

    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.

    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.

    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).

    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL).

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970

    Dark Matter Research

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.
    _____________________________________________________________________________________

    Although it is known to interact with ordinary matter only through gravity, there is such a massive amount of dark matter out there—85% of all the matter in the universe—that it has played a pivotal behind-the-scenes role in shaping all the stuff we can see, from our own Milky Way galaxy to the wispy filaments of gas that link galaxies across vast distances.

    “We think it exists because there’s evidence for it on many, many scales,” says Kevork Abazajian, a theoretical physicist and astrophysicist at The University of California-Irvine (US).

    There have been a lot of ideas about what form dark matter might take, from planet-sized objects called MACHOs to individual particles like WIMPs—weakly interacting massive particles roughly the size of a proton—and even tinier things like axions and sterile neutrinos.

    In the 1980s, scientists came up with a way to make sense of this growing collection: They started classifying proposed dark-matter particles as cold, warm or hot. These categories are based on how fast each type of dark matter would have traveled through the early universe—a speed that depended on its mass—and on how hot its surroundings were when it popped into existence.

    Light, fast particles are known as hot dark matter; heavy, slow ones are cold dark matter; and warm dark matter falls in between.

    In this way of seeing things, WIMPs are cold, sterile neutrinos are warm, and relic neutrinos from the early universe are hot. (Axions are a special case—both light and extremely cold. We’ll get to them later.)

    Why is their speed so important?

    “If a dark matter particle is lighter and faster, it can travel farther in a given time, and it will smooth out any structure that already exists along the way,” Abazajian says.

    On the other hand, slower, colder forms of dark matter would have helped build structure, and based on what we know and see today it must have been part of the mix.

    Building galaxies

    Although there are theories about when and how each type of dark-matter candidate would have formed, the only thing scientists know for sure is that dark matter was already around about 75,000 years after the Big Bang. It was then that matter started to dominate over radiation and little seeds of structure started to form, says Stanford University (US) theoretical physicist Peter Graham.

    Most types of dark-matter particles would have been created by collisions between other particles in the hot, dense soup of the infant universe, in much the same way that high-energy particle collisions at places like the Large Hadron Collider give rise to exotic new types of particles. As the universe expanded and cooled, dark-matter particles would have wound up being hot, warm or cold—and, in fact, there could have been more than one type.

    Scientists describe them as freely “streaming” through the universe, although this term is a little misleading, Abazajian says. Unlike leaves floating on a river, all headed in the same direction in a coordinated way, “these things are not just in one place and then in another place,” he says. “They’re everywhere and going in every direction.”

    As it streamed, each type of dark matter would have had a distinctive impact on the growth of structure along the way—either adding to its clumpiness, and thus to the building of galaxies, or thwarting their growth.

    Cold dark matter, such as the WIMP, would have been a clump-builder. It moved slowly enough to glom together and form gravitational wells, which would have captured nearby bits of matter.

    Hot dark matter, on the other hand, would have been a clump-smoother, zipping by so fast that it could ignore those gravitational wells. If all dark matter were hot, none of those seeds could have grown into bigger structures, says Silvia Pascoli, a theoretical physicist at The University of Bologna [Alma mater studiorum – Università di Bologna](IT). That’s why scientists now believe that hot dark-matter particles, such as relic neutrinos from the early days of the cosmos, could not constitute more than a sliver of dark matter as a whole.

    Despite their tiny contribution, Pascoli adds, “I say these relic neutrinos are currently the only known component of dark matter. They have an important impact on the evolution of the universe.”

    You might think that warm dark matter would be the best dark matter, filling the universe with a Goldilocks bowl of just-right structure. Sterile neutrinos are considered the top candidate in this category, and in theory they could indeed constitute the vast majority of dark matter.

    But most of the parameter space—the sets of conditions—where they could exist have been ruled out, says Abazajian, who as a graduate student researched how specific types of neutrino oscillations in the early universe could have produced sterile neutrino dark matter.

    Although those same oscillations could be happening today, he says, the probability that a regular neutrino would turn into a sterile one through standard oscillations in the vacuum of space are thought to be very small, with estimates ranging from 1 in 100,000 to 1 in 100 trillion.

    “You’d have to have a very good counting mechanism to count up to 100 trillion hits in your detector without missing the one hit from a sterile neutrino,” Abazajian says.

    That said, there are a few experiments out there that are giving it a try, using new approaches that don’t rely on direct hits.

    Then there’s the axion.

    Unlike the other dark-matter candidates, axions would be both extremely light—so light that they are better described as waves whose associated fields can spread over kilometers—and extremely cold, Graham says. They are so weakly coupled to other forms of matter that the frantic collisions of particles in the thermal bath of the early universe would have produced hardly any.

    “They would have been produced in a different way than the other dark matter candidates,” Graham says. “Even though the universe was very hot at the time, axions would have been very cold at birth and would stay cold forever, which means that they are absolutely cold dark matter.”

    Even though axions are very light, Graham says, “because they exist at close to absolute zero, the temperature where all motion stops, they are essentially not moving. They’re kind of this ghostly fluid, and everything else moves through it.”

    Searching for dark matter of all kinds

    Some scientists think it will take more than one type of dark matter to account for all the things we see in the universe.

    And in the past few years, as experiments aimed at detecting WIMPs and producing dark matter particles through collisions at the Large Hadron Collider have so far come up empty-handed, the search for dark matter has broadened.

    SixTRack CERN LHC particles

    The proliferation of ideas for searches has been helped by technological advances and clever approaches that could force much lighter and even more exotic dark-matter particles out of hiding.

    Some of those efforts make use of the very clumpiness that dark matter was instrumental in creating.

    Simona Murgia, an experimentalist at The University of California-Irvine (US), led a team looking for signs of collisions between WIMPs and their antiparticles with the Fermi Gamma-ray Space Telescope while a postdoc at the DOE’s SLAC National Accelerator Laboratory.

    Now she’s joined an international team of scientists who will conduct a vast survey of the Southern sky from the Vera C. Rubin Observatory in Chile using the world’s biggest digital camera, which is under construction at SLAC.

    NSF (US) NOIRLab (US) NOAO (US) Vera C. Rubin Observatory [LSST] Telescope currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing NSF (US) NOIRLab (US) NOAO (US) AURA (US) Gemini South Telescope and Southern Astrophysical Research Telescope.

    One of the things this survey will do is get a much better handle on the distribution of dark matter in the universe by looking at how it bends light from the galaxies we can see.

    “It will tell us something about the nature of dark matter in a totally different way,” Murgia says. “The more clumpy its distribution is, the more consistent it is with theories that tell you dark matter is cold.”

    The camera is expected to snap images of about 20 billion galaxies over 10 years, and from those images scientists hope to infer the fundamental nature of the dark matter that shaped them.

    “We don’t only want to know the dark matter is there,” Murgia says. “We do want to understand the cosmology, but we also really want to know what dark matter is.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 9:13 pm on October 11, 2021 Permalink | Reply
    Tags: "Nature of unknown gamma-ray sources revealed", Astrophysics, , , , Gamma-ray Astronomy, , , , ,   

    From Xinglong Observatory [兴隆观测站] (CN) via phys.org : “Nature of unknown gamma-ray sources revealed” 

    LAMOST telescope located in Xinglong Station, Hebei Province, China, Altitude 960 m (3,150 ft).

    From Xinglong Observatory [兴隆观测站] (CN)

    Chinese Academy of Sciences [中国科学院] (CN)

    via

    phys.org

    October 11, 2021
    Li Yuan, Chinese Academy of Sciences [中国科学院] (CN)

    1
    Fig. 1 Artistic representation of an active galaxy jet. Credit: M. Kornmesser/European Southern Observatory [Observatoire européen austral][Europäische Südsternwarte](EU)(CL).

    An international team of astronomers has unveiled the nature of hundreds of gamma-ray emitting sources, discovering that most of them belong to the class of active galaxies known as blazars.

    Their recent study was published in The Astronomical Journal.

    One of the most intriguing challenges in modern gamma-ray astronomy is searching for low-energy counterparts of unidentified gamma-ray sources. Unidentified sources constitute about 1/3 of all celestial objects detected by the Fermi satellite to date, the most recent gamma-ray mission with unprecedented capabilities for observing the high energy sky.

    Since the largest population of known gamma-ray sources are blazars, astronomers believe they can also classify most unidentified gamma-ray sources as blazars. However, they can completely understand their nature only by observing blazar candidates at visible frequencies.

    Blazars are extremely rare, black hole-powered galaxies. They host a supermassive black hole in their central regions that sweep out matter at almost the speed of light in the form of a powerful jet pointing towards the Earth. Particles accelerated in these jets can emit light up to the most energetic gamma-rays, thus being visible by instruments onboard the Fermi satellite.

    2
    Fig. 2 Example of the completely featureless optical spectrum of the BL Lac known as J065046.49+250259.6. Credit: Harold A. Peña Herazo.

    The team, led by Dr. Harold Peña Herazo from The National Institute for Astrophysics, Optics and Electronics(MX), analyzed hundreds of optical spectra collected by the Large Sky Area Multi-Object Fabre Spectroscopic Telescope (LAMOST) at the Xinglong Station in China [above].

    LAMOST is hosted by The National Astronomical Observatories of China [ 国家天文台] at Chinese Academy of Sciences [中国科学院](CN). It provided a unique opportunity to unveil the nature of blazar-like sources that can potentially be counterparts of unidentified gamma-ray sources.

    From the list of sources discovered by the Fermi satellite, the researchers selected a sample of Blazar Candidates of Uncertain type (BCUs), which share several properties in common with blazars. However, optical spectroscopic observations are necessary to determine their proper classification and confirm their nature.

    Using spectroscopic data available in the LAMOST archive, the researchers were able to classify tens of BCUs as blazars. “LAMOST data also permitted verifying the nature of hundreds of additional blazars by searching for emission or absorption lines used to determine their cosmological distances,” said Prof. GU Minfeng from The Shanghai Astronomical Observatory [上海天文台]Chinese Academy of Sciences [上海天文台](CN).

    The vast majority of sources belong to the blazar class known as BL Lac objects and have a completely featureless optical spectrum. This makes measuring their cosmological distances an extremely challenging task. However, thanks to the LAMOST observations, a few more of them have luckily revealed visible signatures in their optical spectra.

    “Our analysis showed great potential for the LAMOST survey and allowed us to discover a few changing-look blazars,” said Dr. Peña Herazo, currently a postdoctoral fellow at The East Asian Observatory – Hilo, Hawaii(US).

    “It is worth noting that the possibility of using LAMOST observations to estimate blazar cosmological distances is critical to studying this population, its cosmological evolution, the imprint in the extragalactic gamma-ray background light in the gamma-ray spectra, and the blazar contribution to the extragalactic gamma-ray background,” said Prof. Francesco Massaro from the University of Turin.

    “I started working on this optical campaign and analyzing spectroscopic data in 2015, and nowadays, thanks to the observations available in LAMOST archive, we certainly made a significant step toward the identification of gamma-ray sources with blazars. Future perspectives achievable thanks to LAMOST datasets will definitively reveal the nature of hundreds of new blazars in the years to come,” commented Dr. Federica Ricci at The University of Bologna [Alma mater studiorum – Università di Bologna](IT) and INAF-Institute for Radio Astronomy of Bologna [Istituto di Radioastronomia di Bologna](IT).

    The group’s previous study was also published in The Astronomical Journal.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Xinglong Observatory [兴隆观测站] (CN) of the National Astronomical Observatories, Chinese Academy of Sciences (NAOC) (IAU code: 327, coordinates: 40°23′39′′ N, 117°34′30′′ E) was founded in 1968. At present, it is one of most primary observing stations of NAOC. As the largest optical astronomical observatory site in the continent of Asia, it has 9 telescopes with effective aperture larger than 50 cm. These are the Guo Shoujing Telescope, also called the Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST), the 2.16-m Telescope, a 1.26-m optical & near-infrared telescope, a 1-m Alt-Az telescope, an 85-cm telescope (NAOC-Beijing Normal University [北京師範大學](CN) Telescope, NBT), an 80-cm telescope (Tsinghua University [清华大学](CN)-NAOC Telescope, TNT), a 60-cm telescope, a 50-cm telescope and a 60/90-cm Schmidt telescope.

    The average altitude of the Xinglong Observatory is about 900 m. The Xinglong Observatory is located at the south of the main peak of the Yanshan Mountains, in the Xinglong County, Hebei Province, which lies about 120 km (about 2 hours’ drive) to the northeast of Beijing. A shuttle bus runs between NAOC campus and Xinglong Observatory every Tuesday and Friday. The mean and media seeing values of the Xinglong Observatory are 1.9′′ and 1.7′′, respectively. On average, there are 117 photometric nights and 230 observable nights per year based on the data of 2007-2014. Most of the time, the wind speed is less than 4 m/s (the mean value is 2 m/s), and the sky brightness is about 21.1 mag arcsec2 in V band at the zenith.

    Each year, more than a hundred astronomers use the telescopes of the Xinglong Observatory to perform the observations for the studies on Galactic sciences (stellar parameters, extinction measurements, Galactic structures, exoplanets, etc.) and extragalactic sciences (including nearby galaxies, AGNs, high-redshift quasars), as well as time-domain astronomy (supernovae, gamma-ray bursts, stellar tidal disruption events, and different types of variable stars). In recent years, besides the basic daily maintenance of the telescopes, new techniques and methods have been explored by the engineers and technicians of the Xinglong Observatory to improve the efficiency of observations. Meanwhile, the Xinglong Observatory is also a National populscience and education base of China for training students from graduate schools, colleges, high schools and other education institutes throughout China, and it has hosted a number of international workshops and summer schools.

    The Chinese Academy of Sciences [中国科学院] (CN) is the linchpin of China’s drive to explore and harness high technology and the natural sciences for the benefit of China and the world. Comprising a comprehensive research and development network, a merit-based learned society and a system of higher education, CAS brings together scientists and engineers from China and around the world to address both theoretical and applied problems using world-class scientific and management approaches.

    Since its founding, CAS has fulfilled multiple roles — as a national team and a locomotive driving national technological innovation, a pioneer in supporting nationwide S&T development, a think tank delivering S&T advice and a community for training young S&T talent.

    Now, as it responds to a nationwide call to put innovation at the heart of China’s development, CAS has further defined its development strategy by emphasizing greater reliance on democratic management, openness and talent in the promotion of innovative research. With the adoption of its Innovation 2020 programme in 2011, the academy has committed to delivering breakthrough science and technology, higher caliber talent and superior scientific advice. As part of the programme, CAS has also requested that each of its institutes define its “strategic niche” — based on an overall analysis of the scientific progress and trends in their own fields both in China and abroad — in order to deploy resources more efficiently and innovate more collectively.

    As it builds on its proud record, CAS aims for a bright future as one of the world’s top S&T research and development organizations.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: