Tagged: Astronomy Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:16 pm on October 14, 2021 Permalink | Reply
    Tags: "Rocky exoplanets and their host stars may have similar composition", Astronomy, , , ,   

    From IAC Institute of Astrophysics of the Canary Islands [Instituto de Astrofísica de Canarias] (ES) : “Rocky exoplanets and their host stars may have similar composition” 

    Instituto de Astrofísica de Andalucía

    From IAC Institute of Astrophysics of the Canary Islands [Instituto de Astrofísica de Canarias] (ES)

    14/10/2021

    Garik Israelian
    gil@iac.es

    1
    Illustration of the formation of a planet round a star similar to the Sun, with rocks and iron molecules in the foreground. Credit: Tania Cunha (Harbor Planetarium [Planetário do Porto](PT) – Centro Ciência Viva & Instituto de Astrofísica e Ciências do Espaço).

    Newly formed stars have protoplanetary discs around them. A fraction of the material in the disc condenses into planet-forming chunks, and the rest finally falls into the star. Because of their common origin, researchers have assumed that the composition of these chunks and that of the rocky planets with low masses should be similar to that of their host stars. However, until now the Solar System was the only available reference for the astronomers.

    In a new research article, published today in the journal Science, an international team of astronomers led by the researcher Vardan Adibekyan, of The Instituto de Astrofísica e Ciências do Espaço (IA), with participation by the Instituto de Astrofísica de Canarias (IAC), has established for the first time a correlation between the composition of rocky exoplanets and that of their host stars. The study also shows that this relation does not correspond exactly to the relation previously assumed.

    “The team found that the composition of rocky planets is closely related to the composition of their host stars, which could help us to identify planets which may be similar to ours”, explains Vardan Adibekyan, the first author on the paper. “In addition, the iron content of these planets is higher than that predicted from the composition of the protoplanetary discs from which they formed, which is due to the specific characteristics of the formation processes of planets, and the chemistry of the discs. Our work supports models of planet formation and a level of certainty and detail without precedent”, he added.

    For Garik Israelian, an IAC researcher and co-author of the article, this result could not have been imagined in the year 2000. “At that time we tried to find a correlation between the chemical composition of certain solar type stars and the presence of planets orbiting them (or of their orbital characteristics). It was hard to believe that twenty years later these studies would grow to include the metal abundances of planets similar to the Earth”, he emphasises.

    “For us this would have seemed to be science fiction. Planets similar to the Earth were not yet known, and we concentrated only on the planets we could find, and on the parameters of their orbits around their host stars. And today, we are studying the chemical composition of the interiors and of the atmospheres of extrasolar planets. It is a great leap forward”, he added.

    To establish the relation, the team selected twenty-one rocky planets which had been characterized most accurately, using their measurements of mass and radius to determine their densities and their iron content. They also used high-resolution spectra from the latest generation of spectrographs in the major world observatories: at Mauna Kea (Hawaii), at La Silla and Paranal (Chile) and at the Roque de los Muchachos, (Garafía, La Palma, Canary Islands), to determine the compositions of their host stars, and of the most critical components for the formation of rocks in the protoplanetary discs.

    “Understanding the link in the composition between the stars and their planets has been a basic aspect of research in our centre for over a decade. Using the best high-resolution spectrographs, such as HARPS and ESPRESSO at the European Southern Observatory (ESO), our team has collected spectra of the host stars of exoplanets for several years.

    These spectra were used to determine the stellar parameters and abundances of the host stars, and the results have been put together in the published catalogue SWEET-Cat”, explained Nuno Santos, a researcher at the IA and a co-author of the article.

    The team also found an intriguing result. They found differences in the fraction of iron between the super earths and super mercurys, which implies that these planets seem to constitute different populations in terms of composition, with further implications for their formation. This finding will need more studies, because the simulations of the formation of planets, incorporating collision, cannot by themselves reproduce the super mercurys of high density. “Understanding the formation of the super mercurys will help us to understand the especially high density of Mercury”, Adibekyan assures us.

    This research was carried out in the framework of the project “Observational Tests of the Processes of Nucleosynthesis in the Universe” started in the year 2000 by the IAC researcher Garik Israelian; Michel Mayor, Nobel Laureate in Physics, 2019; and Nuno Santos, researcher at the IA.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    IAC Institute of Astrophysics of the Canary Islands [Instituto de Astrofísica de Canarias] (ES) operates two astronomical observatories in the Canary Islands:

    Roque de los Muchachos Observatory on La Palma
    Teide Observatory on Tenerife.

    The seeing statistics at ORM make it the second-best location for optical and infrared astronomy in the Northern Hemisphere, after Mauna Kea Observatory Hawaii (US).

    Maunakea Observatories Hawai’i (US) altitude 4,213 m (13,822 ft)

    The site also has some of the most extensive astronomical facilities in the Northern Hemisphere; its fleet of telescopes includes the 10.4 m Gran Telescopio Canarias, the world’s largest single-aperture optical telescope as of July 2009, the William Herschel Telescope (second largest in Europe), and the adaptive optics corrected Swedish 1-m Solar Telescope.

    Gran Telescopio Canarias [Instituto de Astrofísica de Canarias ](ES) sited on a volcanic peak 2,267 metres (7,438 ft) above sea level.

    The observatory was established in 1985, after 15 years of international work and cooperation of several countries with the Spanish island hosting many telescopes from Britain, The Netherlands, Spain, and other countries. The island provided better seeing conditions for the telescopes that had been moved to Herstmonceux by the Royal Greenwich Observatory, including the 98 inch aperture Isaac Newton Telescope (the largest reflector in Europe at that time). When it was moved to the island it was upgraded to a 100-inch (2.54 meter), and many even larger telescopes from various nations would be hosted there.

    Teide Observatory [Observatorio del Teide], IAU code 954, is an astronomical observatory on Mount Teide at 2,390 metres (7,840 ft), located on Tenerife, Spain. It has been operated by the Instituto de Astrofísica de Canarias since its inauguration in 1964. It became one of the first major international observatories, attracting telescopes from different countries around the world because of the good astronomical seeing conditions. Later the emphasis for optical telescopes shifted more towards Roque de los Muchachos Observatory on La Palma.

     
  • richardmitnick 9:43 pm on October 13, 2021 Permalink | Reply
    Tags: "Einstein’s Principle of Equivalence verified in quasars for the first time", Astronomy, , , , ,   

    From IAC Institute of Astrophysics of the Canary Islands [Instituto de Astrofísica de Canarias] (ES) : “Einstein’s Principle of Equivalence verified in quasars for the first time” 

    Instituto de Astrofísica de Andalucía

    From IAC Institute of Astrophysics of the Canary Islands [Instituto de Astrofísica de Canarias] (ES)

    13/10/2021

    Evencio Mediavilla
    emg@iac.es

    1
    Artist impression of a quasar. Credit: M. Kornmesser/ European Southern Observatory [Observatoire européen austral][Europäische Südsternwarte](EU)(CL)

    According to Einstein’s theory of general relativity gravity affects light as well as matter. One consequence of this theory, based on the Principle of Equivalence, is that the light which escapes from a region with a strong gravitational field loses energy on its way, so that it becomes redder, a phenomenon known as the gravitational redshift. Quantifying this gives a fundamental test of Einstein’s theory of gravitation. Until now this test had been performed only on bodies in the nearby universe, but thanks to the use of a new experimental procedure scientists at the Instituto de Astrofísica de Canarias (IAC) and The University of Granada [Universidad de Granada] (ES) have been able to measure the gravitational redshift in quasars, and thus extend the test to very distant regions from where the light was emitted when our universe was young.

    Einstein’s Principle of Equivalence is the cornestone of the General Theory of Relativity, which is our best current description of gravity, and is one of the basic theories of modern physics. The principle states that it is experimentally impossible to distinguish between a gravitational field and an accelerated motion of the observer, and one of its predictions is that the light emitted from within an intense gravitational field should undergo a measurable shift to lower spectral energy, which for light means a shift to the red, which is termed “redshift”.

    This prediction has been well and very frequently confirmed close to the Earth, from the first measurements by R.V. Pound and G.A. Rebka at Harvard in 1959 until the most recent measurements with satellites. It has also been confirmed using observations of the Sun, and of some stars, such as our neighbour Sirius B, and the star S2 close to the supermassive black hole at the centre of the Galaxy. But to confirm it with measurements beyond the Galaxy has proved difficult, and there have been only a few tests with complicated measurements and low precision in clusters of galaxies relatively near to us in cosmological terms.

    The reason for this lack of testing in the more distant universe is the difficulty of measuring the redshift because in the majority of situations the effect of gravity on the light is very small. For that reason massive black holes with very strong gravitational fields offer promising scenarios for measuring gravitational redshifts. In particular the supermassive black holes found at the centres of galaxies, which have huge gravitational fields, offer one of the more promising scenarios to measure the gravitational redshift. They are situated at the centres of the extraordinarily luminous and distant quasars.

    A quasar is an object in the sky which looks like a star but is situated at a great distance from us, so that the light we receive from it was emitted when the universe was much younger than now. This means that they must be extremely bright. The origin of this huge power output is a disc of hot material which is being swallowed by the supermassive black hole at its centre. This energy is generated in a very small region, barely a few light days in size.

    In the neighbourhood of the black hole there is a very intense gravitational field and so by studying the light emitted by the chemical elements in this region (mainly hydrogen, carbon, and magnesium) we would expect to measure very large gravitational redshifts. Unfortunately the majority of the elements in quasar accretion discs are also present in regions further out from the central black hole where the gravitational effects are much smaller, so the light we receive from those elements is a mixture in which it is not easy to pick out clearly the gravitational redshifts.

    The measurements cover 80% of the history of the universe

    Now a team of researchers at the Instituto de Astrofísica de Canarias (IAC) and the University of Granada (UGR) have found a well defined portion of he ultraviolet light emitted by iron atoms from a region confined to the neighbourhood of the black hole. “Through our research related to gravitational lensing, another of the predictions of Einstein’s theory of General Relativity, we found that a characteristic spectal feature of iron in quasars seemed to be coming from a region very close to the black hole. Our measurements of the redshift confirmed this finding” explains Evencio Mediavilla, an IAC researcher, Professor at the Unversity of La Laguna(ULL) and first author of the article.

    Using this feature the researchers have been able to measure clearly and precisely the gravitational redshifts of many quasars and, using them, estimate the masses of the black holes. “This technique marks an extraordinary advance, because it allows us to measure precisely the gravitational redshifts of individual objects at great distances, which opens up important possibilities for the future” says Mediavilla.

    Jorge Jimenez Vicente, a researcher at the UGR, and co-author of the article, stressess the implications of this new experimental procedure, as it allows comparison of the measured redshift with the theoretcially predicted value: “this technique allows us for the first time to test Einstein’s Principle of Equivalence, and with it the basis of our understanding of gravity on cosmological scales.”

    This test of the Principle of Equivalence performed by these researchers is based on measurements which include active galaxies in our neighbourhood (some 13,800 million years after the Big Bang) out to individual quasars a large distances, whose light was emitted when the age of the universe was only some 2,200 million years, thus covering around 80% of the history of the universe. “The results, with a precision comparable to those of experiments carried out within our Galaxy, validate the Principle of Equivalence over this vast period of time” notes Jiménez-Vicente.

    The article has been published in the journal The Astrophysical Journal, and recently has been selected by The American Astronomical Society (US) which as published an interview with the researchers in the section “AAS Journal Author Series” of its YouTube channel, whose aim is to link the authors with their article, their personal histories, and the astronomical community in general.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    IAC Institute of Astrophysics of the Canary Islands [Instituto de Astrofísica de Canarias] (ES) operates two astronomical observatories in the Canary Islands:

    Roque de los Muchachos Observatory on La Palma
    Teide Observatory on Tenerife.

    The seeing statistics at ORM make it the second-best location for optical and infrared astronomy in the Northern Hemisphere, after Mauna Kea Observatory Hawaii (US).

    Maunakea Observatories Hawai’i (US) altitude 4,213 m (13,822 ft)

    The site also has some of the most extensive astronomical facilities in the Northern Hemisphere; its fleet of telescopes includes the 10.4 m Gran Telescopio Canarias, the world’s largest single-aperture optical telescope as of July 2009, the William Herschel Telescope (second largest in Europe), and the adaptive optics corrected Swedish 1-m Solar Telescope.

    Gran Telescopio Canarias [Instituto de Astrofísica de Canarias ](ES) sited on a volcanic peak 2,267 metres (7,438 ft) above sea level.

    The observatory was established in 1985, after 15 years of international work and cooperation of several countries with the Spanish island hosting many telescopes from Britain, The Netherlands, Spain, and other countries. The island provided better seeing conditions for the telescopes that had been moved to Herstmonceux by the Royal Greenwich Observatory, including the 98 inch aperture Isaac Newton Telescope (the largest reflector in Europe at that time). When it was moved to the island it was upgraded to a 100-inch (2.54 meter), and many even larger telescopes from various nations would be hosted there.

    Teide Observatory [Observatorio del Teide], IAU code 954, is an astronomical observatory on Mount Teide at 2,390 metres (7,840 ft), located on Tenerife, Spain. It has been operated by the Instituto de Astrofísica de Canarias since its inauguration in 1964. It became one of the first major international observatories, attracting telescopes from different countries around the world because of the good astronomical seeing conditions. Later the emphasis for optical telescopes shifted more towards Roque de los Muchachos Observatory on La Palma.

     
  • richardmitnick 1:40 pm on October 13, 2021 Permalink | Reply
    Tags: "Eerie Discovery of 2 'Identical' Galaxies in Deep Space Is Finally Explained", Astronomy, , , , ,   

    From Science Alert (US) : “Eerie Discovery of 2 ‘Identical’ Galaxies in Deep Space Is Finally Explained” 

    ScienceAlert

    From Science Alert (US)

    13 OCTOBER 2021
    MICHELLE STARR

    1
    Hamilton’s Object. (Joseph DePasquale/Space Telescope Science Institute (US))

    Galaxies are a bit like fingerprints, or snowflakes. There are many of them out there, and they can have a lot of characteristics in common, but no two are exactly alike.

    So, back in 2013, when two galaxies were spotted side-by-side in the distant reaches of the Universe, and which looked to be startlingly similar, astronomers were flummoxed.

    Now, they’ve finally solved the mystery of these strange “identical objects” – and the answer could have implications for understanding dark matter.

    The object, now named Hamilton’s Object, was discovered by astronomer Timothy Hamilton of Shawnee State University (US) by accident, in data obtained by the Hubble Space Telescope nearly a decade ago.

    The two galaxies appeared to be the same shape, and had the same nearly parallel dark streaks across the galactic bulge – the central region of the galaxy where most of the stars live.

    “We were really stumped,” Hamilton said. “My first thought was that maybe they were interacting galaxies with tidally stretched-out arms. It didn’t really fit well, but I didn’t know what else to think.”

    It wasn’t until 2015 that a more plausible answer would emerge. Astronomer Richard Griffiths of The University of Hawaii (US), on seeing Hamilton present his object at a meeting, suggested that the culprit might be a rare phenomenon: gravitational lensing.

    This is a phenomenon that results purely from a chance alignment of massive objects in space. If a massive object sits directly between us and a more distant object, a magnification effect occurs due to the gravitational curvature of space-time around the closer object.

    Any light that then travels through this space-time follows this curvature and enters our telescopes smeared and distorted to varying degrees – but also often magnified and duplicated.

    This made a lot more sense than two identical galaxies, especially when Griffiths found yet another duplication of the galaxy (as can be seen in the picture below).

    A huge problem, however, remained: What was causing the gravitational curvature? So Griffiths and his team set about searching sky survey data for an object massive enough to produce the lensing effect.

    And they found it. Between us and Hamilton’s Object lurks a cluster of galaxies that had only been poorly documented. Usually, these discoveries go the other way – first the cluster is identified, and then astronomers go looking for lensed galaxies behind them.

    The team’s work revealed that Hamilton’s Object is around 11 billion light-years away, and a different team’s work revealed that that the cluster is about 7 billion light-years away.

    The galaxy itself is a barred spiral galaxy with its edge facing us, undergoing clumpy and uneven star formation, the researchers determined. Computer simulations then helped determine that the three duplicated images could only be created if the distribution of dark matter is smooth at small scales.

    3
    (Joseph DePasquale/STScI)

    “It’s great that we only need two mirror images in order to get the scale of how clumpy or not dark matter can be at these positions,” said astronomer Jenny Wagner of the The Ruprecht Karl University of Heidelberg, [Ruprecht-Karls-Universität Heidelberg] (DE).

    “Here, we don’t use any lens models. We just take the observables of the multiple images and the fact they can be transformed into one another. They can be folded into one another by our method. This already gives us an idea of how smooth the dark matter needs to be at these two positions.”

    The two identical side-by-side images were created because they straddle a “ripple” in space-time – an area of greatest magnification created by the gravity of a filament of dark matter. Such filaments are thought to connect the Universe in a vast, invisible cosmic web, joining galaxies and galaxy clusters and feeding them with hydrogen gas.

    But we don’t actually know what dark matter is, so any new discovery that lets us map where it is, how it’s distributed, and how it affects the space around it is another drop of evidence that will ultimately help us solve the mystery.

    “We know it’s some form of matter, but we have no idea what the constituent particle is,” Griffiths explained.

    “So we don’t know how it behaves at all. We just know that it has mass and is subject to gravity. The significance of the limits of size on the clumping or smoothness is that it gives us some clues as to what the particle might be. The smaller the dark matter clumps, the more massive the particles must be.”

    The research has been published in the MNRAS3.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

     
  • richardmitnick 10:56 am on October 13, 2021 Permalink | Reply
    Tags: "A Crystal Ball Into Our Solar System’s Future", Astronomy, , , , ,   

    From W.M. Keck Observatory (US) : “A Crystal Ball Into Our Solar System’s Future” 

    From W.M. Keck Observatory (US)

    October 13, 2021

    Mari-Ela Chock, Communications Officer
    W. M. Keck Observatory
    (808) 554-0567
    mchock@keck.hawaii.edu

    Giant Gas Planet Orbiting a Dead Star Gives Glimpse Into the Predicted Aftermath of our Sun’s Demise.

    1
    Artist’s rendition of a newly-discovered jupiter-like exoplanet orbiting a white dwarf, or dead star. This system is evidence that planets can survive their host star’s explosive red giant phase and is the very first confirmed planetary system that serves as an analog to the fate of the sun and jupiter in our own solar system.
    Credit: Adam Makarenko/ W. M. Keck Observatory.

    Astronomers have discovered the very first confirmed planetary system that resembles the expected fate of our solar system, when the Sun reaches the end of its life in about five billion years.

    The researchers detected the system using W. M. Keck Observatory on Maunakea in Hawaiʻi; it consists of a Jupiter-like planet with a Jupiter-like orbit revolving around a white dwarf star located near the center of our Milky Way galaxy.

    “This evidence confirms that planets orbiting at a large enough distance can continue to exist after their star’s death,” says Joshua Blackman, an astronomy postdoctoral researcher at the The University of Tasmania (AU) and lead author of the study. “Given that this system is an analog to our own solar system, it suggests that Jupiter and Saturn might survive the Sun’s red giant phase, when it runs out of nuclear fuel and self-destructs.”

    The study is published in today’s issue of the journal Nature.

    “Earth’s future may not be so rosy because it is much closer to the Sun,” says co-author David Bennett, a senior research scientist at The University of Maryland (US) and The Goddard Space Flight Center | NASA (US). “If humankind wanted to move to a moon of Jupiter or Saturn before the Sun fried the Earth during its red supergiant phase, we’d still remain in orbit around the Sun, although we would not be able to rely on heat from the Sun as a white dwarf for very long.”

    A white dwarf is what main sequence stars like our Sun become when they die. In the last stages of the stellar life cycle, a star burns off all of the hydrogen in its core and balloons into a red giant star. It then collapses into itself, shrinking into a white dwarf, where all that’s left is a hot, dense core, typically Earth-sized and half as massive as the Sun. Because these compact stellar corpses are small and no longer have the nuclear fuel to radiate brightly, white dwarfs are very faint and difficult to detect.

    Animation showing an artist’s rendering of a main sequence star ballooning into a red giant as it burns the last of its hydrogen fuel, then collapses into a white dwarf. What remains is a hot, dense core roughly the size of Earth and about half the mass of the Sun. A gas giant similar to Jupiter orbits from a distance, surviving the explosive transformation. Credit: Adam Makarenko/W. M. Keck Observatory.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    Mission
    To advance the frontiers of astronomy and share our discoveries with the world.

    The W. M. Keck Observatory (US) operates the largest, most scientifically productive telescopes on Earth. The two, 10-meter optical/infrared telescopes on the summit of Mauna Kea on the Island of Hawaii feature a suite of advanced instruments including imagers, multi-object spectrographs, high-resolution spectrographs, integral-field spectrometer and world-leading laser guide star adaptive optics systems. Keck Observatory is a private 501(c) 3 non-profit organization and a scientific partnership of the California Institute of Technology, the University of California and NASA.

    Today Keck Observatory is supported by both public funding sources and private philanthropy. As a 501(c)3, the organization is managed by the California Association for Research in Astronomy (CARA), whose Board of Directors includes representatives from the California Institute of Technology and the University of California, with liaisons to the board from NASA and the Keck Foundation.


    Keck UCal

    Instrumentation

    Keck 1

    HIRES – The largest and most mechanically complex of the Keck’s main instruments, the High Resolution Echelle Spectrometer breaks up incoming starlight into its component colors to measure the precise intensity of each of thousands of color channels. Its spectral capabilities have resulted in many breakthrough discoveries, such as the detection of planets outside our solar system and direct evidence for a model of the Big Bang theory.

    height=”375″ class=”size-full wp-image-32389″ /> Keck High-Resolution Echelle Spectrometer (HIRES), at the Keck I telescope.[/caption]

    LRIS – The Low Resolution Imaging Spectrograph is a faint-light instrument capable of taking spectra and images of the most distant known objects in the universe. The instrument is equipped with a red arm and a blue arm to explore stellar populations of distant galaxies, active galactic nuclei, galactic clusters, and quasars.

    VISIBLE BAND (0.3-1.0 Micron)

    MOSFIRE – The Multi-Object Spectrograph for Infrared Exploration gathers thousands of spectra from objects spanning a variety of distances, environments and physical conditions. What makes this huge, vacuum-cryogenic instrument unique is its ability to select up to 46 individual objects in the field of view and then record the infrared spectrum of all 46 objects simultaneously. When a new field is selected, a robotic mechanism inside the vacuum chamber reconfigures the distribution of tiny slits in the focal plane in under six minutes. Eight years in the making with First Light in 2012, MOSFIRE’s early performance results range from the discovery of ultra-cool, nearby substellar mass objects, to the detection of oxygen in young galaxies only 2 billion years after the Big Bang.

    OSIRIS – The OH-Suppressing Infrared Imaging Spectrograph is a near-infrared spectrograph for use with the Keck I adaptive optics system. OSIRIS takes spectra in a small field of view to provide a series of images at different wavelengths. The instrument allows astronomers to ignore wavelengths where the Earth’s atmosphere shines brightly due to emission from OH (hydroxl) molecules, thus allowing the detection of objects 10 times fainter than previously available.

    Keck 2

    DEIMOS – The Deep Extragalactic Imaging Multi-Object Spectrograph is the most advanced optical spectrograph in the world, capable of gathering spectra from 130 galaxies or more in a single exposure. In ‘Mega Mask’ mode, DEIMOS can take spectra of more than 1,200 objects at once, using a special narrow-band filter.

    NIRSPEC – The Near Infrared Spectrometer studies very high redshift radio galaxies, the motions and types of stars located near the Galactic Center, the nature of brown dwarfs, the nuclear regions of dusty starburst galaxies, active galactic nuclei, interstellar chemistry, stellar physics, and solar-system science.


    ESI – The Echellette Spectrograph and Imager captures high-resolution spectra of very faint galaxies and quasars ranging from the blue to the infrared in a single exposure. It is a multimode instrument that allows users to switch among three modes during a night. It has produced some of the best non-AO images at the Observatory.

    KCWI – The Keck Cosmic Web Imager is designed to provide visible band, integral field spectroscopy with moderate to high spectral resolution, various fields of view and image resolution formats and excellent sky-subtraction. The astronomical seeing and large aperture of the telescope enables studies of the connection between galaxies and the gas in their dark matter halos, stellar relics, star clusters and lensed galaxies.

    NEAR-INFRARED (1-5 Micron)

    ADAPTIVE OPTICS – Adaptive optics senses and compensates for the atmospheric distortions of incoming starlight up to 1,000 times per second. This results in an improvement in image quality on fairly bright astronomical targets by a factor 10 to 20.

    LASER GUIDE STAR ADAPTIVE OPTICS [pictured above] – The Keck Laser Guide Star expands the range of available targets for study with both the Keck I and Keck II adaptive optics systems. They use sodium lasers to excite sodium atoms that naturally exist in the atmosphere 90 km (55 miles) above the Earth’s surface. The laser creates an “artificial star” that allows the Keck adaptive optics system to observe 70-80 percent of the targets in the sky, compared to the 1 percent accessible without the laser.

    NIRC-2/AO – The second generation Near Infrared Camera works with the Keck Adaptive Optics system to produce the highest-resolution ground-based images and spectroscopy in the 1-5 micron range. Typical programs include mapping surface features on solar system bodies, searching for planets around other stars, and analyzing the morphology of remote galaxies.


    ABOUT NIRES
    The Near Infrared Echellette Spectrograph (NIRES) is a prism cross-dispersed near-infrared spectrograph built at the California Institute of Technology by a team led by Chief Instrument Scientist Keith Matthews and Prof. Tom Soifer. Commissioned in 2018, NIRES covers a large wavelength range at moderate spectral resolution for use on the Keck II telescope and observes extremely faint red objects found with the Spitzer and WISE infrared space telescopes, as well as brown dwarfs, high-redshift galaxies, and quasars.

    Future Instrumentation

    KCRM – The Keck Cosmic Reionization Mapper will complete the Keck Cosmic Web Imager (KCWI), the world’s most capable spectroscopic imager. The design for KCWI includes two separate channels to detect light in the blue and the red portions of the visible wavelength spectrum. KCWI-Blue was commissioned and started routine science observations in September 2017. The red channel of KCWI is KCRM; a powerful addition that will open a window for new discoveries at high redshifts.

    KPF – The Keck Planet Finder (KPF) will be the most advanced spectrometer of its kind in the world. The instrument is a fiber-fed high-resolution, two-channel cross-dispersed echelle spectrometer for the visible wavelengths and is designed for the Keck II telescope. KPF allows precise measurements of the mass-density relationship in Earth-like exoplanets, which will help astronomers identify planets around other stars that are capable of supporting life.

     
  • richardmitnick 1:40 pm on October 12, 2021 Permalink | Reply
    Tags: "Stellar fossils in meteorites point to distant stars", Astronomy, , , , NanoSIMS: a state-of-the-art mass spectrometer,   

    From Washington University in St. Louis (US) : “Stellar fossils in meteorites point to distant stars” 

    Wash U Bloc

    From Washington University in St. Louis (US)

    October 12, 2021

    Talia Ogliore
    talia.ogliore@wustl.edu

    Some pristine meteorites contain a record of the original building blocks of the solar system, including grains that formed in ancient stars that died before the sun formed. One of the biggest challenges in studying these presolar grains is to determine the type of star each grain came from.

    1
    An electron microscope image of a micron-sized silicon carbide stardust grain (lower right). The grain is coated with meteoritic organics (dark gunk on the left side of the grain). Such grains formed in the cooling winds lost from the surface of low-mass carbon-rich stars near the end of their lives, typified here (upper left) by a Hubble Space Telescope image of the asymptotic giant branch (AGB) star U Camelopardalis. Laboratory analysis of such tiny dust grains provides unique information on nuclear reactions in low-mass AGB stars and their evolutions. (Image credits: NASA, Nan Liu and Andrew Davis)

    Nan Liu, research assistant professor of physics in Arts & Sciences at Washington University in St. Louis, is first author of a new study in The Astrophysical Journal Letters that analyzes a diverse set of presolar grains with the goal of realizing their true stellar origins.

    Liu and her team used a state-of-the-art mass spectrometer called NanoSIMS to measure isotopes of a suite of elements including the N and Mg-Al isotopes in presolar silicon carbide (SiC) grains. By refining their analytical protocols and also utilizing a new-generation plasma ion source, the scientists were able to visualize their samples with better spatial resolution than could be accomplished by previous studies.

    “Presolar grains have been embedded in meteorites for 4.6 billion years and are sometimes coated with solar materials on the surface,” Liu said. “Thanks to the improved spatial resolution, our team was able to see Al contamination attached on the surface of a grain and to obtain true stellar signatures by including signals only from the core of the grain during the data reduction.”

    The scientists sputtered the grains using an ion beam for extended periods of time to expose clean, interior grain surfaces for their isotopic analyses. The researchers found that the N isotope ratios of the same grain greatly increased after the grain was exposed to extended ion sputtering.

    Isotope ratios can be rarely measured for stars, but C and N isotopes are two exceptions. The new C and N isotope data for the presolar grains reported in this study directly link the grains to different types of carbon stars based on these stars’ observed isotopic ratios.

    “The new isotopic data obtained in this study are exciting for stellar physicists and nuclear astrophysicists like me,” said Maurizio Busso, a co-author of the study who is based at the University of Perugia, in Italy. “Indeed, the ‘strange’ N isotopic ratios of presolar SiC grains have been in the last two decades a remarkable source of concern. The new data explain the difference between what was originally present in the presolar stardust grains and what was attached later, thus solving a long-standing puzzle in the community.”

    2
    NanoSIMS images of a SiC grain. The upper panel shows images taken at a spatial resolution of ~1 μm, the typical resolution of previous analyses. The lower panel shows the same grain’s ion images taken at a spatial resolution of 100 nm, the resolution achieved in this study. (Image courtesy of Nan Liu)

    The study also includes a significant exploration of radioactive isotope aluminum-26 (26Al), an important heat source during the evolution of young planetary bodies in the early solar system and also other extra-solar systems. The scientists inferred the initial presence of large amounts of 26Al in all measured grains, as predicted by current models. The study determined how much 26Al was produced by the “parent stars” of the grains they measured. Liu and her collaborators concluded that stellar model predictions for 26Al are too high by at least a factor of two, compared to the grain data.

    The data-model offsets likely point to uncertainties in relevant nuclear reaction rates, Liu noted, and will motivate nuclear physicists to pursue better measurements of these reaction rates in the future.

    The team’s results link some of the presolar grains in this collection to poorly known carbon stars with peculiar chemical compositions.

    The grains’ isotopic data point to H-burning processes occurring in such carbon stars at higher-than-expected temperatures. This information will help astrophysicists to construct stellar models to better understand the evolution of these stellar objects.

    “As we learn more about the sources for dust, we can gain additional knowledge about the history of the universe and how various stellar objects within it evolve,” Liu said.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Wash U campus

    Washington University in St. Louis (US) is a private research university in Greater St. Louis with its main campus (Danforth) mostly in unincorporated St. Louis County, Missouri, and Clayton, Missouri. It also has a West Campus in Clayton, North Campus in the West End neighborhood of St. Louis, Missouri, and Medical Campus in the Central West End neighborhood of St. Louis, Missouri.

    Founded in 1853 and named after George Washington, the university has students and faculty from all 50 U.S. states and more than 120 countries. Washington University is composed of seven graduate and undergraduate schools that encompass a broad range of academic fields. To prevent confusion over its location, the Board of Trustees added the phrase “in St. Louis” in 1976. Washington University is a member of the Association of American Universities (US) and is classified among “R1: Doctoral Universities – Very high research activity”.

    As of 2020, 25 Nobel laureates in economics, physiology and medicine, chemistry, and physics have been affiliated with Washington University, ten having done the major part of their pioneering research at the university. In 2019, Clarivate Analytics ranked Washington University 7th in the world for most cited researchers. The university also received the 4th highest amount of National Institutes of Health (US) medical research grants among medical schools in 2019.

    Research

    Virtually all faculty members at Washington University engage in academic research, offering opportunities for both undergraduate and graduate students across the university’s seven schools. Known for its interdisciplinary and departmental collaboration, many of Washington University’s research centers and institutes are collaborative efforts between many areas on campus. More than 60% of undergraduates are involved in faculty research across all areas; it is an institutional priority for undergraduates to be allowed to participate in advanced research. According to the Center for Measuring University Performance, it is considered to be one of the top 10 private research universities in the nation. A dedicated Office of Undergraduate Research is located on the Danforth Campus and serves as a resource to post research opportunities, advise students in finding appropriate positions matching their interests, publish undergraduate research journals, and award research grants to make it financially possible to perform research.

    According to the National Science Foundation (US), Washington University spent $816 million on research and development in 2018, ranking it 27th in the nation. The university has over 150 National Institutes of Health funded inventions, with many of them licensed to private companies. Governmental agencies and non-profit foundations such as the NIH, Department of Defense (US), National Science Foundation, and National Aeronautics Space Agency (US) provide the majority of research grant funding, with Washington University being one of the top recipients in NIH grants from year-to-year. Nearly 80% of NIH grants to institutions in the state of Missouri went to Washington University alone in 2007. Washington University and its Medical School play a large part in the Human Genome Project, where it contributes approximately 25% of the finished sequence. The Genome Sequencing Center has decoded the genome of many animals, plants, and cellular organisms, including the platypus, chimpanzee, cat, and corn.

    NASA hosts its Planetary Data System Geosciences Node on the campus of Washington University. Professors, students, and researchers have been heavily involved with many unmanned missions to Mars. Professor Raymond Arvidson has been deputy principal investigator of the Mars Exploration Rover mission and co-investigator of the Phoenix lander robotic arm.

    Washington University professor Joseph Lowenstein, with the assistance of several undergraduate students, has been involved in editing, annotating, making a digital archive of the first publication of poet Edmund Spenser’s collective works in 100 years. A large grant from the National Endowment for the Humanities (US) has been given to support this ambitious project centralized at Washington University with support from other colleges in the United States.

    In 2019, Folding@Home (US), a distributed computing project for performing molecular dynamics simulations of protein dynamics, was moved to Washington University School of Medicine from Stanford University (US). The project, currently led by Dr. Greg Bowman, uses the idle CPU time of personal computers owned by volunteers to conduct protein folding research. Folding@home’s research is primarily focused on biomedical problems such as Alzheimer’s disease, Cancer, Coronavirus disease 2019, and Ebola virus disease. In April 2020, Folding@home became the world’s first exaFLOP computing system with a peak performance of 1.5 exaflops, making it more than seven times faster than the world’s fastest supercomputer, Summit, and more powerful than the top 100 supercomputers in the world, combined.

     
  • richardmitnick 9:25 am on October 10, 2021 Permalink | Reply
    Tags: "2 Old Open Star Clusters Merging in the Milky Way", 30 Doradus nebula [also known as the Tarantula Nebula], A composite of two clusters that differ in age by about one million years., Astronomy, , , , , Hubble Watches Star Clusters on a Collision Course.,   

    From Hubblesite (US)(EU) and NASA/ESA Hubble via EarthSky : “2 Old Open Star Clusters Merging in the Milky Way” 

    From Hubblesite (US)(EU) and NASA/ESA Hubble

    via

    1

    EarthSky

    October 7, 2021
    Deborah Byrd

    1
    08.16.12
    Hubble Watches Star Clusters on a Collision Course.
    This is a Hubble Space Telescope image of a pair of star clusters that are believed to be in the early stages of merging. The clusters lie in the gigantic 30 Doradus nebula, which is 170,000 light-years from Earth.

    Old open star clusters merging

    Open star clusters tend to be young collections of sibling stars, born together from a cloud or nebula in space. Open clusters are like families of stars. They’re still loosely bound by gravity and still move together through space. We know thousands of them in our Milky Way galaxy, and amateur astronomers love to gaze at them through small telescopes and binoculars, under dark skies. Most open star clusters don’t survive more than several orbits around our galaxy’s center before being disrupted and dispersed. But astronomer Denilso Camargo in Brazil reached out this week (early October 2021) about a new discovery of what he called:

    ” … the first old binary star cluster within our Milky Way galaxy.

    Close encounters between open star clusters are rare, he said. Obviously, the subsequent formation of binary clusters is even rarer. And the evolution to a merger event is extremely unlikely. Yet there it is. And that’s not all. Camargo said this system appears to be undergoing a merger during a close encounter. As the two star clusters merge, Camargo said, they’re leaving in their wake:

    … streams populated by bound substructures.”

    That’s the kind of news you’d expect to hear from ESA’s Gaia space observatory.

    European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU) GAIA satellite

    And indeed Camargo used Gaia data, along with images from NASA’s WISE infrared space telescope.

    National Aeronautics and Space Administration(US) Wise/NEOWISE Telescope.

    The study is accepted for publication in The Astrophysical Journal.

    One cluster becomes 2

    Camargo is at The Federal University of Rio Grande do Sul [Universidade Federal do Rio Grande do Sul](BR). Via data from the two space observatories, he closely examined a single known open cluster called NGC 1605 and found it was really two clusters. NGC 1605 one of the open star clusters amateur astronomers like to see. As deep-sky objects go, it’s relatively bright (about 11th magnitude) and can be seen with a 6-inch or larger telescope (in a dark sky). One known cluster has become two, so, Camargo said:

    “I called the two clusters as NGC 1605a and NGC 1605b.”

    Camargo said his study reveals one cluster is 2 billion years old, and the other only 600 million years old. And those ages give a clue to the clusters’ histories. Writing for IFLScience on October 4, 2021, Stephen Luntz explained:

    “It’s clear NGC 1605a and b are unlike anything else we have seen. Most binary clusters are young, which probably indicates they formed together from a single cloud that broke apart. That can’t be the case for a pair of such different ages.

    Instead, these two must have been drifting past each other, and come close enough their gravitational fields caused them to interact. The two have now come so close their star populations overlap.”

    2
    The Goddard Space Flight Center | NASA

    The Hubble observations, made with the Wide Field Camera 3, were taken Oct. 20-27, 2009.

    National Aeronautics Space Agency(USA) European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU)/ Hubble Wide Field Camera 3

    The blue color is light from the hottest, most massive stars; the green from the glow of oxygen; and the red from fluorescing hydrogen. Image Credit: R. O’Connell (University of Virginia (US))NASA, ESA, and the Wide Field Camera 3 Science Oversight Committee.

    Astronomers using data from NASA’s Hubble Space Telescope have caught two clusters full of massive stars that may be in the early stages of merging. The clusters are 170,000 light-years away in the Large Magellanic Cloud, a small satellite galaxy to our Milky Way.

    lmc Large Magellanic Cloud. ESO’s VISTA telescope reveals a remarkable image of the Large Magellanic Cloud.

    Part of ESO’s Paranal Observatory the VLT Survey Telescope (VISTA) observes the brilliantly clear skies above the Atacama Desert of Chile. It is the largest survey telescope in the world in visible light, with an elevation of 2,635 metres (8,645 ft) above sea level.

    What at first was thought to be only one cluster in the core of the massive star-forming region 30 Doradus (also known as the Tarantula Nebula) has been found to be a composite of two clusters that differ in age by about one million years.

    The entire 30 Doradus complex has been an active star-forming region for 25 million years, and it is currently unknown how much longer this region can continue creating new stars. Smaller systems that merge into larger ones could help to explain the origin of some of the largest known star clusters.

    Lead scientist Elena Sabbi of the Space Telescope Science Institute in Baltimore, Md., and her team began looking at the area while searching for runaway stars, fast-moving stars that have been kicked out of their stellar nurseries where they first formed. “Stars are supposed to form in clusters, but there are many young stars outside 30 Doradus that could not have formed where they are; they may have been ejected at very high velocity from 30 Doradus itself,” Sabbi said.

    She then noticed something unusual about the cluster when looking at the distribution of the low-mass stars detected by Hubble. It is not spherical, as was expected, but has features somewhat similar to the shape of two merging galaxies where their shapes are elongated by the tidal pull of gravity. Hubble’s circumstantial evidence for the impending merger comes from seeing an elongated structure in one of the clusters, and from measuring a different age between the two clusters.

    According to some models, the giant gas clouds out of which star clusters form may fragment into smaller pieces. Once these small pieces precipitate stars, they might then interact and merge to become a bigger system. This interaction is what Sabbi and her team think they are observing in 30 Doradus.

    Also, there are an unusually large number of high-velocity stars around 30 Doradus. Astronomers believe that these stars, often called “runaway stars” were expelled from the core of 30 Doradus as the result of dynamical interactions. These interactions are very common during a process called core collapse, in which more-massive stars sink to the center of a cluster by dynamical interactions with lower-mass stars. When many massive stars have reached the core, the core becomes unstable and these massive stars start ejecting each other from the cluster.

    The big cluster R136 in the center of the 30 Doradus region is too young to have already experienced a core collapse. However, since in smaller systems the core collapse is much faster, the large number of runaway stars that has been found in the 30 Doradus region can be better explained if a small cluster has merged into R136.

    Follow-up studies will look at the area in more detail and on a larger scale to see if any more clusters might be interacting with the ones observed. In particular, the infrared sensitivity of NASA’s planned James Webb Space Telescope (JWST) will allow astronomers to look deep into the regions of the Tarantula Nebula that are obscured in visible-light photographs.

    National Aeronautics Space Agency(USA)/European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU)/ Canadian Space Agency [Agence Spatiale Canadienne](CA) Webb Infrared Space Telescope(US) James Webb Space Telescope annotated. Scheduled for launch in October 2021.

    In these areas cooler and dimmer stars are hidden from view inside cocoons of dust. Webb will better reveal the underlying population of stars in the nebula.

    The 30 Doradus Nebula is particularly interesting to astronomers because it is a good example of how star-forming regions in the young universe may have looked. This discovery could help scientists understand the details of cluster formation and how stars formed in the early universe.

    The members of Sabbi’s team are D.J. Lennon (ESA/STScI), M. Gieles (University of Cambridge (UK)), S.E. de Mink (STScI/Johns Hopkins University (US)), N.R. Walborn, J. Anderson, A. Bellini, N. Panagia, and R. van der Marel (STScI), and J. Maiz Appelaniz (Institute of Astrophysics of Andalusia [Instituto de Astrofísica de Andalucía] CSIC (ES)).

    The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA’s Goddard Space Flight Center in Greenbelt, Md., manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore, Md., conducts Hubble science operations. STScI is operated by the Association of Universities for Research in Astronomy, Inc., in Washington, D.C.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition
    The NASA/ESA Hubble Space Telescope is a space telescope that was launched into low Earth orbit in 1990 and remains in operation. It was not the first space telescope, but it is one of the largest and most versatile, renowned both as a vital research tool and as a public relations boon for astronomy. The Hubble telescope is named after astronomer Edwin Hubble and is one of NASA’s Great Observatories, along with the NASA Compton Gamma Ray Observatory, the Chandra X-ray Observatory, and the NASA Spitzer Infared Space Telescope.

    National Aeronautics Space Agency(USA) Compton Gamma Ray Observatory



    Edwin Hubble at Caltech Palomar Samuel Oschin 48 inch Telescope(US). Credit: Emilio Segre Visual Archives/AIP/SPL).

    Hubble features a 2.4-meter (7.9 ft) mirror, and its four main instruments observe in the ultraviolet, visible, and near-infrared regions of the electromagnetic spectrum. Hubble’s orbit outside the distortion of Earth’s atmosphere allows it to capture extremely high-resolution images with substantially lower background light than ground-based telescopes. It has recorded some of the most detailed visible light images, allowing a deep view into space. Many Hubble observations have led to breakthroughs in astrophysics, such as determining the rate of expansion of the universe.

    The Hubble telescope was built by the United States space agency National Aeronautics Space Agency(US) with contributions from the European Space Agency [Agence spatiale européenne](EU). The Space Telescope Science Institute (STScI) selects Hubble’s targets and processes the resulting data, while the NASA Goddard Space Flight Center(US) controls the spacecraft. Space telescopes were proposed as early as 1923. Hubble was funded in the 1970s with a proposed launch in 1983, but the project was beset by technical delays, budget problems, and the 1986 Challenger disaster. It was finally launched by Space Shuttle Discovery in 1990, but its main mirror had been ground incorrectly, resulting in spherical aberration that compromised the telescope’s capabilities. The optics were corrected to their intended quality by a servicing mission in 1993.

    Hubble is the only telescope designed to be maintained in space by astronauts. Five Space Shuttle missions have repaired, upgraded, and replaced systems on the telescope, including all five of the main instruments. The fifth mission was initially canceled on safety grounds following the Columbia disaster (2003), but NASA administrator Michael D. Griffin approved the fifth servicing mission which was completed in 2009. The telescope was still operating as of April 24, 2020, its 30th anniversary, and could last until 2030–2040. One successor to the Hubble telescope is the National Aeronautics Space Agency(USA)/European Space Agency [Agence spatiale européenne](EU)/Canadian Space Agency(CA) Webb Infrared Space Telescope scheduled for launch in October 2021.

    Proposals and precursors

    In 1923, Hermann Oberth—considered a father of modern rocketry, along with Robert H. Goddard and Konstantin Tsiolkovsky—published Die Rakete zu den Planetenräumen (“The Rocket into Planetary Space“), which mentioned how a telescope could be propelled into Earth orbit by a rocket.

    The history of the Hubble Space Telescope can be traced back as far as 1946, to astronomer Lyman Spitzer’s paper entitled Astronomical advantages of an extraterrestrial observatory. In it, he discussed the two main advantages that a space-based observatory would have over ground-based telescopes. First, the angular resolution (the smallest separation at which objects can be clearly distinguished) would be limited only by diffraction, rather than by the turbulence in the atmosphere, which causes stars to twinkle, known to astronomers as seeing. At that time ground-based telescopes were limited to resolutions of 0.5–1.0 arcseconds, compared to a theoretical diffraction-limited resolution of about 0.05 arcsec for an optical telescope with a mirror 2.5 m (8.2 ft) in diameter. Second, a space-based telescope could observe infrared and ultraviolet light, which are strongly absorbed by the atmosphere.

    Spitzer devoted much of his career to pushing for the development of a space telescope. In 1962, a report by the U.S. National Academy of Sciences recommended development of a space telescope as part of the space program, and in 1965 Spitzer was appointed as head of a committee given the task of defining scientific objectives for a large space telescope.

    Space-based astronomy had begun on a very small scale following World War II, as scientists made use of developments that had taken place in rocket technology. The first ultraviolet spectrum of the Sun was obtained in 1946, and the National Aeronautics and Space Administration (US) launched the Orbiting Solar Observatory (OSO) to obtain UV, X-ray, and gamma-ray spectra in 1962.

    An orbiting solar telescope was launched in 1962 by the United Kingdom as part of the Ariel space program, and in 1966 NASA launched the first Orbiting Astronomical Observatory (OAO) mission. OAO-1’s battery failed after three days, terminating the mission. It was followed by OAO-2, which carried out ultraviolet observations of stars and galaxies from its launch in 1968 until 1972, well beyond its original planned lifetime of one year.

    The OSO and OAO missions demonstrated the important role space-based observations could play in astronomy. In 1968, NASA developed firm plans for a space-based reflecting telescope with a mirror 3 m (9.8 ft) in diameter, known provisionally as the Large Orbiting Telescope or Large Space Telescope (LST), with a launch slated for 1979. These plans emphasized the need for crewed maintenance missions to the telescope to ensure such a costly program had a lengthy working life, and the concurrent development of plans for the reusable Space Shuttle indicated that the technology to allow this was soon to become available.

    Quest for funding

    The continuing success of the OAO program encouraged increasingly strong consensus within the astronomical community that the LST should be a major goal. In 1970, NASA established two committees, one to plan the engineering side of the space telescope project, and the other to determine the scientific goals of the mission. Once these had been established, the next hurdle for NASA was to obtain funding for the instrument, which would be far more costly than any Earth-based telescope. The U.S. Congress questioned many aspects of the proposed budget for the telescope and forced cuts in the budget for the planning stages, which at the time consisted of very detailed studies of potential instruments and hardware for the telescope. In 1974, public spending cuts led to Congress deleting all funding for the telescope project.
    In response a nationwide lobbying effort was coordinated among astronomers. Many astronomers met congressmen and senators in person, and large scale letter-writing campaigns were organized. The National Academy of Sciences published a report emphasizing the need for a space telescope, and eventually the Senate agreed to half the budget that had originally been approved by Congress.

    The funding issues led to something of a reduction in the scale of the project, with the proposed mirror diameter reduced from 3 m to 2.4 m, both to cut costs and to allow a more compact and effective configuration for the telescope hardware. A proposed precursor 1.5 m (4.9 ft) space telescope to test the systems to be used on the main satellite was dropped, and budgetary concerns also prompted collaboration with the European Space Agency. ESA agreed to provide funding and supply one of the first generation instruments for the telescope, as well as the solar cells that would power it, and staff to work on the telescope in the United States, in return for European astronomers being guaranteed at least 15% of the observing time on the telescope. Congress eventually approved funding of US$36 million for 1978, and the design of the LST began in earnest, aiming for a launch date of 1983. In 1983 the telescope was named after Edwin Hubble, who confirmed one of the greatest scientific discoveries of the 20th century, made by Georges Lemaître, that the universe is expanding.

    Construction and engineering

    Once the Space Telescope project had been given the go-ahead, work on the program was divided among many institutions. NASA Marshall Space Flight Center (MSFC) was given responsibility for the design, development, and construction of the telescope, while Goddard Space Flight Center was given overall control of the scientific instruments and ground-control center for the mission. MSFC commissioned the optics company Perkin-Elmer to design and build the Optical Telescope Assembly (OTA) and Fine Guidance Sensors for the space telescope. Lockheed was commissioned to construct and integrate the spacecraft in which the telescope would be housed.

    Optical Telescope Assembly

    Optically, the HST is a Cassegrain reflector of Ritchey–Chrétien design, as are most large professional telescopes. This design, with two hyperbolic mirrors, is known for good imaging performance over a wide field of view, with the disadvantage that the mirrors have shapes that are hard to fabricate and test. The mirror and optical systems of the telescope determine the final performance, and they were designed to exacting specifications. Optical telescopes typically have mirrors polished to an accuracy of about a tenth of the wavelength of visible light, but the Space Telescope was to be used for observations from the visible through the ultraviolet (shorter wavelengths) and was specified to be diffraction limited to take full advantage of the space environment. Therefore, its mirror needed to be polished to an accuracy of 10 nanometers, or about 1/65 of the wavelength of red light. On the long wavelength end, the OTA was not designed with optimum IR performance in mind—for example, the mirrors are kept at stable (and warm, about 15 °C) temperatures by heaters. This limits Hubble’s performance as an infrared telescope.

    Perkin-Elmer intended to use custom-built and extremely sophisticated computer-controlled polishing machines to grind the mirror to the required shape. However, in case their cutting-edge technology ran into difficulties, NASA demanded that PE sub-contract to Kodak to construct a back-up mirror using traditional mirror-polishing techniques. (The team of Kodak and Itek also bid on the original mirror polishing work. Their bid called for the two companies to double-check each other’s work, which would have almost certainly caught the polishing error that later caused such problems.) The Kodak mirror is now on permanent display at the National Air and Space Museum. An Itek mirror built as part of the effort is now used in the 2.4 m telescope at the Magdalena Ridge Observatory.

    Construction of the Perkin-Elmer mirror began in 1979, starting with a blank manufactured by Corning from their ultra-low expansion glass. To keep the mirror’s weight to a minimum it consisted of top and bottom plates, each one inch (25 mm) thick, sandwiching a honeycomb lattice. Perkin-Elmer simulated microgravity by supporting the mirror from the back with 130 rods that exerted varying amounts of force. This ensured the mirror’s final shape would be correct and to specification when finally deployed. Mirror polishing continued until May 1981. NASA reports at the time questioned Perkin-Elmer’s managerial structure, and the polishing began to slip behind schedule and over budget. To save money, NASA halted work on the back-up mirror and put the launch date of the telescope back to October 1984. The mirror was completed by the end of 1981; it was washed using 2,400 US gallons (9,100 L) of hot, deionized water and then received a reflective coating of 65 nm-thick aluminum and a protective coating of 25 nm-thick magnesium fluoride.

    Doubts continued to be expressed about Perkin-Elmer’s competence on a project of this importance, as their budget and timescale for producing the rest of the OTA continued to inflate. In response to a schedule described as “unsettled and changing daily”, NASA postponed the launch date of the telescope until April 1985. Perkin-Elmer’s schedules continued to slip at a rate of about one month per quarter, and at times delays reached one day for each day of work. NASA was forced to postpone the launch date until March and then September 1986. By this time, the total project budget had risen to US$1.175 billion.

    Spacecraft systems

    The spacecraft in which the telescope and instruments were to be housed was another major engineering challenge. It would have to withstand frequent passages from direct sunlight into the darkness of Earth’s shadow, which would cause major changes in temperature, while being stable enough to allow extremely accurate pointing of the telescope. A shroud of multi-layer insulation keeps the temperature within the telescope stable and surrounds a light aluminum shell in which the telescope and instruments sit. Within the shell, a graphite-epoxy frame keeps the working parts of the telescope firmly aligned. Because graphite composites are hygroscopic, there was a risk that water vapor absorbed by the truss while in Lockheed’s clean room would later be expressed in the vacuum of space; resulting in the telescope’s instruments being covered by ice. To reduce that risk, a nitrogen gas purge was performed before launching the telescope into space.

    While construction of the spacecraft in which the telescope and instruments would be housed proceeded somewhat more smoothly than the construction of the OTA, Lockheed still experienced some budget and schedule slippage, and by the summer of 1985, construction of the spacecraft was 30% over budget and three months behind schedule. An MSFC report said Lockheed tended to rely on NASA directions rather than take their own initiative in the construction.

    Computer systems and data processing

    The two initial, primary computers on the HST were the 1.25 MHz DF-224 system, built by Rockwell Autonetics, which contained three redundant CPUs, and two redundant NSSC-1 (NASA Standard Spacecraft Computer, Model 1) systems, developed by Westinghouse and GSFC using diode–transistor logic (DTL). A co-processor for the DF-224 was added during Servicing Mission 1 in 1993, which consisted of two redundant strings of an Intel-based 80386 processor with an 80387 math co-processor. The DF-224 and its 386 co-processor were replaced by a 25 MHz Intel-based 80486 processor system during Servicing Mission 3A in 1999. The new computer is 20 times faster, with six times more memory, than the DF-224 it replaced. It increases throughput by moving some computing tasks from the ground to the spacecraft and saves money by allowing the use of modern programming languages.

    Additionally, some of the science instruments and components had their own embedded microprocessor-based control systems. The MATs (Multiple Access Transponder) components, MAT-1 and MAT-2, utilize Hughes Aircraft CDP1802CD microprocessors. The Wide Field and Planetary Camera (WFPC) also utilized an RCA 1802 microprocessor (or possibly the older 1801 version). The WFPC-1 was replaced by the WFPC-2 [below] during Servicing Mission 1 in 1993, which was then replaced by the Wide Field Camera 3 (WFC3) [below] during Servicing Mission 4 in 2009.

    Initial instruments

    When launched, the HST carried five scientific instruments: the Wide Field and Planetary Camera (WF/PC), Goddard High Resolution Spectrograph (GHRS), High Speed Photometer (HSP), Faint Object Camera (FOC) and the Faint Object Spectrograph (FOS). WF/PC was a high-resolution imaging device primarily intended for optical observations. It was built by NASA JPL-Caltech(US), and incorporated a set of 48 filters isolating spectral lines of particular astrophysical interest. The instrument contained eight charge-coupled device (CCD) chips divided between two cameras, each using four CCDs. Each CCD has a resolution of 0.64 megapixels. The wide field camera (WFC) covered a large angular field at the expense of resolution, while the planetary camera (PC) took images at a longer effective focal length than the WF chips, giving it a greater magnification.

    The GHRS was a spectrograph designed to operate in the ultraviolet. It was built by the Goddard Space Flight Center and could achieve a spectral resolution of 90,000. Also optimized for ultraviolet observations were the FOC and FOS, which were capable of the highest spatial resolution of any instruments on Hubble. Rather than CCDs these three instruments used photon-counting digicons as their detectors. The FOC was constructed by ESA, while the University of California, San Diego(US), and Martin Marietta Corporation built the FOS.

    The final instrument was the HSP, designed and built at the University of Wisconsin–Madison(US). It was optimized for visible and ultraviolet light observations of variable stars and other astronomical objects varying in brightness. It could take up to 100,000 measurements per second with a photometric accuracy of about 2% or better.

    HST’s guidance system can also be used as a scientific instrument. Its three Fine Guidance Sensors (FGS) are primarily used to keep the telescope accurately pointed during an observation, but can also be used to carry out extremely accurate astrometry; measurements accurate to within 0.0003 arcseconds have been achieved.

    Ground support

    The Space Telescope Science Institute (STScI) is responsible for the scientific operation of the telescope and the delivery of data products to astronomers. STScI is operated by the Association of Universities for Research in Astronomy(US) (AURA) and is physically located in Baltimore, Maryland on the Homewood campus of Johns Hopkins University(US), one of the 39 U.S. universities and seven international affiliates that make up the AURA consortium. STScI was established in 1981 after something of a power struggle between NASA and the scientific community at large. NASA had wanted to keep this function in-house, but scientists wanted it to be based in an academic establishment. The Space Telescope European Coordinating Facility (ST-ECF), established at Garching bei München near Munich in 1984, provided similar support for European astronomers until 2011, when these activities were moved to the European Space Astronomy Centre.

    One rather complex task that falls to STScI is scheduling observations for the telescope. Hubble is in a low-Earth orbit to enable servicing missions, but this means most astronomical targets are occulted by the Earth for slightly less than half of each orbit. Observations cannot take place when the telescope passes through the South Atlantic Anomaly due to elevated radiation levels, and there are also sizable exclusion zones around the Sun (precluding observations of Mercury), Moon and Earth. The solar avoidance angle is about 50°, to keep sunlight from illuminating any part of the OTA. Earth and Moon avoidance keeps bright light out of the FGSs, and keeps scattered light from entering the instruments. If the FGSs are turned off, the Moon and Earth can be observed. Earth observations were used very early in the program to generate flat-fields for the WFPC1 instrument. There is a so-called continuous viewing zone (CVZ), at roughly 90° to the plane of Hubble’s orbit, in which targets are not occulted for long periods.

    Challenger disaster, delays, and eventual launch

    By January 1986, the planned launch date of October looked feasible, but the Challenger explosion brought the U.S. space program to a halt, grounding the Shuttle fleet and forcing the launch of Hubble to be postponed for several years. The telescope had to be kept in a clean room, powered up and purged with nitrogen, until a launch could be rescheduled. This costly situation (about US$6 million per month) pushed the overall costs of the project even higher. This delay did allow time for engineers to perform extensive tests, swap out a possibly failure-prone battery, and make other improvements. Furthermore, the ground software needed to control Hubble was not ready in 1986, and was barely ready by the 1990 launch.

    Eventually, following the resumption of shuttle flights in 1988, the launch of the telescope was scheduled for 1990. On April 24, 1990, Space Shuttle Discovery successfully launched it during the STS-31 mission.

    From its original total cost estimate of about US$400 million, the telescope cost about US$4.7 billion by the time of its launch. Hubble’s cumulative costs were estimated to be about US$10 billion in 2010, twenty years after launch.

    List of Hubble instruments

    Hubble accommodates five science instruments at a given time, plus the Fine Guidance Sensors, which are mainly used for aiming the telescope but are occasionally used for scientific astrometry measurements. Early instruments were replaced with more advanced ones during the Shuttle servicing missions. COSTAR was a corrective optics device rather than a science instrument, but occupied one of the five instrument bays.
    Since the final servicing mission in 2009, the four active instruments have been ACS, COS, STIS and WFC3. NICMOS is kept in hibernation, but may be revived if WFC3 were to fail in the future.

    Advanced Camera for Surveys (ACS; 2002–present)
    Cosmic Origins Spectrograph (COS; 2009–present)
    Corrective Optics Space Telescope Axial Replacement (COSTAR; 1993–2009)
    Faint Object Camera (FOC; 1990–2002)
    Faint Object Spectrograph (FOS; 1990–1997)
    Fine Guidance Sensor (FGS; 1990–present)
    Goddard High Resolution Spectrograph (GHRS/HRS; 1990–1997)
    High Speed Photometer (HSP; 1990–1993)
    Near Infrared Camera and Multi-Object Spectrometer (NICMOS; 1997–present, hibernating since 2008)
    Space Telescope Imaging Spectrograph (STIS; 1997–present (non-operative 2004–2009))
    Wide Field and Planetary Camera (WFPC; 1990–1993)
    Wide Field and Planetary Camera 2 (WFPC2; 1993–2009)
    Wide Field Camera 3 (WFC3; 2009–present)

    Of the former instruments, three (COSTAR, FOS and WFPC2) are displayed in the Smithsonian National Air and Space Museum. The FOC is in the Dornier museum, Germany. The HSP is in the Space Place at the University of Wisconsin–Madison. The first WFPC was dismantled, and some components were then re-used in WFC3.

    Flawed mirror

    Within weeks of the launch of the telescope, the returned images indicated a serious problem with the optical system. Although the first images appeared to be sharper than those of ground-based telescopes, Hubble failed to achieve a final sharp focus and the best image quality obtained was drastically lower than expected. Images of point sources spread out over a radius of more than one arcsecond, instead of having a point spread function (PSF) concentrated within a circle 0.1 arcseconds (485 nrad) in diameter, as had been specified in the design criteria.

    Analysis of the flawed images revealed that the primary mirror had been polished to the wrong shape. Although it was believed to be one of the most precisely figured optical mirrors ever made, smooth to about 10 nanometers, the outer perimeter was too flat by about 2200 nanometers (about 1⁄450 mm or 1⁄11000 inch). This difference was catastrophic, introducing severe spherical aberration, a flaw in which light reflecting off the edge of a mirror focuses on a different point from the light reflecting off its center.

    The effect of the mirror flaw on scientific observations depended on the particular observation—the core of the aberrated PSF was sharp enough to permit high-resolution observations of bright objects, and spectroscopy of point sources was affected only through a sensitivity loss. However, the loss of light to the large, out-of-focus halo severely reduced the usefulness of the telescope for faint objects or high-contrast imaging. This meant nearly all the cosmological programs were essentially impossible, since they required observation of exceptionally faint objects. This led politicians to question NASA’s competence, scientists to rue the cost which could have gone to more productive endeavors, and comedians to make jokes about NASA and the telescope − in the 1991 comedy The Naked Gun 2½: The Smell of Fear, in a scene where historical disasters are displayed, Hubble is pictured with RMS Titanic and LZ 129 Hindenburg. Nonetheless, during the first three years of the Hubble mission, before the optical corrections, the telescope still carried out a large number of productive observations of less demanding targets. The error was well characterized and stable, enabling astronomers to partially compensate for the defective mirror by using sophisticated image processing techniques such as deconvolution.

    Origin of the problem

    A commission headed by Lew Allen, director of the Jet Propulsion Laboratory, was established to determine how the error could have arisen. The Allen Commission found that a reflective null corrector, a testing device used to achieve a properly shaped non-spherical mirror, had been incorrectly assembled—one lens was out of position by 1.3 mm (0.051 in). During the initial grinding and polishing of the mirror, Perkin-Elmer analyzed its surface with two conventional refractive null correctors. However, for the final manufacturing step (figuring), they switched to the custom-built reflective null corrector, designed explicitly to meet very strict tolerances. The incorrect assembly of this device resulted in the mirror being ground very precisely but to the wrong shape. A few final tests, using the conventional null correctors, correctly reported spherical aberration. But these results were dismissed, thus missing the opportunity to catch the error, because the reflective null corrector was considered more accurate.

    The commission blamed the failings primarily on Perkin-Elmer. Relations between NASA and the optics company had been severely strained during the telescope construction, due to frequent schedule slippage and cost overruns. NASA found that Perkin-Elmer did not review or supervise the mirror construction adequately, did not assign its best optical scientists to the project (as it had for the prototype), and in particular did not involve the optical designers in the construction and verification of the mirror. While the commission heavily criticized Perkin-Elmer for these managerial failings, NASA was also criticized for not picking up on the quality control shortcomings, such as relying totally on test results from a single instrument.

    Design of a solution

    Many feared that Hubble would be abandoned. The design of the telescope had always incorporated servicing missions, and astronomers immediately began to seek potential solutions to the problem that could be applied at the first servicing mission, scheduled for 1993. While Kodak had ground a back-up mirror for Hubble, it would have been impossible to replace the mirror in orbit, and too expensive and time-consuming to bring the telescope back to Earth for a refit. Instead, the fact that the mirror had been ground so precisely to the wrong shape led to the design of new optical components with exactly the same error but in the opposite sense, to be added to the telescope at the servicing mission, effectively acting as “spectacles” to correct the spherical aberration.

    The first step was a precise characterization of the error in the main mirror. Working backwards from images of point sources, astronomers determined that the conic constant of the mirror as built was −1.01390±0.0002, instead of the intended −1.00230. The same number was also derived by analyzing the null corrector used by Perkin-Elmer to figure the mirror, as well as by analyzing interferograms obtained during ground testing of the mirror.

    Because of the way the HST’s instruments were designed, two different sets of correctors were required. The design of the Wide Field and Planetary Camera 2, already planned to replace the existing WF/PC, included relay mirrors to direct light onto the four separate charge-coupled device (CCD) chips making up its two cameras. An inverse error built into their surfaces could completely cancel the aberration of the primary. However, the other instruments lacked any intermediate surfaces that could be figured in this way, and so required an external correction device.

    The Corrective Optics Space Telescope Axial Replacement (COSTAR) system was designed to correct the spherical aberration for light focused at the FOC, FOS, and GHRS. It consists of two mirrors in the light path with one ground to correct the aberration. To fit the COSTAR system onto the telescope, one of the other instruments had to be removed, and astronomers selected the High Speed Photometer to be sacrificed. By 2002, all the original instruments requiring COSTAR had been replaced by instruments with their own corrective optics. COSTAR was removed and returned to Earth in 2009 where it is exhibited at the National Air and Space Museum. The area previously used by COSTAR is now occupied by the Cosmic Origins Spectrograph.

    Servicing missions and new instruments

    Servicing Mission 1

    The first Hubble serving mission was scheduled for 1993 before the mirror problem was discovered. It assumed greater importance, as the astronauts would need to do extensive work to install corrective optics; failure would have resulted in either abandoning Hubble or accepting its permanent disability. Other components failed before the mission, causing the repair cost to rise to $500 million (not including the cost of the shuttle flight). A successful repair would help demonstrate the viability of building Space Station Alpha, however.

    STS-49 in 1992 demonstrated the difficulty of space work. While its rescue of Intelsat 603 received praise, the astronauts had taken possibly reckless risks in doing so. Neither the rescue nor the unrelated assembly of prototype space station components occurred as the astronauts had trained, causing NASA to reassess planning and training, including for the Hubble repair. The agency assigned to the mission Story Musgrave—who had worked on satellite repair procedures since 1976—and six other experienced astronauts, including two from STS-49. The first mission director since Project Apollo would coordinate a crew with 16 previous shuttle flights. The astronauts were trained to use about a hundred specialized tools.

    Heat had been the problem on prior spacewalks, which occurred in sunlight. Hubble needed to be repaired out of sunlight. Musgrave discovered during vacuum training, seven months before the mission, that spacesuit gloves did not sufficiently protect against the cold of space. After STS-57 confirmed the issue in orbit, NASA quickly changed equipment, procedures, and flight plan. Seven total mission simulations occurred before launch, the most thorough preparation in shuttle history. No complete Hubble mockup existed, so the astronauts studied many separate models (including one at the Smithsonian) and mentally combined their varying and contradictory details. Service Mission 1 flew aboard Endeavour in December 1993, and involved installation of several instruments and other equipment over ten days.

    Most importantly, the High Speed Photometer was replaced with the COSTAR corrective optics package, and WFPC was replaced with the Wide Field and Planetary Camera 2 (WFPC2) with an internal optical correction system. The solar arrays and their drive electronics were also replaced, as well as four gyroscopes in the telescope pointing system, two electrical control units and other electrical components, and two magnetometers. The onboard computers were upgraded with added coprocessors, and Hubble’s orbit was boosted.

    On January 13, 1994, NASA declared the mission a complete success and showed the first sharper images. The mission was one of the most complex performed up until that date, involving five long extra-vehicular activity periods. Its success was a boon for NASA, as well as for the astronomers who now had a more capable space telescope.

    Servicing Mission 2

    Servicing Mission 2, flown by Discovery in February 1997, replaced the GHRS and the FOS with the Space Telescope Imaging Spectrograph (STIS) and the Near Infrared Camera and Multi-Object Spectrometer (NICMOS), replaced an Engineering and Science Tape Recorder with a new Solid State Recorder, and repaired thermal insulation. NICMOS contained a heat sink of solid nitrogen to reduce the thermal noise from the instrument, but shortly after it was installed, an unexpected thermal expansion resulted in part of the heat sink coming into contact with an optical baffle. This led to an increased warming rate for the instrument and reduced its original expected lifetime of 4.5 years to about two years.

    Servicing Mission 3A

    Servicing Mission 3A, flown by Discovery, took place in December 1999, and was a split-off from Servicing Mission 3 after three of the six onboard gyroscopes had failed. The fourth failed a few weeks before the mission, rendering the telescope incapable of performing scientific observations. The mission replaced all six gyroscopes, replaced a Fine Guidance Sensor and the computer, installed a Voltage/temperature Improvement Kit (VIK) to prevent battery overcharging, and replaced thermal insulation blankets.

    Servicing Mission 3B

    Servicing Mission 3B flown by Columbia in March 2002 saw the installation of a new instrument, with the FOC (which, except for the Fine Guidance Sensors when used for astrometry, was the last of the original instruments) being replaced by the Advanced Camera for Surveys (ACS). This meant COSTAR was no longer required, since all new instruments had built-in correction for the main mirror aberration. The mission also revived NICMOS by installing a closed-cycle cooler and replaced the solar arrays for the second time, providing 30 percent more power.

    Servicing Mission 4

    Plans called for Hubble to be serviced in February 2005, but the Columbia disaster in 2003, in which the orbiter disintegrated on re-entry into the atmosphere, had wide-ranging effects on the Hubble program. NASA Administrator Sean O’Keefe decided all future shuttle missions had to be able to reach the safe haven of the International Space Station should in-flight problems develop. As no shuttles were capable of reaching both HST and the space station during the same mission, future crewed service missions were canceled. This decision was criticised by numerous astronomers who felt Hubble was valuable enough to merit the human risk. HST’s planned successor, the James Webb Telescope (JWST), as of 2004 was not expected to launch until at least 2011. A gap in space-observing capabilities between a decommissioning of Hubble and the commissioning of a successor was of major concern to many astronomers, given the significant scientific impact of HST. The consideration that JWST will not be located in low Earth orbit, and therefore cannot be easily upgraded or repaired in the event of an early failure, only made concerns more acute. On the other hand, many astronomers felt strongly that servicing Hubble should not take place if the expense were to come from the JWST budget.

    In January 2004, O’Keefe said he would review his decision to cancel the final servicing mission to HST, due to public outcry and requests from Congress for NASA to look for a way to save it. The National Academy of Sciences convened an official panel, which recommended in July 2004 that the HST should be preserved despite the apparent risks. Their report urged “NASA should take no actions that would preclude a space shuttle servicing mission to the Hubble Space Telescope”. In August 2004, O’Keefe asked Goddard Space Flight Center to prepare a detailed proposal for a robotic service mission. These plans were later canceled, the robotic mission being described as “not feasible”. In late 2004, several Congressional members, led by Senator Barbara Mikulski, held public hearings and carried on a fight with much public support (including thousands of letters from school children across the U.S.) to get the Bush Administration and NASA to reconsider the decision to drop plans for a Hubble rescue mission.

    The nomination in April 2005 of a new NASA Administrator, Michael D. Griffin, changed the situation, as Griffin stated he would consider a crewed servicing mission. Soon after his appointment Griffin authorized Goddard to proceed with preparations for a crewed Hubble maintenance flight, saying he would make the final decision after the next two shuttle missions. In October 2006 Griffin gave the final go-ahead, and the 11-day mission by Atlantis was scheduled for October 2008. Hubble’s main data-handling unit failed in September 2008, halting all reporting of scientific data until its back-up was brought online on October 25, 2008. Since a failure of the backup unit would leave the HST helpless, the service mission was postponed to incorporate a replacement for the primary unit.

    Servicing Mission 4 (SM4), flown by Atlantis in May 2009, was the last scheduled shuttle mission for HST. SM4 installed the replacement data-handling unit, repaired the ACS and STIS systems, installed improved nickel hydrogen batteries, and replaced other components including all six gyroscopes. SM4 also installed two new observation instruments—Wide Field Camera 3 (WFC3) and the Cosmic Origins Spectrograph (COS)—and the Soft Capture and Rendezvous System, which will enable the future rendezvous, capture, and safe disposal of Hubble by either a crewed or robotic mission. Except for the ACS’s High Resolution Channel, which could not be repaired and was disabled, the work accomplished during SM4 rendered the telescope fully functional.

    Major projects

    Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey [CANDELS]

    The survey “aims to explore galactic evolution in the early Universe, and the very first seeds of cosmic structure at less than one billion years after the Big Bang.” The CANDELS project site describes the survey’s goals as the following:

    The Cosmic Assembly Near-IR Deep Extragalactic Legacy Survey is designed to document the first third of galactic evolution from z = 8 to 1.5 via deep imaging of more than 250,000 galaxies with WFC3/IR and ACS. It will also find the first Type Ia SNe beyond z > 1.5 and establish their accuracy as standard candles for cosmology. Five premier multi-wavelength sky regions are selected; each has multi-wavelength data from Spitzer and other facilities, and has extensive spectroscopy of the brighter galaxies. The use of five widely separated fields mitigates cosmic variance and yields statistically robust and complete samples of galaxies down to 109 solar masses out to z ~ 8.

    Frontier Fields program

    The program, officially named Hubble Deep Fields Initiative 2012, is aimed to advance the knowledge of early galaxy formation by studying high-redshift galaxies in blank fields with the help of gravitational lensing to see the “faintest galaxies in the distant universe”. The Frontier Fields web page describes the goals of the program being:

    To reveal hitherto inaccessible populations of z = 5–10 galaxies that are ten to fifty times fainter intrinsically than any presently known
    To solidify our understanding of the stellar masses and star formation histories of sub-L* galaxies at the earliest times
    To provide the first statistically meaningful morphological characterization of star forming galaxies at z > 5
    To find z > 8 galaxies stretched out enough by cluster lensing to discern internal structure and/or magnified enough by cluster lensing for spectroscopic follow-up.

    Cosmic Evolution Survey (COSMOS)

    The Cosmic Evolution Survey (COSMOS) is an astronomical survey designed to probe the formation and evolution of galaxies as a function of both cosmic time (redshift) and the local galaxy environment. The survey covers a two square degree equatorial field with spectroscopy and X-ray to radio imaging by most of the major space-based telescopes and a number of large ground based telescopes, making it a key focus region of extragalactic astrophysics. COSMOS was launched in 2006 as the largest project pursued by the Hubble Space Telescope at the time, and still is the largest continuous area of sky covered for the purposes of mapping deep space in blank fields, 2.5 times the area of the moon on the sky and 17 times larger than the largest of the CANDELS regions. The COSMOS scientific collaboration that was forged from the initial COSMOS survey is the largest and longest-running extragalactic collaboration, known for its collegiality and openness. The study of galaxies in their environment can be done only with large areas of the sky, larger than a half square degree. More than two million galaxies are detected, spanning 90% of the age of the Universe. The COSMOS collaboration is led by Caitlin Casey, Jeyhan Kartaltepe, and Vernesa Smolcic and involves more than 200 scientists in a dozen countries.

    Important discoveries

    Hubble has helped resolve some long-standing problems in astronomy, while also raising new questions. Some results have required new theories to explain them.

    Age of the universe

    Among its primary mission targets was to measure distances to Cepheid variable stars more accurately than ever before, and thus constrain the value of the Hubble constant, the measure of the rate at which the universe is expanding, which is also related to its age. Before the launch of HST, estimates of the Hubble constant typically had errors of up to 50%, but Hubble measurements of Cepheid variables in the Virgo Cluster and other distant galaxy clusters provided a measured value with an accuracy of ±10%, which is consistent with other more accurate measurements made since Hubble’s launch using other techniques. The estimated age is now about 13.7 billion years, but before the Hubble Telescope, scientists predicted an age ranging from 10 to 20 billion years.

    Expansion of the universe

    While Hubble helped to refine estimates of the age of the universe, it also cast doubt on theories about its future. Astronomers from the High-z Supernova Search Team and the Supernova Cosmology Project used ground-based telescopes and HST to observe distant supernovae and uncovered evidence that, far from decelerating under the influence of gravity, the expansion of the universe may in fact be accelerating. Three members of these two groups have subsequently been awarded Nobel Prizes for their discovery.

    Saul Perlmutter [The Supernova Cosmology Project] shared the 2006 Shaw Prize in Astronomy, the 2011 Nobel Prize in Physics, and the 2015 Breakthrough Prize in Fundamental Physics with Brian P. Schmidt and Adam Riess [The High-z Supernova Search Team] for providing evidence that the expansion of the universe is accelerating.

    The cause of this acceleration remains poorly understood; the most common cause attributed is Dark Energy.

    Black holes

    The high-resolution spectra and images provided by the HST have been especially well-suited to establishing the prevalence of black holes in the center of nearby galaxies. While it had been hypothesized in the early 1960s that black holes would be found at the centers of some galaxies, and astronomers in the 1980s identified a number of good black hole candidates, work conducted with Hubble shows that black holes are probably common to the centers of all galaxies. The Hubble programs further established that the masses of the nuclear black holes and properties of the galaxies are closely related. The legacy of the Hubble programs on black holes in galaxies is thus to demonstrate a deep connection between galaxies and their central black holes.

    Extending visible wavelength images

    A unique window on the Universe enabled by Hubble are the Hubble Deep Field, Hubble Ultra-Deep Field, and Hubble Extreme Deep Field images, which used Hubble’s unmatched sensitivity at visible wavelengths to create images of small patches of sky that are the deepest ever obtained at optical wavelengths. The images reveal galaxies billions of light years away, and have generated a wealth of scientific papers, providing a new window on the early Universe. The Wide Field Camera 3 improved the view of these fields in the infrared and ultraviolet, supporting the discovery of some of the most distant objects yet discovered, such as MACS0647-JD.

    The non-standard object SCP 06F6 was discovered by the Hubble Space Telescope in February 2006.

    On March 3, 2016, researchers using Hubble data announced the discovery of the farthest known galaxy to date: GN-z11. The Hubble observations occurred on February 11, 2015, and April 3, 2015, as part of the CANDELS/GOODS-North surveys.

    Solar System discoveries

    HST has also been used to study objects in the outer reaches of the Solar System, including the dwarf planets Pluto and Eris.

    The collision of Comet Shoemaker-Levy 9 with Jupiter in 1994 was fortuitously timed for astronomers, coming just a few months after Servicing Mission 1 had restored Hubble’s optical performance. Hubble images of the planet were sharper than any taken since the passage of Voyager 2 in 1979, and were crucial in studying the dynamics of the collision of a comet with Jupiter, an event believed to occur once every few centuries.

    During June and July 2012, U.S. astronomers using Hubble discovered Styx, a tiny fifth moon orbiting Pluto.

    In March 2015, researchers announced that measurements of aurorae around Ganymede, one of Jupiter’s moons, revealed that it has a subsurface ocean. Using Hubble to study the motion of its aurorae, the researchers determined that a large saltwater ocean was helping to suppress the interaction between Jupiter’s magnetic field and that of Ganymede. The ocean is estimated to be 100 km (60 mi) deep, trapped beneath a 150 km (90 mi) ice crust.

    From June to August 2015, Hubble was used to search for a Kuiper belt object (KBO) target for the New Horizons Kuiper Belt Extended Mission (KEM) when similar searches with ground telescopes failed to find a suitable target.

    This resulted in the discovery of at least five new KBOs, including the eventual KEM target, 486958 Arrokoth, that New Horizons performed a close fly-by of on January 1, 2019.

    In August 2020, taking advantage of a total lunar eclipse, astronomers using NASA’s Hubble Space Telescope have detected Earth’s own brand of sunscreen – ozone – in our atmosphere. This method simulates how astronomers and astrobiology researchers will search for evidence of life beyond Earth by observing potential “biosignatures” on exoplanets (planets around other stars).
    Hubble and ALMA image of MACS J1149.5+2223.

    Supernova reappearance

    On December 11, 2015, Hubble captured an image of the first-ever predicted reappearance of a supernova, dubbed “Refsdal”, which was calculated using different mass models of a galaxy cluster whose gravity is warping the supernova’s light. The supernova was previously seen in November 2014 behind galaxy cluster MACS J1149.5+2223 as part of Hubble’s Frontier Fields program. Astronomers spotted four separate images of the supernova in an arrangement known as an “Einstein Cross”.

    The light from the cluster has taken about five billion years to reach Earth, though the supernova exploded some 10 billion years ago. Based on early lens models, a fifth image was predicted to reappear by the end of 2015. The detection of Refsdal’s reappearance in December 2015 served as a unique opportunity for astronomers to test their models of how mass, especially dark matter, is distributed within this galaxy cluster.

    Impact on astronomy

    Many objective measures show the positive impact of Hubble data on astronomy. Over 15,000 papers based on Hubble data have been published in peer-reviewed journals, and countless more have appeared in conference proceedings. Looking at papers several years after their publication, about one-third of all astronomy papers have no citations, while only two percent of papers based on Hubble data have no citations. On average, a paper based on Hubble data receives about twice as many citations as papers based on non-Hubble data. Of the 200 papers published each year that receive the most citations, about 10% are based on Hubble data.

    Although the HST has clearly helped astronomical research, its financial cost has been large. A study on the relative astronomical benefits of different sizes of telescopes found that while papers based on HST data generate 15 times as many citations as a 4 m (13 ft) ground-based telescope such as the William Herschel Telescope, the HST costs about 100 times as much to build and maintain.

    Deciding between building ground- versus space-based telescopes is complex. Even before Hubble was launched, specialized ground-based techniques such as aperture masking interferometry had obtained higher-resolution optical and infrared images than Hubble would achieve, though restricted to targets about 108 times brighter than the faintest targets observed by Hubble. Since then, advances in “adaptive optics” have extended the high-resolution imaging capabilities of ground-based telescopes to the infrared imaging of faint objects.

    The usefulness of adaptive optics versus HST observations depends strongly on the particular details of the research questions being asked. In the visible bands, adaptive optics can correct only a relatively small field of view, whereas HST can conduct high-resolution optical imaging over a wide field. Only a small fraction of astronomical objects are accessible to high-resolution ground-based imaging; in contrast Hubble can perform high-resolution observations of any part of the night sky, and on objects that are extremely faint.

    Impact on aerospace engineering

    In addition to its scientific results, Hubble has also made significant contributions to aerospace engineering, in particular the performance of systems in low Earth orbit. These insights result from Hubble’s long lifetime on orbit, extensive instrumentation, and return of assemblies to the Earth where they can be studied in detail. In particular, Hubble has contributed to studies of the behavior of graphite composite structures in vacuum, optical contamination from residual gas and human servicing, radiation damage to electronics and sensors, and the long term behavior of multi-layer insulation. One lesson learned was that gyroscopes assembled using pressurized oxygen to deliver suspension fluid were prone to failure due to electric wire corrosion. Gyroscopes are now assembled using pressurized nitrogen. Another is that optical surfaces in LEO can have surprisingly long lifetimes; Hubble was only expected to last 15 years before the mirror became unusable, but after 14 years there was no measureable degradation. Finally, Hubble servicing missions, particularly those that serviced components not designed for in-space maintenance, have contributed towards the development of new tools and techniques for on-orbit repair.

    Archives

    All Hubble data is eventually made available via the Mikulski Archive for Space Telescopes at STScI, CADC and ESA/ESAC. Data is usually proprietary—available only to the principal investigator (PI) and astronomers designated by the PI—for twelve months after being taken. The PI can apply to the director of the STScI to extend or reduce the proprietary period in some circumstances.

    Observations made on Director’s Discretionary Time are exempt from the proprietary period, and are released to the public immediately. Calibration data such as flat fields and dark frames are also publicly available straight away. All data in the archive is in the FITS format, which is suitable for astronomical analysis but not for public use. The Hubble Heritage Project processes and releases to the public a small selection of the most striking images in JPEG and TIFF formats.

    Outreach activities

    It has always been important for the Space Telescope to capture the public’s imagination, given the considerable contribution of taxpayers to its construction and operational costs. After the difficult early years when the faulty mirror severely dented Hubble’s reputation with the public, the first servicing mission allowed its rehabilitation as the corrected optics produced numerous remarkable images.

    Several initiatives have helped to keep the public informed about Hubble activities. In the United States, outreach efforts are coordinated by the Space Telescope Science Institute (STScI) Office for Public Outreach, which was established in 2000 to ensure that U.S. taxpayers saw the benefits of their investment in the space telescope program. To that end, STScI operates the HubbleSite.org website. The Hubble Heritage Project, operating out of the STScI, provides the public with high-quality images of the most interesting and striking objects observed. The Heritage team is composed of amateur and professional astronomers, as well as people with backgrounds outside astronomy, and emphasizes the aesthetic nature of Hubble images. The Heritage Project is granted a small amount of time to observe objects which, for scientific reasons, may not have images taken at enough wavelengths to construct a full-color image.

    Since 1999, the leading Hubble outreach group in Europe has been the Hubble European Space Agency Information Centre (HEIC). This office was established at the Space Telescope European Coordinating Facility in Munich, Germany. HEIC’s mission is to fulfill HST outreach and education tasks for the European Space Agency. The work is centered on the production of news and photo releases that highlight interesting Hubble results and images. These are often European in origin, and so increase awareness of both ESA’s Hubble share (15%) and the contribution of European scientists to the observatory. ESA produces educational material, including a videocast series called Hubblecast designed to share world-class scientific news with the public.

    The Hubble Space Telescope has won two Space Achievement Awards from the Space Foundation, for its outreach activities, in 2001 and 2010.

    A replica of the Hubble Space Telescope is on the courthouse lawn in Marshfield, Missouri, the hometown of namesake Edwin P. Hubble.

    Major Instrumentation

    Hubble WFPC2 no longer in service.

    Wide Field Camera 3 [WFC3]

    Advanced Camera for Surveys [ACS]

    Cosmic Origins Spectrograph [COS]

    The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA’s Goddard Space Flight Center manages the telescope. The Space Telescope Science Institute (STScI), is a free-standing science center, located on the campus of The Johns Hopkins University and operated by the Association of Universities for Research in Astronomy (AURA) for NASA, conducts Hubble science operations.

    ESA50 Logo large

     
  • richardmitnick 12:28 pm on October 8, 2021 Permalink | Reply
    Tags: A series of ground-based telescopes located on Cerro Toco in northern Chile., , Astronomy, , The Simons Observatory Large Aperture Telescope.   

    From Penn Today : “Penn cosmology team ready to field the largest ever cosmic microwave background camera” 

    From Penn Today

    at

    U Penn bloc

    University of Pennsylvania

    October 7, 2021
    Erica K. Brockmeier

    1
    The shell of the Large Aperture Telescope Receiver (LATR) arrived to Penn’s campus and was installed in the Devlin High Bay in the summer of 2019. Since then, researchers have been addressing key engineering and technical challenges, which range from holding temperatures at very precise stages, maintaining optical alignment, limiting temperature gradients, and minimizing the time it takes to move between temperatures.

    A new study published in The Astrophysical Journal Supplement Series details the design and validation of the Large Aperture Telescope Receiver (LATR), the cryogenic camera that will be the “heart” of the Simons Observatory Large Aperture Telescope.

    Led by Ningfeng Zhu, a Ph.D. candidate working in the lab of Mark Devlin, this paper describes the optimization, construction, and performance of the camera, an essential step towards the final tests and integrations that will allow the LATR to be installed at the observatory in Chile in 2022.

    The Simons Observatory will be comprised of a series of ground-based telescopes located on Cerro Toco in northern Chile. Here, researchers will study cosmic microwave background (CMB), the residual radiation left behind by the Big Bang, to understand more about the evolution of the universe. Coupled with the largest of these telescopes will be the LATR, an instrument that, once completed, will be the largest cryogenic millimeter-wave camera to date and able to map the sky four times faster than the current generation of instruments.

    In order to detect CMB, the detectors inside the LATR, which will accommodate 13 optics tubes that have a total of 62,000 detectors, must be kept incredibly cold, down to 0.1 degrees Kelvin, nearly -459.49 degrees Fahrenheit. The LATR must also be able to cool from room temperature to these incredibly low and precise temperatures very quickly.

    2
    A new study led by Zhu details the design and validation of the cryogenic camera that will be the “heart” of the Simons Observatory Large Aperture Telescope. (Image courtesy of Ningfeng Zhu)

    Zhu, who help build cryogenic cameras for a cosmology project while an undergraduate at The University of Chicago (US), used his previous expertise as he led the overall design of the cryostat. “When I moved here, the Simons Observatory was just starting to get off the ground, and they were looking for people to design this next-generation camera,” Zhu says. “It’s a lot of engineering and technical challenges, but in the end we’re after the science.”

    These engineering and technical challenges include the need to verify that all of the components perform to required technical specifications, which range from maintaining stable temperatures, from 80 Kelvin all the way down to 0.1 Kelvin, maintaining optical alignment, limiting temperature gradients, and minimizing the time it takes to move between temperatures. All of this requires careful design considerations which take into account a variety of inputs, such as the materials being used and the overall size of the LATR.

    In this paper, the researchers detail these design considerations and how they struck a balance to achieve the technical specifications required for the LATR. The paper also details the results of essential validation tests that confirmed that the instrument worked as predicted, a key step in being able to install LATR for on-site testing at the observatory.

    “The testing is still ongoing. We will receive more detectors to test, more components to integrate, and we will be gearing towards getting ready for shipment,” Zhu says. After the LATR is installed in Chile, he says, there will be a series of engineering runs and then, if it passes all the tests, the large telescope will be fully operational. While the original project timelines were delayed due to COVID, the team hopes to be ready to ship the LATR by early 2022.

    The designs detailed in this paper not only allow for future improvements at the Simons Observatory but can also be used to help researchers in other fields, such as quantum computing, that also require instruments that can work at ultra-low temperatures.

    This work was all a massive team effort, Zhu says, and he is excited for the science that will be possible once the telescope is up and running. “We’ll have a level of accuracy that’s never been done before, and with this amount of accuracy there’s a lot of science to be done,” says Zhu.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Penn campus

    Academic life at University of Pennsylvania (US) is unparalleled, with 100 countries and every U.S. state represented in one of the Ivy League’s most diverse student bodies. Consistently ranked among the top 10 universities in the country, Penn enrolls 10,000 undergraduate students and welcomes an additional 10,000 students to our world-renowned graduate and professional schools.

    Penn’s award-winning educators and scholars encourage students to pursue inquiry and discovery, follow their passions, and address the world’s most challenging problems through an interdisciplinary approach.

    The University of Pennsylvania (US) is a private Ivy League research university in Philadelphia, Pennsylvania. The university claims a founding date of 1740 and is one of the nine colonial colleges chartered prior to the U.S. Declaration of Independence. Benjamin Franklin, Penn’s founder and first president, advocated an educational program that trained leaders in commerce, government, and public service, similar to a modern liberal arts curriculum.

    Penn has four undergraduate schools as well as twelve graduate and professional schools. Schools enrolling undergraduates include the College of Arts and Sciences; the School of Engineering and Applied Science; the Wharton School; and the School of Nursing. Penn’s “One University Policy” allows students to enroll in classes in any of Penn’s twelve schools. Among its highly ranked graduate and professional schools are a law school whose first professor wrote the first draft of the United States Constitution, the first school of medicine in North America (Perelman School of Medicine, 1765), and the first collegiate business school (Wharton School, 1881).

    Penn is also home to the first “student union” building and organization (Houston Hall, 1896), the first Catholic student club in North America (Newman Center, 1893), the first double-decker college football stadium (Franklin Field, 1924 when second deck was constructed), and Morris Arboretum, the official arboretum of the Commonwealth of Pennsylvania. The first general-purpose electronic computer (ENIAC) was developed at Penn and formally dedicated in 1946. In 2019, the university had an endowment of $14.65 billion, the sixth-largest endowment of all universities in the United States, as well as a research budget of $1.02 billion. The university’s athletics program, the Quakers, fields varsity teams in 33 sports as a member of the NCAA Division I Ivy League conference.

    As of 2018, distinguished alumni and/or Trustees include three U.S. Supreme Court justices; 32 U.S. senators; 46 U.S. governors; 163 members of the U.S. House of Representatives; eight signers of the Declaration of Independence and seven signers of the U.S. Constitution (four of whom signed both representing two-thirds of the six people who signed both); 24 members of the Continental Congress; 14 foreign heads of state and two presidents of the United States, including Donald Trump. As of October 2019, 36 Nobel laureates; 80 members of the American Academy of Arts and Sciences(US); 64 billionaires; 29 Rhodes Scholars; 15 Marshall Scholars and 16 Pulitzer Prize winners have been affiliated with the university.

    History

    The University of Pennsylvania considers itself the fourth-oldest institution of higher education in the United States, though this is contested by Princeton University(US) and Columbia(US) Universities. The university also considers itself as the first university in the United States with both undergraduate and graduate studies.

    In 1740, a group of Philadelphians joined together to erect a great preaching hall for the traveling evangelist George Whitefield, who toured the American colonies delivering open-air sermons. The building was designed and built by Edmund Woolley and was the largest building in the city at the time, drawing thousands of people the first time it was preached in. It was initially planned to serve as a charity school as well, but a lack of funds forced plans for the chapel and school to be suspended. According to Franklin’s autobiography, it was in 1743 when he first had the idea to establish an academy, “thinking the Rev. Richard Peters a fit person to superintend such an institution”. However, Peters declined a casual inquiry from Franklin and nothing further was done for another six years. In the fall of 1749, now more eager to create a school to educate future generations, Benjamin Franklin circulated a pamphlet titled Proposals Relating to the Education of Youth in Pensilvania, his vision for what he called a “Public Academy of Philadelphia”. Unlike the other colonial colleges that existed in 1749—Harvard University(US), William & Mary(US), Yale Unversity(US), and The College of New Jersey(US)—Franklin’s new school would not focus merely on education for the clergy. He advocated an innovative concept of higher education, one which would teach both the ornamental knowledge of the arts and the practical skills necessary for making a living and doing public service. The proposed program of study could have become the nation’s first modern liberal arts curriculum, although it was never implemented because Anglican priest William Smith (1727-1803), who became the first provost, and other trustees strongly preferred the traditional curriculum.

    Franklin assembled a board of trustees from among the leading citizens of Philadelphia, the first such non-sectarian board in America. At the first meeting of the 24 members of the board of trustees on November 13, 1749, the issue of where to locate the school was a prime concern. Although a lot across Sixth Street from the old Pennsylvania State House (later renamed and famously known since 1776 as “Independence Hall”), was offered without cost by James Logan, its owner, the trustees realized that the building erected in 1740, which was still vacant, would be an even better site. The original sponsors of the dormant building still owed considerable construction debts and asked Franklin’s group to assume their debts and, accordingly, their inactive trusts. On February 1, 1750, the new board took over the building and trusts of the old board. On August 13, 1751, the “Academy of Philadelphia”, using the great hall at 4th and Arch Streets, took in its first secondary students. A charity school also was chartered on July 13, 1753 by the intentions of the original “New Building” donors, although it lasted only a few years. On June 16, 1755, the “College of Philadelphia” was chartered, paving the way for the addition of undergraduate instruction. All three schools shared the same board of trustees and were considered to be part of the same institution. The first commencement exercises were held on May 17, 1757.

    The institution of higher learning was known as the College of Philadelphia from 1755 to 1779. In 1779, not trusting then-provost the Reverend William Smith’s “Loyalist” tendencies, the revolutionary State Legislature created a University of the State of Pennsylvania. The result was a schism, with Smith continuing to operate an attenuated version of the College of Philadelphia. In 1791, the legislature issued a new charter, merging the two institutions into a new University of Pennsylvania with twelve men from each institution on the new board of trustees.

    Penn has three claims to being the first university in the United States, according to university archives director Mark Frazier Lloyd: the 1765 founding of the first medical school in America made Penn the first institution to offer both “undergraduate” and professional education; the 1779 charter made it the first American institution of higher learning to take the name of “University”; and existing colleges were established as seminaries (although, as detailed earlier, Penn adopted a traditional seminary curriculum as well).

    After being located in downtown Philadelphia for more than a century, the campus was moved across the Schuylkill River to property purchased from the Blockley Almshouse in West Philadelphia in 1872, where it has since remained in an area now known as University City. Although Penn began operating as an academy or secondary school in 1751 and obtained its collegiate charter in 1755, it initially designated 1750 as its founding date; this is the year that appears on the first iteration of the university seal. Sometime later in its early history, Penn began to consider 1749 as its founding date and this year was referenced for over a century, including at the centennial celebration in 1849. In 1899, the board of trustees voted to adjust the founding date earlier again, this time to 1740, the date of “the creation of the earliest of the many educational trusts the University has taken upon itself”. The board of trustees voted in response to a three-year campaign by Penn’s General Alumni Society to retroactively revise the university’s founding date to appear older than Princeton University, which had been chartered in 1746.

    Research, innovations and discoveries

    Penn is classified as an “R1” doctoral university: “Highest research activity.” Its economic impact on the Commonwealth of Pennsylvania for 2015 amounted to $14.3 billion. Penn’s research expenditures in the 2018 fiscal year were $1.442 billion, the fourth largest in the U.S. In fiscal year 2019 Penn received $582.3 million in funding from the National Institutes of Health(US).

    In line with its well-known interdisciplinary tradition, Penn’s research centers often span two or more disciplines. In the 2010–2011 academic year alone, five interdisciplinary research centers were created or substantially expanded; these include the Center for Health-care Financing; the Center for Global Women’s Health at the Nursing School; the $13 million Morris Arboretum’s Horticulture Center; the $15 million Jay H. Baker Retailing Center at Wharton; and the $13 million Translational Research Center at Penn Medicine. With these additions, Penn now counts 165 research centers hosting a research community of over 4,300 faculty and over 1,100 postdoctoral fellows, 5,500 academic support staff and graduate student trainees. To further assist the advancement of interdisciplinary research President Amy Gutmann established the “Penn Integrates Knowledge” title awarded to selected Penn professors “whose research and teaching exemplify the integration of knowledge”. These professors hold endowed professorships and joint appointments between Penn’s schools.

    Penn is also among the most prolific producers of doctoral students. With 487 PhDs awarded in 2009, Penn ranks third in the Ivy League, only behind Columbia University(US) and Cornell University(US) (Harvard University(US) did not report data). It also has one of the highest numbers of post-doctoral appointees (933 in number for 2004–2007), ranking third in the Ivy League (behind Harvard and Yale University(US)) and tenth nationally.

    In most disciplines Penn professors’ productivity is among the highest in the nation and first in the fields of epidemiology, business, communication studies, comparative literature, languages, information science, criminal justice and criminology, social sciences and sociology. According to the National Research Council nearly three-quarters of Penn’s 41 assessed programs were placed in ranges including the top 10 rankings in their fields, with more than half of these in ranges including the top five rankings in these fields.

    Penn’s research tradition has historically been complemented by innovations that shaped higher education. In addition to establishing the first medical school; the first university teaching hospital; the first business school; and the first student union Penn was also the cradle of other significant developments. In 1852, Penn Law was the first law school in the nation to publish a law journal still in existence (then called The American Law Register, now the Penn Law Review, one of the most cited law journals in the world). Under the deanship of William Draper Lewis, the law school was also one of the first schools to emphasize legal teaching by full-time professors instead of practitioners, a system that is still followed today. The Wharton School was home to several pioneering developments in business education. It established the first research center in a business school in 1921 and the first center for entrepreneurship center in 1973 and it regularly introduced novel curricula for which BusinessWeek wrote, “Wharton is on the crest of a wave of reinvention and change in management education”.

    Several major scientific discoveries have also taken place at Penn. The university is probably best known as the place where the first general-purpose electronic computer (ENIAC) was born in 1946 at the Moore School of Electrical Engineering.

    ENIAC UPenn

    It was here also where the world’s first spelling and grammar checkers were created, as well as the popular COBOL programming language. Penn can also boast some of the most important discoveries in the field of medicine. The dialysis machine used as an artificial replacement for lost kidney function was conceived and devised out of a pressure cooker by William Inouye while he was still a student at Penn Med; the Rubella and Hepatitis B vaccines were developed at Penn; the discovery of cancer’s link with genes; cognitive therapy; Retin-A (the cream used to treat acne), Resistin; the Philadelphia gene (linked to chronic myelogenous leukemia) and the technology behind PET Scans were all discovered by Penn Med researchers. More recent gene research has led to the discovery of the genes for fragile X syndrome, the most common form of inherited mental retardation; spinal and bulbar muscular atrophy, a disorder marked by progressive muscle wasting; and Charcot–Marie–Tooth disease, a progressive neurodegenerative disease that affects the hands, feet and limbs.

    Conductive polymer was also developed at Penn by Alan J. Heeger, Alan MacDiarmid and Hideki Shirakawa, an invention that earned them the Nobel Prize in Chemistry. On faculty since 1965, Ralph L. Brinster developed the scientific basis for in vitro fertilization and the transgenic mouse at Penn and was awarded the National Medal of Science in 2010. The theory of superconductivity was also partly developed at Penn, by then-faculty member John Robert Schrieffer (along with John Bardeen and Leon Cooper). The university has also contributed major advancements in the fields of economics and management. Among the many discoveries are conjoint analysis, widely used as a predictive tool especially in market research; Simon Kuznets’s method of measuring Gross National Product; the Penn effect (the observation that consumer price levels in richer countries are systematically higher than in poorer ones) and the “Wharton Model” developed by Nobel-laureate Lawrence Klein to measure and forecast economic activity. The idea behind Health Maintenance Organizations also belonged to Penn professor Robert Eilers, who put it into practice during then-President Nixon’s health reform in the 1970s.

    International partnerships

    Students can study abroad for a semester or a year at partner institutions such as the London School of Economics(UK), University of Barcelona [Universitat de Barcelona](ES), Paris Institute of Political Studies [Institut d’études politiques de Paris](FR), University of Queensland(AU), University College London(UK), King’s College London(UK), Hebrew University of Jerusalem(IL) and University of Warwick(UK).

     
  • richardmitnick 10:50 am on October 8, 2021 Permalink | Reply
    Tags: "Astrophysicists explain the origin of unusually heavy neutron star binaries", A giant star expands and engulfs the neutron star companion in a stage referred to as common-envelope evolution., Astronomy, , , Compact astrophysical objects like neutron stars and black holes are challenging to study because when they are stable they tend to be invisible emitting no detectable radiation., , LIGO’s detection of a heavy neutron star merger at a rate similar to the lighter binary system implies that heavy neutron star pairs should be relatively common., Simulations of supernova explosions of massive stars paired with neutron stars can explain puzzling results from gravitational wave observatories., The first detection of gravitational waves by the aLIGO in 2017 was a neutron star merger that mostly conformed to the expectations of astrophysicists., The mass of the helium core of the stripped star is essential in determining the nature of its interactions with its neutron star companion and the ultimate fate of the binary system., , There may well be a large undetected population of heavy neutron star binaries in our galaxy., When the helium core is small it expands and then mass transfer spins up the neutron star to create a pulsar.   

    From The University of California-Santa Cruz (US) : “Astrophysicists explain the origin of unusually heavy neutron star binaries” 

    From The University of California-Santa Cruz (US)

    October 08, 2021
    Tim Stephens
    stephens@ucsc.edu

    Simulations of supernova explosions of massive stars paired with neutron stars can explain puzzling results from gravitational wave observatories.

    1
    In the late stages of binary neutron star formation, the giant star expands and engulfs the neutron star companion in a stage referred to as common-envelope evolution (a). Ejection of the envelope leaves the neutron star in a close orbit with a stripped-envelope star. The evolution of the system depends on the mass ratio. Less-massive stripped stars experience an additional mass transfer phase that further strips the star and recycles the pulsar companion, leading to systems such as the observed binary neutron stars in the Milky Way and GW170817 (b). More massive stripped stars do not expand as much, therefore avoiding further stripping and companion recycling, leading to systems such as GW190425 (c). Finally, even more massive stripped stars with will lead to black hole-neutron star binaries such as GW200115 (d). Credit: Vigna-Gomez et al., ApJL 2021.

    A new study showing how the explosion of a stripped massive star in a supernova can lead to the formation of a heavy neutron star or a light black hole resolves one of the most challenging puzzles to emerge from the detection of neutron star mergers by the gravitational wave observatories LIGO and Virgo.

    The first detection of gravitational waves by the Advanced Laser Interferometer Gravitational-Wave Observatory (LIGO) in 2017 was a neutron star merger that mostly conformed to the expectations of astrophysicists. But the second detection, in 2019, was a merger of two neutron stars whose combined mass was unexpectedly large.

    Localizations of gravitational-wave signals detected by LIGO in 2015 (GW150914, LVT151012, GW151226, GW170104), more recently, by the LIGO-Virgo network (GW170814, GW170817). After Virgo (IT) came online in August 2018.

    “It was so shocking that we had to start thinking about how to create a heavy neutron star without making it a pulsar,” said Enrico Ramirez-Ruiz, professor of Astronomy and Astrophysics at UC Santa Cruz.

    Compact astrophysical objects like neutron stars and black holes are challenging to study because when they are stable they tend to be invisible emitting no detectable radiation.

    Merging neutron stars. Image Credit: Mark Garlick, University of Warwick (UK).

    “That means we are biased in what we can observe,” Ramirez-Ruiz explained. “We have detected neutron star binaries in our galaxy when one of them is a pulsar, and the masses of those pulsars are almost all identical—we don’t see any heavy neutron stars.”

    LIGO’s detection of a heavy neutron star merger at a rate similar to the lighter binary system implies that heavy neutron star pairs should be relatively common. So why don’t they show up in the pulsar population?

    In the new study, Ramirez-Ruiz and his colleagues focused on the supernovae of stripped stars in binary systems that can form “double compact objects” consisting of either two neutron stars or a neutron star and a black hole. A stripped star, also called a helium star, is a star that has had its hydrogen envelope removed by its interactions with a companion star.

    The study, published October 8 in The Astrophysical Journal Letters, was led by Alejandro Vigna-Gomez, an astrophysicist at The University of Copenhagen [Københavns Universitet](DK)’s Niels Bohr Institute [Niels Bohr Institutet] (DK), where Ramirez-Ruiz holds a Niels Bohr Professorship.

    “We used detailed stellar models to follow the evolution of a stripped star until the moment it explodes in a supernova,” Vigna-Gomez said. “Once we reach the time of the supernova, we do a hydrodynamical study, where we are interested in following the evolution of the exploding gas.”

    The stripped star, in a binary system with a neutron star companion, starts out ten times more massive than our sun, but so dense it is smaller than the sun in diameter. The final stage in its evolution is a core-collapse supernova, which leaves behind either a neutron star or a black hole, depending on the final mass of the core.

    The team’s results showed that when the massive stripped star explodes, some of its outer layers are rapidly ejected from the binary system. Some of the inner layers, however, are not ejected and eventually fall back onto the newly formed compact object.

    “The amount of material accreted depends on the explosion energy—the higher the energy, the less mass you can keep,” Vigna-Gomez said. “For our ten-solar-mass stripped star, if the explosion energy is low, it will form a black hole; if the energy is large, it will keep less mass and form a neutron star.”

    These results not only explain the formation of heavy neutron star binary systems, such as the one revealed by the gravitational wave event GW190425, but also predict the formation of neutron star and light black hole binaries, such as the one that merged in the 2020 gravitational wave event GW200115.

    Another important finding is that the mass of the helium core of the stripped star is essential in determining the nature of its interactions with its neutron star companion and the ultimate fate of the binary system. A sufficiently massive helium star can avoid transferring mass onto the neutron star. With a less massive helium star, however, the mass transfer process can transform the neutron star into a rapidly spinning pulsar.

    “When the helium core is small it expands and then mass transfer spins up the neutron star to create a pulsar,” Ramirez-Ruiz explained. “Massive helium cores, however, are more gravitationally bound and don’t expand, so there is no mass transfer. And if they don’t spin up into a pulsar, we don’t see them.”

    In other words, there may well be a large undetected population of heavy neutron star binaries in our galaxy.

    “Transferring mass onto a neutron star is an effective mechanism to create rapidly spinning (millisecond) pulsars,” Vigna-Gomez said. “Avoiding this mass transfer episode as we suggest hints that there is a radio-quiet population of such systems in the Milky Way.”

    In addition to Vigna-Gomez and Ramirez-Ruiz, the coauthors of the paper include Sophie Schroder at the Niels Bohr Institute; David Aguilera-Dena at the The University of Crete [Πανεπιστήμιο Κρήτης](GR); Aldo Batta at the The National Institute for Astrophysics, Optics and Electronics(MX); Norbert Langer at The Rhenish Friedrich Wilhelm University of Bonn[Rheinische Friedrich-Wilhelms-Universität Bonn](DE); and Reinhold Willcox at Monash University (AU). This work was supported by The Heising-Simons Foundation (US), The Danish National Research Foundation [Danmarks Grundforskningsfond](DK), and The National Science Foundation (US).

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC Santa Cruz (US) campus.

    The University of California-Santa Cruz (US) , opened in 1965 and grew, one college at a time, to its current (2008-09) enrollment of more than 16,000 students. Undergraduates pursue more than 60 majors supervised by divisional deans of humanities, physical & biological sciences, social sciences, and arts. Graduate students work toward graduate certificates, master’s degrees, or doctoral degrees in more than 30 academic fields under the supervision of the divisional and graduate deans. The dean of the Jack Baskin School of Engineering oversees the campus’s undergraduate and graduate engineering programs.

    UCSC is the home base for the Lick Observatory.

    UCO Lick Observatory’s 36-inch Great Refractor telescope housed in the South (large) Dome of main building.

    UC Santa Cruz (US) Lick Observatory Since 1888 Mt Hamilton, in San Jose, California, Altitude 1,283 m (4,209 ft)

    UC Observatories Lick Automated Planet Finder fully robotic 2.4-meter optical telescope at Lick Observatory, situated on the summit of Mount Hamilton, east of San Jose, California, USA.

    The UCO Lick C. Donald Shane telescope is a 120-inch (3.0-meter) reflecting telescope located at the Lick Observatory, Mt Hamilton, in San Jose, California, Altitude 1,283 m (4,209 ft).

    Search for extraterrestrial intelligence expands at Lick Observatory
    New instrument scans the sky for pulses of infrared light
    March 23, 2015
    By Hilary Lebow


    Astronomers are expanding the search for extraterrestrial intelligence into a new realm with detectors tuned to infrared light at UC’s Lick Observatory. A new instrument, called NIROSETI, will soon scour the sky for messages from other worlds.

    “Infrared light would be an excellent means of interstellar communication,” said Shelley Wright, an assistant professor of physics at UC San Diego (US) who led the development of the new instrument while at the U Toronto Dunlap Institute for Astronomy and Astrophysics (CA).

    Shelley Wright of UC San Diego with (US) NIROSETI, developed at U Toronto Dunlap Institute for Astronomy and Astrophysics (CA) at the 1-meter Nickel Telescope at Lick Observatory at UC Santa Cruz

    Wright worked on an earlier SETI project at Lick Observatory as a UC Santa Cruz undergraduate, when she built an optical instrument designed by University of California-Berkeley (US) researchers. The infrared project takes advantage of new technology not available for that first optical search.

    Infrared light would be a good way for extraterrestrials to get our attention here on Earth, since pulses from a powerful infrared laser could outshine a star, if only for a billionth of a second. Interstellar gas and dust is almost transparent to near infrared, so these signals can be seen from great distances. It also takes less energy to send information using infrared signals than with visible light.

    Frank Drake, professor emeritus of astronomy and astrophysics at UC Santa Cruz and director emeritus of the SETI Institute, said there are several additional advantages to a search in the infrared realm.

    Frank Drake with his Drake Equation. Credit Frank Drake.

    “The signals are so strong that we only need a small telescope to receive them. Smaller telescopes can offer more observational time, and that is good because we need to search many stars for a chance of success,” said Drake.

    The only downside is that extraterrestrials would need to be transmitting their signals in our direction, Drake said, though he sees this as a positive side to that limitation. “If we get a signal from someone who’s aiming for us, it could mean there’s altruism in the universe. I like that idea. If they want to be friendly, that’s who we will find.”

    Scientists have searched the skies for radio signals for more than 50 years and expanded their search into the optical realm more than a decade ago. The idea of searching in the infrared is not a new one, but instruments capable of capturing pulses of infrared light only recently became available.

    “We had to wait,” Wright said. “I spent eight years waiting and watching as new technology emerged.”

    Now that technology has caught up, the search will extend to stars thousands of light years away, rather than just hundreds. NIROSETI, or Near-Infrared Optical Search for Extraterrestrial Intelligence, could also uncover new information about the physical universe.

    “This is the first time Earthlings have looked at the universe at infrared wavelengths with nanosecond time scales,” said Dan Werthimer, UC Berkeley SETI Project Director. “The instrument could discover new astrophysical phenomena, or perhaps answer the question of whether we are alone.”

    NIROSETI will also gather more information than previous optical detectors by recording levels of light over time so that patterns can be analyzed for potential signs of other civilizations.

    “Searching for intelligent life in the universe is both thrilling and somewhat unorthodox,” said Claire Max, director of UC Observatories and professor of astronomy and astrophysics at UC Santa Cruz. “Lick Observatory has already been the site of several previous SETI searches, so this is a very exciting addition to the current research taking place.”

    NIROSETI will scan the skies several times a week on the Nickel 1-meter telescope at Lick Observatory, located on Mt. Hamilton east of San Jose.

     
  • richardmitnick 8:59 am on October 8, 2021 Permalink | Reply
    Tags: "Is New Finding an Asteroid or a Comet? It's Both", Astronomy, , , Cometary/Asteroid object (248370) 2005 QN173, ,   

    From Planetary Science Institute (US) : “Is New Finding an Asteroid or a Comet? It’s Both” 

    From Planetary Science Institute (US)

    Oct. 4, 2021
    MEDIA CONTACT:
    Alan Fischer
    Public Information Officer
    520-382-0411

    SCIENCE CONTACT:
    Henry Hsieh
    Senior Scientist
    808-729-4208

    1
    Composite image of (248370) 2005 QN173 taken with Palomar Observatory’s Hale Telescope in California on July 12, 2021.

    The head, or nucleus, of the comet is in the upper left corner, with the tail stretching down and to the right, getting progressively fainter farther from the nucleus. Stars in the field of view appear as short dotted lines due to the apparent motion of Solar System objects against background stars and the process of adding together multiple images to increase the visibility of the tail.

    Credit: Henry H. Hsieh (PSI), Jana Pittichová (NASA/JPL-Caltech).

    The newest known example of a rare type of object in the Solar System – a comet hidden among the main-belt asteroids – has been found and studied, according to a new paper by Planetary Science Institute Senior Scientist Henry Hsieh.

    Discovered to be active on July 7, 2021, by the Asteroid Terrestrial-Impact Last Alert System (ATLAS) survey, asteroid (248370) 2005 QN137 is just the eighth main-belt asteroid, out of more than half a million known main-belt asteroids, confirmed to not only be active, but to have been active on more than one occasion. “This behavior strongly indicates that its activity is due to the sublimation of icy material,” said Hsieh, lead author of the paper in [The Astrophysical Journal Letters] that he presented at a press conference today at the 53rd annual meeting of The American Astronomical Society (US)’s Division for Planetary Sciences. “As such, it is considered a so-called main-belt comet, and is one of just about 20 objects that have currently been confirmed or are suspected to be main-belt comets, including some that have only been observed to be active once so far.

    “248370 can be thought of as both an asteroid and a comet, or more specifically, a main-belt asteroid that has just recently been recognized to also be a comet. It fits the physical definitions of a comet, in that it is likely icy and is ejecting dust into space, even though it also has the orbit of an asteroid,” Hsieh said. “This duality and blurring of the boundary between what were previously thought to be two completely separate types of objects – asteroids and comets – is a key part of what makes these objects so interesting.”

    Hsieh found that size of the nucleus, the solid object at the “head” of the comet that is surrounded by a dust cloud, is 3.2 kilometers (2 miles) across, the length of the tail in July 2021 was more than 720,000 kilometers (450,000 miles) long, a little less than twice the distance from the Earth to the Moon, and the tail at that time was just 1,400 kilometers (900 miles) wide. These dimensions mean that if the length of the tail was scaled to the length of a football field, the tail would be just 7 inches wide and the nucleus would be half a millimeter across.

    “This extremely narrow tail tells us that dust particles are barely floating off of the nucleus at extremely slow speeds and that the flow of gas escaping from the comet that normally lifts dust off into space from a comet is extremely weak. Such slow speeds would normally make it difficult for dust to escape from the gravity of the nucleus itself, so this suggests that something else might be helping the dust to escape. For example, the nucleus might be spinning fast enough that it’s helping to fling dust off into space that has been partially lifted by escaping gas. Further observations will be needed to confirm the rotation speed of the nucleus though,” Hsieh said.

    “Cometary activity is generally thought to be caused by sublimation – the transformation from ice to gas – of icy material in a Solar System object, which means that most comets are found to come from the cold outer Solar System, beyond the orbit of Neptune, and spend most of their time there, with their highly elongated orbits only bringing them close to the Sun and the Earth for short periods at a time,” Hsieh said. “During those times when they are close enough to the Sun, they heat up and release gas and dust as a result of ice sublimation, producing the fuzzy appearance and often spectacular tails associated with comets.”

    By contrast, main-belt asteroids, which orbit between the orbits of Mars and Jupiter, are thought to have been in the warm inner Solar System where we see them today (inside the orbit of Jupiter) for the last 4.6 billion years. Any ice in these objects was expected to be long gone from being so close to the Sun for so long, meaning that cometary activity was not expected to be possible from any of these objects. However, a few rare objects that challenge this expectation called main-belt comets, first discovered as a new class of comets by Hsieh and David Jewitt in 2006, have been found over the last several years. These objects are interesting because a substantial part of Earth’s water is thought to have been delivered via impacts by asteroids from the main asteroid belt when the Earth was being formed. Given that the activity observed for these objects means they are likely to still contain ice, they offer a potential way to test that hypothesis and learn more about the origin of life on Earth by learning more about the abundance, distribution, and physical properties of icy objects in the inner Solar System.

    Hsieh’s work was funded by a grant to PSI from NASA’s Solar System Observations program (Grant 80NSSC19K0869). This work also made use of observations carried out under the Las Cumbres Observatory Outbursting Objects Key Project (LOOK) and the Faulkes Telescope Project’s Comet Chasers program, and from Lowell Observatory’s Lowell Discovery Telescope and Palomar Observatory’s Hale Telescope [above].

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Planetary Science Institute (PSI) is a 501(c)(3) non-profit research institute based in Tucson, Arizona, focusing on planetary science.

    Founded in 1972, PSI is involved in many NASA missions, the study of Mars, asteroids, comets, interplanetary dust, the formation of the Solar System, extrasolar planets, the origin of life, and other scientific topics. It actively participated in the Dawn mission, which explored Vesta between 2011 and 2012, and Ceres between 2015 and 2018. It managed the spacecraft’s Gamma Ray and Neutron Detector, which mapped the surfaces of the two minor planets to determine how they were formed and evolved.

    PSI’s orbit@home was a distributed computing project through which the public can help in the search for near-Earth objects. The Institute is also involved in science education through school programs, popular science books and art.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: