Tagged: Astronomy Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:28 pm on January 23, 2022 Permalink | Reply
    Tags: "A New Map of the Sun’s Local Bubble", Astronomy, , , , ,   

    From The New York Times : “A New Map of the Sun’s Local Bubble” 

    From The New York Times

    Jan. 20, 2022
    Dennis Overbye

    A view of the center of Milky Way from 2011. Scientists believe a series of supernova explosions 14 million years ago led to the creation of a 1,000-light-year-wide region bereft of the gas and dust needed to form new stars.Credit: The National Aeronautics and Space Administration(US).

    Just a bit too late for New Year celebrations, astronomers have discovered that the Milky Way galaxy, our home, is, like champagne, full of bubbles.

    As it happens, our solar system is passing through the center of one of these bubbles. Fourteen million years ago, according to the astronomers, a firecracker chain of supernova explosions drove off all the gas and dust from a region roughly 1,000 light-years wide, leaving it bereft of the material needed to produce new generations of stars.

    As a result, all the baby stars in our neighborhood can be found stuck on the edges of this bubble. There, the staccato force of a previous generation of exploding stars has pushed gas clouds together into forms dense enough to collapse under their own ponderous if diffuse gravity and condense enough to ignite, as baby stars. Our sun, 4.5 billion years old, drifts through the middle of this space in a coterie of aged stars.

    “This is really an origin story,” Catherine Zucker said in a news release from The Harvard-Smithsonian Center for Astrophysics. “For the first time, we can explain how all nearby star formation began.”

    Dr. Zucker, now at The Space Telescope Science Institute (US), led a team that mapped what they call the Local Bubble in remarkable detail. They used data from a number of sources, particularly Gaia, a European spacecraft, that has mapped and measured more than a billion stars, to pinpoint the locations of gas and dust clouds.

    European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU) GAIA satellite.

    Last year, a group of scientists led by João Alves, an astrophysicist at The University of Vienna [Universität Wien](AT) announced the discovery of the Radcliffe Wave, an undulating string of dust and gas clouds 9,000 light-years long that might be the spine of our local arm of the galaxy. One section of the wave now appears to be part of our Local Bubble.

    An artist’s illustration of the Local Bubble with star formation occurring on the bubble’s surface.Credit: Leah Hustak (STScI)/CfA.

    The same group of scientists published their latest findings in Nature, along with an elaborate animated map of the Local Bubble and its highlights.

    New Local Bubble Map. Credit: CfA

    The results, the astronomers write, provide “robust observational support” for a long-held theory that supernova explosions are important in triggering star formation, perhaps by jostling gas and dust clouds into collapsing and starting on the long road to thermonuclear luminosity.

    Astronomers have long recognized the Local Bubble. What is new, said Alyssa Goodman, a member of the team also from the Harvard-Smithsonian Center for Astrophysics, is the observation that all local star forming-regions lie on the Local Bubble’s surface. Researchers previously lacked the tools to map gas and dust clouds in three dimensions. “Thanks to 3-D dust-mapping, now we do,” Dr. Goodman said.

    According to the team’s calculations the Local Bubble began 14 million years ago with a massive supernova, the first of about 15; massive stars died and blew up. Their blast waves cleared out the region. As a result there are now no stars younger than 14 million years in the bubble, Dr. Goodman said.

    The bubble continues to grow at about 4 miles a second. “Still, more supernovae are expected to take place in the near future, like Antares, a red supergiant star near the edge of the bubble that could go any century now,” Dr. Alves said. “So the Local Bubble is not ‘done.’”

    With a score of well-known star-forming regions sitting on the surface of the bubble, the next generation of stars is securely on tap.

    The team plans to go on and map more bubbles in the our Milky Way flute of champagne. There must be more, Dr. Goodman said, because it would be too much of a coincidence for the sun to be smack in the middle of the only one.

    The sun’s presence in this one is nonetheless coincidental, Dr. Alves said. Our star wandered into the region only 5 million years ago, long after most of the action, and will exit about 5 million years from now.

    The motions of the stars are more irregular than commonly portrayed, as they are bumped gravitationally by other stars, clouds and the like, Dr. Alves said.

    “The sun is moving at a significantly different velocity than the average of the stars and gas in the solar neighborhood,” he noted. This would enable it to catch up and pass — or be passed by — the bubble.

    “It was a revelation,” Dr. Goodman said, “how kooky the sun’s path really is compared with a simple circle.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 10:18 am on January 21, 2022 Permalink | Reply
    Tags: "Snapshot-Sagittarius A* gives its compliments to the chef", Astronomy, , , , , ,   

    From Astronomy Magazine : “Snapshot-Sagittarius A* gives its compliments to the chef” 

    From Astronomy Magazine

    January 7, 2022
    Caitlyn Buongiorno

    Credit: Gerald Cecil/The University of North Carolina-Chapel Hill (US)/The National Aeronautics and Space Agency(US)/The European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganisation](EU); Image Processing: Joseph DePasquale (The Space Telescope Science Institute (US)).

    Like any happy eater, our Milky Way’s supermassive black hole, Sagittarius A* (Sgr A*), belches every time it consumes a particularly hefty meal. The resulting small outbursts, or mini-jets, can be difficult to spot outright, but may leave traces in the surrounding gas.

    SCIENCE: Gerald Cecil (UNC-Chapel Hill)/NASA, ESA; IMAGE PROCESSING: Joseph DePasquale (STScI).

    Such evidence of a blowtorchlike jet released just a few thousand years ago was outlined in a paper published Dec. 6 in The Astrophysical Journal. Though the jet wasn’t spotted directly, the Hubble Space Telescope instead saw indirect evidence of the jet’s material pushing on a nearby hydrogen cloud.

    National Aeronautics and Space Administration(US)/European Space Agency [Agence spatiale européenne] [Europäische Weltraumorganisation](EU) Hubble Space Telescope.

    Another lingering jet was previously spotted in 2013 by NASA’s Chandra X-ray Observatory and the Karl G. Jansky Very Large Array.

    The National Aeronautics and Space Administration Chandra X-ray telescope(US).

    National Radio Astronomy Observatory(US)Karl G Jansky Very Large Array located in central New Mexico on the Plains of San Agustin, between the towns of Magdalena and Datil, ~50 miles (80 km) west of Socorro. The VLA comprises twenty-eight 25-meter radio telescopes.

    Both jets clearly indicate that the 4.1-million-solar-mass Sgr A* is far from a sleeping giant.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Astronomy is a magazine about the science and hobby of astronomy. Based near Milwaukee in Waukesha, Wisconsin, it is produced by Kalmbach Publishing. Astronomy’s readers include those interested in astronomy and those who want to know about sky events, observing techniques, astrophotography, and amateur astronomy in general.

    Astronomy was founded in 1973 by Stephen A. Walther, a graduate of The University of Wisconsin–Stevens Point (US) and amateur astronomer. The first issue, August 1973, consisted of 48 pages with five feature articles and information about what to see in the sky that month. Issues contained astrophotos and illustrations created by astronomical artists. Walther had worked part time as a planetarium lecturer at The University of Wisconsin–Milwaukee (US) and developed an interest in photographing constellations at an early age. Although even in childhood he was interested to obsession in Astronomy, he did so poorly in mathematics that his mother despaired that he would ever be able to earn a living. However he graduated in Journalism from the University of Wisconsin Stevens Point, and as a senior class project he created a business plan for a magazine for amateur astronomers. With the help of his brother David, he was able to bring the magazine to fruition. He died in 1977.

  • richardmitnick 9:34 am on January 21, 2022 Permalink | Reply
    Tags: "Spend some time observing in Auriga - Photo Essay", Astronomy, , , ,   

    From Astronomy Magazine : “Spend some time observing in Auriga – Photo Essay” 

    From Astronomy Magazine

    January 11, 2022
    Michael E. Bakich

    With three Messier objects and loads of other bright targets, the Charioteer has a lot to offer.

    Credit: Richard Talcott and Roen Kelly/Astronomy.

    The constellation Auriga (pronounced or-EYE-guh) the Charioteer, a star pattern known by this name for several thousand years, is easy to recognize primarily because of its brightest star, Capella (Alpha [α] Aurigae). This luminary is the sixth-brightest nighttime star and shines with an intense yellow light. The constellation’s Beta star, magnitude 1.9 Menkalinan, is 40th brightest.

    The Charioteer is visible in the evening from mid-autumn through winter in the Northern Hemisphere. Its center lies at R.A. 6h01m and Dec. 42° north. Auriga ranks 21st in size out of the 88 constellations, covering 657.44 square degrees (1.59 percent) of the sky. Its size is a bit of a hindrance to its visibility, however. It lies in the middle of the constellation ladder (43rd) in terms of overall brightness.

    The best date each year to see Auriga is December 21, when it stands opposite the Sun in the sky and reaches its highest point at local midnight. With respect to visibility, anyone living north of latitude 34° south can see the entire figure at some time during the year. And it’s completely invisible only to those who live at latitudes south of 62° south.

    Auriga contains three Messier objects (all open clusters) and several other open clusters and emission nebulae. Because it lies along the Milky Way, it doesn’t contain any galaxies. As you can see, however, lots of targets lie within its borders for you to point a telescope at. Good luck!

    Auriga targets

    Credit: Anthony Ayiomamitis.

    Open cluster NGC 2281 glows at magnitude 5.4 and measures 14′ across. It lies 0.8° south-southwest of magnitude 5.0 Psi^⁷ (ψ^⁷) Aurigae. Through a 4-inch scope at 100x, you’ll spot two dozen stars. Four stars forming a parallelogram sit at the center of the cluster.

    Credit: Jaspal Chadha.

    NGC 1664 is an attractive open cluster 2° west of magnitude 3.0 Epsilon (ε) Aurigae. It glows at magnitude 7.6 and spans 18′. A 4-inch scope at 100x reveals three dozen stars. The background star field is rich, but you’ll have no trouble picking out the cluster.

    Credit: Martin C. Germano.

    NGC 1778 is a magnitude 7.7 open cluster with a diameter of 8′. You’ll find it 2° east-southeast of magnitude 5.1 Omega (ω) Aurigae. Through a 4-inch scope, you’ll see two dozen stars unevenly spread across this cluster’s face. Double the aperture to 8 inches, and you’ll raise that star count to 50.

    Credit: Martin C. Germano.

    Barnard 29 is a dark nebula that lies 2.4° southeast of magnitude 2.7 Iota (ι) Aurigae. Through a 12-inch scope, B29 appears as a gray, mottled region that blends gradually into its starry surroundings. The darkest area appears 15′ across.

    Credit: Alistair Symon.

    The Flaming Star Nebula (IC 405) appears as a dim 30′ by 20′ wisp of light. To observe it, first find AE Aurigae, which lies 4.2° east-northeast of Iota. Through a 6-inch scope, the nebula appears triangular.

    Credit: Martin C. Germano.

    Open cluster NGC 1857 sits 0.8° south-southeast of magnitude 4.7 Lambda (λ) Aurigae. It glows at magnitude 7.0 and measures 5′. Through an 8-inch scope, you’ll see 25 stars around 13th magnitude. The exception is SAO 57903, a magnitude 7.4 yellow star at the center.

    Credit: Mark Hanson.

    IC 410 is a large (40′ by 30′) emission nebula 2.4° west-northwest of magnitude 4.7 Chi (χ) Aurigae. The nebulosity glows brightest in an area 5′ in diameter on the northwestern edge. Use a 12-inch scope with an Oxygen-III filter and this object will knock your socks off.

    Credit: Martin C. Germano.

    NGC 1907 is a magnitude 8.2 open cluster that spans 6′. A 4-inch scope at 100x shows about a dozen stars. Use a low-power eyepiece and you’ll sweep up an even-brighter open cluster: M38, 0.5° to the north-northeast.

    Credit: Anthony Ayiomamitis.

    The Starfish Cluster (Messier 38) is the westernmost and faintest (magnitude 6.4) of the three Messier open clusters in this constellation. A 4-inch scope will reveal three dozen stars in an area 20′ across.

    Al and Andy Ferayomi/Adam Block/The National Optical Astronomy Observatory (US)/The Association of Universities for Research in Astronomy (AURA)(US)/The National Science Foundation (US).

    Emission nebula NGC 1931 sits 0.8° east-southeast of magnitude 5.1 Phi (ϕ) Aurigae. An 8-inch scope at 200x shows the nebula, which spans 4′. It orients northeast to southwest and shows non-uniform brightness across its face.

    Credit: Anthony Ayiomamitis.

    The Pinwheel Cluster (Messier 36) is the least spectacular of the Messier trio in Auriga. At magnitude 6.0, however, it still outshines 99.99 percent of the sky’s star clusters. Through a 4-inch scope, you’ll see several dozen stars strewn across an area 12′ wide.

    Credit: Anthony Ayiomamitis.

    The Salt and Pepper Cluster (Messier 37) displays an even distribution of stars — a rarity in open clusters. A 3-inch scope reveals 50 stars. Through a 10-inch scope, you’ll count 200, and a 16-inch will reveal 500. M37 glows at magnitude 5.6 and is 20′ across.

    Credit: Martin C. Germano.

    NGC 2126 lies midway between magnitude 1.9 Menkalinan (Beta [β] Aurigae) and magnitude 3.7 Delta (δ) Aurigae. It glows at magnitude 10.2 and spans 6′. Through a 6-inch telescope, you’ll see about 20 stars. The magnitude 6.0 star SAO 40801 lies 3′ northeast of the cluster.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Astronomy is a magazine about the science and hobby of astronomy. Based near Milwaukee in Waukesha, Wisconsin, it is produced by Kalmbach Publishing. Astronomy’s readers include those interested in astronomy and those who want to know about sky events, observing techniques, astrophotography, and amateur astronomy in general.

    Astronomy was founded in 1973 by Stephen A. Walther, a graduate of The University of Wisconsin–Stevens Point (US) and amateur astronomer. The first issue, August 1973, consisted of 48 pages with five feature articles and information about what to see in the sky that month. Issues contained astrophotos and illustrations created by astronomical artists. Walther had worked part time as a planetarium lecturer at The University of Wisconsin–Milwaukee (US) and developed an interest in photographing constellations at an early age. Although even in childhood he was interested to obsession in Astronomy, he did so poorly in mathematics that his mother despaired that he would ever be able to earn a living. However he graduated in Journalism from the University of Wisconsin Stevens Point, and as a senior class project he created a business plan for a magazine for amateur astronomers. With the help of his brother David, he was able to bring the magazine to fruition. He died in 1977.

  • richardmitnick 9:17 am on January 19, 2022 Permalink | Reply
    Tags: "Is life possible on rogue planets and moons?", , A hydrogen-rich atmosphere can not only prevent free-floating planets from losing their internal radioactive heat to space but could also keep surface temperatures warm., Astronomy, , , Earth-like planets are not the only places where life could form., Just like Saturn’s moon Titan has a thick atmosphere a sufficiently massive moon of a free-floating planet could have one too., Microorganisms can hypothetically survive on ocean floors of Enceladus-like icy moons around free-floating planets., Oceans on worlds with no Suns but moons, Planets may have been ejected out of our solar system too over 4 billion years ago and now orbit our galaxy as dark worlds., Scientists think planets that don’t orbit any star-called free-floating planets or rogue planets-can harbor life too., Simulated hydrogen-rich environments in labs show that certain terrestrial microorganisms can thrive under such conditions., Starless free-floating worlds might represent the most common habitable real estate of the universe.,   

    From The Planetary Society (US): “Is life possible on rogue planets and moons?” 


    From The Planetary Society (US)

    Jan 18, 2022
    Jatan Mehta

    An artist’s illustration of a Jupiter-like planet floating freely in space without a star. Image: The National Aeronautics and Space Administration(US).

    Starless free-floating worlds might represent the most common habitable real estate of the universe.

    Our search for planets around other stars in our galaxy has yielded us more than 4,500 worlds. Quite a few of these exoplanets seem to be Earth-like, where surface conditions could sustain liquid water and life as we know it.

    But even as next generation telescopes aim to detect gases on such planets indicative of life, our search for such habitable worlds remains somewhat limited. Simply put: Earth-like planets are not the only places where life could form.

    We know from our own solar system that icy moons orbiting giant planets far away from the Sun — such as Europa, Ganymede and Enceladus — can have underground, habitable oceans too. Their liquid water isn’t due to the Sun’s heat but rather warmed by friction between parts of their interiors being tugged by their planets’ gravity. If sunlight, a surface and an atmosphere aren’t necessary to make a world habitable, then why confine our search for life to Earth-like worlds that orbit stars?

    Scientists think planets that don’t orbit any star, called free-floating planets or rogue planets, can harbor life too. These planets originally form around stars like any other but get kicked out of their system at some point due to gravitational effects of giant planets within.

    Planets may have been ejected out of our solar system too over 4 billion years ago and now orbit our galaxy as dark worlds. Without a star, how can these dark worlds conceivably host life as we know it? Our exploration of the solar system combined with two decades of exoplanet research tells us there are several possibilities.

    Oceans on worlds with no Suns but moons

    Getting kicked out of a star system early on does have at least one advantage: strong ultraviolet light from young stars can’t strip away hydrogen atmospheres of these planets, which helps retain heat.

    A 1999 research paper [Nature] suggests that a hydrogen-rich atmosphere can not only prevent free-floating planets from losing their internal radioactive heat to space but could also keep surface temperatures warm enough to sustain Earth-like oceans. Simulated hydrogen-rich environments in labs show that certain terrestrial microorganisms can thrive under such conditions. That said, life on free-floating worlds would still have to miraculously emerge using the planet’s miniscule internal energy, compared to over 99% of Earth’s energy coming from sunlight.

    Hypothetically, if a free-floating planet has a large enough moon, it could further heat the planet using tidal mechanisms, similar to our Moon and Earth. When the Moon formed more than 4.4 billion years ago, it was about 15 times closer to us than it is today. It induced such a strong tidal heating that scientists think the Moon may have played a key role in making the early Earth habitable. Even if such heating lasts only a few hundred million years, it could provide a richer source of energy than the free-floating planet’s own heat to keep an ocean warm, initiate complex geology and possibly develop microbial life.

    But how likely is it for free-floating planets to have moons in the first place?

    “There’s nothing theoretically stopping us from having a Moon-sized satellite around a free-floating planet,” said Nick Oberg, a researcher at The Kapteyn Astronomical Institute – University of Gronigen [Rijksuniversiteit Groningen] (NL) and The Delft University of Technology [Technische Universiteit Delft](NL) studying formation of Jupiter’s moons. “Orbital simulations show that more than 47% of moons can remain bound to exiled gas giant planets.” Likewise, simulations with ejected Earth-mass planets show that more than 4% of them retain their Moon-sized satellite.

    Habitable moons around starless worlds

    In addition to exiled free-floating planets being able to retain their moons, it’s also possible for free-floating planets and their satellites to coalesce directly from clouds of gas and dust in interstellar space just like stars do. We have already discovered a free-floating planet candidate surrounded by a disk from which moons like those around Jupiter could form.

    “Planets with multiple satellites, such as the Galilean moons of Jupiter, have even better chances of retaining those moons after being ejected,” said Patricio Javier Ávila, a Chilean researcher of free-floating planets at The University of Concepción [Universidad de Concepción](CL). Just as tidal heating from Jupiter and Saturn creates underground oceans on some of their icy moons, such satellites around free-floating planets could have subsurface oceans too [Astronomy and Astrophysics].

    “If a free-floating planet retains multiple moons and their elliptical orbits, tidal heating could be sustained and with it the subsurface oceans,” Oberg said.

    Europa’s subsurface ocean cutaway An artist’s illustration of an underground liquid water ocean beneath the thick icy crust of Jupiter’s moon Europa. A similar ocean exists on Saturn’s moon Enceladus too.Image: NASA.

    When NASA’s Cassini spacecraft flew through water plumes erupting from Saturn’s icy moon Enceladus — sourced from its underground ocean — it found a variety of organic molecules, which are building blocks of life.

    National Aeronautics and Space Administration(US)/European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU)/ASI Italian Space Agency [Agenzia Spaziale Italiana](IT) Cassini Spacecraft.

    Cassini’s observations suggest that Enceladus’ ocean seems to have potentially habitable hydrothermal vents similar to those found in the deepest, darkest parts of Earth’s oceans. Not only do various microorganisms like methanogens thrive near such terrestrial vents, scientists think this is how life on Earth could’ve started in the first place [Nature Reviews Microbiology].

    Microorganisms can hypothetically survive [Nature Communications] on ocean floors of Enceladus-like icy moons around free-floating planets too, well protected from asteroid impacts and harmful radiation by thick icy crusts above.

    This graphic illustrates hydrothermal vents on Enceladus’ ocean floor that could provide habitable environments for microbial life to form and thrive.Image: NASA-JPL/Caltech (US).

    There’s another possibility, though. Just like Saturn’s moon Titan has a thick atmosphere a sufficiently massive moon of a free-floating planet could have one too. Coupled with tidal heating, Earth-mass exomoons of these could have high enough temperatures to sustain oceans on their surface for hundreds of millions of years, and be favorable to microbial life.

    Okay, but can we even detect starless worlds?

    For all their potential to host life, it’s incredibly difficult to detect dark, free-floating worlds in our galaxy using traditional exoplanet-catching methods. It’s hard enough already to find miniscule planets even when they have stars!

    Even though free-floating planets should be common, and at least one of them might be lying within (astronomically) merely 10 light years from us, we haven’t found any yet.

    “It’s challenging to verify these objects as true free-floating planets because their mass can be so difficult to accurately estimate,” said Oberg.

    In 2013, scientists directly imaged [The Astrophysical Journal] a Jupiter-like free-floating planet candidate 80 light years away, but it’s hard to tell it apart from a class of objects called brown dwarfs.

    Artist’s concept of a Brown dwarf [not quite a] star. NASA/JPL-Caltech.

    Example of direct imaging-This false-color composite image traces the motion of the planet Fomalhaut b, a world captured by direct imaging. Credit: The National Aeronautics and Space Administration(US), The European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU), and P. Kalas, The University of California-Berkeley (US) and The SETI Institute (US).

    These are more massive than Jupiter but are called “failed stars” because they aren’t massive enough to fuse hydrogen in their cores.

    Directly image free-floating world Direct image of the free-floating planet candidate PSO J318.5-22, visible as the dot with the reddish hue.Image: N. Metcalfe / Pan-STARRS 1

    U Hawaii (US) Pan-STARRS1 (PS1) Panoramic Survey Telescope and Rapid Response System is a 1.8-meter diameter telescope situated at Haleakala Observatories near the summit of Haleakala, altitude 10,023 ft (3,055 m) on the Island of Maui, Hawaii, USA. It is equipped with the world’s largest digital camera, with almost 1.4 billion pixels.

    Fourteen more free-floating candidates have been detected using a technique called “gravitational microlensing”, wherein a planet’s gravitational field bends light from objects behind them and magnifies their view like a fish bowl. These are difficult to confirm too.

    Gravitational microlensing, S. Liebes, Physical Review B, 133 (1964): 835.

    “Gravitational microlensing detections are one-time events, making them harder to follow up on,” Ávila said. “It’s also difficult to distinguish a light brown dwarf from a free-floating planet as the technique favors more massive objects.”

    Nevertheless, brown dwarfs could host habitable moons in the same way free-floating planets do. Brown dwarfs have been observed too so there’s some hope.

    Interestingly, moons of free-floating worlds may be relatively easier to detect than their parent objects. Even as we haven’t yet found an exomoon around a typical exoplanet with a host star, we might spot a free-floating object’s moon first because there would be no noise from a glaring star when the moon passes in front of the planet from our view.

    Next generation space telescopes, such as NASA’s recently launched JWST and ESA’s upcoming PLATO telescope, could detect Moon- and Titan-sized satellites orbiting free-floating planets and brown dwarfs.

    National Aeronautics Space Agency(US)/European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU)/ Canadian Space Agency [Agence Spatiale Canadienne](CA) James Webb Infrared Space Telescope(US) annotated. Scheduled for launch in 2011 delayed to October 2021 finally launched December 25, 2021.

    ESA PLATO spacecraft depiction

    Wide-field surveys by NASA’s upcoming Nancy Grace Roman Telescope should increase our chances even more, as should better gravitational microlensing surveys in the future.

    National Aeronautics and Space Administration(US) Nancy Grace Roman Space Telescope [WFIRST] depiction.

    Detecting exomoons and the nature of their orbits will allow scientists to determine properties of their parent objects.

    Even if we discover no or few exomoons around free-floating worlds, next generation telescopes will still advance our understanding of moons in general.

    “JWST and future telescopes will vastly increase our understanding of moon-forming disks around regular exoplanets, which are not only easier to spot and study than exomoons but have already been detected,” said Jesper Tjoa, a researcher at the University of Heidelberg. An example of such a system is the moon-forming disk around the young Jupiter-like planet PDS 70c nearly 400 light years away.

    A moon-forming disk Wide and close-up views of the moon-forming disk surrounding PDS 70c, a young Jupiter-like planet nearly 400 light-years away, as seen with the ALMA telescope on Earth.Image: The Atacama Large Millimiter/submillimeter Array (CL) / The European Southern Observatory [Observatoire européen austral][Europaiche Sûdsternwarte] (EU)(CL).

    European Southern Observatory/National Radio Astronomy Observatory(US)/National Astronomical Observatory of Japan(JP) ALMA Observatory (CL).

    European Southern Observatory(EU) , Very Large Telescope at Cerro Paranal in the Atacama Desert •ANTU (UT1; The Sun ) •KUEYEN (UT2; The Moon ) •MELIPAL (UT3; The Southern Cross ), and •YEPUN (UT4; Venus – as evening star). Elevation 2,635 m (8,645 ft) from above Credit J.L. Dauvergne & G. Hüdepohl atacama photo.

    Finding exomoons across the galaxy and understanding how they form and evolve would provide us insights into how moons in our solar system formed, and how common habitable moons are.

    The habitable worlds next door

    The possibility that icy moons of free-floating planets or of exoplanets with host stars could harbor life is tantalizing, and ties back to our solar system. Even if we do find habitable exomoons with great difficulty, there’s no way for us to be sure if they host life. The only place for us to definitively confirm alien exomoon life is our solar system, wherein we can send spacecraft to measure things with precision and even fetch samples. In fact, studying icy moons of our solar system with spacecraft is what helps us model the possibilities of habitable exomoons.

    This is precisely why some of the biggest planetary science missions launching this decade, like JUICE and Europa Clipper, are dedicated to finding if underground oceans of Jupiter’s icy moons are habitable.

    European Space Agency [Agence spatiale européenne](EU) Juice spacecraft depiction.

    European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU)Juice Schematic.

    NASA Europa Clipper depiction.
    NASA/Europa Clipper annotated.

    Future mission concepts such as the Enceladus Life Finder would look for direct signs of life in Enceladus’ water plumes. NASA is launching the Dragonfly mission later in the decade to explore Titan’s surface to understand possible starting ingredients for life on early Earth and elsewhere.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition


    In 1980, Carl Sagan, Louis Friedman, and Bruce Murray founded The Planetary Society (US) . They saw that there was enormous public interest in space, but that this was not reflected in government, as NASA’s budget was cut again and again.

    Today, The Planetary Society (US) continues this work, under the leadership of CEO Bill Nye, as the world’s largest and most influential non-profit space organization. The organization is supported by over 50,000 members in over 100 countries, and by hundreds of volunteers around the world.

    Our mission is to empower the world’s citizens to advance space science and exploration. We advocate for space and planetary science funding in government, inspire and educate people around the world, and develop and fund groundbreaking space science and technology.

    We introduce people to the wonders of the cosmos, bridging the gap between the scientific community and the general public to inspire and educate people from all walks of life.

    We give every citizen of the planet the opportunity to make their voices heard in government and effect real change in support of space exploration.

    And we bring ordinary people directly to the frontier of exploration as we crowdfund innovative and exciting space technologies.

  • richardmitnick 1:54 pm on January 18, 2022 Permalink | Reply
    Tags: "There are 40 billion billions of Black Holes in the Universe!", A remarkable amount-around 1% of the overall ordinary (baryonic) matter of the Universe-is locked up in stellar mass black holes., Astronomy, , , , , How many black holes are out there in the Universe? This is one of the most relevant and pressing questions in modern astrophysics and cosmology., , The International School for Advanced Studies [Scuola Internazionale Superiore di Studi Avanzati](IT), With a new computational approach SISSA researchers have been able to make the fascinating calculation.   

    From The International School for Advanced Studies [Scuola Internazionale Superiore di Studi Avanzati](IT): “There are 40 billion billions of Black Holes in the Universe!” 


    From The International School for Advanced Studies [Scuola Internazionale Superiore di Studi Avanzati](IT)


    Nico Pitrelli
    T +39 040 3787462
    M +39 339 1337950

    Donato Ramani
    T +39 040 3787513
    M +39 342 8022237

    There are 40 billion billions of Black Holes in the Universe!

    Image by PIxabay

    With a new computational approach SISSA researchers have been able to
    make the fascinating calculation. Moreover, according to their work, around
    1% of the overall ordinary (baryonic) matter is locked up in stellar mass
    black holes. Their results have just been published in the prestigious The
    Astrophysical Journal

    How many black holes are out there in the Universe? This is one of the most
    relevant and pressing questions in modern astrophysics and cosmology. The
    intriguing issue has recently been addressed by the SISSA Ph.D. student Alex
    Sicilia, supervised by Prof. Andrea Lapi and Dr. Lumen Boco, together with other
    collaborators from SISSA and from other national and international institutions. In
    a first paper of a series just published in The Astrophysical Journal, the authors have investigated the demographics of stellar mass black holes, which are black
    holes with masses between a few to some hundred solar masses, that originated
    at the end of the life of massive stars. According to the new research, a
    remarkable amount around 1% of the overall ordinary (baryonic) matter of
    the Universe is locked up in stellar mass black holes. Astonishingly, the
    researchers have found that the number of black holes within the
    observable Universe (a sphere of diameter around 90 billions light years) at
    present time is about 40 trillions, 40 billion billions (i.e., about 40 x 1018, i.e.
    4 followed by 19 zeros!).

    A new method to calculate the number of black holes

    As the authors of the research explain: “This important result has been obtained
    thanks to an original approach which combines the state-of-the-art stellar and
    binary evolution code SEVN developed by SISSA researcher Dr. Mario Spera to
    empirical prescriptions for relevant physical properties of galaxies, especially the
    rate of star formation, the amount of stellar mass and the metallicity of the
    interstellar medium (which are all important elements to define the number and
    the masses of stellar black holes). Exploiting these crucial ingredients in a self-
    consistent approach, thanks to their new computation approach, the researchers
    have then derived the number of stellar black holes and their mass distribution
    across the whole history of the Universe. Alex Sicilia, first author of the study,
    comments: “The innovative character of this work is in the coupling of a detailed
    model of stellar and binary evolution with advanced recipes for star formation and
    metal enrichment in individual galaxies. This is one of the first, and one of the
    most robust, ab initio computation of the stellar black hole mass function across
    cosmic history.”

    What’s the origin of most massive stellar black holes?

    The estimate of the number of black holes in the observable Universe is not the
    only issue investigated by the scientists in this piece of research. In collaboration
    with Dr. Ugo Di Carlo and Prof. Michela Mapelli from The University of Padua [Università degli Studi di Padova](IT),they
    have also explored the various formation channels for black holes of different
    masses, like isolated stars, binary systems and stellar clusters. According to their
    work, the most massive stellar black holes originate mainly from dynamical
    events in stellar clusters. Specifically, the researchers have shown that such
    events are required to explain the mass function of coalescing black holes as
    estimated from gravitational wave observations by the LIGO/Virgo collaboration.

    Caltech/MIT Advanced aLigo at Hanford, WA(US), Livingston, LA(US) and VIRGO Gravitational Wave interferometer, near Pisa(IT).

    Lumen Boco, co-author of the paper, comments: “Our work provides a robust
    theory for the generation of light seeds for (super)massive black holes at high
    redshift, and can constitute a starting point to investigate the origin of ‘heavy
    seeds’, that we will pursue in a forthcoming paper.

    A multidisciplinary work carried out in the context of “BiD4BESt – Big Data
    Application for Black Hole Evolution Studies”

    Prof. Andrea Lapi, Sicilia’s supervisor and coordinator of the Ph.D. in
    Astrophysics and Cosmology at SISSA, adds: “This research is really
    multidisciplinary, covering aspects of, and requiring expertise in stellar
    astrophysics, galaxy formation and evolution, gravitational wave and multi-messenger astrophysics; as such it needs collaborative efforts from various
    members of the SISSA Astrophysics and Cosmology group, and a strong
    networking with external collaborators.”

    Alex Sicilia’s work occurs in the context of a prestigious Innovative Training
    Network Project “BiD4BESt – Big Data Application for Black Hole Evolution
    Studies” co-PIed by Prof. Andrea Lapi from SISSA (H2020-MSCAITN-2019
    Project 860744), that has been funded by the European Union with about 3.5
    million Euros overall; it involves several academic and industrial partners, to
    provide Ph.D. training to 13 early stage researchers in the area of black hole
    formation and evolution, by exploiting advanced data science techniques.

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    International School for Advanced Studies, Trieste. Credit: Mike Peel (http://www.mikepeel.net)

    The International School for Advanced Studies [Scuola Internazionale Superiore di Studi Avanzati] (IT) (SISSA) is an international, state-supported, post-graduate-education and research institute, located in Trieste, Italy.

    SISSA is active in the fields of mathematics, physics, and neuroscience, offering both undergraduate and post-graduate courses. Each year, about 70 PhD students are admitted to SISSA based on their scientific qualifications. SISSA also runs master’s programs in the same areas, in collaboration with both Italian and other European universities.


    SISSA was founded in 1978, as a part of the reconstruction following the Friuli earthquake of 1976. Although the city of Trieste itself did not suffer any damage, physicist Paolo Budinich asked and obtained from the Italian government to include in the interventions the institution of a new, post-graduate teaching and research institute, modeled on the Scuola Normale Superiore di Pisa(IT). The school became operative with a PhD course in theoretical physics, and Budinich himself was appointed as general director. In 1986, Budinich left his position to Daniele Amati, who at the time was at the head of the theoretical division at The European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire](CH)[CERN]. Under his leadership, SISSA expanded its teaching and research activity towards the field of neuroscience, and instituted a new interdisciplinary laboratory aiming at connecting humanities and scientific studies. From 2001 to 2004, the director was the Italian geneticist Edoardo Boncinelli, who fostered the development of the existing research areas. Other directors were appointed in the following years, which saw the strengthening of SISSA collaboration with other Italian and European universities in offering master’s degree programs in the three areas of the School (mathematics, physics and neuroscience). The physicist Stefano Ruffo, the current director, was appointed in 2015. He signed a partnership with the International Centre for Genetic Engineering and Biotechnology to set up a new PhD program in Molecular Biology, with teaching activity organized by both institutions.


    SISSA houses the following research groups:

    Astroparticle Physics
    Condensed Matter
    Molecular and Statistical Biophysics
    Statistical Physics
    Theoretical Particle Physics
    Cognitive Neuroscience
    Molecular Biology
    Applied Mathematics
    Mathematical Analysis
    Mathematical Physics

    In addition, there is the Interdisciplinary Laboratory for Natural and Humanistic Sciences (now LISNU – Laboratorio Interdisciplinare Scienze Naturali e Umanistiche), which is endowed with the task of making connections between science, humanities, and the public. It currently offers a course in Scientific Communication and Scientific journalism.

    SISSA also enjoys special teaching and scientific links with the International Centre for Theoretical Physics, the International Centre for Genetic Engineering and Biotechnology and the Elettra Synchrotron Light Laboratory.

  • richardmitnick 10:46 am on January 16, 2022 Permalink | Reply
    Tags: , "Too much heavy metal stops stars producing more", Astronomy, , , , Many stars in the center of the Milky Way have high heavy metal content., The ARC Centres of Excellence for All Sky Astrophysics in 3D (AU)   

    From The ARC Centres of Excellence for All Sky Astrophysics in 3D (AU) via phys.org : “Too much heavy metal stops stars producing more” 


    From The ARC Centres of Excellence for All Sky Astrophysics in 3D (AU)



    January 11, 2022

    Many stars in the center of the Milky Way have high heavy metal content. Credit: Michael Franklin.

    Stars are giant factories that produce most of the elements in the universe—including the elements in us, and in Earth’s metal deposits. But how do stars produce changes over time?

    Two new papers published in MNRAS here and here shed light on how the youngest generation of stars will eventually stop contributing metals back to the universe.

    The authors are all members of ASTRO 3D, the ARC Centre of Excellence for All Sky Astrophysics in 3 Dimensions. They are based at Monash University (AU), The Australian National University (AU), and The Space Telescope Science Institute (US).

    “We know the first two elements of the periodic table—hydrogen and helium—were created in the Big Bang,” says Amanda Karakas, first author of a paper studying metal-rich stars.

    “Over time, the stars that came after the Big Bang produce heavier elements.”

    These “metal-rich” stars, like our sun, spew out their products into space, enriching the composition of the galaxy over time.

    These objects affect us directly as around half of the carbon and all elements heavier than iron are synthesized by stars like our sun.

    About 90 percent of all the lead on Earth, for example, was made in low-mass stars that also produce elements such as strontium and barium.

    But this ability to produce more metals changes depending on the composition of a star at its birth. “Introducing just a tiny bit more metal into the stars’ gas has really large implications on their evolution,” says Giulia Cinquegrana. Her paper uses modeling from the earlier paper to study the chemical output of metal-rich stars.

    “We discovered that at a certain threshold of initial metal content in the gas, stars will stop sending more metals into the universe over their lifetime,” Cinquegrana says.

    The sun, born about 4.5 billion years ago, is a typical “middle-aged” star. It is “metal-rich” compared to the first stellar generations and has a heavy element content similar to many other stars in the center of the Milky Way.

    “Our papers predict the evolution of younger stars (most-recent generations) which are up to seven times more metal-rich than the sun,” says Karakas.

    “My simulations show that this really high level of chemical enrichment causes these stars to act quite weirdly, compared to what we believe is happening in the sun,” says Cinquegrana.

    “Our models of super metal-rich stars show that they still expand to become red giants and go on to end their lives as white dwarfs, but by that time they are not expelling any heavy elements. The metals get locked up in the white dwarf remnant,” she says.

    “But the process of stars constantly adding elements to the universe means that the make-up of the universe is always changing. In the far distant future, the distribution of elements will look very different to what we see now in our solar system,” says Karakas.

    The papers are published in MNRAS, issue Jan 2022 and Feb 2022.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The ARC Centre of Excellence in All Sky Astrophysics in 3 Dimensions (AU)

    Unifies over 200 world-leading astronomers to understand the evolution of the matter, light, and elements from the Big Bang to the present day.

    We are combining Australian innovative 3D optical and radio technology with new theoretical supercomputer simulations on a massive scale, requiring new big data techniques.

    Through our nationwide training and education programs, we are training young scientific leaders and inspiring high-school students into STEM sciences to prepare Australia for the next generation of telescopes: the Square Kilometre Array and the Extremely Large Optical telescopes.

    The objectives for the ARC Centres of Excellence (AU) are to to:

    Undertake highly innovative and potentially transformational research that aims to achieve international standing in the fields of research envisaged and leads to a significant advancement of capabilities and knowledge.

    Link existing Australian research strengths and build critical mass with new capacity for interdisciplinary, collaborative approaches to address the most challenging and significant research problems.

    Develop relationships and build new networks with major national and international centres and research programs to help strengthen research, achieve global competitiveness and gain recognition for Australian research

    Build Australia’s human capacity in a range of research areas by attracting and retaining, from within Australia and abroad, researchers of high international standing as well as the most promising research students.

    Provide high-quality postgraduate and postdoctoral training environments for the next generation of researchers.

    Offer Australian researchers opportunities to work on large-scale problems over long periods of time.

    Establish Centres that have an impact on the wider community through interaction with SKA Murchison Widefield Array (AU), Boolardy station in outback Western Australia, at the Murchison Radio-astronomy Observatory (MRO), on the traditional lands of the Wajarri peoples.

    The Murchison Radio-astronomy Observatory,on the traditional lands of the Wajarri peoples, in outback Western Australia will house up to 130,000 antennas like these and the associated advanced technologies.

    EDGES telescope in a radio quiet zone at the Murchison Radio-astronomy Observatory in Western Australia, on the traditional lands of the Wajarri peoples.

    SKA ASKAP Pathfinder Radio Telescopehigher education institutes, governments, industry and the private and non-profit sector.

  • richardmitnick 11:02 am on January 14, 2022 Permalink | Reply
    Tags: "Which spiral arm of the Milky Way holds our sun?", Astronomy, , , , , Our spiral arm is the Orion-Cygnus Arm   

    From EarthSky : “Which spiral arm of the Milky Way holds our sun?” 


    From EarthSky

    January 14, 2022

    Artist’s concept of our Milky Way galaxy. See our sun’s orbit in its spiral arm, in yellow, between the galaxy’s center and its outer edge? In its 3rd data release, Gaia focused in the direction opposite the galaxy’s center, toward its nearest outer edge. Credit: R. Hurt/ JPL/Caltech-NASA(US).

    Which spiral arm of the Milky Way is home to the sun and Earth?

    Our Milky Way galaxy is the island of stars we call home. If you imagine it as a disk with spiral arms emanating from the center, our sun is approximately halfway from the center to the visible edge. Our solar system lies between two prominent spiral arms, in what astronomers once thought was a mere bridge of stars, gas, and dust clouds. In recent decades, research advances have revealed that we live in our very own spiral arm of the galaxy, albeit a relatively minor one. Our spiral arm is the Orion-Cygnus Arm, or simply, the Orion Arm or Local Arm. You sometimes still hear the names Orion Bridge or Orion Spur.

    The structure of the Milky Way

    Th Milky Way is a barred spiral galaxy, which means it has a central bar [the bar is easily seen in the image]. There’s still a lot we don’t know about the structure of our galaxy. According to the best current knowledge, the Milky Way is about 100,000 light-years across, about 2,000 light-years deep, and has 100 to 400 billion stars. There may be four primary spiral arms emanating from its center bar with an unknown number of smaller offshoot arms.

    Where, within this vast spiral structure, do our sun and its planets reside? We’re about 26,000 light-years from the center of the galaxy, on the inner edge of the Orion-Cygnus Arm.

    It’s sandwiched by two primary spiral arms, the Sagittarius and Perseus Arms. The artists’ concepts above and below show the various spiral arms, along with the location of our sun on the Orion-Cygnus Arm.

    In this diagram, you can more clearly see the 4 major spiral arms of the Milky Way. The Perseus Arm is blue-green, and the Carina-Sagittarius Arm is pink. Toward the top, the sun’s location is in the orange-yellow Orion-Cygnus Arm. The center of the galaxy reads GB, for “galactic bar.” Image via Rursus/ Wikimedia Commons.

    The Orion Arm

    The Orion Arm of the Milky Way is probably some 3,500 light-years wide. Initially, astronomers thought it was about 10,000 light-years in length. A new study – published in 2016 [Science Advances] – suggests it’s more than 20,000 light-years long.

    Astronomers continue to piece together the structure of the Milky Way by painstakingly measuring the positions and distances to many stars and gas clouds. Telescopes on the ground and in space determine distances from parallax measurements.

    Parallax method via The European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU).

    One currently operational space telescope, Gaia, is providing a wealth of new information that will allow astronomers to better characterize the Milky Way’s structure and size.

    European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU) GAIA satellite.

    In fact, Gaia’s stated goal is to provide a 3-dimensional map of our Milky Way.

    How our local spiral arm got its name

    The Orion Arm gets its name from the constellation Orion the Hunter, which is one of the most prominent constellations of Northern Hemisphere winter (Southern Hemisphere summer).
    In one of the most detailed astronomical images ever produced, NASA ESA’s Hubble Space Telescope captured an unprecedented look at the Orion Nebula. This extensive study took 105 Hubble orbits to complete. All imaging instruments aboard the telescope.

    Some of the brightest stars and most famous celestial objects of this constellation (Betelgeuse, Rigel, the stars of Orion’s Belt, the Orion Nebula [just above]) are neighbors of sorts to our sun, located within the Orion Arm. That’s why we see so many bright objects within the constellation Orion: When we look at it, we’re looking into our own local spiral arm.

    Betelgeuse-a superluminous red giant star 650 light-years away in the infrared from the European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU)Herschel Space Observatory (EU) Stars like Betelgeuse, end their lives as supernovae. Credit: Decin et al.

    European Space Agency Herschel spacecraft active from 2009 to 2013.

    Photo taken by Rogelio Bernal Andreo in October 2010 of the Orion constellation showing the surrounding nebulas of the Orion Molecular Cloud complex. Also captured is the red supergiant Betelgeuse (top left) and the famous belt of Orion composed of the OB stars Alnitak, Alnilam and Mintaka. To the bottom right can be found the star Rigel. The red crescent shape is Barnard’s Loop.
    Date 23 August 2012
    Source http://deepskycolors.com/astro/JPEG/RBA_Orion_HeadToToes.jpg
    Author Rogelio Bernal Andreo

    Orion Molecular Cloud Complex showing the distinctive three stars of Orion’s belt. Credit: Rogelio Bernal Andreo Wikimedia Commons.

    Artist’s concept of our galactic neighborhood. Some of the best-known astronomical objects in our sky lie in the Orion Arm, including our sun. Image via R. Hurt/JPL/Caltech-NASA(US).

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Deborah Byrd created the EarthSky radio series in 1991 and founded EarthSky.orgin 1994. Today, she serves as Editor-in-Chief of this website. She has won a galaxy of awards from the broadcasting and science communities, including having an asteroid named 3505 Byrd in her honor. A science communicator and educator since 1976, Byrd believes in science as a force for good in the world and a vital tool for the 21st century. “Being an EarthSky editor is like hosting a big global party for cool nature-lovers,” she says.

  • richardmitnick 12:27 pm on January 12, 2022 Permalink | Reply
    Tags: "1000-Light-Year-Wide Bubble Surrounding Earth Is Source of All Nearby Young Stars", Astronomy, , , , , Star formation occurrs on the bubble's surface., , The Space Telescope Science Institute (US)   

    From Hubblesite (US) and ESA Hubble (EU): “1000-Light-Year-Wide Bubble Surrounding Earth Is Source of All Nearby Young Stars” 

    National Aeronautics and Space Administration(US)/European Space Agency [Agence spatiale européenne] [Europäische Weltraumorganisation](EU) Hubble Space Telescope.

    From Hubblesite (US) and ESA Hubble (EU)

    January 12, 2022


    Christine Pulliam
    The Space Telescope Science Institute (US)

    Nadia Whitehead
    The Harvard Smithsonian Center for Astrophysics (US)

    About This Image. Artist’s illustration of the Local Bubble with star formation occurring on the bubble’s surface. Scientists have now shown how a chain of events beginning 14 million years ago with a set of powerful supernovae led to the creation of the vast bubble, responsible for the formation of all young stars within 500 light-years of the Sun and Earth.
    ILLUSTRATION: The Harvard Smithsonian Center for Astrophysics (US), Leah Hustak (The Space Telescope Science Institute (US))

    For the first time, astronomers have retraced the history of our galactic neighborhood, showing exactly how the young stars nearest to our solar system formed.

    Astronomers at The Harvard Smithsonian Center for Astrophysics (US) and The Space Telescope Science Institute (US) have reconstructed the evolutionary history of our galactic neighborhood, showing how a chain of events beginning 14 million years ago led to the creation of a vast bubble that’s responsible for the formation of all nearby, young stars.
    The Earth sits in a 1,000-light-year-wide void surrounded by thousands of young stars — but how did those stars form?

    In a paper appearing today in Nature, astronomers at the Center for Astrophysics | Harvard & Smithsonian (CfA) and the Space Telescope Science Institute (STScI) reconstruct the evolutionary history of our galactic neighborhood, showing how a chain of events beginning 14 million years ago led to the creation of a vast bubble that’s responsible for the formation of all nearby, young stars.

    “This is really an origin story; for the first time we can explain how all nearby star formation began,” said astronomer and data visualization expert Catherine Zucker, who completed the work during a fellowship at the CfA.

    The paper’s central figure, a 3D spacetime animation, reveals that all young stars and star-forming regions — within 500 light-years of Earth — sit on the surface of a giant bubble known as the Local Bubble.

    The Local Interstellar Cloud: An Overview. https://www.thoughtco.com/clouds-in-space-3073644

    While astronomers have known of its existence for decades, scientists can now see and understand the Local Bubble’s beginnings and its impact on the gas around it.

    The Source of Our Stars: The Local Bubble

    Using a trove of new data and data science techniques, the spacetime animation shows how a series of supernovae that first went off 14 million years ago pushed interstellar gas outwards, creating a bubble-like structure with a surface that’s ripe for star formation.

    Today, seven well-known star-forming regions or molecular clouds — dense regions in space where stars can form — sit on the surface of the bubble.

    “We’ve calculated that about 15 supernovae have gone off over millions of years to form the Local Bubble that we see today,” said Zucker who is now a NASA Hubble Fellow at STScI.

    The oddly-shaped bubble is not dormant and continues to slowly grow, the astronomers note.

    “It’s coasting along at about 4 miles per second,” Zucker said. “It has lost most of its oomph though and has pretty much plateaued in terms of speed.”

    The expansion speed of the bubble, as well as the past and present trajectories of the young stars forming on its surface, were derived using data obtained by Gaia, a space-based observatory launched by the European Space Agency.

    European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU) GAIA satellite.

    “This is an incredible detective story, driven by both data and theory,” said Harvard professor and Center for Astrophysics astronomer Alyssa Goodman, a study co-author and founder of glue, data visualization software that enabled the discovery. “We can piece together the history of star formation around us using a wide variety of independent clues: supernova models, stellar motions and exquisite new 3D maps of the material surrounding the Local Bubble.”

    Bubbles Everywhere?

    “When the first supernovae that created the Local Bubble went off, our Sun was far away from the action,” said co-author João Alves, a professor at The University of Vienna [Universität Wien](AT). “But about five million years ago, the Sun’s path through the galaxy took it right into the bubble, and now the Sun sits — just by luck — almost right in the bubble’s center.”

    Today, as humans peer out into space from near the Sun, they have a front row seat to the process of star formation occurring all around on the bubble’s surface.

    Astronomers first theorized that superbubbles were pervasive in the Milky Way nearly 50 years ago [The Astrophysical Journal]. “Now, we have proof — and what are the chances that we are right smack in the middle of one of these things?” asks Goodman. Statistically, it is very unlikely that the Sun would be centered in a giant bubble if such bubbles were rare in our Milky Way Galaxy, she explained.

    Goodman likens the discovery to a Milky Way that resembles very hole-y swiss cheese, where holes in the cheese are blasted out by supernovae, and new stars can form in the cheese around the holes created by dying stars.

    Next the team, including co-author and Harvard doctoral student Michael Foley, plans to map out more interstellar bubbles to get a full 3D view of their locations, shapes and sizes. Charting out bubbles, and their relationship to each other, will ultimately allow astronomers to understand the role played by dying stars in giving birth to new ones, and in the structure and evolution of galaxies like the Milky Way.

    Zucker wonders, “Where do these bubbles touch? How do they interact with each other? How do superbubbles drive the birth of stars like our Sun in the Milky Way?”

    Additional co-authors on the paper are Douglas Finkbeiner and Diana Khimey of the CfA; Josefa Groβschedl and Cameren Swiggum of the University of Vienna; Shmuel Bialy of The University of Maryland (US); Joshua Speagle of The University of Toronto (CA); and Andreas Burkert of The University Observatory Munich[Universitätssternwarte München](DE).

    The articles, analyzed data (on the Harvard Dataverse) and interactive figures and videos are all freely available to everyone through a dedicated website.

    The results were presented at a press conference of The American Astronomical Society (US) on Wednesday, January 12, 2022. The public can watch a recording of the conference here.

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition
    The NASA/ESA Hubble Space Telescope is a space telescope that was launched into low Earth orbit in 1990 and remains in operation. It was not the first space telescope, but it is one of the largest and most versatile, renowned both as a vital research tool and as a public relations boon for astronomy. The Hubble telescope is named after astronomer Edwin Hubble and is one of NASA’s Great Observatories, along with the NASA Compton Gamma Ray Observatory, the Chandra X-ray Observatory, and the NASA Spitzer Infared Space Telescope.

    National Aeronautics Space Agency(USA) Compton Gamma Ray Observatory
    National Aeronautics and Space Administration(US) Chandra X-ray telescope(US).
    National Aeronautics and Space Administration(US) Spitzer Infrared Apace Telescope no longer in service. Launched in 2003 and retired on 30 January 2020.

    Edwin Hubble at Caltech Palomar Samuel Oschin 48 inch Telescope(US) Credit: Emilio Segre Visual Archives/AIP/SPL.

    Edwin Hubble looking through the 100-inch Hooker telescope at Mount Wilson in Southern California(US), 1929 discovers the Universe is Expanding.Credit: Margaret Bourke-White/Time & Life Pictures/Getty Images.

    Hubble features a 2.4-meter (7.9 ft) mirror, and its four main instruments observe in the ultraviolet, visible, and near-infrared regions of the electromagnetic spectrum. Hubble’s orbit outside the distortion of Earth’s atmosphere allows it to capture extremely high-resolution images with substantially lower background light than ground-based telescopes. It has recorded some of the most detailed visible light images, allowing a deep view into space. Many Hubble observations have led to breakthroughs in astrophysics, such as determining the rate of expansion of the universe.

    The Hubble telescope was built by the United States space agency National Aeronautics Space Agency(US) with contributions from the European Space Agency [Agence spatiale européenne](EU). The Space Telescope Science Institute (STScI) selects Hubble’s targets and processes the resulting data, while the NASA Goddard Space Flight Center(US) controls the spacecraft. Space telescopes were proposed as early as 1923. Hubble was funded in the 1970s with a proposed launch in 1983, but the project was beset by technical delays, budget problems, and the 1986 Challenger disaster. It was finally launched by Space Shuttle Discovery in 1990, but its main mirror had been ground incorrectly, resulting in spherical aberration that compromised the telescope’s capabilities. The optics were corrected to their intended quality by a servicing mission in 1993.

    Hubble is the only telescope designed to be maintained in space by astronauts. Five Space Shuttle missions have repaired, upgraded, and replaced systems on the telescope, including all five of the main instruments. The fifth mission was initially canceled on safety grounds following the Columbia disaster (2003), but NASA administrator Michael D. Griffin approved the fifth servicing mission which was completed in 2009. The telescope was still operating as of April 24, 2020, its 30th anniversary, and could last until 2030–2040. One successor to the Hubble telescope is the National Aeronautics Space Agency(USA)/European Space Agency [Agence spatiale européenne](EU)/Canadian Space Agency(CA) Webb Infrared Space Telescope scheduled for launch in December 2021.

    National Aeronautics Space Agency(USA)/European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU)/ Canadian Space Agency [Agence Spatiale Canadienne](CA) Webb Infrared Space Telescope(US) James Webb Space Telescope annotated. Scheduled for launch in October 2021 delayed to December 2021.

    Proposals and precursors

    In 1923, Hermann Oberth—considered a father of modern rocketry, along with Robert H. Goddard and Konstantin Tsiolkovsky—published Die Rakete zu den Planetenräumen (“The Rocket into Planetary Space“), which mentioned how a telescope could be propelled into Earth orbit by a rocket.

    The history of the Hubble Space Telescope can be traced back as far as 1946, to astronomer Lyman Spitzer’s paper entitled Astronomical advantages of an extraterrestrial observatory. In it, he discussed the two main advantages that a space-based observatory would have over ground-based telescopes. First, the angular resolution (the smallest separation at which objects can be clearly distinguished) would be limited only by diffraction, rather than by the turbulence in the atmosphere, which causes stars to twinkle, known to astronomers as seeing. At that time ground-based telescopes were limited to resolutions of 0.5–1.0 arcseconds, compared to a theoretical diffraction-limited resolution of about 0.05 arcsec for an optical telescope with a mirror 2.5 m (8.2 ft) in diameter. Second, a space-based telescope could observe infrared and ultraviolet light, which are strongly absorbed by the atmosphere.

    Spitzer devoted much of his career to pushing for the development of a space telescope. In 1962, a report by the U.S. National Academy of Sciences recommended development of a space telescope as part of the space program, and in 1965 Spitzer was appointed as head of a committee given the task of defining scientific objectives for a large space telescope.

    Space-based astronomy had begun on a very small scale following World War II, as scientists made use of developments that had taken place in rocket technology. The first ultraviolet spectrum of the Sun was obtained in 1946, and the National Aeronautics and Space Administration (US) launched the Orbiting Solar Observatory (OSO) to obtain UV, X-ray, and gamma-ray spectra in 1962.
    National Aeronautics Space Agency(USA) Orbiting Solar Observatory

    An orbiting solar telescope was launched in 1962 by the United Kingdom as part of the Ariel space program, and in 1966 NASA launched the first Orbiting Astronomical Observatory (OAO) mission. OAO-1’s battery failed after three days, terminating the mission. It was followed by OAO-2, which carried out ultraviolet observations of stars and galaxies from its launch in 1968 until 1972, well beyond its original planned lifetime of one year.

    The OSO and OAO missions demonstrated the important role space-based observations could play in astronomy. In 1968, NASA developed firm plans for a space-based reflecting telescope with a mirror 3 m (9.8 ft) in diameter, known provisionally as the Large Orbiting Telescope or Large Space Telescope (LST), with a launch slated for 1979. These plans emphasized the need for crewed maintenance missions to the telescope to ensure such a costly program had a lengthy working life, and the concurrent development of plans for the reusable Space Shuttle indicated that the technology to allow this was soon to become available.

    Quest for funding

    The continuing success of the OAO program encouraged increasingly strong consensus within the astronomical community that the LST should be a major goal. In 1970, NASA established two committees, one to plan the engineering side of the space telescope project, and the other to determine the scientific goals of the mission. Once these had been established, the next hurdle for NASA was to obtain funding for the instrument, which would be far more costly than any Earth-based telescope. The U.S. Congress questioned many aspects of the proposed budget for the telescope and forced cuts in the budget for the planning stages, which at the time consisted of very detailed studies of potential instruments and hardware for the telescope. In 1974, public spending cuts led to Congress deleting all funding for the telescope project.
    In response a nationwide lobbying effort was coordinated among astronomers. Many astronomers met congressmen and senators in person, and large scale letter-writing campaigns were organized. The National Academy of Sciences published a report emphasizing the need for a space telescope, and eventually the Senate agreed to half the budget that had originally been approved by Congress.

    The funding issues led to something of a reduction in the scale of the project, with the proposed mirror diameter reduced from 3 m to 2.4 m, both to cut costs and to allow a more compact and effective configuration for the telescope hardware. A proposed precursor 1.5 m (4.9 ft) space telescope to test the systems to be used on the main satellite was dropped, and budgetary concerns also prompted collaboration with the European Space Agency. ESA agreed to provide funding and supply one of the first generation instruments for the telescope, as well as the solar cells that would power it, and staff to work on the telescope in the United States, in return for European astronomers being guaranteed at least 15% of the observing time on the telescope. Congress eventually approved funding of US$36 million for 1978, and the design of the LST began in earnest, aiming for a launch date of 1983. In 1983 the telescope was named after Edwin Hubble, who confirmed one of the greatest scientific discoveries of the 20th century, made by Georges Lemaître, that the universe is expanding.

    Construction and engineering

    Once the Space Telescope project had been given the go-ahead, work on the program was divided among many institutions. NASA Marshall Space Flight Center (MSFC) was given responsibility for the design, development, and construction of the telescope, while Goddard Space Flight Center was given overall control of the scientific instruments and ground-control center for the mission. MSFC commissioned the optics company Perkin-Elmer to design and build the Optical Telescope Assembly (OTA) and Fine Guidance Sensors for the space telescope. Lockheed was commissioned to construct and integrate the spacecraft in which the telescope would be housed.

    Optical Telescope Assembly

    Optically, the HST is a Cassegrain reflector of Ritchey–Chrétien design, as are most large professional telescopes. This design, with two hyperbolic mirrors, is known for good imaging performance over a wide field of view, with the disadvantage that the mirrors have shapes that are hard to fabricate and test. The mirror and optical systems of the telescope determine the final performance, and they were designed to exacting specifications. Optical telescopes typically have mirrors polished to an accuracy of about a tenth of the wavelength of visible light, but the Space Telescope was to be used for observations from the visible through the ultraviolet (shorter wavelengths) and was specified to be diffraction limited to take full advantage of the space environment. Therefore, its mirror needed to be polished to an accuracy of 10 nanometers, or about 1/65 of the wavelength of red light. On the long wavelength end, the OTA was not designed with optimum IR performance in mind—for example, the mirrors are kept at stable (and warm, about 15 °C) temperatures by heaters. This limits Hubble’s performance as an infrared telescope.

    Perkin-Elmer intended to use custom-built and extremely sophisticated computer-controlled polishing machines to grind the mirror to the required shape. However, in case their cutting-edge technology ran into difficulties, NASA demanded that PE sub-contract to Kodak to construct a back-up mirror using traditional mirror-polishing techniques. (The team of Kodak and Itek also bid on the original mirror polishing work. Their bid called for the two companies to double-check each other’s work, which would have almost certainly caught the polishing error that later caused such problems.) The Kodak mirror is now on permanent display at the National Air and Space Museum. An Itek mirror built as part of the effort is now used in the 2.4 m telescope at the Magdalena Ridge Observatory.

    Construction of the Perkin-Elmer mirror began in 1979, starting with a blank manufactured by Corning from their ultra-low expansion glass. To keep the mirror’s weight to a minimum it consisted of top and bottom plates, each one inch (25 mm) thick, sandwiching a honeycomb lattice. Perkin-Elmer simulated microgravity by supporting the mirror from the back with 130 rods that exerted varying amounts of force. This ensured the mirror’s final shape would be correct and to specification when finally deployed. Mirror polishing continued until May 1981. NASA reports at the time questioned Perkin-Elmer’s managerial structure, and the polishing began to slip behind schedule and over budget. To save money, NASA halted work on the back-up mirror and put the launch date of the telescope back to October 1984. The mirror was completed by the end of 1981; it was washed using 2,400 US gallons (9,100 L) of hot, deionized water and then received a reflective coating of 65 nm-thick aluminum and a protective coating of 25 nm-thick magnesium fluoride.

    Doubts continued to be expressed about Perkin-Elmer’s competence on a project of this importance, as their budget and timescale for producing the rest of the OTA continued to inflate. In response to a schedule described as “unsettled and changing daily”, NASA postponed the launch date of the telescope until April 1985. Perkin-Elmer’s schedules continued to slip at a rate of about one month per quarter, and at times delays reached one day for each day of work. NASA was forced to postpone the launch date until March and then September 1986. By this time, the total project budget had risen to US$1.175 billion.

    Spacecraft systems

    The spacecraft in which the telescope and instruments were to be housed was another major engineering challenge. It would have to withstand frequent passages from direct sunlight into the darkness of Earth’s shadow, which would cause major changes in temperature, while being stable enough to allow extremely accurate pointing of the telescope. A shroud of multi-layer insulation keeps the temperature within the telescope stable and surrounds a light aluminum shell in which the telescope and instruments sit. Within the shell, a graphite-epoxy frame keeps the working parts of the telescope firmly aligned. Because graphite composites are hygroscopic, there was a risk that water vapor absorbed by the truss while in Lockheed’s clean room would later be expressed in the vacuum of space; resulting in the telescope’s instruments being covered by ice. To reduce that risk, a nitrogen gas purge was performed before launching the telescope into space.

    While construction of the spacecraft in which the telescope and instruments would be housed proceeded somewhat more smoothly than the construction of the OTA, Lockheed still experienced some budget and schedule slippage, and by the summer of 1985, construction of the spacecraft was 30% over budget and three months behind schedule. An MSFC report said Lockheed tended to rely on NASA directions rather than take their own initiative in the construction.

    Computer systems and data processing

    The two initial, primary computers on the HST were the 1.25 MHz DF-224 system, built by Rockwell Autonetics, which contained three redundant CPUs, and two redundant NSSC-1 (NASA Standard Spacecraft Computer, Model 1) systems, developed by Westinghouse and GSFC using diode–transistor logic (DTL). A co-processor for the DF-224 was added during Servicing Mission 1 in 1993, which consisted of two redundant strings of an Intel-based 80386 processor with an 80387 math co-processor. The DF-224 and its 386 co-processor were replaced by a 25 MHz Intel-based 80486 processor system during Servicing Mission 3A in 1999. The new computer is 20 times faster, with six times more memory, than the DF-224 it replaced. It increases throughput by moving some computing tasks from the ground to the spacecraft and saves money by allowing the use of modern programming languages.

    Additionally, some of the science instruments and components had their own embedded microprocessor-based control systems. The MATs (Multiple Access Transponder) components, MAT-1 and MAT-2, utilize Hughes Aircraft CDP1802CD microprocessors. The Wide Field and Planetary Camera (WFPC) also utilized an RCA 1802 microprocessor (or possibly the older 1801 version). The WFPC-1 was replaced by the WFPC-2 [below] during Servicing Mission 1 in 1993, which was then replaced by the Wide Field Camera 3 (WFC3) [below] during Servicing Mission 4 in 2009.

    Initial instruments

    When launched, the HST carried five scientific instruments: the Wide Field and Planetary Camera (WF/PC), Goddard High Resolution Spectrograph (GHRS), High Speed Photometer (HSP), Faint Object Camera (FOC) and the Faint Object Spectrograph (FOS). WF/PC was a high-resolution imaging device primarily intended for optical observations. It was built by NASA JPL-Caltech(US), and incorporated a set of 48 filters isolating spectral lines of particular astrophysical interest. The instrument contained eight charge-coupled device (CCD) chips divided between two cameras, each using four CCDs. Each CCD has a resolution of 0.64 megapixels. The wide field camera (WFC) covered a large angular field at the expense of resolution, while the planetary camera (PC) took images at a longer effective focal length than the WF chips, giving it a greater magnification.

    The GHRS was a spectrograph designed to operate in the ultraviolet. It was built by the Goddard Space Flight Center and could achieve a spectral resolution of 90,000. Also optimized for ultraviolet observations were the FOC and FOS, which were capable of the highest spatial resolution of any instruments on Hubble. Rather than CCDs these three instruments used photon-counting digicons as their detectors. The FOC was constructed by ESA, while the University of California, San Diego(US), and Martin Marietta Corporation built the FOS.

    The final instrument was the HSP, designed and built at the University of Wisconsin–Madison(US). It was optimized for visible and ultraviolet light observations of variable stars and other astronomical objects varying in brightness. It could take up to 100,000 measurements per second with a photometric accuracy of about 2% or better.

    HST’s guidance system can also be used as a scientific instrument. Its three Fine Guidance Sensors (FGS) are primarily used to keep the telescope accurately pointed during an observation, but can also be used to carry out extremely accurate astrometry; measurements accurate to within 0.0003 arcseconds have been achieved.

    Ground support

    The Space Telescope Science Institute (STScI) is responsible for the scientific operation of the telescope and the delivery of data products to astronomers. STScI is operated by the Association of Universities for Research in Astronomy (US) (AURA) and is physically located in Baltimore, Maryland on the Homewood campus of Johns Hopkins University (US), one of the 39 U.S. universities and seven international affiliates that make up the AURA consortium. STScI was established in 1981 after something of a power struggle between NASA and the scientific community at large. NASA had wanted to keep this function in-house, but scientists wanted it to be based in an academic establishment. The Space Telescope European Coordinating Facility (ST-ECF), established at Garching bei München near Munich in 1984, provided similar support for European astronomers until 2011, when these activities were moved to the European Space Astronomy Centre.

    One rather complex task that falls to STScI is scheduling observations for the telescope. Hubble is in a low-Earth orbit to enable servicing missions, but this means most astronomical targets are occulted by the Earth for slightly less than half of each orbit. Observations cannot take place when the telescope passes through the South Atlantic Anomaly due to elevated radiation levels, and there are also sizable exclusion zones around the Sun (precluding observations of Mercury), Moon and Earth. The solar avoidance angle is about 50°, to keep sunlight from illuminating any part of the OTA. Earth and Moon avoidance keeps bright light out of the FGSs, and keeps scattered light from entering the instruments. If the FGSs are turned off, the Moon and Earth can be observed. Earth observations were used very early in the program to generate flat-fields for the WFPC1 instrument. There is a so-called continuous viewing zone (CVZ), at roughly 90° to the plane of Hubble’s orbit, in which targets are not occulted for long periods.

    Challenger disaster, delays, and eventual launch

    By January 1986, the planned launch date of October looked feasible, but the Challenger explosion brought the U.S. space program to a halt, grounding the Shuttle fleet and forcing the launch of Hubble to be postponed for several years. The telescope had to be kept in a clean room, powered up and purged with nitrogen, until a launch could be rescheduled. This costly situation (about US$6 million per month) pushed the overall costs of the project even higher. This delay did allow time for engineers to perform extensive tests, swap out a possibly failure-prone battery, and make other improvements. Furthermore, the ground software needed to control Hubble was not ready in 1986, and was barely ready by the 1990 launch.

    Eventually, following the resumption of shuttle flights in 1988, the launch of the telescope was scheduled for 1990. On April 24, 1990, Space Shuttle Discovery successfully launched it during the STS-31 mission.

    From its original total cost estimate of about US$400 million, the telescope cost about US$4.7 billion by the time of its launch. Hubble’s cumulative costs were estimated to be about US$10 billion in 2010, twenty years after launch.

    List of Hubble instruments

    Hubble accommodates five science instruments at a given time, plus the Fine Guidance Sensors, which are mainly used for aiming the telescope but are occasionally used for scientific astrometry measurements. Early instruments were replaced with more advanced ones during the Shuttle servicing missions. COSTAR was a corrective optics device rather than a science instrument, but occupied one of the five instrument bays.
    Since the final servicing mission in 2009, the four active instruments have been ACS, COS, STIS and WFC3. NICMOS is kept in hibernation, but may be revived if WFC3 were to fail in the future.

    Advanced Camera for Surveys (ACS; 2002–present)
    Cosmic Origins Spectrograph (COS; 2009–present)
    Corrective Optics Space Telescope Axial Replacement (COSTAR; 1993–2009)
    Faint Object Camera (FOC; 1990–2002)
    Faint Object Spectrograph (FOS; 1990–1997)
    Fine Guidance Sensor (FGS; 1990–present)
    Goddard High Resolution Spectrograph (GHRS/HRS; 1990–1997)
    High Speed Photometer (HSP; 1990–1993)
    Near Infrared Camera and Multi-Object Spectrometer (NICMOS; 1997–present, hibernating since 2008)
    Space Telescope Imaging Spectrograph (STIS; 1997–present (non-operative 2004–2009))
    Wide Field and Planetary Camera (WFPC; 1990–1993)
    Wide Field and Planetary Camera 2 (WFPC2; 1993–2009)
    Wide Field Camera 3 (WFC3; 2009–present)

    Of the former instruments, three (COSTAR, FOS and WFPC2) are displayed in the Smithsonian National Air and Space Museum. The FOC is in the Dornier museum, Germany. The HSP is in the Space Place at the University of Wisconsin–Madison. The first WFPC was dismantled, and some components were then re-used in WFC3.

    Flawed mirror

    Within weeks of the launch of the telescope, the returned images indicated a serious problem with the optical system. Although the first images appeared to be sharper than those of ground-based telescopes, Hubble failed to achieve a final sharp focus and the best image quality obtained was drastically lower than expected. Images of point sources spread out over a radius of more than one arcsecond, instead of having a point spread function (PSF) concentrated within a circle 0.1 arcseconds (485 nrad) in diameter, as had been specified in the design criteria.

    Analysis of the flawed images revealed that the primary mirror had been polished to the wrong shape. Although it was believed to be one of the most precisely figured optical mirrors ever made, smooth to about 10 nanometers, the outer perimeter was too flat by about 2200 nanometers (about 1⁄450 mm or 1⁄11000 inch). This difference was catastrophic, introducing severe spherical aberration, a flaw in which light reflecting off the edge of a mirror focuses on a different point from the light reflecting off its center.

    The effect of the mirror flaw on scientific observations depended on the particular observation—the core of the aberrated PSF was sharp enough to permit high-resolution observations of bright objects, and spectroscopy of point sources was affected only through a sensitivity loss. However, the loss of light to the large, out-of-focus halo severely reduced the usefulness of the telescope for faint objects or high-contrast imaging. This meant nearly all the cosmological programs were essentially impossible, since they required observation of exceptionally faint objects. This led politicians to question NASA’s competence, scientists to rue the cost which could have gone to more productive endeavors, and comedians to make jokes about NASA and the telescope − in the 1991 comedy The Naked Gun 2½: The Smell of Fear, in a scene where historical disasters are displayed, Hubble is pictured with RMS Titanic and LZ 129 Hindenburg. Nonetheless, during the first three years of the Hubble mission, before the optical corrections, the telescope still carried out a large number of productive observations of less demanding targets. The error was well characterized and stable, enabling astronomers to partially compensate for the defective mirror by using sophisticated image processing techniques such as deconvolution.

    Origin of the problem

    A commission headed by Lew Allen, director of the Jet Propulsion Laboratory, was established to determine how the error could have arisen. The Allen Commission found that a reflective null corrector, a testing device used to achieve a properly shaped non-spherical mirror, had been incorrectly assembled—one lens was out of position by 1.3 mm (0.051 in). During the initial grinding and polishing of the mirror, Perkin-Elmer analyzed its surface with two conventional refractive null correctors. However, for the final manufacturing step (figuring), they switched to the custom-built reflective null corrector, designed explicitly to meet very strict tolerances. The incorrect assembly of this device resulted in the mirror being ground very precisely but to the wrong shape. A few final tests, using the conventional null correctors, correctly reported spherical aberration. But these results were dismissed, thus missing the opportunity to catch the error, because the reflective null corrector was considered more accurate.

    The commission blamed the failings primarily on Perkin-Elmer. Relations between NASA and the optics company had been severely strained during the telescope construction, due to frequent schedule slippage and cost overruns. NASA found that Perkin-Elmer did not review or supervise the mirror construction adequately, did not assign its best optical scientists to the project (as it had for the prototype), and in particular did not involve the optical designers in the construction and verification of the mirror. While the commission heavily criticized Perkin-Elmer for these managerial failings, NASA was also criticized for not picking up on the quality control shortcomings, such as relying totally on test results from a single instrument.

    Design of a solution

    Many feared that Hubble would be abandoned. The design of the telescope had always incorporated servicing missions, and astronomers immediately began to seek potential solutions to the problem that could be applied at the first servicing mission, scheduled for 1993. While Kodak had ground a back-up mirror for Hubble, it would have been impossible to replace the mirror in orbit, and too expensive and time-consuming to bring the telescope back to Earth for a refit. Instead, the fact that the mirror had been ground so precisely to the wrong shape led to the design of new optical components with exactly the same error but in the opposite sense, to be added to the telescope at the servicing mission, effectively acting as “spectacles” to correct the spherical aberration.

    The first step was a precise characterization of the error in the main mirror. Working backwards from images of point sources, astronomers determined that the conic constant of the mirror as built was −1.01390±0.0002, instead of the intended −1.00230. The same number was also derived by analyzing the null corrector used by Perkin-Elmer to figure the mirror, as well as by analyzing interferograms obtained during ground testing of the mirror.

    Because of the way the HST’s instruments were designed, two different sets of correctors were required. The design of the Wide Field and Planetary Camera 2, already planned to replace the existing WF/PC, included relay mirrors to direct light onto the four separate charge-coupled device (CCD) chips making up its two cameras. An inverse error built into their surfaces could completely cancel the aberration of the primary. However, the other instruments lacked any intermediate surfaces that could be figured in this way, and so required an external correction device.

    The Corrective Optics Space Telescope Axial Replacement (COSTAR) system was designed to correct the spherical aberration for light focused at the FOC, FOS, and GHRS. It consists of two mirrors in the light path with one ground to correct the aberration. To fit the COSTAR system onto the telescope, one of the other instruments had to be removed, and astronomers selected the High Speed Photometer to be sacrificed. By 2002, all the original instruments requiring COSTAR had been replaced by instruments with their own corrective optics. COSTAR was removed and returned to Earth in 2009 where it is exhibited at the National Air and Space Museum. The area previously used by COSTAR is now occupied by the Cosmic Origins Spectrograph.


    NASA COSTAR installation

    Servicing missions and new instruments

    Servicing Mission 1

    The first Hubble serving mission was scheduled for 1993 before the mirror problem was discovered. It assumed greater importance, as the astronauts would need to do extensive work to install corrective optics; failure would have resulted in either abandoning Hubble or accepting its permanent disability. Other components failed before the mission, causing the repair cost to rise to $500 million (not including the cost of the shuttle flight). A successful repair would help demonstrate the viability of building Space Station Alpha, however.

    STS-49 in 1992 demonstrated the difficulty of space work. While its rescue of Intelsat 603 received praise, the astronauts had taken possibly reckless risks in doing so. Neither the rescue nor the unrelated assembly of prototype space station components occurred as the astronauts had trained, causing NASA to reassess planning and training, including for the Hubble repair. The agency assigned to the mission Story Musgrave—who had worked on satellite repair procedures since 1976—and six other experienced astronauts, including two from STS-49. The first mission director since Project Apollo would coordinate a crew with 16 previous shuttle flights. The astronauts were trained to use about a hundred specialized tools.

    Heat had been the problem on prior spacewalks, which occurred in sunlight. Hubble needed to be repaired out of sunlight. Musgrave discovered during vacuum training, seven months before the mission, that spacesuit gloves did not sufficiently protect against the cold of space. After STS-57 confirmed the issue in orbit, NASA quickly changed equipment, procedures, and flight plan. Seven total mission simulations occurred before launch, the most thorough preparation in shuttle history. No complete Hubble mockup existed, so the astronauts studied many separate models (including one at the Smithsonian) and mentally combined their varying and contradictory details. Service Mission 1 flew aboard Endeavour in December 1993, and involved installation of several instruments and other equipment over ten days.

    Most importantly, the High Speed Photometer was replaced with the COSTAR corrective optics package, and WFPC was replaced with the Wide Field and Planetary Camera 2 (WFPC2) with an internal optical correction system. The solar arrays and their drive electronics were also replaced, as well as four gyroscopes in the telescope pointing system, two electrical control units and other electrical components, and two magnetometers. The onboard computers were upgraded with added coprocessors, and Hubble’s orbit was boosted.

    On January 13, 1994, NASA declared the mission a complete success and showed the first sharper images. The mission was one of the most complex performed up until that date, involving five long extra-vehicular activity periods. Its success was a boon for NASA, as well as for the astronomers who now had a more capable space telescope.

    Servicing Mission 2

    Servicing Mission 2, flown by Discovery in February 1997, replaced the GHRS and the FOS with the Space Telescope Imaging Spectrograph (STIS) and the Near Infrared Camera and Multi-Object Spectrometer (NICMOS), replaced an Engineering and Science Tape Recorder with a new Solid State Recorder, and repaired thermal insulation. NICMOS contained a heat sink of solid nitrogen to reduce the thermal noise from the instrument, but shortly after it was installed, an unexpected thermal expansion resulted in part of the heat sink coming into contact with an optical baffle. This led to an increased warming rate for the instrument and reduced its original expected lifetime of 4.5 years to about two years.

    Servicing Mission 3A

    Servicing Mission 3A, flown by Discovery, took place in December 1999, and was a split-off from Servicing Mission 3 after three of the six onboard gyroscopes had failed. The fourth failed a few weeks before the mission, rendering the telescope incapable of performing scientific observations. The mission replaced all six gyroscopes, replaced a Fine Guidance Sensor and the computer, installed a Voltage/temperature Improvement Kit (VIK) to prevent battery overcharging, and replaced thermal insulation blankets.

    Servicing Mission 3B

    Servicing Mission 3B flown by Columbia in March 2002 saw the installation of a new instrument, with the FOC (which, except for the Fine Guidance Sensors when used for astrometry, was the last of the original instruments) being replaced by the Advanced Camera for Surveys (ACS). This meant COSTAR was no longer required, since all new instruments had built-in correction for the main mirror aberration. The mission also revived NICMOS by installing a closed-cycle cooler and replaced the solar arrays for the second time, providing 30 percent more power.

    Servicing Mission 4

    Plans called for Hubble to be serviced in February 2005, but the Columbia disaster in 2003, in which the orbiter disintegrated on re-entry into the atmosphere, had wide-ranging effects on the Hubble program. NASA Administrator Sean O’Keefe decided all future shuttle missions had to be able to reach the safe haven of the International Space Station should in-flight problems develop. As no shuttles were capable of reaching both HST and the space station during the same mission, future crewed service missions were canceled. This decision was criticised by numerous astronomers who felt Hubble was valuable enough to merit the human risk. HST’s planned successor, the James Webb Telescope (JWST), as of 2004 was not expected to launch until at least 2011. A gap in space-observing capabilities between a decommissioning of Hubble and the commissioning of a successor was of major concern to many astronomers, given the significant scientific impact of HST. The consideration that JWST will not be located in low Earth orbit, and therefore cannot be easily upgraded or repaired in the event of an early failure, only made concerns more acute. On the other hand, many astronomers felt strongly that servicing Hubble should not take place if the expense were to come from the JWST budget.

    In January 2004, O’Keefe said he would review his decision to cancel the final servicing mission to HST, due to public outcry and requests from Congress for NASA to look for a way to save it. The National Academy of Sciences convened an official panel, which recommended in July 2004 that the HST should be preserved despite the apparent risks. Their report urged “NASA should take no actions that would preclude a space shuttle servicing mission to the Hubble Space Telescope”. In August 2004, O’Keefe asked Goddard Space Flight Center to prepare a detailed proposal for a robotic service mission. These plans were later canceled, the robotic mission being described as “not feasible”. In late 2004, several Congressional members, led by Senator Barbara Mikulski, held public hearings and carried on a fight with much public support (including thousands of letters from school children across the U.S.) to get the Bush Administration and NASA to reconsider the decision to drop plans for a Hubble rescue mission.

    The nomination in April 2005 of a new NASA Administrator, Michael D. Griffin, changed the situation, as Griffin stated he would consider a crewed servicing mission. Soon after his appointment Griffin authorized Goddard to proceed with preparations for a crewed Hubble maintenance flight, saying he would make the final decision after the next two shuttle missions. In October 2006 Griffin gave the final go-ahead, and the 11-day mission by Atlantis was scheduled for October 2008. Hubble’s main data-handling unit failed in September 2008, halting all reporting of scientific data until its back-up was brought online on October 25, 2008. Since a failure of the backup unit would leave the HST helpless, the service mission was postponed to incorporate a replacement for the primary unit.

    Servicing Mission 4 (SM4), flown by Atlantis in May 2009, was the last scheduled shuttle mission for HST. SM4 installed the replacement data-handling unit, repaired the ACS and STIS systems, installed improved nickel hydrogen batteries, and replaced other components including all six gyroscopes. SM4 also installed two new observation instruments—Wide Field Camera 3 (WFC3) and the Cosmic Origins Spectrograph (COS)—and the Soft Capture and Rendezvous System, which will enable the future rendezvous, capture, and safe disposal of Hubble by either a crewed or robotic mission. Except for the ACS’s High Resolution Channel, which could not be repaired and was disabled, the work accomplished during SM4 rendered the telescope fully functional.

    Major projects

    Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey [CANDELS]

    The survey “aims to explore galactic evolution in the early Universe, and the very first seeds of cosmic structure at less than one billion years after the Big Bang.” The CANDELS project site describes the survey’s goals as the following:

    The Cosmic Assembly Near-IR Deep Extragalactic Legacy Survey is designed to document the first third of galactic evolution from z = 8 to 1.5 via deep imaging of more than 250,000 galaxies with WFC3/IR and ACS. It will also find the first Type Ia SNe beyond z > 1.5 and establish their accuracy as standard candles for cosmology. Five premier multi-wavelength sky regions are selected; each has multi-wavelength data from Spitzer and other facilities, and has extensive spectroscopy of the brighter galaxies. The use of five widely separated fields mitigates cosmic variance and yields statistically robust and complete samples of galaxies down to 109 solar masses out to z ~ 8.

    Frontier Fields program

    The program, officially named Hubble Deep Fields Initiative 2012, is aimed to advance the knowledge of early galaxy formation by studying high-redshift galaxies in blank fields with the help of gravitational lensing to see the “faintest galaxies in the distant universe”. The Frontier Fields web page describes the goals of the program being:

    To reveal hitherto inaccessible populations of z = 5–10 galaxies that are ten to fifty times fainter intrinsically than any presently known
    To solidify our understanding of the stellar masses and star formation histories of sub-L* galaxies at the earliest times
    To provide the first statistically meaningful morphological characterization of star forming galaxies at z > 5
    To find z > 8 galaxies stretched out enough by cluster lensing to discern internal structure and/or magnified enough by cluster lensing for spectroscopic follow-up.

    Cosmic Evolution Survey (COSMOS)

    The Cosmic Evolution Survey (COSMOS) is an astronomical survey designed to probe the formation and evolution of galaxies as a function of both cosmic time (redshift) and the local galaxy environment. The survey covers a two square degree equatorial field with spectroscopy and X-ray to radio imaging by most of the major space-based telescopes and a number of large ground based telescopes, making it a key focus region of extragalactic astrophysics. COSMOS was launched in 2006 as the largest project pursued by the Hubble Space Telescope at the time, and still is the largest continuous area of sky covered for the purposes of mapping deep space in blank fields, 2.5 times the area of the moon on the sky and 17 times larger than the largest of the CANDELS regions. The COSMOS scientific collaboration that was forged from the initial COSMOS survey is the largest and longest-running extragalactic collaboration, known for its collegiality and openness. The study of galaxies in their environment can be done only with large areas of the sky, larger than a half square degree. More than two million galaxies are detected, spanning 90% of the age of the Universe. The COSMOS collaboration is led by Caitlin Casey, Jeyhan Kartaltepe, and Vernesa Smolcic and involves more than 200 scientists in a dozen countries.

    Important discoveries

    Hubble has helped resolve some long-standing problems in astronomy, while also raising new questions. Some results have required new theories to explain them.

    Age of the universe

    Among its primary mission targets was to measure distances to Cepheid variable stars more accurately than ever before, and thus constrain the value of the Hubble constant, the measure of the rate at which the universe is expanding, which is also related to its age. Before the launch of HST, estimates of the Hubble constant typically had errors of up to 50%, but Hubble measurements of Cepheid variables in the Virgo Cluster and other distant galaxy clusters provided a measured value with an accuracy of ±10%, which is consistent with other more accurate measurements made since Hubble’s launch using other techniques. The estimated age is now about 13.7 billion years, but before the Hubble Telescope, scientists predicted an age ranging from 10 to 20 billion years.

    Expansion of the universe

    While Hubble helped to refine estimates of the age of the universe, it also cast doubt on theories about its future. Astronomers from the High-z Supernova Search Team and the Supernova Cosmology Project used ground-based telescopes and HST to observe distant supernovae and uncovered evidence that, far from decelerating under the influence of gravity, the expansion of the universe may in fact be accelerating. Three members of these two groups have subsequently been awarded Nobel Prizes for their discovery.

    Saul Perlmutter [The Supernova Cosmology Project] shared the 2006 Shaw Prize in Astronomy, the 2011 Nobel Prize in Physics, and the 2015 Breakthrough Prize in Fundamental Physics with Brian P. Schmidt and Adam Riess [The High-z Supernova Search Team] for providing evidence that the expansion of the universe is accelerating.

    The cause of this acceleration remains poorly understood; the most common cause attributed is Dark Energy.

    Black holes

    The high-resolution spectra and images provided by the HST have been especially well-suited to establishing the prevalence of black holes in the center of nearby galaxies. While it had been hypothesized in the early 1960s that black holes would be found at the centers of some galaxies, and astronomers in the 1980s identified a number of good black hole candidates, work conducted with Hubble shows that black holes are probably common to the centers of all galaxies. The Hubble programs further established that the masses of the nuclear black holes and properties of the galaxies are closely related. The legacy of the Hubble programs on black holes in galaxies is thus to demonstrate a deep connection between galaxies and their central black holes.

    Extending visible wavelength images

    A unique window on the Universe enabled by Hubble are the Hubble Deep Field, Hubble Ultra-Deep Field, and Hubble Extreme Deep Field images, which used Hubble’s unmatched sensitivity at visible wavelengths to create images of small patches of sky that are the deepest ever obtained at optical wavelengths. The images reveal galaxies billions of light years away, and have generated a wealth of scientific papers, providing a new window on the early Universe. The Wide Field Camera 3 improved the view of these fields in the infrared and ultraviolet, supporting the discovery of some of the most distant objects yet discovered, such as MACS0647-JD.

    The non-standard object SCP 06F6 was discovered by the Hubble Space Telescope in February 2006.

    On March 3, 2016, researchers using Hubble data announced the discovery of the farthest known galaxy to date: GN-z11. The Hubble observations occurred on February 11, 2015, and April 3, 2015, as part of the CANDELS/GOODS-North surveys.

    Solar System discoveries

    HST has also been used to study objects in the outer reaches of the Solar System, including the dwarf planets Pluto and Eris.

    The collision of Comet Shoemaker-Levy 9 with Jupiter in 1994 was fortuitously timed for astronomers, coming just a few months after Servicing Mission 1 had restored Hubble’s optical performance. Hubble images of the planet were sharper than any taken since the passage of Voyager 2 in 1979, and were crucial in studying the dynamics of the collision of a comet with Jupiter, an event believed to occur once every few centuries.

    During June and July 2012, U.S. astronomers using Hubble discovered Styx, a tiny fifth moon orbiting Pluto.

    In March 2015, researchers announced that measurements of aurorae around Ganymede, one of Jupiter’s moons, revealed that it has a subsurface ocean. Using Hubble to study the motion of its aurorae, the researchers determined that a large saltwater ocean was helping to suppress the interaction between Jupiter’s magnetic field and that of Ganymede. The ocean is estimated to be 100 km (60 mi) deep, trapped beneath a 150 km (90 mi) ice crust.

    From June to August 2015, Hubble was used to search for a Kuiper belt object (KBO) target for the New Horizons Kuiper Belt Extended Mission (KEM) when similar searches with ground telescopes failed to find a suitable target.

    National Aeronautics Space Agency(USA)/New Horizons(US) spacecraft.

    This resulted in the discovery of at least five new KBOs, including the eventual KEM target, 486958 Arrokoth, that New Horizons performed a close fly-by of on January 1, 2019.

    In August 2020, taking advantage of a total lunar eclipse, astronomers using NASA’s Hubble Space Telescope have detected Earth’s own brand of sunscreen – ozone – in our atmosphere. This method simulates how astronomers and astrobiology researchers will search for evidence of life beyond Earth by observing potential “biosignatures” on exoplanets (planets around other stars).
    Hubble and ALMA image of MACS J1149.5+2223.

    Supernova reappearance

    On December 11, 2015, Hubble captured an image of the first-ever predicted reappearance of a supernova, dubbed “Refsdal”, which was calculated using different mass models of a galaxy cluster whose gravity is warping the supernova’s light. The supernova was previously seen in November 2014 behind galaxy cluster MACS J1149.5+2223 as part of Hubble’s Frontier Fields program. Astronomers spotted four separate images of the supernova in an arrangement known as an “Einstein Cross”.

    The light from the cluster has taken about five billion years to reach Earth, though the supernova exploded some 10 billion years ago. Based on early lens models, a fifth image was predicted to reappear by the end of 2015. The detection of Refsdal’s reappearance in December 2015 served as a unique opportunity for astronomers to test their models of how mass, especially dark matter, is distributed within this galaxy cluster.

    Impact on astronomy

    Many objective measures show the positive impact of Hubble data on astronomy. Over 15,000 papers based on Hubble data have been published in peer-reviewed journals, and countless more have appeared in conference proceedings. Looking at papers several years after their publication, about one-third of all astronomy papers have no citations, while only two percent of papers based on Hubble data have no citations. On average, a paper based on Hubble data receives about twice as many citations as papers based on non-Hubble data. Of the 200 papers published each year that receive the most citations, about 10% are based on Hubble data.

    Although the HST has clearly helped astronomical research, its financial cost has been large. A study on the relative astronomical benefits of different sizes of telescopes found that while papers based on HST data generate 15 times as many citations as a 4 m (13 ft) ground-based telescope such as the William Herschel Telescope, the HST costs about 100 times as much to build and maintain.

    Isaac Newton Group 4.2 meter William Herschel Telescope at Roque de los Muchachos Observatory | Instituto de Astrofísica de Canarias • IAC(ES) on La Palma in the Canary Islands(ES), 2,396 m (7,861 ft)

    Deciding between building ground- versus space-based telescopes is complex. Even before Hubble was launched, specialized ground-based techniques such as aperture masking interferometry had obtained higher-resolution optical and infrared images than Hubble would achieve, though restricted to targets about 108 times brighter than the faintest targets observed by Hubble. Since then, advances in “adaptive optics” have extended the high-resolution imaging capabilities of ground-based telescopes to the infrared imaging of faint objects.

    Glistening against the awesome backdrop of the night sky above ESO’s Paranal Observatory, four laser beams project out into the darkness from Unit Telescope 4 UT4 of the VLT, a major asset of the Adaptive Optics system.

    UCO KeckLaser Guide Star Adaptive Optics on two 10 meter Keck Observatory telescopes, Maunakea Hawaii USA, altitude 4,207 m (13,802 ft).

    The usefulness of adaptive optics versus HST observations depends strongly on the particular details of the research questions being asked. In the visible bands, adaptive optics can correct only a relatively small field of view, whereas HST can conduct high-resolution optical imaging over a wide field. Only a small fraction of astronomical objects are accessible to high-resolution ground-based imaging; in contrast Hubble can perform high-resolution observations of any part of the night sky, and on objects that are extremely faint.

    Impact on aerospace engineering

    In addition to its scientific results, Hubble has also made significant contributions to aerospace engineering, in particular the performance of systems in low Earth orbit. These insights result from Hubble’s long lifetime on orbit, extensive instrumentation, and return of assemblies to the Earth where they can be studied in detail. In particular, Hubble has contributed to studies of the behavior of graphite composite structures in vacuum, optical contamination from residual gas and human servicing, radiation damage to electronics and sensors, and the long term behavior of multi-layer insulation. One lesson learned was that gyroscopes assembled using pressurized oxygen to deliver suspension fluid were prone to failure due to electric wire corrosion. Gyroscopes are now assembled using pressurized nitrogen. Another is that optical surfaces in LEO can have surprisingly long lifetimes; Hubble was only expected to last 15 years before the mirror became unusable, but after 14 years there was no measureable degradation. Finally, Hubble servicing missions, particularly those that serviced components not designed for in-space maintenance, have contributed towards the development of new tools and techniques for on-orbit repair.


    All Hubble data is eventually made available via the Mikulski Archive for Space Telescopes at STScI, CADC and ESA/ESAC. Data is usually proprietary—available only to the principal investigator (PI) and astronomers designated by the PI—for twelve months after being taken. The PI can apply to the director of the STScI to extend or reduce the proprietary period in some circumstances.

    Observations made on Director’s Discretionary Time are exempt from the proprietary period, and are released to the public immediately. Calibration data such as flat fields and dark frames are also publicly available straight away. All data in the archive is in the FITS format, which is suitable for astronomical analysis but not for public use. The Hubble Heritage Project processes and releases to the public a small selection of the most striking images in JPEG and TIFF formats.

    Outreach activities

    It has always been important for the Space Telescope to capture the public’s imagination, given the considerable contribution of taxpayers to its construction and operational costs. After the difficult early years when the faulty mirror severely dented Hubble’s reputation with the public, the first servicing mission allowed its rehabilitation as the corrected optics produced numerous remarkable images.

    Several initiatives have helped to keep the public informed about Hubble activities. In the United States, outreach efforts are coordinated by the Space Telescope Science Institute (STScI) Office for Public Outreach, which was established in 2000 to ensure that U.S. taxpayers saw the benefits of their investment in the space telescope program. To that end, STScI operates the HubbleSite.org website. The Hubble Heritage Project, operating out of the STScI, provides the public with high-quality images of the most interesting and striking objects observed. The Heritage team is composed of amateur and professional astronomers, as well as people with backgrounds outside astronomy, and emphasizes the aesthetic nature of Hubble images. The Heritage Project is granted a small amount of time to observe objects which, for scientific reasons, may not have images taken at enough wavelengths to construct a full-color image.

    Since 1999, the leading Hubble outreach group in Europe has been the Hubble European Space Agency Information Centre (HEIC). This office was established at the Space Telescope European Coordinating Facility in Munich, Germany. HEIC’s mission is to fulfill HST outreach and education tasks for the European Space Agency. The work is centered on the production of news and photo releases that highlight interesting Hubble results and images. These are often European in origin, and so increase awareness of both ESA’s Hubble share (15%) and the contribution of European scientists to the observatory. ESA produces educational material, including a videocast series called Hubblecast designed to share world-class scientific news with the public.

    The Hubble Space Telescope has won two Space Achievement Awards from the Space Foundation, for its outreach activities, in 2001 and 2010.

    A replica of the Hubble Space Telescope is on the courthouse lawn in Marshfield, Missouri, the hometown of namesake Edwin P. Hubble.

    Major Instrumentation

    Hubble WFPC2 no longer in service.

    Wide Field Camera 3 [WFC3]

    National Aeronautics Space Agency(USA)/European Space Agency [Agence spatiale européenne](EU) Hubble Wide Field Camera 3

    Advanced Camera for Surveys [ACS]

    National Aeronautics Space Agency(US)/European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU) NASA/ESA Hubble Space Telescope(US) Advanced Camera for Surveys

    Cosmic Origins Spectrograph [COS]

    National Aeronautics Space Agency (US) Cosmic Origins Spectrograph.

    The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA’s Goddard Space Flight Center manages the telescope. The Space Telescope Science Institute (STScI), is a free-standing science center, located on the campus of The Johns Hopkins University and operated by the Association of Universities for Research in Astronomy (AURA) for NASA, conducts Hubble science operations.

    ESA50 Logo large

  • richardmitnick 10:17 pm on January 5, 2022 Permalink | Reply
    Tags: Astronomy, , , , Magnetars: neutron stars with an exceptionally strong magnetic fields, Only about 30 magnetars have been identified from approximately 3000 known neutron stars.   

    From Scientific American: “Astronomers Report a Monstrous Eruption from a Supermagnetic Star” 

    From Scientific American

    January 5, 2022
    Mindy Weisberger

    A powerful x-ray burst erupts from an intensely magnetic neutron star – a magnetar – in this illustration. Credit: Chris Smith/ NASA’s Goddard Space Flight Center (US)/Universities Space Research Association (US).

    A dense, magnetic star violently erupted and spat out as much energy as a billion suns — and it happened in a fraction of a second, scientists recently reported.

    This type of star, known as a magnetar, is a neutron star with an exceptionally strong magnetic field, and magnetars often flare spectacularly and without warning. But even though magnetars can be thousands of times brighter than our sun, their eruptions are so brief and unpredictable that they’re challenging for astrophysicists to find and study.

    However, researchers recently managed to catch one of these flares and calculate oscillations in the brightness of a magnetar as it erupted. The scientists found that the distant magnetar released as much energy as our sun produces in 100,000 years, and it did so in just 1/10 of a second, according to a statement translated from Spanish.

    A neutron star forms when a massive star collapses at the end of its life. As the star dies in a supernova, protons and electrons in its core are crushed into a compressed solar mass that combines intense gravity with high-speed rotation and powerful magnetic forces, according to NASA. The result, a neutron star, is approximately 1.3 to 2.5 solar masses — one solar mass is the mass of our sun, or about 330,000 Earths — crammed into a sphere measuring just 12 miles (20 kilometers) in diameter.

    Matter in neutron stars is so densely packed that an amount the size of a sugar cube would weigh more than 1 billion tons (900 million metric tons), and a neutron star’s gravitational pull is so intense that a passing marshmallow would hit the star’s surface with the force of 1,000 hydrogen bombs, according to NASA.

    Magnetars are neutron stars with magnetic fields that are 1,000 times stronger than those of other neutron stars, and they are more powerful than any other magnetic object in the universe. Our sun pales in comparison to these bright, dense stars even when they aren’t erupting, study lead author Alberto J. Castro-Tirado, a research professor with The Instituto de Astrofísica de Andalucía – CSIC (ES), said in the statement.

    “Even in an inactive state, magnetars can be 100,000 times more luminous than our sun,” Castro-Tirado said. “But in the case of the flash that we have studied — GRB2001415 — the energy that was released is equivalent to that which our sun radiates in 100,000 years.”

    A “giant flare”

    The magnetar that produced the brief eruption is located in the Sculptor Galaxy, a spiral galaxy about 13 million light-years from Earth, and is “a true cosmic monster,” study co-author Victor Reglero, director of UV’s Image Processing Laboratory, said in the statement. The giant flare was detected on April 15, 2020 by the Atmosphere–Space Interactions Monitor (ASIM) instrument on the International Space Station, researchers reported Dec. 22 in the journal Nature.

    ASIM [Atmosphere-Space Interactions Monitor] on ISS

    Artificial intelligence (AI) in the ASIM pipeline detected the flare, enabling the researchers to analyze that brief, violent energy surge; the flare lasted just 0.16 seconds and then the signal decayed so rapidly that it was nearly indistinguishable from background noise in the data. The study authors spent more than a year analyzing ASIM’s two seconds of data collection, dividing the event into four phases based on the magnetar’s energy output, and then measuring variations in the star’s magnetic field caused by the energy pulse when it was at its peak.

    It’s almost as if the magnetar decided to broadcast its existence “from its cosmic solitude” by shouting into the void of space with the force “of a billion suns,” Reglero said.

    Only about 30 magnetars have been identified from approximately 3,000 known neutron stars, and this is the most distant magnetar flare detected to date. Scientists suspect that eruptions such as this one may be caused by so-called starquakes that disrupt magnetars’ elastic outer layers, and this rare observation could help researchers unravel the stresses that produce magnetars’ energy burps, according to the study.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

  • richardmitnick 9:24 pm on January 4, 2022 Permalink | Reply
    Tags: "How could the Big Bang arise from nothing?", Albert Einstein's Theory of General Relativity proposes that a gravitational singularity may have existed., Astronomy, , , , , In a gravitational singularity even the laws of quantum physics break down and the four fundamental forces (strong nuclear; weak nuclear; electromagnetic & gravity) could be unified as one!, Many-worlds quantum theory gives a new twist on conformal cyclic cosmology: Our Big Bang might be the rebirth of one single quantum multiverse containing infinitely different universes., , No matter how small the chance of something occurring if it has a non-zero chance then it occurs in some quantum parallel world., Other measurement results all play out in other universes in a multiverse effectively cut off from our own., , , Some people believe parallel universes may also be observable in cosmological data as imprints caused by another universe colliding with ours., The "Grand Unification Epoch", The "Plank Epoch", , , The measurement result we see is just one possibility-the one that plays out in our own universe., Three options to the deeper question of how the cycles began: no physical explanation at all; endlessly repeating cycles each a universe in its own right; one single cycle .   

    From The Conversation : “How could the Big Bang arise from nothing?” 

    From The Conversation

    January 3, 2022
    Alastair Wilson
    Professor of Philosophy, The University of Birmingham (UK)

    The evolution of the cosmos after the Big Bang. Into what is the universe expanding? Credit: Dana Berry/NASA Goddard.

    READER QUESTION: My understanding is that nothing comes from nothing. For something to exist, there must be material or a component available, and for them to be available, there must be something else available. Now my question: Where did the material come from that created the Big Bang, and what happened in the first instance to create that material? Peter, 80, Australia.

    “The last star will slowly cool and fade away. With its passing, the universe will become once more a void, without light or life or meaning.” So warned the physicist Brian Cox in the recent BBC series Universe. The fading of that last star will only be the beginning of an infinitely long, dark epoch. All matter will eventually be consumed by monstrous black holes, which in their turn will evaporate away into the dimmest glimmers of light. Space will expand ever outwards until even that dim light becomes too spread out to interact. Activity will cease.

    Or will it? Strangely enough, some cosmologists believe a previous, cold dark empty universe like the one which lies in our far future could have been the source of our very own Big Bang.

    The first matter

    But before we get to that, let’s take a look at how “material” – physical matter – first came about. If we are aiming to explain the origins of stable matter made of atoms or molecules, there was certainly none of that around at the Big Bang – nor for hundreds of thousands of years afterwards. We do in fact have a pretty detailed understanding of how the first atoms formed out of simpler particles once conditions cooled down enough for complex matter to be stable, and how these atoms were later fused into heavier elements inside stars. But that understanding doesn’t address the question of whether something came from nothing.

    So let’s think further back. The first long-lived matter particles of any kind were protons and neutrons, which together make up the atomic nucleus.

    The quark structure of the proton. 16 March 2006 Arpad Horvath.

    The quark structure of the neutron. 15 January 2018 Jacek Rybak.

    These came into existence around one ten-thousandth of a second after the Big Bang. Before that point, there was really no material in any familiar sense of the word. But physics lets us keep on tracing the timeline backwards – to physical processes which predate any stable matter.

    This takes us to the so-called “grand unified epoch”.

    The Beginning of the Modern Universe

    The “Grand Unification Epoch” took place from 10^-43 seconds to 10^-36 seconds after our universe was born. Quantum theory allows us to form a clearer picture of this Epoch compared to the mysterious The “Plank Epoch”.

    During the “Grand Unification Epoch”, the universe was still extremely hot and incomprehensibly small. However, it had cooled down enough to allow the force of gravity to separate from the other three fundamental forces. The unification of the strong nuclear, weak nuclear, and electromagnetic force that existed during this period of time is referred to as the electronuclear force. However, the splitting off of gravity from the electronuclear force wasn’t the only milestone of this epoch- this is also when the first elementary particles began to form.

    What Are Elementary Particles?
    Elementary Particles are particles which have no substructure- i.e. they are the simplest form of matter possible. Elementary particles are the building blocks of electrons, neutrons, protons and more! Currently, there are 17 elementary particles that have been confirmed- the unconfirmed “gravitron”is still in the theoretical category. There are 12 “matter” elementary particles and 5 “force carrier” particles.

    Standard Model of Particle Physics, Quantum Diaries.

    These are the matter elementary particles are what make up the physical part of subatomic particles and are referred to as fermions. The two categories of elementary fermions are quarks and leptons. Quarks combine to form particles known as Hadrons (more on that later), which make up the famous neutrons and protons; Leptons form electrons and other fundamental particles.

    The 5 force carrier particles mediate the interactions between the weak magnetic, strong magnetic, and electromagnetic forces. Bosons are the fundamental reason for the attractions/reactions we view as forces.

    The “Plank Epoch”
    The “Plank Epoch” encompasses the time period from 0 to 10^-43 seconds.
    This extremely small unit of time is aptly referred to as a “Plank Time”.
    Not much is truly known about this period of time, however some very interesting hypothesis have been made.

    Albert Einstein’s Theory of General Relativity proposes that a gravitational singularity may have existed. In a gravitational singularity, even the laws of quantum physics break down and the four fundamental forces (strong nuclear, weak nuclear, electromagnetic, & gravity) could be unified as one! This is an extremely odd concept to consider. It also ties into the so-called “Theory of Everything” which states that at high enough energy levels, even gravity will combine back into one unified force with the other three.

    During the “Plank Epoch”, our universe was only 10^-35 meters wide (VERY small) and 10^32 degrees celsius (VERY hot)!

    During the The “Grand Unification Epoch”, the universe was still extremely hot and incomprehensibly small. However, it had cooled down enough to allow the force of gravity to separate from the other three fundamental forces. The unification of the strong nuclear, weak nuclear, and electromagnetic force that existed during this period of time is referred to as the electronuclear force. However, the splitting off of gravity from the electronuclear force wasn’t the only milestone of this epoch- this is also when the first elementary particles began to form.

    By now, we are well into the realm of speculative physics, as we can’t produce enough energy in our experiments to probe the sort of processes that were going on at the time. But a plausible hypothesis is that the physical world was made up of a soup of short-lived elementary particles – including quarks, the building blocks of protons and neutrons. There was both matter and “antimatter” in roughly equal quantities: each type of matter particle, such as the quark, has an antimatter “mirror image” companion, which is near identical to itself, differing only in one aspect. However, matter and antimatter annihilate in a flash of energy when they meet, meaning these particles were constantly created and destroyed.

    But how did these particles come to exist in the first place? Quantum field theory tells us that even a vacuum, supposedly corresponding to empty spacetime, is full of physical activity in the form of energy fluctuations. These fluctuations can give rise to particles popping out, only to be disappear shortly after. This may sound like a mathematical quirk rather than real physics, but such particles have been spotted in countless experiments.

    The spacetime vacuum state is seething with particles constantly being created and destroyed, apparently “out of nothing”. But perhaps all this really tells us is that the quantum vacuum is (despite its name) a something rather than a nothing. The philosopher David Albert has memorably criticized accounts of the Big Bang which promise to get something from nothing in this way.

    Simulation of quantum vacuum fluctuations in quantum chromodynamics. Credit: Ahmed Neutron/Wikimedia.

    Suppose we ask: where did spacetime itself arise from? Then we can go on turning the clock yet further back, into the truly ancient “Planck epoch” – a period so early in the universe’s history that our best theories of physics break down [above]. This era occurred only one ten-millionth of a trillionth of a trillionth of a trillionth of a second after the Big Bang. At this point, space and time themselves became subject to quantum fluctuations. Physicists ordinarily work separately with Quantum Mechanics, which rules the microworld of particles, and with general relativity, which applies on large, cosmic scales. But to truly understand the Planck epoch, we need a complete theory of quantum gravity, merging the two.

    We still don’t have a perfect theory of quantum gravity, but there are attempts – like string theory and loop quantum gravity. In these attempts, ordinary space and time are typically seen as emergent, like the waves on the surface of a deep ocean. What we experience as space and time are the product of quantum processes operating at a deeper, microscopic level – processes that don’t make much sense to us as creatures rooted in the macroscopic world.

    In the “Planck epoch”, our ordinary understanding of space and time breaks down, so we can’t any longer rely on our ordinary understanding of cause and effect either. Despite this, all candidate theories of quantum gravity describe something physical that was going on in the Planck epoch – some quantum precursor of ordinary space and time. But where did that come from?

    Even if causality no longer applies in any ordinary fashion, it might still be possible to explain one component of the “Planck epoch” universe in terms of another. Unfortunately, by now even our best physics fails completely to provide answers. Until we make further progress towards a “theory of everything”, we won’t be able to give any definitive answer. The most we can say with confidence at this stage is that physics has so far found no confirmed instances of something arising from nothing.

    Cycles from almost nothing

    To truly answer the question of how something could arise from nothing, we would need to explain the quantum state of the entire universe at the beginning of the Planck epoch. All attempts to do this remain highly speculative. Some of them appeal to supernatural forces like a “designer”. But other candidate explanations remain within the realm of physics – such as a multiverse, which contains an infinite number of parallel universes, or cyclical models of the universe, being born and reborn again.

    The 2020 Nobel Prize-winning physicist Roger Penrose has proposed one intriguing but controversial model for a cyclical universe dubbed “conformal cyclic cosmology”. Penrose was inspired by an interesting mathematical connection between a very hot, dense, small state of the universe – as it was at the Big Bang – and an extremely cold, empty, expanded state of the universe – as it will be in the far future. His radical theory to explain this correspondence is that those states become mathematically identical when taken to their limits. Paradoxical though it might seem, a total absence of matter might have managed to give rise to all the matter we see around us in our universe.

    Nobel Lecture: Roger Penrose, Nobel Prize in Physics 2020
    34 minutes

    In this view, the Big Bang arises from an almost nothing. That’s what’s left over when all the matter in a universe has been consumed into black holes, which have in turn boiled away into photons – lost in a void. The whole universe thus arises from something that – viewed from another physical perspective – is as close as one can get to nothing at all. But that nothing is still a kind of something. It is still a physical universe, however empty.

    How can the very same state be a cold, empty universe from one perspective and a hot dense universe from another? The answer lies in a complex mathematical procedure called “conformal rescaling”, a geometrical transformation which in effect alters the size of an object but leaves its shape unchanged.

    Penrose showed how the cold dense state and the hot dense state could be related by such rescaling so that they match with respect to the shapes of their spacetimes – although not to their sizes. It is, admittedly, difficult to grasp how two objects can be identical in this way when they have different sizes – but Penrose argues size as a concept ceases to make sense in such extreme physical environments.

    In conformal cyclic cosmology, the direction of explanation goes from old and cold to young and hot: the hot dense state exists because of the cold empty state. But this “because” is not the familiar one – of a cause followed in time by its effect. It is not only size that ceases to be relevant in these extreme states: time does too. The cold dense state and the hot dense state are in effect located on different timelines. The cold empty state would continue on forever from the perspective of an observer in its own temporal geometry, but the hot dense state it gives rise to effectively inhabits a new timeline all its own.

    It may help to understand the hot dense state as produced from the cold empty state in some non-causal way. Perhaps we should say that the hot dense state emerges from, or is grounded in, or realised by the cold, empty state. These are distinctively metaphysical ideas which have been explored by philosophers of science extensively, especially in the context of quantum gravity where ordinary cause and effect seem to break down. At the limits of our knowledge, physics and philosophy become hard to disentangle.

    Experimental evidence?

    Conformal cyclic cosmology offers some detailed, albeit speculative, answers to the question of where our Big Bang came from. But even if Penrose’s vision is vindicated by the future progress of cosmology, we might think that we still wouldn’t have answered a deeper philosophical question – a question about where physical reality itself came from. How did the whole system of cycles come about? Then we finally end up with the pure question of why there is something rather than nothing – one of the biggest questions of metaphysics.

    But our focus here is on explanations which remain within the realm of physics. There are three broad options to the deeper question of how the cycles began. It could have no physical explanation at all. Or there could be endlessly repeating cycles each a universe in its own right, with the initial quantum state of each universe explained by some feature of the universe before. Or there could be one single cycle and one single repeating universe, with the beginning of that cycle explained by some feature of its own end. The latter two approaches avoid the need for any uncaused events – and this gives them a distinctive appeal. Nothing would be left unexplained by physics.

    Ongoing cycles of distinct universes in conformal cyclic cosmology. Roger Penrose.

    Penrose envisages a sequence of endless new cycles for reasons partly linked to his own preferred interpretation of quantum theory. In quantum mechanics, a physical system exists in a superposition of many different states at the same time, and only “picks one” randomly, when we measure it. For Penrose, each cycle involves random quantum events turning out a different way – meaning each cycle will differ from those before and after it. This is actually good news for experimental physicists, because it might allow us to glimpse the old universe that gave rise to ours through faint traces, or anomalies, in the leftover radiation from the Big Bang seen by the Planck satellite.

    Penrose and his collaborators believe they may have spotted these traces already [MNRAS], attributing patterns in the Planck data [CMB] to radiation from supermassive black holes in the previous universe. However, their claimed observations have been challenged by other physicists [Journal of Cosmology and Astroparticle Physics] and the jury remains out.

    CMB per European Space Agency(EU) Planck.

    Endless new cycles are key to Penrose’s own vision. But there is a natural way to convert conformal cyclic cosmology from a multi-cycle to a one-cycle form. Then physical reality consists in a single cycling around through the Big Bang to a maximally empty state in the far future – and then around again to the very same Big Bang, giving rise to the very same universe all over again.

    This latter possibility is consistent with another interpretation of quantum mechanics, dubbed the many-worlds interpretation. The many-worlds interpretation tells us that each time we measure a system that is in superposition, this measurement doesn’t randomly select a state. Instead, the measurement result we see is just one possibility – the one that plays out in our own universe. The other measurement results all play out in other universes in a multiverse effectively cut off from our own. So no matter how small the chance of something occurring if it has a non-zero chance then it occurs in some quantum parallel world. There are people just like you out there in other worlds who have won the lottery, or have been swept up into the clouds by a freak typhoon, or have spontaneously ignited, or have done all three simultaneously.

    Some people believe such parallel universes may also be observable [MNRAS] in cosmological data as imprints caused by another universe colliding with ours.

    Many-worlds quantum theory gives a new twist on conformal cyclic cosmology, though not one that Penrose agrees with. Our Big Bang might be the rebirth of one single quantum multiverse containing infinitely many different universes all occurring together. Everything possible happens – then it happens again and again and again.

    An ancient myth

    For a philosopher of science, Penrose’s vision is fascinating. It opens up new possibilities for explaining the Big Bang, taking our explanations beyond ordinary cause and effect. It is therefore a great test case for exploring the different ways physics can explain our world. It deserves more attention from philosophers.

    For a lover of myth, Penrose’s vision is beautiful. In Penrose’s preferred multi-cycle form, it promises endless new worlds born from the ashes of their ancestors. In its one-cycle form, it is a striking modern re-invocation of the ancient idea of the ouroboros, or world-serpent. In Norse mythology, the serpent Jörmungandr is a child of Loki, a clever trickster, and the giant Angrboda. Jörmungandr consumes its own tail, and the circle created sustains the balance of the world. But the ouroboros myth has been documented all over the world – including as far back as ancient Egypt.

    Ouroboros on the tomb of Tutankhamun. Credit: Djehouty/Wikimedia.

    The ouroboros of the one cyclic universe is majestic indeed. It contains within its belly our own universe, as well as every one of the weird and wonderful alternative possible universes allowed by quantum physics – and at the point where its head meets its tail, it is completely empty yet also coursing with energy at temperatures of a hundred thousand million billion trillion degrees Celsius. Even Loki, the shapeshifter, would be impressed.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Conversation launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: