Tagged: Live Science (US) Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:41 pm on March 8, 2022 Permalink | Reply
    Tags: "What is the Higgs boson?", , , , , , Live Science (US), , , ,   

    From Live Science (US): “What is the Higgs boson?” 

    From Live Science (US)

    Andrew May

    The elusive particle that physicists knew had to to exist, but took half a century to find.

    Conceptual illustration of the Higgs particle (orange, top and bottom) being produced by colliding two protons. The protons are each composed of three quarks (green and blue) held together by the strong nuclear force carried by gluons (white squiggly lines). The Higgs boson, long expected to exist according to theory, was finally revealed in proton-proton collisions conducted using the Large Hadron Collider (LHC) at CERN in Switzerland, 2012. (Image credit: Mark Garlick/Science Photo Library via Getty Images)

    The Higgs boson is one of the 17 elementary particles that make up the Standard Model of particle physics, which is scientists’ best theory about the behaviors of the universe’s most basic building blocks.

    The Higgs boson particle was the last of these to be discovered, after a search lasting five decades, and it plays such a fundamental role in subatomic physics that it is sometimes referred to as the “God particle.” Here, we take a closer look at the Higgs boson from its theoretical origins, through its high-profile discovery in 2012, to its continuing significance today.

    Higgs field theory

    One of the most basic properties of matter is “mass” — a quantity that determines how much resistance an object offers when a force is applied to it, according to the U.S. Department of Energy. It’s the m in Einstein’s famous equation E = mc^2, where E is energy. Since c is just a constant — the speed of light — then what that equation tells us is that, except for a change of measurement units, energy and mass are the same thing. Some 99% of the mass of any real-world object, such as a human body, comes from the binding energy holding elementary particles together inside atoms. The remaining 1% of the mass, however, is intrinsic to those elementary particles. The question is: How do they get their mass?

    In the 1960s, theoretical physicists, including Peter Higgs of the University of Edinburgh, came up with a possible answer, according to CERN, the European Organization for Nuclear Research.

    The mechanism they proposed involves an invisible but all-pervading field, later dubbed the “Higgs field.” It is through interactions with this field that elementary particles acquire their mass.

    Different particles have different masses because they’re not all affected in the same way by the Higgs field. CERN scientist Stefano Meroli explains this with the analogy of a person (the elementary particle) moving through a group of journalists (the Higgs field). If the person is a celebrity they will have to battle their way through, like a high-mass particle, but if they’re unknown to the journalists they will pass through easily — like a low-mass particle.

    The Higgs boson explained

    Higgs boson diagram. (Image credit: Nasky via Shutterstock.)

    Peter Higgs submitted his original paper about the Higgs field (at the time unnamed) to the journal Physical Review Letters on Aug. 31,1964, according to the University of Edinburgh. On the same day, another paper by Belgian physicists Francois Englert and Robert Brout was published describing essentially the same theory. When this was brought to his attention, Higgs modified his own paper to add another prediction — that there should be a new elementary particle associated with the Higgs field. It belonged to a class of particles called bosons and would itself have an extremely high mass. This was the particle that came to be known as the Higgs boson.

    Higgs’ theory was an elegant explanation for the mass of elementary particles, but was it correct? The most obvious way to verify it was to observe a Higgs boson, but that was never going to be easy. For one thing, the Higgs boson was expected to be highly unstable, disintegrating into other particles in a tiny fraction of a second, according to physicist Brian Greene writing for Smithsonian Magazine. And its huge mass — by subatomic standards — meant that it could be created only in super-high energy collisions. When CERN built the world’s most powerful particle accelerator, the Large Hadron Collider (LHC) [above], one of its primary motivations was to find the Higgs boson.

    [Part of this story is missing. I will do my best to explain.

    The original start of the hunt for Higgs began at The Doe’s Fermi National Accelerator Laboratory in Batavia, Illinois. The accelerator was the Tevatron. The construction of the Laboratory was guided by Robert Rathbun Wilson,an American physicist known for his work on the Manhattan Project during World War II. The Tevatron was completed in 1983 at a cost of $120 million and significant upgrade investments were made during its active years of 1983–2011. The main achievement of the Tevatron was the discovery in 1995 of the top quark—the last fundamental fermion predicted by the Standard Model of particle physics. On July 2, 2012, scientists of the CDF and DØ collider experiment teams at Fermilab announced the findings from the analysis of around 500 trillion collisions produced from the Tevatron collider since 2001, and found that the existence of the suspected Higgs boson was highly likely with a confidence of 99.8%, later improved to over 99.9%.

    The Tevatron ceased operations on 30 September 2011, due to budget cuts and because of the completion of the LHC, which began operations in early 2010 and is far more powerful (planned energies were two 7 TeV beams at the LHC compared to 1 TeV at the Tevatron). The main ring of the Tevatron will probably be reused in future experiments, and its components may be transferred to other particle accelerators.

    The Tevatron was a synchrotron that accelerated protons and antiprotons in a 6.28 km (3.90 mi) ring to energies of up to 1 TeV, hence its name. Unfortunately, 1 TeV was never even close to the energy level necessary to result in the production of the HIggs boson. The Laboratory for a large part of its history was led by Nobel Laureat Leon Lederman.

    Leon M. Lederman Nobel laureate, Director of FNAL after R.R. Wilson

    Fermilab has gone on to become the world center for searches to establish the scientific underpinnnings of the searches for neutrino science with the LBNF/DUNE experiment.

    With the clear understanding that the Tevatron would not find the Higgs boson, the U.S. Department of Energy began construction of the Superconducting Super Collider [SSC] in the vicinity of Waxahachie, Texas.

    Its planned ring circumference was 87.1 kilometers (54.1 mi) with an energy of 20 TeV per proton, far higher than the 7 TeV projected to be energy level of the LHC, and was set to be the world’s largest and most energetic. Surely, had the SSC been built, Higgs would have been detected in the United States and High Energy physics would have been still centered in the United States and not lost to the EU. To be sure, High Energy Physics is much alive in the U.S. There are 600 ATLAS scientists working at The DOE’s Brookhaven National Laboratory on Long Island, New York; and there are 1000 CMS scientists at Fermilab. Also, the superconducting magnets for the LHC are built at three locations in the U.S.: Fermilab, Brookhaven, and The DOE’s Lawrence Berkeley Laboratory in California.

    Higgs boson discovery

    Physicists measure the mass of particles in units called electron volts (eV). For example, the mass of a proton — the nucleus of a hydrogen atom — is 938 million eV. When the LHC started operation in 2008, the only thing scientists knew for certain about the Higgs was that its mass had to be greater than 114 billion eV, according to CERN — otherwise it would have been found by the previous generation of particle accelerators. Fortunately, the LHC proved equal to the task, churning out an increasing number of measurements indicating something tantalizingly Higgs-like around 125 billion eV. By July 4, 2012, there was no longer any doubt, and a formal announcement was made to great media fanfare. Almost 50 years after it was first proposed, the Higgs boson had finally been found.

    Sadly, one of the three scientists behind the original prediction, Robert Brout, had died just over a year earlier. However, the two surviving physicists, Francois Englert and Peter Higgs, were awarded the 2013 Nobel Prize in physics “for the theoretical discovery of a mechanism that contributes to our understanding of the origin of mass of subatomic particles, and which recently was confirmed through the discovery of the predicted fundamental particle,” according to the Nobel Foundation.

    The LHC has four detectors, ATLAS, CMS, Alice, and LHCb.

    Higgs was finally encountered by the ATLAS and CMS detectors.

    The God particle?

    Outside the world of high-energy physics, the Higgs boson is often referred to by the evocative and catchy name of the “God particle.” This was the title of a 1993 book on the subject by Leon Lederman and Dick Teresi — chosen, the authors say, because the publisher wouldn’t let them call it the “Goddamn Particle.” Much as it’s loved by the media, the “God particle” moniker is disliked by many scientists, according to CERN.

    “God particle” or not, the discovery of the Higgs boson was enormously significant. It was the final piece of the Standard Model jigsaw, and it may lead scientists to an understanding of further mysteries — such as the nature of dark matter — that lie beyond it, according to Pete Wilton of The University of Oxford (UK).

    Higgs boson today

    In its own right, too, the Higgs boson is continuing to reveal more of its mysteries to scientists at CERN and elsewhere. One way to learn more about the way it works — and whether it truly is responsible for the mass of all the other elementary particles — is by observing the different ways the Higgs boson decays into other particles. It typically decays into quarks, but it’s also been found to decay into a completely different class of particle called muons. This is a strong indication that muons, like quarks, really do get their mass via the Higgs mechanism.

    The Higgs boson may have even more surprises in store for us. For example, the particle that’s been discovered — which was close to the lower end of the expected mass range — may not be the only Higgs out there. There may be a whole family of Higgs bosons, some much more massive than the one we currently know about. On the other hand, recent research suggests that, if the Higgs had a significantly greater mass than it does, the universe might have undergone catastrophic collapse before it had a chance to get going. This may indeed have been the fate of other parts of the multiverse, but thankfully not our own. If that theory is correct, we can thank the Higgs boson for our very existence.


    The Higgs boson. CERN. https://home.cern/science/physics/higgs-boson

    CERN answers queries from social media. CERN. https://home.cern/resources/faqs/cern-answers-queries-social-media

    DOE Explains…the Higgs Boson. U.S. Department of Energy. https://www.energy.gov/science/doe-explainsthe-higgs-boson

    Wilton, Pete. (2015, July) Exploring the Higgs boson’s dark side. University of Oxford. https://www.ox.ac.uk/news/science-blog/exploring-higgs-bosons-dark-side

    The Nobel Prize in Physics. (2013) The Nobel Foundation. https://www.nobelprize.org/prizes/physics/2013/summary/

    Peter Higgs and the Higgs Boson. (2014, March) The University of Edinburgh. https://www.ph.ed.ac.uk/higgs/brief-history

    Greene, Brian. How the Higgs Boson Was Found. (2013, July) https://www.smithsonianmag.com/science-nature/how-the-higgs-boson-was-found-4723520/

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 9:54 am on February 20, 2022 Permalink | Reply
    Tags: "Three galaxies are tearing each other apart in stunning new Hubble telescope image", , , Live Science (US), , The galaxy cluster IC 2431.   

    From Hubblesite and From ESA/Hubble via Live Science (US): “Three galaxies are tearing each other apart in stunning new Hubble telescope image” 

    National Aeronautics and Space Administration(US)/European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne] [Europäische Weltraumorganisation](EU) Hubble Space Telescope.

    From Hubblesite and From ESA/Hubble


    Live Science (US)

    Brandon Specktor

    This twisty-turny collision is a preview of what awaits our galaxy.

    Three galaxies collide in this stunning new Hubble image. (Image credit:
    Hubble Space Telescope – NASA/ESA.)

    Corkscrewing through the cosmos, three distant galaxies collide in a stunning new image captured by NASA’s Hubble Space Telescope.

    This cosmic crash is known as a triple galaxy merger, when three galaxies slowly draw each other nearer and tear each other apart with their competing gravitational forces. Mergers like these are common throughout the universe, and all large galaxies — including our own, the Milky Way — owe their size to violent mergers like this one.

    As chaotic as they seem, mergers like these are more about creation than destruction. As gas from the three neighboring galaxies collides and condenses, a vast sea of material from which new stars will emerge is assembled at the center of the newly unified galaxy.

    Existing stars, meanwhile, will survive the crash mostly unscathed; while the gravitational tug-of-war among the three galaxies will warp the orbital paths of many existing stars, so much space exists between those stars that relatively few of them are likely to collide, Live Science previously reported.

    The galaxy cluster seen above is called IC 2431, located about 681 million light-years from Earth in the constellation Cancer, according to NASA. Astronomers detected the merger thanks to a citizen science project called Galaxy Zoo, which invited more than 100,000 volunteers to classify images of 900,000 galaxies captured by the Hubble telescope that were never thoroughly examined. The crowdsourced project accomplished in 175 days what would have taken astronomers years to achieve, according to NASA, and the initiative has already resulted in a number of strange and exciting discoveries, like this one.

    Studying galactic mergers can help astronomers better understand the Milky Way’s past and future. The Milky Way is thought to have gobbled up more than a dozen galaxies over the past 12 billion years, including during the exceptionally named Gaia sausage merger, Live Science previously reported.

    Meanwhile, our galaxy appears on track to combine with the nearby Andromeda galaxy about 4.5 billion years from now. The merger will totally alter the night sky over Earth but will likely leave the solar system unharmed, according to NASA.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition
    The NASA/ESA Hubble Space Telescope is a space telescope that was launched into low Earth orbit in 1990 and remains in operation. It was not the first space telescope, but it is one of the largest and most versatile, renowned both as a vital research tool and as a public relations boon for astronomy. The Hubble telescope is named after astronomer Edwin Hubble and is one of NASA’s Great Observatories, along with the NASA Compton Gamma Ray Observatory, the Chandra X-ray Observatory, and the NASA Spitzer Infrared Space Telescope.

    National Aeronautics Space Agency (USA) Compton Gamma Ray Observatory
    National Aeronautics and Space Administration(US) Chandra X-ray telescope(US).
    National Aeronautics and Space Administration(US) Spitzer Infrared Apace Telescope no longer in service. Launched in 2003 and retired on 30 January 2020.

    Edwin Hubble at Caltech Palomar Samuel Oschin 48 inch Telescope(US) Credit: Emilio Segre Visual Archives/AIP/SPL.

    Edwin Hubble looking through the 100-inch Hooker telescope at Mount Wilson in Southern California(US), 1929 discovers the Universe is Expanding. Credit: Margaret Bourke-White/Time & Life Pictures/Getty Images.

    Hubble features a 2.4-meter (7.9 ft) mirror, and its four main instruments observe in the ultraviolet, visible, and near-infrared regions of the electromagnetic spectrum. Hubble’s orbit outside the distortion of Earth’s atmosphere allows it to capture extremely high-resolution images with substantially lower background light than ground-based telescopes. It has recorded some of the most detailed visible light images, allowing a deep view into space. Many Hubble observations have led to breakthroughs in astrophysics, such as determining the rate of expansion of the universe.

    The Hubble telescope was built by the United States space agency National Aeronautics Space Agency (US) with contributions from the The European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganisation](EU). The Space Telescope Science Institute (STScI) selects Hubble’s targets and processes the resulting data, while the NASA Goddard Space Flight Center(US) controls the spacecraft. Space telescopes were proposed as early as 1923. Hubble was funded in the 1970s with a proposed launch in 1983, but the project was beset by technical delays, budget problems, and the 1986 Challenger disaster. It was finally launched by Space Shuttle Discovery in 1990, but its main mirror had been ground incorrectly, resulting in spherical aberration that compromised the telescope’s capabilities. The optics were corrected to their intended quality by a servicing mission in 1993.

    Hubble is the only telescope designed to be maintained in space by astronauts. Five Space Shuttle missions have repaired, upgraded, and replaced systems on the telescope, including all five of the main instruments. The fifth mission was initially canceled on safety grounds following the Columbia disaster (2003), but NASA administrator Michael D. Griffin approved the fifth servicing mission which was completed in 2009. The telescope was still operating as of April 24, 2020, its 30th anniversary, and could last until 2030–2040. One successor to the Hubble telescope is the National Aeronautics Space Agency(USA)/European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne](EU)/Canadian Space Agency(CA) Webb Infrared Space Telescope launched December 25, 2021, ten years late.
    National Aeronautics Space Agency(USA)/European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganisation](EU)/ Canadian Space Agency [Agence Spatiale Canadienne](CA) Webb Infrared Space Telescope(US) James Webb Space Telescope annotated . Launched December 25, 2021, ten years late.

    Proposals and precursors

    In 1923, Hermann Oberth—considered a father of modern rocketry, along with Robert H. Goddard and Konstantin Tsiolkovsky—published Die Rakete zu den Planetenräumen (“The Rocket into Planetary Space“), which mentioned how a telescope could be propelled into Earth orbit by a rocket.

    The history of the Hubble Space Telescope can be traced back as far as 1946, to astronomer Lyman Spitzer’s paper entitled Astronomical advantages of an extraterrestrial observatory. In it, he discussed the two main advantages that a space-based observatory would have over ground-based telescopes. First, the angular resolution (the smallest separation at which objects can be clearly distinguished) would be limited only by diffraction, rather than by the turbulence in the atmosphere, which causes stars to twinkle, known to astronomers as seeing. At that time ground-based telescopes were limited to resolutions of 0.5–1.0 arcseconds, compared to a theoretical diffraction-limited resolution of about 0.05 arcsec for an optical telescope with a mirror 2.5 m (8.2 ft) in diameter. Second, a space-based telescope could observe infrared and ultraviolet light, which are strongly absorbed by the atmosphere.

    Spitzer devoted much of his career to pushing for the development of a space telescope. In 1962, a report by the U.S. National Academy of Sciences recommended development of a space telescope as part of the space program, and in 1965 Spitzer was appointed as head of a committee given the task of defining scientific objectives for a large space telescope.

    Space-based astronomy had begun on a very small-scale following World War II, as scientists made use of developments that had taken place in rocket technology. The first ultraviolet spectrum of the Sun was obtained in 1946, and the National Aeronautics and Space Administration (US) launched the Orbiting Solar Observatory (OSO) to obtain UV, X-ray, and gamma-ray spectra in 1962.
    National Aeronautics Space Agency(USA) Orbiting Solar Observatory

    An orbiting solar telescope was launched in 1962 by the United Kingdom as part of the Ariel space program, and in 1966 NASA launched the first Orbiting Astronomical Observatory (OAO) mission. OAO-1’s battery failed after three days, terminating the mission. It was followed by OAO-2, which carried out ultraviolet observations of stars and galaxies from its launch in 1968 until 1972, well beyond its original planned lifetime of one year.

    The OSO and OAO missions demonstrated the important role space-based observations could play in astronomy. In 1968, NASA developed firm plans for a space-based reflecting telescope with a mirror 3 m (9.8 ft) in diameter, known provisionally as the Large Orbiting Telescope or Large Space Telescope (LST), with a launch slated for 1979. These plans emphasized the need for crewed maintenance missions to the telescope to ensure such a costly program had a lengthy working life, and the concurrent development of plans for the reusable Space Shuttle indicated that the technology to allow this was soon to become available.

    Quest for funding

    The continuing success of the OAO program encouraged increasingly strong consensus within the astronomical community that the LST should be a major goal. In 1970, NASA established two committees, one to plan the engineering side of the space telescope project, and the other to determine the scientific goals of the mission. Once these had been established, the next hurdle for NASA was to obtain funding for the instrument, which would be far more costly than any Earth-based telescope. The U.S. Congress questioned many aspects of the proposed budget for the telescope and forced cuts in the budget for the planning stages, which at the time consisted of very detailed studies of potential instruments and hardware for the telescope. In 1974, public spending cuts led to Congress deleting all funding for the telescope project.
    In response a nationwide lobbying effort was coordinated among astronomers. Many astronomers met congressmen and senators in person, and large-scale letter-writing campaigns were organized. The National Academy of Sciences published a report emphasizing the need for a space telescope, and eventually the Senate agreed to half the budget that had originally been approved by Congress.

    The funding issues led to something of a reduction in the scale of the project, with the proposed mirror diameter reduced from 3 m to 2.4 m, both to cut costs and to allow a more compact and effective configuration for the telescope hardware. A proposed precursor 1.5 m (4.9 ft) space telescope to test the systems to be used on the main satellite was dropped, and budgetary concerns also prompted collaboration with the European Space Agency. ESA agreed to provide funding and supply one of the first-generation instruments for the telescope, as well as the solar cells that would power it, and staff to work on the telescope in the United States, in return for European astronomers being guaranteed at least 15% of the observing time on the telescope. Congress eventually approved funding of US$36 million for 1978, and the design of the LST began in earnest, aiming for a launch date of 1983. In 1983 the telescope was named after Edwin Hubble, who confirmed one of the greatest scientific discoveries of the 20th century, made by Georges Lemaitre, that the universe is expanding.

    Construction and engineering

    Once the Space Telescope project had been given the go-ahead, work on the program was divided among many institutions. NASA Marshall Space Flight Center (MSFC) was given responsibility for the design, development, and construction of the telescope, while Goddard Space Flight Center was given overall control of the scientific instruments and ground-control center for the mission. MSFC commissioned the optics company Perkin-Elmer to design and build the Optical Telescope Assembly (OTA) and Fine Guidance Sensors for the space telescope. Lockheed was commissioned to construct and integrate the spacecraft in which the telescope would be housed.

    Optical Telescope Assembly

    Optically, the HST is a Cassegrain reflector of Ritchey–Chrétien design, as are most large professional telescopes. This design, with two hyperbolic mirrors, is known for good imaging performance over a wide field of view, with the disadvantage that the mirrors have shapes that are hard to fabricate and test. The mirror and optical systems of the telescope determine the final performance, and they were designed to exacting specifications. Optical telescopes typically have mirrors polished to an accuracy of about a tenth of the wavelength of visible light, but the Space Telescope was to be used for observations from the visible through the ultraviolet (shorter wavelengths) and was specified to be diffraction limited to take full advantage of the space environment. Therefore, its mirror needed to be polished to an accuracy of 10 nanometers, or about 1/65 of the wavelength of red light. On the long wavelength end, the OTA was not designed with optimum IR performance in mind—for example, the mirrors are kept at stable (and warm, about 15 °C) temperatures by heaters. This limits Hubble’s performance as an infrared telescope.

    Perkin-Elmer intended to use custom-built and extremely sophisticated computer-controlled polishing machines to grind the mirror to the required shape. However, in case their cutting-edge technology ran into difficulties, NASA demanded that PE sub-contract to Kodak to construct a back-up mirror using traditional mirror-polishing techniques. (The team of Kodak and Itek also bid on the original mirror polishing work. Their bid called for the two companies to double-check each other’s work, which would have almost certainly caught the polishing error that later caused such problems.) The Kodak mirror is now on permanent display at the National Air and Space Museum. An Itek mirror built as part of the effort is now used in the 2.4 m telescope at the Magdalena Ridge Observatory.

    Construction of the Perkin-Elmer mirror began in 1979, starting with a blank manufactured by Corning from their ultra-low expansion glass. To keep the mirror’s weight to a minimum it consisted of top and bottom plates, each one inch (25 mm) thick, sandwiching a honeycomb lattice. Perkin-Elmer simulated microgravity by supporting the mirror from the back with 130 rods that exerted varying amounts of force. This ensured the mirror’s final shape would be correct and to specification when finally deployed. Mirror polishing continued until May 1981. NASA reports at the time questioned Perkin-Elmer’s managerial structure, and the polishing began to slip behind schedule and over budget. To save money, NASA halted work on the back-up mirror and put the launch date of the telescope back to October 1984. The mirror was completed by the end of 1981; it was washed using 2,400 US gallons (9,100 L) of hot, deionized water and then received a reflective coating of 65 nm-thick aluminum and a protective coating of 25 nm-thick magnesium fluoride.

    Doubts continued to be expressed about Perkin-Elmer’s competence on a project of this importance, as their budget and timescale for producing the rest of the OTA continued to inflate. In response to a schedule described as “unsettled and changing daily”, NASA postponed the launch date of the telescope until April 1985. Perkin-Elmer’s schedules continued to slip at a rate of about one month per quarter, and at times delays reached one day for each day of work. NASA was forced to postpone the launch date until March and then September 1986. By this time, the total project budget had risen to US$1.175 billion.

    Spacecraft systems

    The spacecraft in which the telescope and instruments were to be housed was another major engineering challenge. It would have to withstand frequent passages from direct sunlight into the darkness of Earth’s shadow, which would cause major changes in temperature, while being stable enough to allow extremely accurate pointing of the telescope. A shroud of multi-layer insulation keeps the temperature within the telescope stable and surrounds a light aluminum shell in which the telescope and instruments sit. Within the shell, a graphite-epoxy frame keeps the working parts of the telescope firmly aligned. Because graphite composites are hygroscopic, there was a risk that water vapor absorbed by the truss while in Lockheed’s clean room would later be expressed in the vacuum of space; resulting in the telescope’s instruments being covered by ice. To reduce that risk, a nitrogen gas purge was performed before launching the telescope into space.

    While construction of the spacecraft in which the telescope and instruments would be housed proceeded somewhat more smoothly than the construction of the OTA, Lockheed still experienced some budget and schedule slippage, and by the summer of 1985, construction of the spacecraft was 30% over budget and three months behind schedule. An MSFC report said Lockheed tended to rely on NASA directions rather than take their own initiative in the construction.

    Computer systems and data processing

    The two initial, primary computers on the HST were the 1.25 MHz DF-224 system, built by Rockwell Autonetics, which contained three redundant CPUs, and two redundant NSSC-1 (NASA Standard Spacecraft Computer, Model 1) systems, developed by Westinghouse and GSFC using diode–transistor logic (DTL). A co-processor for the DF-224 was added during Servicing Mission 1 in 1993, which consisted of two redundant strings of an Intel-based 80386 processor with an 80387-math co-processor. The DF-224 and its 386 co-processor were replaced by a 25 MHz Intel-based 80486 processor system during Servicing Mission 3A in 1999. The new computer is 20 times faster, with six times more memory, than the DF-224 it replaced. It increases throughput by moving some computing tasks from the ground to the spacecraft and saves money by allowing the use of modern programming languages.

    Additionally, some of the science instruments and components had their own embedded microprocessor-based control systems. The MATs (Multiple Access Transponder) components, MAT-1 and MAT-2, utilize Hughes Aircraft CDP1802CD microprocessors. The Wide Field and Planetary Camera (WFPC) also utilized an RCA 1802 microprocessor (or possibly the older 1801 version). The WFPC-1 was replaced by the WFPC-2 [below] during Servicing Mission 1 in 1993, which was then replaced by the Wide Field Camera 3 (WFC3) [below] during Servicing Mission 4 in 2009.

    Initial instruments

    When launched, the HST carried five scientific instruments: the Wide Field and Planetary Camera (WF/PC), Goddard High Resolution Spectrograph (GHRS), High Speed Photometer (HSP), Faint Object Camera (FOC) and the Faint Object Spectrograph (FOS). WF/PC was a high-resolution imaging device primarily intended for optical observations. It was built by NASA JPL-Caltech (US), and incorporated a set of 48 filters isolating spectral lines of particular astrophysical interest. The instrument contained eight charge-coupled device (CCD) chips divided between two cameras, each using four CCDs. Each CCD has a resolution of 0.64 megapixels. The wide field camera (WFC) covered a large angular field at the expense of resolution, while the planetary camera (PC) took images at a longer effective focal length than the WF chips, giving it a greater magnification.

    The GHRS was a spectrograph designed to operate in the ultraviolet. It was built by the Goddard Space Flight Center and could achieve a spectral resolution of 90,000. Also optimized for ultraviolet observations were the FOC and FOS, which were capable of the highest spatial resolution of any instruments on Hubble. Rather than CCDs these three instruments used photon-counting digicons as their detectors. The FOC was constructed by ESA, while the University of California, San Diego(US), and Martin Marietta Corporation built the FOS.

    The final instrument was the HSP, designed and built at the University of Wisconsin–Madison (US). It was optimized for visible and ultraviolet light observations of variable stars and other astronomical objects varying in brightness. It could take up to 100,000 measurements per second with a photometric accuracy of about 2% or better.

    HST’s guidance system can also be used as a scientific instrument. Its three Fine Guidance Sensors (FGS) are primarily used to keep the telescope accurately pointed during an observation, but can also be used to carry out extremely accurate astrometry; measurements accurate to within 0.0003 arcseconds have been achieved.

    Ground support

    The Space Telescope Science Institute (STScI) is responsible for the scientific operation of the telescope and the delivery of data products to astronomers. STScI is operated by the Association of Universities for Research in Astronomy (US) (AURA) and is physically located in Baltimore, Maryland on the Homewood campus of Johns Hopkins University (US), one of the 39 U.S. universities and seven international affiliates that make up the AURA consortium. STScI was established in 1981 after something of a power struggle between NASA and the scientific community at large. NASA had wanted to keep this function in-house, but scientists wanted it to be based in an academic establishment. The Space Telescope European Coordinating Facility (ST-ECF), established at Garching bei München near Munich in 1984, provided similar support for European astronomers until 2011, when these activities were moved to the European Space Astronomy Centre.

    One rather complex task that falls to STScI is scheduling observations for the telescope. Hubble is in a low-Earth orbit to enable servicing missions, but this means most astronomical targets are occulted by the Earth for slightly less than half of each orbit. Observations cannot take place when the telescope passes through the South Atlantic Anomaly due to elevated radiation levels, and there are also sizable exclusion zones around the Sun (precluding observations of Mercury), Moon and Earth. The solar avoidance angle is about 50°, to keep sunlight from illuminating any part of the OTA. Earth and Moon avoidance keeps bright light out of the FGSs, and keeps scattered light from entering the instruments. If the FGSs are turned off, the Moon and Earth can be observed. Earth observations were used very early in the program to generate flat-fields for the WFPC1 instrument. There is a so-called continuous viewing zone (CVZ), at roughly 90° to the plane of Hubble’s orbit, in which targets are not occulted for long periods.

    Challenger disaster, delays, and eventual launch

    By January 1986, the planned launch date of October looked feasible, but the Challenger explosion brought the U.S. space program to a halt, grounding the Shuttle fleet and forcing the launch of Hubble to be postponed for several years. The telescope had to be kept in a clean room, powered up and purged with nitrogen, until a launch could be rescheduled. This costly situation (about US$6 million per month) pushed the overall costs of the project even higher. This delay did allow time for engineers to perform extensive tests, swap out a possibly failure-prone battery, and make other improvements. Furthermore, the ground software needed to control Hubble was not ready in 1986, and was barely ready by the 1990 launch.

    Eventually, following the resumption of shuttle flights in 1988, the launch of the telescope was scheduled for 1990. On April 24, 1990, Space Shuttle Discovery successfully launched it during the STS-31 mission.

    From its original total cost estimate of about US$400 million, the telescope cost about US$4.7 billion by the time of its launch. Hubble’s cumulative costs were estimated to be about US$10 billion in 2010, twenty years after launch.

    List of Hubble instruments

    Hubble accommodates five science instruments at a given time, plus the Fine Guidance Sensors, which are mainly used for aiming the telescope but are occasionally used for scientific astrometry measurements. Early instruments were replaced with more advanced ones during the Shuttle servicing missions. COSTAR was a corrective optics device rather than a science instrument, but occupied one of the five instrument bays.
    Since the final servicing mission in 2009, the four active instruments have been ACS, COS, STIS and WFC3. NICMOS is kept in hibernation, but may be revived if WFC3 were to fail in the future.

    Advanced Camera for Surveys (ACS; 2002–present)
    Cosmic Origins Spectrograph (COS; 2009–present)
    Corrective Optics Space Telescope Axial Replacement (COSTAR; 1993–2009)
    Faint Object Camera (FOC; 1990–2002)
    Faint Object Spectrograph (FOS; 1990–1997)
    Fine Guidance Sensor (FGS; 1990–present)
    Goddard High Resolution Spectrograph (GHRS/HRS; 1990–1997)
    High Speed Photometer (HSP; 1990–1993)
    Near Infrared Camera and Multi-Object Spectrometer (NICMOS; 1997–present, hibernating since 2008)
    Space Telescope Imaging Spectrograph (STIS; 1997–present (non-operative 2004–2009))
    Wide Field and Planetary Camera (WFPC; 1990–1993)
    Wide Field and Planetary Camera 2 (WFPC2; 1993–2009)
    Wide Field Camera 3 (WFC3; 2009–present)

    Of the former instruments, three (COSTAR, FOS and WFPC2) are displayed in the Smithsonian National Air and Space Museum. The FOC is in the Dornier Museum, Germany. The HSP is in the Space Place at the University of Wisconsin–Madison. The first WFPC was dismantled, and some components were then re-used in WFC3.

    Flawed mirror

    Within weeks of the launch of the telescope, the returned images indicated a serious problem with the optical system. Although the first images appeared to be sharper than those of ground-based telescopes, Hubble failed to achieve a final sharp focus and the best image quality obtained was drastically lower than expected. Images of point sources spread out over a radius of more than one arcsecond, instead of having a point spread function (PSF) concentrated within a circle 0.1 arcseconds (485 nrad) in diameter, as had been specified in the design criteria.

    Analysis of the flawed images revealed that the primary mirror had been polished to the wrong shape. Although it was believed to be one of the most precisely figured optical mirrors ever made, smooth to about 10 nanometers, the outer perimeter was too flat by about 2200 nanometers (about 1⁄450 mm or 1⁄11000 inch). This difference was catastrophic, introducing severe spherical aberration, a flaw in which light reflecting off the edge of a mirror focuses on a different point from the light reflecting off its center.

    The effect of the mirror flaw on scientific observations depended on the particular observation—the core of the aberrated PSF was sharp enough to permit high-resolution observations of bright objects, and spectroscopy of point sources was affected only through a sensitivity loss. However, the loss of light to the large, out-of-focus halo severely reduced the usefulness of the telescope for faint objects or high-contrast imaging. This meant nearly all the cosmological programs were essentially impossible, since they required observation of exceptionally faint objects. This led politicians to question NASA’s competence, scientists to rue the cost which could have gone to more productive endeavors, and comedians to make jokes about NASA and the telescope − in the 1991 comedy The Naked Gun 2½: The Smell of Fear, in a scene where historical disasters are displayed, Hubble is pictured with RMS Titanic and LZ 129 Hindenburg. Nonetheless, during the first three years of the Hubble mission, before the optical corrections, the telescope still carried out a large number of productive observations of less demanding targets. The error was well characterized and stable, enabling astronomers to partially compensate for the defective mirror by using sophisticated image processing techniques such as deconvolution.

    Origin of the problem

    A commission headed by Lew Allen, director of the Jet Propulsion Laboratory, was established to determine how the error could have arisen. The Allen Commission found that a reflective null corrector, a testing device used to achieve a properly shaped non-spherical mirror, had been incorrectly assembled—one lens was out of position by 1.3 mm (0.051 in). During the initial grinding and polishing of the mirror, Perkin-Elmer analyzed its surface with two conventional refractive null correctors. However, for the final manufacturing step (figuring), they switched to the custom-built reflective null corrector, designed explicitly to meet very strict tolerances. The incorrect assembly of this device resulted in the mirror being ground very precisely but to the wrong shape. A few final tests, using the conventional null correctors, correctly reported spherical aberration. But these results were dismissed, thus missing the opportunity to catch the error, because the reflective null corrector was considered more accurate.

    The commission blamed the failings primarily on Perkin-Elmer. Relations between NASA and the optics company had been severely strained during the telescope construction, due to frequent schedule slippage and cost overruns. NASA found that Perkin-Elmer did not review or supervise the mirror construction adequately, did not assign its best optical scientists to the project (as it had for the prototype), and in particular did not involve the optical designers in the construction and verification of the mirror. While the commission heavily criticized Perkin-Elmer for these managerial failings, NASA was also criticized for not picking up on the quality control shortcomings, such as relying totally on test results from a single instrument.

    Design of a solution

    Many feared that Hubble would be abandoned. The design of the telescope had always incorporated servicing missions, and astronomers immediately began to seek potential solutions to the problem that could be applied at the first servicing mission, scheduled for 1993. While Kodak had ground a back-up mirror for Hubble, it would have been impossible to replace the mirror in orbit, and too expensive and time-consuming to bring the telescope back to Earth for a refit. Instead, the fact that the mirror had been ground so precisely to the wrong shape led to the design of new optical components with exactly the same error but in the opposite sense, to be added to the telescope at the servicing mission, effectively acting as “spectacles” to correct the spherical aberration.

    The first step was a precise characterization of the error in the main mirror. Working backwards from images of point sources, astronomers determined that the conic constant of the mirror as built was −1.01390±0.0002, instead of the intended −1.00230. The same number was also derived by analyzing the null corrector used by Perkin-Elmer to figure the mirror, as well as by analyzing interferograms obtained during ground testing of the mirror.

    Because of the way the HST’s instruments were designed, two different sets of correctors were required. The design of the Wide Field and Planetary Camera 2, already planned to replace the existing WF/PC, included relay mirrors to direct light onto the four separate charge-coupled device (CCD) chips making up its two cameras. An inverse error built into their surfaces could completely cancel the aberration of the primary. However, the other instruments lacked any intermediate surfaces that could be figured in this way, and so required an external correction device.

    The Corrective Optics Space Telescope Axial Replacement (COSTAR) system was designed to correct the spherical aberration for light focused at the FOC, FOS, and GHRS. It consists of two mirrors in the light path with one ground to correct the aberration. To fit the COSTAR system onto the telescope, one of the other instruments had to be removed, and astronomers selected the High Speed Photometer to be sacrificed. By 2002, all the original instruments requiring COSTAR had been replaced by instruments with their own corrective optics. COSTAR was removed and returned to Earth in 2009 where it is exhibited at the National Air and Space Museum. The area previously used by COSTAR is now occupied by the Cosmic Origins Spectrograph.


    NASA COSTAR installation

    Servicing missions and new instruments

    Servicing Mission 1

    The first Hubble serving mission was scheduled for 1993 before the mirror problem was discovered. It assumed greater importance, as the astronauts would need to do extensive work to install corrective optics; failure would have resulted in either abandoning Hubble or accepting its permanent disability. Other components failed before the mission, causing the repair cost to rise to $500 million (not including the cost of the shuttle flight). A successful repair would help demonstrate the viability of building Space Station Alpha, however.

    STS-49 in 1992 demonstrated the difficulty of space work. While its rescue of Intelsat 603 received praise, the astronauts had taken possibly reckless risks in doing so. Neither the rescue nor the unrelated assembly of prototype space station components occurred as the astronauts had trained, causing NASA to reassess planning and training, including for the Hubble repair. The agency assigned to the mission Story Musgrave—who had worked on satellite repair procedures since 1976—and six other experienced astronauts, including two from STS-49. The first mission director since Project Apollo would coordinate a crew with 16 previous shuttle flights. The astronauts were trained to use about a hundred specialized tools.

    Heat had been the problem on prior spacewalks, which occurred in sunlight. Hubble needed to be repaired out of sunlight. Musgrave discovered during vacuum training, seven months before the mission, that spacesuit gloves did not sufficiently protect against the cold of space. After STS-57 confirmed the issue in orbit, NASA quickly changed equipment, procedures, and flight plan. Seven total mission simulations occurred before launch, the most thorough preparation in shuttle history. No complete Hubble mockup existed, so the astronauts studied many separate models (including one at the Smithsonian) and mentally combined their varying and contradictory details. Service Mission 1 flew aboard Endeavour in December 1993, and involved installation of several instruments and other equipment over ten days.

    Most importantly, the High-Speed Photometer was replaced with the COSTAR corrective optics package, and WFPC was replaced with the Wide Field and Planetary Camera 2 (WFPC2) with an internal optical correction system. The solar arrays and their drive electronics were also replaced, as well as four gyroscopes in the telescope pointing system, two electrical control units and other electrical components, and two magnetometers. The onboard computers were upgraded with added coprocessors, and Hubble’s orbit was boosted.

    On January 13, 1994, NASA declared the mission a complete success and showed the first sharper images. The mission was one of the most complex performed up until that date, involving five long extra-vehicular activity periods. Its success was a boon for NASA, as well as for the astronomers who now had a more capable space telescope.

    Servicing Mission 2

    Servicing Mission 2, flown by Discovery in February 1997, replaced the GHRS and the FOS with the Space Telescope Imaging Spectrograph (STIS) and the Near Infrared Camera and Multi-Object Spectrometer (NICMOS), replaced an Engineering and Science Tape Recorder with a new Solid State Recorder, and repaired thermal insulation. NICMOS contained a heat sink of solid nitrogen to reduce the thermal noise from the instrument, but shortly after it was installed, an unexpected thermal expansion resulted in part of the heat sink coming into contact with an optical baffle. This led to an increased warming rate for the instrument and reduced its original expected lifetime of 4.5 years to about two years.

    Servicing Mission 3A

    Servicing Mission 3A, flown by Discovery, took place in December 1999, and was a split-off from Servicing Mission 3 after three of the six onboard gyroscopes had failed. The fourth failed a few weeks before the mission, rendering the telescope incapable of performing scientific observations. The mission replaced all six gyroscopes, replaced a Fine Guidance Sensor and the computer, installed a Voltage/temperature Improvement Kit (VIK) to prevent battery overcharging, and replaced thermal insulation blankets.

    Servicing Mission 3B

    Servicing Mission 3B flown by Columbia in March 2002 saw the installation of a new instrument, with the FOC (which, except for the Fine Guidance Sensors when used for astrometry, was the last of the original instruments) being replaced by the Advanced Camera for Surveys (ACS). This meant COSTAR was no longer required, since all new instruments had built-in correction for the main mirror aberration. The mission also revived NICMOS by installing a closed-cycle cooler and replaced the solar arrays for the second time, providing 30 percent more power.

    Servicing Mission 4

    Plans called for Hubble to be serviced in February 2005, but the Columbia disaster in 2003, in which the orbiter disintegrated on re-entry into the atmosphere, had wide-ranging effects on the Hubble program. NASA Administrator Sean O’Keefe decided all future shuttle missions had to be able to reach the safe haven of the International Space Station should in-flight problems develop. As no shuttles were capable of reaching both HST and the space station during the same mission, future crewed service missions were canceled. This decision was criticized by numerous astronomers who felt Hubble was valuable enough to merit the human risk. HST’s planned successor, the James Webb Telescope (JWST), as of 2004 was not expected to launch until at least 2011. A gap in space-observing capabilities between a decommissioning of Hubble and the commissioning of a successor was of major concern to many astronomers, given the significant scientific impact of HST. The consideration that JWST will not be located in low Earth orbit, and therefore cannot be easily upgraded or repaired in the event of an early failure, only made concerns more acute. On the other hand, many astronomers felt strongly that servicing Hubble should not take place if the expense were to come from the JWST budget.

    In January 2004, O’Keefe said he would review his decision to cancel the final servicing mission to HST, due to public outcry and requests from Congress for NASA to look for a way to save it. The National Academy of Sciences convened an official panel, which recommended in July 2004 that the HST should be preserved despite the apparent risks. Their report urged “NASA should take no actions that would preclude a space shuttle servicing mission to the Hubble Space Telescope”. In August 2004, O’Keefe asked Goddard Space Flight Center to prepare a detailed proposal for a robotic service mission. These plans were later canceled, the robotic mission being described as “not feasible”. In late 2004, several Congressional members, led by Senator Barbara Mikulski, held public hearings and carried on a fight with much public support (including thousands of letters from school children across the U.S.) to get the Bush Administration and NASA to reconsider the decision to drop plans for a Hubble rescue mission.

    The nomination in April 2005 of a new NASA Administrator, Michael D. Griffin, changed the situation, as Griffin stated he would consider a crewed servicing mission. Soon after his appointment Griffin authorized Goddard to proceed with preparations for a crewed Hubble maintenance flight, saying he would make the final decision after the next two shuttle missions. In October 2006 Griffin gave the final go-ahead, and the 11-day mission by Atlantis was scheduled for October 2008. Hubble’s main data-handling unit failed in September 2008, halting all reporting of scientific data until its back-up was brought online on October 25, 2008. Since a failure of the backup unit would leave the HST helpless, the service mission was postponed to incorporate a replacement for the primary unit.

    Servicing Mission 4 (SM4), flown by Atlantis in May 2009, was the last scheduled shuttle mission for HST. SM4 installed the replacement data-handling unit, repaired the ACS and STIS systems, installed improved nickel hydrogen batteries, and replaced other components including all six gyroscopes. SM4 also installed two new observation instruments—Wide Field Camera 3 (WFC3) and the Cosmic Origins Spectrograph (COS)—and the Soft Capture and Rendezvous System, which will enable the future rendezvous, capture, and safe disposal of Hubble by either a crewed or robotic mission. Except for the ACS’s High Resolution Channel, which could not be repaired and was disabled, the work accomplished during SM4 rendered the telescope fully functional.

    Major projects

    Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey [CANDELS]

    The survey “aims to explore galactic evolution in the early Universe, and the very first seeds of cosmic structure at less than one billion years after the Big Bang.” The CANDELS project site describes the survey’s goals as the following:

    The Cosmic Assembly Near-IR Deep Extragalactic Legacy Survey is designed to document the first third of galactic evolution from z = 8 to 1.5 via deep imaging of more than 250,000 galaxies with WFC3/IR and ACS. It will also find the first Type Ia SNe beyond z > 1.5 and establish their accuracy as standard candles for cosmology. Five premier multi-wavelength sky regions are selected; each has multi-wavelength data from Spitzer and other facilities, and has extensive spectroscopy of the brighter galaxies. The use of five widely separated fields mitigates cosmic variance and yields statistically robust and complete samples of galaxies down to 109 solar masses out to z ~ 8.

    Frontier Fields program

    The program, officially named Hubble Deep Fields Initiative 2012, is aimed to advance the knowledge of early galaxy formation by studying high-redshift galaxies in blank fields with the help of gravitational lensing to see the “faintest galaxies in the distant universe”. The Frontier Fields web page describes the goals of the program being:

    To reveal hitherto inaccessible populations of z = 5–10 galaxies that are ten to fifty times fainter intrinsically than any presently known
    To solidify our understanding of the stellar masses and star formation histories of sub-L* galaxies at the earliest times
    To provide the first statistically meaningful morphological characterization of star forming galaxies at z > 5
    To find z > 8 galaxies stretched out enough by cluster lensing to discern internal structure and/or magnified enough by cluster lensing for spectroscopic follow-up.

    Cosmic Evolution Survey (COSMOS)

    The Cosmic Evolution Survey (COSMOS) is an astronomical survey designed to probe the formation and evolution of galaxies as a function of both cosmic time (redshift) and the local galaxy environment. The survey covers a two square degree equatorial field with spectroscopy and X-ray to radio imaging by most of the major space-based telescopes and a number of large ground based telescopes, making it a key focus region of extragalactic astrophysics. COSMOS was launched in 2006 as the largest project pursued by the Hubble Space Telescope at the time, and still is the largest continuous area of sky covered for the purposes of mapping deep space in blank fields, 2.5 times the area of the moon on the sky and 17 times larger than the largest of the CANDELS regions. The COSMOS scientific collaboration that was forged from the initial COSMOS survey is the largest and longest-running extragalactic collaboration, known for its collegiality and openness. The study of galaxies in their environment can be done only with large areas of the sky, larger than a half square degree. More than two million galaxies are detected, spanning 90% of the age of the Universe. The COSMOS collaboration is led by Caitlin Casey, Jeyhan Kartaltepe, and Vernesa Smolcic and involves more than 200 scientists in a dozen countries.

    Important discoveries

    Hubble has helped resolve some long-standing problems in astronomy, while also raising new questions. Some results have required new theories to explain them.

    Age of the universe

    Among its primary mission targets was to measure distances to Cepheid variable stars more accurately than ever before, and thus constrain the value of the Hubble constant, the measure of the rate at which the universe is expanding, which is also related to its age. Before the launch of HST, estimates of the Hubble constant typically had errors of up to 50%, but Hubble measurements of Cepheid variables in the Virgo Cluster and other distant galaxy clusters provided a measured value with an accuracy of ±10%, which is consistent with other more accurate measurements made since Hubble’s launch using other techniques. The estimated age is now about 13.7 billion years, but before the Hubble Telescope, scientists predicted an age ranging from 10 to 20 billion years.

    Expansion of the universe

    While Hubble helped to refine estimates of the age of the universe, it also cast doubt on theories about its future. Astronomers from the High-z Supernova Search Team and the Supernova Cosmology Project used ground-based telescopes and HST to observe distant supernovae and uncovered evidence that, far from decelerating under the influence of gravity, the expansion of the universe may in fact be accelerating. Three members of these two groups have subsequently been awarded Nobel Prizes for their discovery.

    Saul Perlmutter [The Supernova Cosmology Project] shared the 2006 Shaw Prize in Astronomy, the 2011 Nobel Prize in Physics, and the 2015 Breakthrough Prize in Fundamental Physics with Brian P. Schmidt and Adam Riess [The High-z Supernova Search Team] for providing evidence that the expansion of the universe is accelerating.

    The cause of this acceleration remains poorly understood; the most common cause attributed is Dark Energy.

    Black holes

    The high-resolution spectra and images provided by the HST have been especially well-suited to establishing the prevalence of black holes in the center of nearby galaxies. While it had been hypothesized in the early 1960s that black holes would be found at the centers of some galaxies, and astronomers in the 1980s identified a number of good black hole candidates, work conducted with Hubble shows that black holes are probably common to the centers of all galaxies. The Hubble programs further established that the masses of the nuclear black holes and properties of the galaxies are closely related. The legacy of the Hubble programs on black holes in galaxies is thus to demonstrate a deep connection between galaxies and their central black holes.

    Extending visible wavelength images

    A unique window on the Universe enabled by Hubble are the Hubble Deep Field, Hubble Ultra-Deep Field, and Hubble Extreme Deep Field images, which used Hubble’s unmatched sensitivity at visible wavelengths to create images of small patches of sky that are the deepest ever obtained at optical wavelengths. The images reveal galaxies billions of light years away, and have generated a wealth of scientific papers, providing a new window on the early Universe. The Wide Field Camera 3 improved the view of these fields in the infrared and ultraviolet, supporting the discovery of some of the most distant objects yet discovered, such as MACS0647-JD.

    The non-standard object SCP 06F6 was discovered by the Hubble Space Telescope in February 2006.

    On March 3, 2016, researchers using Hubble data announced the discovery of the farthest known galaxy to date: GN-z11. The Hubble observations occurred on February 11, 2015, and April 3, 2015, as part of the CANDELS/GOODS-North surveys.

    Solar System discoveries

    HST has also been used to study objects in the outer reaches of the Solar System, including the dwarf planets Pluto and Eris.

    The collision of Comet Shoemaker-Levy 9 with Jupiter in 1994 was fortuitously timed for astronomers, coming just a few months after Servicing Mission 1 had restored Hubble’s optical performance. Hubble images of the planet were sharper than any taken since the passage of Voyager 2 in 1979, and were crucial in studying the dynamics of the collision of a comet with Jupiter, an event believed to occur once every few centuries.

    During June and July 2012, U.S. astronomers using Hubble discovered Styx, a tiny fifth moon orbiting Pluto.

    In March 2015, researchers announced that measurements of aurorae around Ganymede, one of Jupiter’s moons, revealed that it has a subsurface ocean. Using Hubble to study the motion of its aurorae, the researchers determined that a large saltwater ocean was helping to suppress the interaction between Jupiter’s magnetic field and that of Ganymede. The ocean is estimated to be 100 km (60 mi) deep, trapped beneath a 150 km (90 mi) ice crust.

    From June to August 2015, Hubble was used to search for a Kuiper belt object (KBO) target for the New Horizons Kuiper Belt Extended Mission (KEM) when similar searches with ground telescopes failed to find a suitable target.

    National Aeronautics Space Agency(USA)/New Horizons(US) spacecraft.

    This resulted in the discovery of at least five new KBOs, including the eventual KEM target, 486958 Arrokoth, that New Horizons performed a close fly-by of on January 1, 2019.

    In August 2020, taking advantage of a total lunar eclipse, astronomers using NASA’s Hubble Space Telescope have detected Earth’s own brand of sunscreen – ozone – in our atmosphere. This method simulates how astronomers and astrobiology researchers will search for evidence of life beyond Earth by observing potential “biosignatures” on exoplanets (planets around other stars).
    Hubble and ALMA image of MACS J1149.5+2223.

    Supernova reappearance

    On December 11, 2015, Hubble captured an image of the first-ever predicted reappearance of a supernova, dubbed “Refsdal”, which was calculated using different mass models of a galaxy cluster whose gravity is warping the supernova’s light. The supernova was previously seen in November 2014 behind galaxy cluster MACS J1149.5+2223 as part of Hubble’s Frontier Fields program. Astronomers spotted four separate images of the supernova in an arrangement known as an “Einstein Cross”.

    The light from the cluster has taken about five billion years to reach Earth, though the supernova exploded some 10 billion years ago. Based on early lens models, a fifth image was predicted to reappear by the end of 2015. The detection of Refsdal’s reappearance in December 2015 served as a unique opportunity for astronomers to test their models of how mass, especially dark matter, is distributed within this galaxy cluster.

    Impact on astronomy

    Many objective measures show the positive impact of Hubble data on astronomy. Over 15,000 papers based on Hubble data have been published in peer-reviewed journals, and countless more have appeared in conference proceedings. Looking at papers several years after their publication, about one-third of all astronomy papers have no citations, while only two percent of papers based on Hubble data have no citations. On average, a paper based on Hubble data receives about twice as many citations as papers based on non-Hubble data. Of the 200 papers published each year that receive the most citations, about 10% are based on Hubble data.

    Although the HST has clearly helped astronomical research, its financial cost has been large. A study on the relative astronomical benefits of different sizes of telescopes found that while papers based on HST data generate 15 times as many citations as a 4 m (13 ft) ground-based telescope such as the William Herschel Telescope, the HST costs about 100 times as much to build and maintain.

    Isaac Newton Group 4.2 meter William Herschel Telescope at Roque de los Muchachos Observatory | Instituto de Astrofísica de Canarias • IAC(ES) on La Palma in the Canary Islands(ES), 2,396 m (7,861 ft)

    Deciding between building ground- versus space-based telescopes is complex. Even before Hubble was launched, specialized ground-based techniques such as aperture masking interferometry had obtained higher-resolution optical and infrared images than Hubble would achieve, though restricted to targets about 108 times brighter than the faintest targets observed by Hubble. Since then, advances in “adaptive optics” have extended the high-resolution imaging capabilities of ground-based telescopes to the infrared imaging of faint objects.

    Glistening against the awesome backdrop of the night sky above ESO’s Paranal Observatory, four laser beams project out into the darkness from Unit Telescope 4 UT4 of the VLT, a major asset of the Adaptive Optics system.

    UCO KeckLaser Guide Star Adaptive Optics on two 10 meter Keck Observatory telescopes, Maunakea Hawaii USA, altitude 4,207 m (13,802 ft).

    The usefulness of adaptive optics versus HST observations depends strongly on the particular details of the research questions being asked. In the visible bands, adaptive optics can correct only a relatively small field of view, whereas HST can conduct high-resolution optical imaging over a wide field. Only a small fraction of astronomical objects are accessible to high-resolution ground-based imaging; in contrast Hubble can perform high-resolution observations of any part of the night sky, and on objects that are extremely faint.

    Impact on aerospace engineering

    In addition to its scientific results, Hubble has also made significant contributions to aerospace engineering, in particular the performance of systems in low Earth orbit. These insights result from Hubble’s long lifetime on orbit, extensive instrumentation, and return of assemblies to the Earth where they can be studied in detail. In particular, Hubble has contributed to studies of the behavior of graphite composite structures in vacuum, optical contamination from residual gas and human servicing, radiation damage to electronics and sensors, and the long-term behavior of multi-layer insulation. One lesson learned was that gyroscopes assembled using pressurized oxygen to deliver suspension fluid were prone to failure due to electric wire corrosion. Gyroscopes are now assembled using pressurized nitrogen. Another is that optical surfaces in LEO can have surprisingly long lifetimes; Hubble was only expected to last 15 years before the mirror became unusable, but after 14 years there was no measurable degradation. Finally, Hubble servicing missions, particularly those that serviced components not designed for in-space maintenance, have contributed towards the development of new tools and techniques for on-orbit repair.


    All Hubble data is eventually made available via the Mikulski Archive for Space Telescopes at STScI, CADC and ESA/ESAC. Data is usually proprietary—available only to the principal investigator (PI) and astronomers designated by the PI—for twelve months after being taken. The PI can apply to the director of the STScI to extend or reduce the proprietary period in some circumstances.

    Observations made on Director’s Discretionary Time are exempt from the proprietary period, and are released to the public immediately. Calibration data such as flat fields and dark frames are also publicly available straight away. All data in the archive is in the FITS format, which is suitable for astronomical analysis but not for public use. The Hubble Heritage Project processes and releases to the public a small selection of the most striking images in JPEG and TIFF formats.

    Outreach activities

    It has always been important for the Space Telescope to capture the public’s imagination, given the considerable contribution of taxpayers to its construction and operational costs. After the difficult early years when the faulty mirror severely dented Hubble’s reputation with the public, the first servicing mission allowed its rehabilitation as the corrected optics produced numerous remarkable images.

    Several initiatives have helped to keep the public informed about Hubble activities. In the United States, outreach efforts are coordinated by the Space Telescope Science Institute (STScI) Office for Public Outreach, which was established in 2000 to ensure that U.S. taxpayers saw the benefits of their investment in the space telescope program. To that end, STScI operates the HubbleSite.org website. The Hubble Heritage Project, operating out of the STScI, provides the public with high-quality images of the most interesting and striking objects observed. The Heritage team is composed of amateur and professional astronomers, as well as people with backgrounds outside astronomy, and emphasizes the aesthetic nature of Hubble images. The Heritage Project is granted a small amount of time to observe objects which, for scientific reasons, may not have images taken at enough wavelengths to construct a full-color image.

    Since 1999, the leading Hubble outreach group in Europe has been the Hubble European Space Agency Information Centre (HEIC). This office was established at the Space Telescope European Coordinating Facility in Munich, Germany. HEIC’s mission is to fulfill HST outreach and education tasks for the European Space Agency. The work is centered on the production of news and photo releases that highlight interesting Hubble results and images. These are often European in origin, and so increase awareness of both ESA’s Hubble share (15%) and the contribution of European scientists to the observatory. ESA produces educational material, including a videocast series called Hubblecast designed to share world-class scientific news with the public.

    The Hubble Space Telescope has won two Space Achievement Awards from the Space Foundation, for its outreach activities, in 2001 and 2010.

    A replica of the Hubble Space Telescope is on the courthouse lawn in Marshfield, Missouri, the hometown of namesake Edwin P. Hubble.

    Major Instrumentation

    Hubble WFPC2 no longer in service.

    Wide Field Camera 3 [WFC3]

    National Aeronautics Space Agency(USA)/The European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganisation](EU) Hubble Wide Field Camera 3

    Advanced Camera for Surveys [ACS]

    National Aeronautics Space Agency(US)/European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganisation](EU) NASA/ESA Hubble Space Telescope(US) Advanced Camera for Surveys

    Cosmic Origins Spectrograph [COS]

    National Aeronautics Space Agency (US) Cosmic Origins Spectrograph.

    The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA’s Goddard Space Flight Center manages the telescope. The Space Telescope Science Institute (STScI), is a free-standing science center, located on the campus of The Johns Hopkins University and operated by the Association of Universities for Research in Astronomy (AURA) for NASA, conducts Hubble science operations.

    ESA50 Logo large

  • richardmitnick 10:58 am on February 14, 2022 Permalink | Reply
    Tags: "Mysteries of Stephen Hawking's Doodle-Filled Blackboard May Finally Be Deciphered" - "Stephen Hawking exhibition hopes to unravel the mysteries of his blackboard", Among the more unusual objects is a small glass apple – a gift from researchers at Intel and a nod to Hawking’s position as Lucasian professor-a post formerly held by Sir Isaac Newton., Having devoted his life to the conundrums of the cosmos Professor Stephen Hawking has left behind a mystery of his own amid the eclectic contents of his former office., Hawking’s academic work is portrayed through a handful of key works all written before his ascent to stardom in 1988., Live Science (US), Science Museum hopes fellow travellers of the late cosmologist will join visitors to display of his office miscellany., The blackboard dates from 1980 when Hawking joined fellow physicists at a conference on superspace and supergravity at the University of Cambridge (UK)., The Cambridge cosmologist who died in 2018 at the age of 76 treasured a blackboard that became smothered with cartoons; doodles and equations at a conference he arranged in 1980., The display contains only a fraction of the 700-plus items that staff at the Science Museum have sorted through since they arrived last summer., The display does not attempt to exaggerate Hawking’s standing-according to Leon-most physicists would say he was “very good”; a “contender for a Nobel prize”., The doodles; in-jokes and coded messages on a blackboard that legendary physicist Stephen Hawking kept untouched for more than 35 years., The Guardian (UK), The hope for Juan-Andres-Leon-the curator is that attenders of the conference on superspace and supergravity held in Cambridge more than 40 years ago swing by and explain some of the sketches., , What for example does "stupor symmetry" mean?, What is "Exxon gravity" and why was it on the legendary physicist's chalkboard?, What is hiding inside the tin can labeled "Exxon supergravity?", Who is the shaggy-bearded Martian drawn large at the blackboard's center?, Why is there a floppy-nosed squid climbing over a brick wall?   

    From The University of Cambridge (UK) via and Live Science (US) and The Guardian (UK): “Mysteries of Stephen Hawking’s Doodle-Filled Blackboard May Finally Be Deciphered” – “Stephen Hawking exhibition hopes to unravel the mysteries of his blackboard” 

    U Cambridge bloc


    Live Science (US)


    The Guardian Logo

    The Guardian (UK)

    13 FEBRUARY 2022

    What is “Exxon gravity” and why was it on the legendary physicist’s chalkboard?

    Hawking’s blackboard. Credit: Isidora Bojovic/Science Museum Group.

    A new museum exhibit hopes to uncover the secrets behind the doodles; in-jokes and coded messages on a blackboard that legendary physicist Stephen Hawking kept untouched for more than 35 years.

    The blackboard dates from 1980 when Hawking joined fellow physicists at a conference on superspace and supergravity at the University of Cambridge (UK), according to The Guardian.

    9 Feb 2022
    Ian Sample – Science editor

    Stephen Hawking exhibition hopes to unravel the mysteries of his blackboard

    Science Museum hopes fellow travellers of the late cosmologist will join visitors to display of his office miscellany.

    Having devoted his life to the conundrums of the cosmos Professor Stephen Hawking has left behind a mystery of his own amid the eclectic contents of his former office.

    The Cambridge cosmologist who died in 2018 at the age of 76 treasured a blackboard that became smothered with cartoons; doodles and equations at a conference he arranged in 1980. But what all the graffiti and in-jokes mean is taking some time to unravel.

    The blackboard – as much a perplexing work of art as a memento from the history of physics – goes on display for the first time on Thursday as part of a collection of office items acquired by the Science Museum in London.

    Hawking in his office at the University of Cambridge. Photograph: Sarah Lee.

    The hope for Juan-Andres Leon, the curator of Stephen Hawking’s office, is that surviving attenders of the conference on superspace and supergravity held in Cambridge more than 40 years ago swing by and explain what some of the sketches and comments mean. “We’ll certainly try and extract their interpretations,” Leon said.

    Joining the blackboard in a temporary display called Stephen Hawking at Work is a rare copy of the physicist’s 1966 PhD thesis; his wheelchair; a formal bet that information swallowed by a black hole is lost for ever, and a host of celebrity memorabilia, including a personalised jacket given to him by the creators of the Simpsons for his many appearances on the show.

    “People don’t have much of a glimpse into what everyday life was like for Stephen Hawking, and because he was a theoretical physicist, it’s hard to convey what he might do on a random Monday,” said Leon.

    “The office provides a lot of material and I think people knew that this was more than the collection of its parts, that it really reflects what made Stephen Hawking unique,” he added.

    The display contains only a fraction of the 700-plus items that staff at the Science Museum have sorted through since they arrived last summer. In time, all will be photographed and described online for all to see.

    Hawking’s academic work is portrayed through a handful of key works all written before his ascent to stardom in 1988. For his PhD thesis, Hawking took the process that forms a black hole – the cataclysmic collapse of a star – and ran it backwards, showing that the universe must explode into being from a single point in space and time.

    “Hawking used black holes as instruments, using them to understand even bigger things,” said Leon. In further work, Hawking showed that black holes are not so black after all, shedding radiation into space until they evaporate into nothing.

    Among the more unusual objects is a small glass apple – a gift from researchers at Intel and a nod to Hawking’s position as Lucasian professor-a post formerly held by Sir Isaac Newton. The ornament has been painted to evoke the cosmic microwave background, the afterglow of the big bang.

    Hawking described the tiny fluctuations in the cosmic microwave background – irregularities in the early universe that later formed the stars and galaxies – as the “fingerprints of creation”. His gift for a grand metaphor delighted publishers and led more than a few readers to see proof of divine intervention.

    The display does not attempt to exaggerate Hawking’s standing-according to Leon-most physicists would say he was “very good”; a “contender for a Nobel prize” but not “the most revolutionary genius since Einstein”. What it does attempt to do is portray a sense of Hawking as a physicist, a wit and a celebrity.

    “We didn’t want the display to be solemn, all heavy with trombones and swirling galaxies and things like that. We wanted it to be playful,” said Leon. “He didn’t take things too seriously and I don’t think he’d have been such a celebrity if he didn’t have that spark of fun about him.”

    The Stephen Hawking at Work display is free and will run until 12 June at the Science Museum in London before moving to the Science and Industry Museum in Manchester, with further stops at the National Science and Media Museum in Bradford, the National Railway Museum in York and Locomotion in Shildon, County Durham.


    While attempting to come up with a cosmological “theory of everything” — a set of equations that would combine the rules of general relativity and quantum mechanics — Hawking’s colleagues used the blackboard as a welcome distraction, filling it with a mishmash of half-finished equations, perplexing puns and inscrutable doodles.

    Still preserved more than 40 years later, the befuddling blackboard has just gone on public display for the first time ever as the centerpiece of a new exhibition on Hawking’s office, which opened Feb. 10 at the Science Museum of London. The museum will welcome physicists and friends of Hawking — who died in 2018 at the age of 76 — from around the world in hopes that they may be able to decipher some of the hand-scrawled doodles.

    What for example does “stupor symmetry” mean? Who is the shaggy-bearded Martian drawn large at the blackboard’s center? Why is there a floppy-nosed squid climbing over a brick wall? What is hiding inside the tin can labeled “Exxon supergravity?” Hopefully, the world’s great minds of math and physics can rise to the occasion with answers.

    The blackboard joins dozens of other Hawking artifacts on display, including a copy of the physicist’s 1966 Ph.D. thesis on the expansion of the universe, his wheelchair and a personalized jacket given to him by the creators of “The Simpsons” to honor his multiple appearances on the show. The exhibit will run until June 12 at the Science Museum in London, before hitting the road with stops at several other museums in the U.K., according to The Guardian.

    Hawking was born in England on Jan. 8, 1942. While studying cosmology at the University of Cambridge in 1963, he was diagnosed with motor neuron disease, more commonly known as Lou Gehrig’s disease or amyotrophic lateral sclerosis (ALS). Then just 21, Hawking was expected to live just two more years. He continued to live and work for more than five decades, publishing pioneering work on black holes, the Big Bang theory and general relativity.

    See the full Science Alert article here.

    See the full Guardian article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Cambridge Campus

    The University of Cambridge (UK) [legally The Chancellor, Masters, and Scholars of the University of Cambridge] is a collegiate public research university in Cambridge, England. Founded in 1209 Cambridge is the second-oldest university in the English-speaking world and the world’s fourth-oldest surviving university. It grew out of an association of scholars who left the University of Oxford(UK) after a dispute with townsfolk. The two ancient universities share many common features and are often jointly referred to as “Oxbridge”.

    Cambridge is formed from a variety of institutions which include 31 semi-autonomous constituent colleges and over 150 academic departments, faculties and other institutions organised into six schools. All the colleges are self-governing institutions within the university, each controlling its own membership and with its own internal structure and activities. All students are members of a college. Cambridge does not have a main campus and its colleges and central facilities are scattered throughout the city. Undergraduate teaching at Cambridge is organised around weekly small-group supervisions in the colleges – a feature unique to the Oxbridge system. These are complemented by classes, lectures, seminars, laboratory work and occasionally further supervisions provided by the central university faculties and departments. Postgraduate teaching is provided predominantly centrally.

    Cambridge University Press a department of the university is the oldest university press in the world and currently the second largest university press in the world. Cambridge Assessment also a department of the university is one of the world’s leading examining bodies and provides assessment to over eight million learners globally every year. The university also operates eight cultural and scientific museums, including the Fitzwilliam Museum, as well as a botanic garden. Cambridge’s libraries – of which there are 116 – hold a total of around 16 million books, around nine million of which are in Cambridge University Library, a legal deposit library. The university is home to – but independent of – the Cambridge Union – the world’s oldest debating society. The university is closely linked to the development of the high-tech business cluster known as “Silicon Fe”. It is the central member of Cambridge University Health Partners, an academic health science centre based around the Cambridge Biomedical Campus.

    By both endowment size and consolidated assets Cambridge is the wealthiest university in the United Kingdom. In the fiscal year ending 31 July 2019, the central university – excluding colleges – had a total income of £2.192 billion of which £592.4 million was from research grants and contracts. At the end of the same financial year the central university and colleges together possessed a combined endowment of over £7.1 billion and overall consolidated net assets (excluding “immaterial” historical assets) of over £12.5 billion. It is a member of numerous associations and forms part of the ‘golden triangle’ of English universities.

    Cambridge has educated many notable alumni including eminent mathematicians; scientists; politicians; lawyers; philosophers; writers; actors; monarchs and other heads of state. As of October 2020 121 Nobel laureates; 11 Fields Medalists; 7 Turing Award winners; and 14 British prime ministers have been affiliated with Cambridge as students; alumni; faculty or research staff. University alumni have won 194 Olympic medals.


    By the late 12th century the Cambridge area already had a scholarly and ecclesiastical reputation due to monks from the nearby bishopric church of Ely. However it was an incident at Oxford which is most likely to have led to the establishment of the university: three Oxford scholars were hanged by the town authorities for the death of a woman without consulting the ecclesiastical authorities who would normally take precedence (and pardon the scholars) in such a case; but were at that time in conflict with King John. Fearing more violence from the townsfolk scholars from the University of Oxford started to move away to cities such as Paris; Reading; and Cambridge. Subsequently enough scholars remained in Cambridge to form the nucleus of a new university when it had become safe enough for academia to resume at Oxford. In order to claim precedence it is common for Cambridge to trace its founding to the 1231 charter from Henry III granting it the right to discipline its own members (ius non-trahi extra) and an exemption from some taxes; Oxford was not granted similar rights until 1248.

    A bull in 1233 from Pope Gregory IX gave graduates from Cambridge the right to teach “everywhere in Christendom”. After Cambridge was described as a studium generale in a letter from Pope Nicholas IV in 1290 and confirmed as such in a bull by Pope John XXII in 1318 it became common for researchers from other European medieval universities to visit Cambridge to study or to give lecture courses.

    Foundation of the colleges

    The colleges at the University of Cambridge were originally an incidental feature of the system. No college is as old as the university itself. The colleges were endowed fellowships of scholars. There were also institutions without endowments called hostels. The hostels were gradually absorbed by the colleges over the centuries; but they have left some traces, such as the name of Garret Hostel Lane.

    Hugh Balsham, Bishop of Ely, founded Peterhouse – Cambridge’s first college in 1284. Many colleges were founded during the 14th and 15th centuries but colleges continued to be established until modern times. There was a gap of 204 years between the founding of Sidney Sussex in 1596 and that of Downing in 1800. The most recently established college is Robinson built in the late 1970s. However Homerton College only achieved full university college status in March 2010 making it the newest full college (it was previously an “Approved Society” affiliated with the university).

    In medieval times many colleges were founded so that their members would pray for the souls of the founders and were often associated with chapels or abbeys. The colleges’ focus changed in 1536 with the Dissolution of the Monasteries. Henry VIII ordered the university to disband its Faculty of Canon Law and to stop teaching “scholastic philosophy”. In response, colleges changed their curricula away from canon law and towards the classics; the Bible; and mathematics.

    Nearly a century later the university was at the centre of a Protestant schism. Many nobles, intellectuals and even commoners saw the ways of the Church of England as too similar to the Catholic Church and felt that it was used by the Crown to usurp the rightful powers of the counties. East Anglia was the centre of what became the Puritan movement. In Cambridge the movement was particularly strong at Emmanuel; St Catharine’s Hall; Sidney Sussex; and Christ’s College. They produced many “non-conformist” graduates who, greatly influenced by social position or preaching left for New England and especially the Massachusetts Bay Colony during the Great Migration decade of the 1630s. Oliver Cromwell, Parliamentary commander during the English Civil War and head of the English Commonwealth (1649–1660), attended Sidney Sussex.

    Modern period

    After the Cambridge University Act formalised the organisational structure of the university the study of many new subjects was introduced e.g. theology, history and modern languages. Resources necessary for new courses in the arts architecture and archaeology were donated by Viscount Fitzwilliam of Trinity College who also founded the Fitzwilliam Museum. In 1847 Prince Albert was elected Chancellor of the University of Cambridge after a close contest with the Earl of Powis. Albert used his position as Chancellor to campaign successfully for reformed and more modern university curricula, expanding the subjects taught beyond the traditional mathematics and classics to include modern history and the natural sciences. Between 1896 and 1902 Downing College sold part of its land to build the Downing Site with new scientific laboratories for anatomy, genetics, and Earth sciences. During the same period the New Museums Site was erected including the Cavendish Laboratory which has since moved to the West Cambridge Site and other departments for chemistry and medicine.

    The University of Cambridge began to award PhD degrees in the first third of the 20th century. The first Cambridge PhD in mathematics was awarded in 1924.

    In the First World War 13,878 members of the university served and 2,470 were killed. Teaching and the fees it earned came almost to a stop and severe financial difficulties followed. As a consequence the university first received systematic state support in 1919 and a Royal Commission appointed in 1920 recommended that the university (but not the colleges) should receive an annual grant. Following the Second World War the university saw a rapid expansion of student numbers and available places; this was partly due to the success and popularity gained by many Cambridge scientists.

  • richardmitnick 11:10 am on February 6, 2022 Permalink | Reply
    Tags: "What's the oldest mountain range in the world? (How about the youngest?)", , , , , Live Science (US),   

    From Live Science (US): “What’s the oldest mountain range in the world? (How about the youngest?)” 

    From Live Science (US)

    Brittney J. Miller

    Not all mountain ranges are ancient, geologically speaking.

    Sunrise over Great Smoky Mountains National Park in Gatlinburg, Tennessee, which is part of the Appalachian Mountains. Image credit: WerksMedia via Getty Images.

    Mountains may look ancient — but some are mere toddlers, while others are great-grandaddies, geologically speaking. So, what is the oldest mountain range? And what about the youngest?

    In general, tall mountain ranges, such as the Himalayas, tend to be young, whereas ranges with shorter peaks from millennia of erosion, like the Appalachians, are often older, according to The American Museum of Natural History (US) in New York City. But due to Earth’s ever-changing topography, this superlative is hard to assign — and it demands an understanding of how these peaks rise and fall over time.

    Today’s landscapes feature actively growing and dormant mountain ranges subjected to billions of years of transformations. That’s why pinpointing an age for these peaks gets tricky, said Jim Van Orman, a geochemist at Case Western Reserve University (US).

    Most mountain ranges form due to tectonic plates, the giant, puzzle-like slabs that glide over Earth’s mantle.

    The Tectonic Plates of the world were mapped in 1996, Geological Survey (US).

    As different tectonic plates interact over millions of years, entire mountain ranges can surge skyward.

    There are two main types of tectonic boundaries. At convergent boundaries, tectonic plates collide. The impact often causes the less-dense plate to subduct, or go under and into the underlying mantle beneath the other plate. That sinking crust can lift the land above and result in massive mountain ranges, like the Himalayas that house Mount Everest, Van Orman said. Divergent boundaries, on the other hand, occur where tectonic plates separate. As the plates pull away from each other, the crust stretches thin like taffy. Hot magma rises to fill the created gaps, forging mountains and valleys like those in the “Basin and Range Province” in the western U.S. and northwestern Mexico.

    There’s a lot of nuance when it comes to dating mountain ranges. Take the Appalachian Mountains, for example.

    The range began rising from a convergent boundary around 470 million years ago and grew even taller starting about 270 million years ago, when the continents that eventually became North America and Africa collided, according to The Geological Survey (US). Throughout the following millions of years, erosion decimated its original altitude. The mountains we know today are thanks to a later uplift that rejuvenated their elevations. This rise and fall of heights — a trademark characteristic of mountains — make it difficult and subjective to label a range’s actual age.

    The Appalachians have “a complicated history,” Van Orman told Live Science. “There’s the age of the original rocks, but it wasn’t a mountain range when it was planed off [or eroded] for a large part of its history. So, how old is it, really?”

    While tracing a range’s timeline is tricky, geologists do have tools to measure the age of mountains’ compositions depending on the type of rock. As igneous and metamorphic rocks form, they generate minerals and radioactive isotopes, or variations of elements that have differing numbers of neutrons in their nuclei, that can be dated. For sedimentary rocks, researchers use clues trapped in the rock layers, such as fossils or volcanic ash, to gauge the rocks’ life spans. Eroded mountainous sediments that end up in nearby basins can also be traced back to their peak of origin and dated appropriately, Van Orman said.

    From these measurements, geologists can attribute a spectrum of relative ages for some of Earth’s mountainous topography. On the older side, the Makhonjwa Mountains in southern Africa, which are just 2,000 to 5,900 feet (600 to 1,800 meters) tall, contain 3.6 billion-year-old rocks, according to NASA’s Earth Observatory (US). Other ancient slabs that make up the cores of continents, called “cratons,” may have once been part of mountain ranges and can be found in Greenland, Canada, Australia and beyond.

    Other mountain ranges date to more recent geologic history; for example, those in the Basin and Range Province, such as Snake Range, began to appear around 30 million years ago. Individual volcanic mountains have sprouted within the past million years — some even within the past century, like the volcano Parícutin, which unexpectedly arose from a cornfield during an eruption in 1943, according to The Smithsonian National Museum of Natural History (US).

    Geologists are still researching when and how Earth’s various mountain ranges formed. Exploring these elusive timelines could impart insights about past global climate and biodiversity, as these enormous peaks influence air circulation and genetic exchange.

    “It helps reconstruct the whole history of Earth,” Van Orman said. “Going back deep in time, the only real evidence we have for [plate movement] is looking at these old mountain belts.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 4:21 pm on January 24, 2022 Permalink | Reply
    Tags: "The Higgs boson could have kept our universe from collapsing", , , , , , , Live Science (US), , ,   

    From Live Science: “The Higgs boson could have kept our universe from collapsing” 

    From Live Science

    Paul Sutter

    Other patches in the multiverse would have, instead, met their ends.

    Physicists have proposed our universe might be a tiny patch of a much larger cosmos that is constantly and rapidly inflating and popping off new universes. In our corner of this multiverse, the mass of the Higgs boson was low enough that this patch did not collapse like others may have. Image credit: MARK GARLICK/SCIENCE PHOTO LIBRARY via Getty Images.

    The Higgs boson, the mysterious particle that lends other particles their mass, could have kept our universe from collapsing. And its properties might be a clue that we live in a multiverse of parallel worlds, a wild new theory suggests.

    That theory, in which different regions of the universe have different sets of physical laws, would suggest that only worlds in which the Higgs boson is tiny would survive.

    If true, the new model would entail the creation of new particles, which in turn would explain why the strong interaction — which ultimately keeps atoms from collapsing — seems to obey certain symmetries. And along the way, it could help reveal the nature of Dark Matter — the elusive substance that makes up most matter.

    A tale of two Higgs

    In 2012, the Large Hadron Collider achieved a truly monumental feat; this underground particle accelerator along the French-Swiss border detected for the first time the Higgs boson, a particle that had eluded physicists for decades.

    The European Organization for Nuclear Research [Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) [CERN].

    The European Organization for Nuclear Research [Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH)[CERN] map.

    CERN LHC tube in the tunnel. Credit: Maximilien Brice and Julien Marius Ordan.

    SixTRack CERN LHC particles.

    The Higgs boson is a cornerstone of the Standard Model.

    European Organization for Nuclear Research [Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) ATLAS Higgs Event June 18, 2012.

    European Organization for Nuclear Research [Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) CMS Higgs Event May 27, 2012.

    This particle gives other particles their mass and creates the distinction between the weak interaction and the electromagnetic interaction.

    But with the good news came some bad. The Higgs had a mass of 125 gigaelectronvolts (GeV), which was orders of magnitude smaller than what physicists had thought it should be.

    To be perfectly clear, the framework physicists use to describe the zoo of subatomic particles, known as the Standard Model, doesn’t actually predict the value of the Higgs mass.

    Standard Model of Particle Physics, Quantum Diaries.

    For that theory to work, the number has to be derived experimentally. But back-of-the-envelope calculations made physicists guess that the Higgs would have an incredibly large mass. So once the champagne was opened and the Nobel prizes were handed out, the question loomed: Why does the Higgs have such a low mass?

    In another, and initially unrelated problem, the strong interaction isn’t exactly behaving as the Standard Model predicts it should. In the mathematics that physicists use to describe high-energy interactions, there are certain symmetries. For example, there is the symmetry of charge (change all the electric charges in an interaction and everything operates the same), the symmetry of time (run a reaction backward and it’s the same), and the symmetry of parity (flip an interaction around to its mirror-image and it’s the same).

    In all experiments performed to date, the strong interaction appears to obey the combined symmetry of both charge reversal and parity reversal. But the mathematics of the strong interaction do not show that same symmetry. No known natural phenomena should enforce that symmetry, and yet nature seems to be obeying it.

    What gives?

    A matter of multiverses

    A pair of theorists, Raffaele Tito D’Agnolo of the French Alternative Energies and Atomic Energy Commission (CEA) and Daniele Teresi of CERN, thought that these two problems might be related. In a paper published in January to the journal Physical Review Letters, they outlined their solution to the twin conundrums.

    Their solution: The universe was just born that way.

    They invoked an idea called the multiverse, which is born out of a theory called inflation. Inflation is the idea that in the earliest days of the Big Bang, our cosmos underwent a period of extremely enhanced expansion, doubling in size every billionth of a second.


    Alan Guth, from M.I.T., who first proposed cosmic inflation.

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    Alan Guth’s notes:
    Alan Guth’s original notes on inflation.

    Physicists aren’t exactly sure what powered inflation or how it worked, but one outgrowth of the basic idea is that our universe has never stopped inflating. Instead, what we call “our universe” is just one tiny patch of a much larger cosmos that is constantly and rapidly inflating and constantly popping off new universes, like foamy suds in your bathtub.

    Different regions of this “multiverse” will have different values of the Higgs mass. The researchers found that universes with a large Higgs mass find themselves catastrophically collapsing before they get a chance to grow. Only the regions of the multiverse that have low Higgs masses survive and have stable expansion rates, leading to the development of galaxies, stars, planets and eventually high-energy particle colliders.

    To make a multiverse with varying Higgs masses, the team had to introduce two more particles into the mix. These particles would be new additions to the Standard Model. The interactions of these two new particles set the mass of the Higgs in different regions of the multiverse.

    And those two new particles are also capable of doing other things.

    Time for a test

    The newly proposed particles modify the strong interaction, leading to the charge-parity symmetry that exists in nature. They would act a lot like an axion, another hypothetical particle that has been introduced in an attempt to explain the nature of the strong interaction.

    The new particles don’t have a role limited to the early universe, either. They might still be inhabiting the present-day cosmos. If one of their masses is small enough, it could have evaded detection in our accelerator experiments, but would still be floating around in space.

    In other words, one of these new particles could be responsible for the Dark Matter, the invisible stuff that makes up over 85% of all the matter in the universe.

    It’s a bold suggestion: solving two of the greatest challenges to particle physics and also explaining the nature of Dark Matter.

    Could a solution really be this simple? As elegant as it is, the theory still needs to be tested. The model predicts a certain mass range for the Dark Matter, something that future experiments that are on the hunt for dark matter, like the underground facility the Super Cryogenic Dark Matter Search, could determine. Also, the theory predicts that the neutron should have a small but potentially measurable asymmetry in the electric charges within the neutron, a difference from the predictions of the Standard Model.

    Unfortunately, we’re going to have to wait awhile. Each of these measurements will take years, if not decades, to effectively rule out — or support – the new idea.

    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM, denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky.
    Coma cluster via NASA/ESA Hubble, the original example of Dark Matter discovered during observations by Fritz Zwicky and confirmed 30 years later by Vera Rubin.
    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.

    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.

    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.
    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970.

    Vera Rubin measuring spectra, worked on Dark Matter(Emilio Segre Visual Archives AIP SPL).
    Dark Matter Research

    Super Cryogenic Dark Matter Search from DOE’s SLAC National Accelerator Laboratory (US) at Stanford University (US) at SNOLAB (Vale Inco Mine, Sudbury, Canada).

    LBNL LZ Dark Matter Experiment (US) xenon detector at Sanford Underground Research Facility(US) Credit: Matt Kapust.

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    DAMA at Gran Sasso uses sodium iodide housed in copper to hunt for dark matter LNGS-INFN.

    Yale HAYSTAC axion dark matter experiment at Yale’s Wright Lab.

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB (CA) deep in Sudbury’s Creighton Mine.

    The LBNL LZ Dark Matter Experiment (US) Dark Matter project at SURF, Lead, SD, USA.

    DAMA-LIBRA Dark Matter experiment at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS) located in the Abruzzo region of central Italy.

    DARWIN Dark Matter experiment. A design study for a next-generation, multi-ton dark matter detector in Europe at The University of Zurich [Universität Zürich](CH).

    PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China.

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 5:37 pm on January 5, 2022 Permalink | Reply
    Tags: "Physicists crack unsolvable three-body problem using drunkard's walk", , Live Science (US), , Predicting the motion of two massive objects like a pair of stars is a piece of cake. But when a third object enters the picture the problem becomes unsolvable.,   

    From The Technion-Israel Institute of Technology [הטכניון – מכון טכנולוגי לישראל] (IL) via Live Science: “Physicists crack unsolvable three-body problem using drunkard’s walk” 

    Technion bloc

    From The Technion-Israel Institute of Technology [הטכניון – מכון טכנולוגי לישראל] (IL)


    Live Science

    Ashley Hamer

    The unsolvable three-body problem: It has plagued scientists since the days of Isaac Newton.

    Image credit: Adrienne Bresnahan/Getty Images.

    A physics problem that has plagued science since the days of Isaac Newton is closer to being solved, say a pair of Israeli researchers. The duo used “the drunkard’s walk” to calculate the outcome of a cosmic dance between three massive objects, or the so-called three-body problem.

    For physicists, predicting the motion of two massive objects, like a pair of stars, is a piece of cake. But when a third object enters the picture, the problem becomes unsolvable. That’s because when two massive objects get close to each other, their gravitational attraction influences the paths they take in a way that can be described by a simple mathematical formula.

    But adding a third object isn’t so simple: Suddenly, the interactions between the three objects become chaotic. Instead of following a predictable path defined by a mathematical formula, the behavior of the three objects becomes sensitive to what scientists call “initial conditions” — that is, whatever speed and position they were in previously.

    Any slight difference in those initial conditions changes their future behavior drastically, and because there’s always some uncertainty in what we know about those conditions, their behavior is impossible to calculate far out into the future.

    In one scenario, two of the objects might orbit each other closely while the third is flung into a wide orbit; in another, the third object might be ejected from the other two, never to return, and so on.

    In a paper published in the journal Physical Review X, scientists used the frustrating unpredictability of the three-body problem to their advantage.

    “[The three-body problem] depends very, very sensitively on initial conditions, so essentially it means that the outcome is basically random,” said Yonadav Barry Ginat, a doctoral student at Technion-Israel Institute of Technology who co-authored the paper with Hagai Perets, a physicist at the same university. “But that doesn’t mean that we cannot calculate what probability each outcome has.”

    To do that, they relied on the theory of random walks — also known as “the drunkard’s walk.” The idea is that a drunkard walks in random directions, with the same chance of taking a step to the right as taking a step to the left. If you know those chances, you can calculate the probability of the drunkard ending up in any given spot at some later point in time.

    Credit: Technion – Israel Institute of Technology.

    So in the new study, Ginat and Perets looked at systems of three bodies, where the third object approaches a pair of objects in orbit. In their solution, each of the drunkard’s “steps” corresponds to the velocity of the third object relative to the other two.

    “One can calculate what the probabilities for each of those possible speeds of the third body is, and then you can compose all those steps and all those probabilities to find the final probability of what’s going to happen to the three-body system in a long time from now,” meaning whether the third object will be flung out for good, or whether it might come back, for instance, Ginat said.

    But the scientists’ solution goes further than that. In most simulations of the three-body problem, the three objects are treated as so-called ideal particles, with no internal properties at play. But stars and planets interact in more complicated ways: Just think about the way the moon’s gravity tugs on the Earth to produce the tides. Those tidal forces steal some energy from the interaction between the two bodies, and that changes the way each body moves.

    Because this solution calculates the probability of each “step” of the three-body interaction, it can account for these additional forces to more precisely calculate the outcome.

    This is a big step forward for the three-body problem, but Ginat says it’s certainly not the end. The researchers now hope to figure out what happens when the three bodies are in special configurations — for example, all three on a flat plane. Another challenge is to see if they can generalize these ideas to four bodies.

    “There are quite a few open questions remaining,” Ginat said.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Technion Campus

    A science and technology research university, among the world’s top ten, TheTechnion-Israel Institute of Technology [הטכניון – מכון טכנולוגי לישראל](IL), is dedicated to the creation of knowledge and the development of human capital and leadership, for the advancement of the State of Israel and all humanity.

    The Technion-Israel Institute of Technology [הטכניון – מכון טכנולוגי לישראל](IL) is a public research university in Haifa, Israel. Established in 1912 under the dominion of the Ottoman Empire (and more than 35 years before the establishment of State of Israel), the Technion is the oldest university in the country.

    The Technion is ranked as the top university in both Israel and the Middle East, and in the top 100 universities in the world in the Academic Ranking of World Universities of 2019. The university offers degrees in science and engineering, and related fields such as architecture, medicine, industrial management, and education. It has 19 academic departments, 60 research centers, and 12 affiliated teaching hospitals. Since its founding, it has awarded more than 100,000 degrees and its graduates are cited for providing the skills and education behind the creation and protection of the State of Israel.

    Technion’s 565 faculty members currently include three Nobel Laureates in chemistry. Four Nobel Laureates have been associated with the university.

    The Technion has played a major role in the history of modern Israel. The selection of Hebrew as the language of instruction, defeating German in the War of the Languages, was an important milestone in Hebrew’s consolidation as Israel’s official language. The Technion is also a major factor behind the growth of Israel’s high-tech industry and innovation, including the country’s technical cluster in Silicon Wadi.
    The Technikum was conceived in the early 1900s by the German-Jewish fund Ezrah as a school of engineering and sciences. It was to be the only institution of higher learning in the then Ottoman Palestine, other than the Bezalel Academy of Arts and Design [בצלאל, אקדמיה לאמנות ועיצוב‎] (IL) in Jerusalem (founded in 1907). In October 1913, the board of trustees selected German as the language of instruction, provoking a major controversy known as the War of the Languages. After opposition from American and Russian Jews to the use of German, the board of trustees reversed itself in February 1914 and selected Hebrew as the language of instruction. The German name Technikum was also replaced by the Hebrew name Technion.

    Technion’s cornerstone was laid in 1912, and studies began 12 years later in 1924. In 1923 Albert Einstein visited and planted the now-famous first palm tree, as an initiative of Nobel tradition. The first palm tree still stands today in front of the old Technion building, which is now the MadaTech museum, in the Hadar neighborhood. Einstein founded the first Technion Society, and served as its president upon his return to Germany.

    Research highlights

    In 1982, Dan Shechtman discovered a Quasicrystal structure. This is a structure with a Symmetry in the order of 5 – a phenomenon considered impossible until then by the then-current prevailing theories of Crystallography. In 2011 he won the Nobel Prize in Chemistry for this discovery.

    In 2004, two Technion professors, Avram Hershko and Aaron Ciechanover, won the Nobel Prize for the discovery of the biological system responsible for disassembling protein in the cell.

    Shulamit Levenberg, 37, was chosen by Scientific American magazine as one of the leading scientists in 2006 for the discovery of a method to transplant skin in a way the body does not reject.
    Moussa B.H. Youdim developed Rasagiline, a drug marketed by Teva Pharmaceuticals as Azilect (TM) for the treatment of neurodegenerative disease, especially Parkinson’s disease.

    In 1998, Technion successfully launched the “Gurwin TechSat II” microsatellite, making Technion one of five universities with a student program that designs, builds, and launches its own satellite. The satellite stayed in orbit until 2010.

    In the 1970s, computer scientists Abraham Lempel and Jacob Ziv developed the Lempel-Ziv-Welch algorithm for data compression. In 1995 and 2007 they won an IEEE Richard W. Hamming Medal for pioneering work in data compression and especially for developing the algorithm.

    In 2019, a team of 12 students won a gold medal at iGEM for developing bee-free honey.

  • richardmitnick 12:46 pm on January 2, 2022 Permalink | Reply
    Tags: "Is Earth expanding or shrinking?", Earth is losing about 66100 tons (60000 metric tons) per year., It will take more than 3000 times that long — roughly 15.4 trillion years — before Earth will lose its atmosphere., It would take 5 billion years for Earth to lose its atmosphere if the planet had no way to replenish it., Live Science (US), , The ocean and other processes like volcanic eruptions do help to replenish Earth's atmosphere., While that loss sounds like a lot in the context of the whole planet it's very very very small.   

    From Live Science : “Is Earth expanding or shrinking?” 

    From Live Science

    Donavyn Coffey

    A 3D rendering of Earth. Is Earth growing or shrinking? Image credit: Frank Lee via Getty Images.

    Like any good gift giver, Earth is constantly giving and receiving materials with the surrounding solar system. For instance, dust speeding through space regularly bombards our planet in the form of shooting stars, and gases from Earth’s atmosphere regularly seep out into space.

    So, if Earth is continuously giving away matter, as well as acquiring new material, is it expanding or shrinking?

    Because of Earth’s gaseous gifts to space, our planet — or, to be specific, the atmosphere — is shrinking, according to Guillaume Gronoff, a senior research scientist who studies atmospheric escape at The NASA Langley Research Center(US). However, we’re not shrinking by much, he said.

    Planets are formed by accretion, or when space dust collides and increasingly builds up into a larger mass. After Earth formed about 4.5 billion years ago, a small amount of accretion continued to happen in the form of meteors and meteorites adding to Earth’s mass, Gronoff said.

    But once a planet forms, another process begins: atmospheric escape. It works similarly to evaporation but on a different scale, Gronoff said. In the atmosphere, oxygen, hydrogen, and helium atoms absorb enough energy from the sun to escape the atmosphere, according to Gronnoff.

    So how do these processes affect Earth’s overall mass? Scientists can only estimate.

    “Of course, it’s still research, because it’s difficult to measure the mass of the Earth in real time,” Gronoff told Live Science. “We don’t have the weight of the Earth at the precision needed to see if the Earth is losing or gaining.”

    But by observing the rate of meteors, scientists estimate that about 16,500 tons (15,000 metric tons) — about one and a half Eiffel Towers — impacts the planet every year, adding to its mass, Gronoff said.

    Meanwhile, using satellite data, scientists have estimated the rate of atmospheric escape. “It’s something like 82,700 tons (75,000 metric tons) or 7.5 Eiffel Towers,” Gronoff said. That means Earth is losing about 66,100 tons (60,000 metric tons) per year. While that sounds like a lot in the context of the whole planet it’s very very very small.” he said.

    Using estimates for atmospheric escape established over the past hundred years, Gronoff calculated that, at a rate of 60,000 tons of atmosphere lost per year, it would take 5 billion years for Earth to lose its atmosphere if the planet had no way to replenish it.

    However the ocean and other processes like volcanic eruptions do help to replenish Earth’s atmosphere. So, it will take more than 3000 times that long — roughly 15.4 trillion years — before Earth will lose its atmosphere; that’s about 100 times the life of the universe, he said. But long before that happens, Earth will likely be uninhabitable anyway because of the evolution of the sun, which is expected to turn into a red giant in about 5 billion years. “So the escape of the atmosphere is not the problem in the very long run,” Gronoff said.

    So, while we can all applaud Earth for being a good philanthropist, graciously giving its atmospheric gases to space, we can also rest assured that Earth’s shrinking size is not imperiling life on Earth.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 12:30 pm on January 1, 2022 Permalink | Reply
    Tags: "We may finally be able to test one of Stephen Hawking's most far-out ideas", , , , , Live Science (US), ,   

    From Live Science : “We may finally be able to test one of Stephen Hawking’s most far-out ideas” 

    From Live Science

    Paul Sutter

    National Aeronautics Space Agency(USA)/European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU)/ Canadian Space Agency [Agence Spatiale Canadienne](CA) James Webb Infrared Space Telescope annotated. Scheduled for launch in 2011 delayed to October 2021 finally launched December 25, 2021.

    An artist’s impression of Dark Matter in the beginning of the universe. Image credit: Shutterstock.

    We may soon be able to test one of Stephen Hawking’s most controversial theories, new research suggests.

    In the 1970s, Hawking proposed that Dark Matter, the invisible substance that makes up most matter in the cosmos, may be made of black holes formed in the earliest moments of the Big Bang.

    Now, three astronomers have developed a theory that explains not only the existence of Dark Matter, but also the appearance of the largest black holes in the universe.

    “What I find personally super exciting about this idea is how it elegantly unifies the two really challenging problems that I work on — that of probing the nature of dark matter and the formation and growth of black holes — and resolves them in one fell swoop,” study co-author Priyamvada Natarajan, an astrophysicist at Yale University (US), said in a statement. What’s more, several new instruments — including the James Webb Space Telescope that just launched — could produce data needed to finally assess Hawking’s famous notion.

    Black holes from the beginning

    Dark Matter makes up over 80% of all the matter in the universe, but it doesn’t directly interact with light in any way. It just floats around being massive, affecting the gravity within galaxies.

    It’s tempting to think that black holes might be responsible for this elusive stuff. After all, black holes are famously dark, so filling a galaxy with black holes could theoretically explain all the observations of Dark Matter.

    Unfortunately, in the modern universe, black holes form only after massive stars die, then collapse under the weight of their own gravity. So making black holes requires many stars — which requires a bunch of normal matter.Scientists know how much normal matter is in the universe from calculations of the early universe, where the first hydrogen and helium formed. And there simply isn’t enough normal matter to make all the Dark Matter astronomers have observed.

    Sleeping giants

    That’s where Hawking came in. In 1971, he suggested that black holes formed in the chaotic environment of the earliest moments of the Big Bang. There, pockets of matter could spontaneously reach the densities needed to make black holes, flooding the cosmos with them well before the first stars twinkled. Hawking suggested that these “primordial” black holes might be responsible for Dark Matter. While the idea was interesting, most astrophysicists focused instead on finding a new subatomic particle to explain Dark Matter.

    What’s more, models of primordial black hole formation ran into observational issues. If too many formed in the early universe, they changed the picture of the leftover radiation from the early universe, known as the cosmic microwave background [CMB].

    CMB per European Space Agency(EU) Planck.

    That meant the theory only worked when the number and size of ancient black holes were fairly limited, or it would conflict with measurements of the CMB. .

    The idea was revived in 2015 when the Laser Interferometer Gravitational-Wave Observatory found its first pair of colliding black holes.

    Caltech /MIT Advanced aLigo

    The two black holes were much larger than expected, and one way to explain their large mass was to say they formed in the early universe, not in the hearts of dying stars.

    A simple solution

    In the latest research, Natarajan, Nico Cappelluti at The University of Miami (FL)(US) and Günther Hasinger at The European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU) took a deep dive into the theory of primordial black holes, exploring how they might explain the Dark Matter and possibly resolve other cosmological challenges.

    To pass current observational tests, primordial black holes have to be within a certain mass range. In the new work, the researchers assumed that the primordial black holes had a mass of around 1.4 times the mass of the sun. They constructed a model of the universe that replaced all the Dark Matter with these fairly light black holes, and then they looked for observational clues that could validate (or rule out) the model.

    The team found that primordial black holes could play a major role in the universe by seeding the first stars, the first galaxies and the first supermassive black holes (SMBHs). Observations indicate that stars, galaxies and SMBHs appear very quickly in cosmological history, perhaps too quickly to be accounted for by the processes of formation and growth that we observe in the present-day universe.

    “Primordial black holes, if they do exist, could well be the seeds from which all supermassive black holes form, including the one at the center of the Milky Way,” Natarajan said.

    And the theory is simple and doesn’t require a zoo of new particles to explain Dark Matter.

    “Our study shows that without introducing new particles or new physics, we can solve mysteries of modern cosmology from the nature of Dark Matter itself to the origin of supermassive black holes,” Cappelluti said in the statement.

    So far this idea is only a model, but it’s one that could be tested relatively soon. The James Webb Space Telescope, which launched Christmas Day after years of delays, is specifically designed to answer questions about the origins of stars and galaxies. And the next generation of gravitational wave detectors, especially the Laser Interferometer Space Antenna (LISA), is poised to reveal much more about black holes, including primordial ones if they exist.

    Gravity is talking. Lisa will listen. Dialogos of Eide.

    Together, the two observatories should give astronomers enough information to piece together the story of the first stars and potentially the origins of Dark Matter.

    “It was irresistible to explore this idea deeply, knowing it had the potential to be validated fairly soon,” Natarajan said.

    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM, denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky.
    Coma cluster via NASA/ESA Hubble, the original example of Dark Matter discovered during observations by Fritz Zwicky and confirmed 30 years later by Vera Rubin.
    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.

    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.

    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.
    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970.

    Vera Rubin measuring spectra, worked on Dark Matter(Emilio Segre Visual Archives AIP SPL).
    Dark Matter Research

    LBNL LZ Dark Matter Experiment (US) xenon detector at Sanford Underground Research Facility(US) Credit: Matt Kapust.

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    DAMA at Gran Sasso uses sodium iodide housed in copper to hunt for dark matter LNGS-INFN.

    Yale HAYSTAC axion dark matter experiment at Yale’s Wright Lab.

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB (CA) deep in Sudbury’s Creighton Mine.

    The LBNL LZ Dark Matter Experiment (US) Dark Matter project at SURF, Lead, SD, USA.

    DAMA-LIBRA Dark Matter experiment at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS) located in the Abruzzo region of central Italy.

    DARWIN Dark Matter experiment. A design study for a next-generation, multi-ton dark matter detector in Europe at The University of Zurich [Universität Zürich](CH).

    PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China.

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 1:07 pm on November 16, 2021 Permalink | Reply
    Tags: "Black holes may be growing as the universe expands", , , , , Live Science (US)   

    From Live Science (US) : “Black holes may be growing as the universe expands” 

    From Live Science (US)

    Ben Turner

    A new hypothesis suggests the universe’s expansion could be causing all material objects to grow in mass.

    An artist’s depiction of the IC 10 X-1 system, a black hole lurks in the upper left corner. (Image credit: Universal History Archive/Universal Images Group via Getty Images)

    The universe’s black holes are bigger than astrophysicists expected them to be. Now, a new study suggests why: Every single black hole may be growing as the universe expands.

    The new hypothesis, called “cosmological coupling,” argues that as the universe expands outward after the Big Bang, all objects with mass grow with it too. And black holes, as some of the most massive objects to exist, grow the most.

    This hypothesis stems from the gravitational ripples in space-time that occur when two massive black holes get locked in orbit, spiral inward and collide. Since 2015, scientists at the Laser Interferometer Gravitational-Wave Observatory (LIGO) and Virgo interferometer, which are designed to detect these gravitational waves, have observed many of these black hole mergers.

    Caltech /MIT Advanced aLigo

    But the waves contain a mystery. Based on the estimated size distribution of stars in the universe, black holes should have masses less than roughly 40 times the mass of the sun. But data taken from these gravitational waves show that many black holes are more than 50 solar masses, and some approach 100 solar masses.

    A common explanation for this mismatch is that black holes grow over time by gorging on gas, dust, stars and even other black holes. But because black holes often form after giant stellar explosions called supernovas, many black holes emerge in regions of space without any of this material. Astronomers have suggested alternative explanations, but all propose unseen changes to scientists’ current understanding of star life cycles. And none can explain the staggering size diversity of merged black holes that gravitational wave observatories have detected.

    The new paper, published Nov. 3 in The Astrophysical Journal Letters [no working links], proposes an explanation of both the large and small merged black hole masses: The ballooning masses of the black holes aren’t a result of anything they’re eating but are instead somehow tethered to the expansion of the universe itself.

    This would mean that all of the universe’s black holes — including the merging black holes detected in gravitational wave experiments, the wandering black holes at the outskirts of our galaxy and even the enormous supermassive black holes at the centers of most galaxies — are growing over time.

    To investigate their hypothesis, the researchers chose to model two merging black in a growing universe, rather than the static universes other research teams build for the sake of simplifying the complex equations (derived from Einstein’s theory of general relativity) that provide the foundations for black hole merger models.

    It takes just a few seconds for two spiraling black holes to merge, so assuming a static universe over that short time frame, as past work has done, seems sensible.

    Artist’s by now iconic conception of two merging black holes similar to those detected by LIGO. Credit: Aurore Simonnet /Caltech MIT Advanced aLIGO(US)/Sonoma State University (US).

    But the researchers disagree, they say that if scientists assume a static universe in their models, they could be ruling out potential changes to the two black holes over the billions of years they existed before reaching the point of collision

    “It’s an assumption that simplifies Einstein’s equations, because a universe that doesn’t grow has much less to keep track of,” study first author Kevin S. Croker, a professor in The University of Hawaii at Mānoa (US) Department of Physics and Astronomy, said in the statement. “There is a trade-off, though: Predictions may only be reasonable for a limited amount of time.”

    By simulating millions of pairs of stars — from their births to their deaths — the researchers were able to study the ones which died to form paired black holes and link how much they grew in proportion to the universe’s expansion. After comparing some predictions made by the model universe they had grown with LIGO-Virgo data, the researchers were surprised to see they matched well.

    “I have to say, I didn’t know what to think at first,” co-author Gregory Tarlé, a professor of physics at The University of Michigan (US), said in a statement. “It was such a simple idea, I was surprised it worked so well.”

    The hypothesis may sound outlandish, but cosmological coupling exists elsewhere in astrophysics. The most famous example of this is probably “red shift,” in which objects moving away have their light stretched to longer (and, therefore, redder) wavelengths.

    This means that as the universe expands and stars move away from each other — like dots drawn on an inflating balloon — the light particles, or photons, that the stars emit become redder over time, losing energy as they do so. The energy of light is said to be cosmologically coupled with the universe’s expansion.

    If the researchers are correct, it means everything with mass is getting bigger — suns, neutron stars, planets and even humans. Of course, this coupling would be much weaker for us than for black holes.

    “Cosmological coupling does apply to other objects and material in the universe, but the strength of the coupling is so weak that you cannot see its effects,” Croker told Live Science. “For the types of black hole we have hypothesized, the coupling can be a million times larger than what you’d expect from the core of the sun. And even for these sorts of black holes, you might have to wait hundreds of millions of years to just double your mass.”

    It may just be an interesting idea for now, but as gravitational wave detectors become more sensitive over time, more and more data will become available to test the hypothesis, Croker said.

    “Planned upgrades to LIGO-Virgo, plus the data they will collect over the next decade, will describe many more black hole mergers,” Croker said. “The more data that is collected, the more powerfully we can test our hypothesis. Space-based gravity wave experiments, like LISA [the Laser Interferometer Space Antenna], may allow us to see the mass gain directly in single systems.”

    Gravity is talking. Lisa will listen. Dialogos of Eide.

    European Space Agency(EU)/National Aeronautics and Space Administration (US) eLISA space based, the future of gravitational wave research.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 12:55 pm on October 13, 2021 Permalink | Reply
    Tags: "Scientists capture image of bizarre 'electron ice' for the first time", , Live Science (US), , Wigner crystal — a strange honeycomb-pattern material inside another material made entirely out of electrons.   

    From Live Science (US) : “Scientists capture image of bizarre ‘electron ice’ for the first time” 

    From Live Science (US)

    Ben Turner

    The scanning tunnelling image of the graphene sheet shows the honeycomb imprint of the ‘electron ice’ underneath it. (Image credit: H. Li et al./Nature)

    Physicists have taken the first ever image of a Wigner crystal — a strange honeycomb-pattern material inside another material made entirely out of electrons.

    Hungarian physicist Eugene Wigner first theorized this crystal in 1934, but it’s taken more than eight decades for scientists to finally get a direct look at the “electron ice.” The fascinating first image shows electrons squished together into a tight, repeating pattern — like tiny blue butterfly wings, or pressings of an alien clover.

    The researchers behind the study, published on Sept. 29 in the journal Nature, say that while this isn’t the first time that a Wigner crystal has been plausibly created or even had its properties studied, the visual evidence they collected is the most emphatic proof of the material’s existence yet.

    “If you say you have an electron crystal, show me the crystal,” study co-author Feng Wang, a physicist at The University of California (US), told Nature News.

    Inside ordinary conductors like silver or copper, or semiconductors like silicon, electrons zip around so fast that they are barely able to interact with each other. But at very low temperatures, they slow down to a crawl, and the repulsion between the negatively charged electrons begins to dominate. The once highly mobile particles grind to a halt, arranging themselves into a repeating, honeycomb-like pattern to minimize their total energy use.

    To see this in action, the researchers trapped electrons in the gap between atom-thick layers of two tungsten semiconductors — one tungsten disulfide and the other tungsten diselenide. Then, after applying an electric field across the gap to remove any potentially disruptive excess electrons, the researchers chilled their electron sandwich down to 5 degrees above absolute zero. Sure enough, the once-speedy electrons stopped, settling into the repeating structure of a Wigner crystal.

    The researchers then used a device called a scanning tunneling microscope (STM) to view this new crystal. STMs work by applying a tiny voltage across a very sharp metal tip before running it just above a material, causing electrons to leap down to the material’s surface from the tip. The rate that electrons jump from the tip depends on what’s underneath them, so researchers can build up a picture of the Braille-like contours of a 2D surface by measuring current flowing into the surface at each point.

    But the current provided by the STM was at first too much for the delicate electron ice, “melting” it upon contact. To stop this, the researchers inserted a single-atom layer of graphene just above the Wigner crystal, enabling the crystal to interact with the graphene and leave an impression on it that the STM could safely read — much like a photocopier. By tracing the image imprinted on the graphene sheet completely, the STM captured the first snapshot of the Wigner crystal, proving its existence beyond all doubt.

    Now that they have conclusive proof that Wigner crystals exist, scientists can use the crystals to answer deeper questions about how multiple electrons interact with each other, such as why the crystals arrange themselves in honeycomb orderings, and how they “melt.” The answers will offer a rare glimpse into some of the most elusive properties of the tiny particles.

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: