Tagged: Basic Research Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:46 pm on October 15, 2021 Permalink | Reply
    Tags: "Two Impacts-Not Just One-May Have Formed The Moon", , Basic Research, , ,   

    From Sky & Telescope : “Two Impacts-Not Just One-May Have Formed The Moon” 

    From Sky & Telescope

    October 14, 2021
    Asa Stahl

    1
    In this image, the proposed hit-and-run collision is simulated in 3D, shown about an hour after impact. Theia, the impactor, barely escapes the collision. A. Emsenhuber / The University of Bern [Universität Bern](CH) / The Ludwig Maximilians University of Munich [Ludwig-Maximilians-Universität München](DE).

    Scientists have long thought that the Moon formed with a bang, when a protoplanet the size of Mars hit the newborn Earth. Evidence from Moon rocks and simulations back up this idea.

    But a new study suggests that the protoplanet most likely hit Earth twice. The first time, the impactor (dubbed “Theia”) only glanced off Earth. Then, some hundreds of thousands of years later, it came back to deliver the final blow.

    The study, which simulated the literally Earth-shattering impact thousands of times, found that such a “hit-and-run return” scenario could help answer two longstanding questions surrounding the creation of the Moon. At the same time, it might explain how Earth and Venus ended up so different.

    The One-Two Punch

    “The key issue here is planetary diversity,” says Erik Asphaug (The University of Arizona (US)), who led the study. Venus and Earth have similar sizes, masses, and distances from the Sun. If Venus is a “crushing hot-house,” he asks, “why is Earth so amazingly blue and rich?”

    The Moon might hold the secret. Its creation was the last major episode in Earth’s formation, a catastrophic event that set the stage for the rest of our planet’s evolution. “You can’t understand how Earth formed without understanding how the Moon formed,” Asphaug explains. “They are part of the same puzzle.”

    The new simulations, which were published in the October Journal of Planetary Sciences, put a few more pieces of that puzzle into place.

    The first has to do with the speed of Theia’s impact. If Theia had hit our planet too fast, it would have exploded into an interplanetary plume of debris and eroded much of Earth. Yet if it had come in too slowly, the result would be a Moon whose orbit looks nothing like what we see today. The original impact theory doesn’t explain why Theia traveled at a just-right speed between these extremes.

    “[This] new scenario fixes that,” says Matthias Meier (Natural History Museum, Switzerland), who was not involved in the study. Initially, Theia could have been going much faster, but the first impact would have slowed it down to the perfect speed for the second one.

    The other problem with the original impact theory is that our Moon ought to be mostly made of primordial Theia. But Moon rocks from the Apollo missions show that Earth and the Moon have nearly identical compositions when it comes to certain kinds of elements. How could they have formed from two different building blocks?

    “The canonical giant-impact scenario is really bad at solving [this issue],” Meier says (though others have tried).

    A hit-and-run return, on the other hand, would enable Earth’s and Theia’s materials to mix more than in a single impact, ultimately forming a Moon chemically more similar to Earth. Though Asphaug and colleagues don’t quite fix the mismatch, they argue that more advanced simulations would yield even better results.

    Earth vs. Venus

    Resolving this aspect of the giant-impact theory would be no mean feat. But Asphaug’s real surprise came when he saw how hit-and-run impacts would have affected Venus compared to Earth.

    “I first thought maybe there was a mistake,” he recalls.

    The new simulations showed that the young Earth tended to pass on half of its hit-and-runners to Venus, while Venus accreted almost everything that came its way. This dynamic could help explain the drastic differences between the two planets: If more runners ended up at Venus, they would have enriched the planet in more outer solar system material compared to Earth. And since the impactors that escaped Earth to go on to Venus would have been the faster ones, each planet would have experienced generally different collisions.

    This finding flips the original purpose of the study on its head. If Venus suffered more giant impacts than Earth, the question would no longer be “why does Earth have a moon?” but “why doesn’t Venus?”

    Perhaps there was only one hit-and-run event, the one that made our Moon. Perhaps there were many, but for the same reason that Venus collected more impacts than Earth, it also accreted more destructive debris, obliterating any moon it already had. Or perhaps the last of Venus’ impacts was just particularly violent.

    Finding out means taking a trip to Venus. That would provide “the next leap in understanding,” Meier says. If Earth and Venus both had hit-and-runs, for example, then the surface of Venus ought to be more like Earth’s than previously expected. If Venus has the same chemical similarities as the Moon and Earth, that would throw out the giant-impact theory’s last remaining problem.

    “Getting samples from Venus,” Asphaug concludes, “is the key to answering all these questions.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Sky & Telescope, founded in 1941 by Charles A. Federer Jr. and Helen Spence Federer, has the largest, most experienced staff of any astronomy magazine in the world. Its editors are virtually all amateur or professional astronomers, and every one has built a telescope, written a book, done original research, developed a new product, or otherwise distinguished him or herself.

    Sky & Telescope magazine, now in its eighth decade, came about because of some happy accidents. Its earliest known ancestor was a four-page bulletin called The Amateur Astronomer, which was begun in 1929 by the Amateur Astronomers Association in New York City. Then, in 1935, the American Museum of Natural History opened its Hayden Planetarium and began to issue a monthly bulletin that became a full-size magazine called The Sky within a year. Under the editorship of Hans Christian Adamson, The Sky featured large illustrations and articles from astronomers all over the globe. It immediately absorbed The Amateur Astronomer.

    Despite initial success, by 1939 the planetarium found itself unable to continue financial support of The Sky. Charles A. Federer, who would become the dominant force behind Sky & Telescope, was then working as a lecturer at the planetarium. He was asked to take over publishing The Sky. Federer agreed and started an independent publishing corporation in New York.

    “Our first issue came out in January 1940,” he noted. “We dropped from 32 to 24 pages, used cheaper quality paper…but editorially we further defined the departments and tried to squeeze as much information as possible between the covers.” Federer was The Sky’s editor, and his wife, Helen, served as managing editor. In that January 1940 issue, they stated their goal: “We shall try to make the magazine meet the needs of amateur astronomy, so that amateur astronomers will come to regard it as essential to their pursuit, and professionals to consider it a worthwhile medium in which to bring their work before the public.”

     
  • richardmitnick 9:16 pm on October 14, 2021 Permalink | Reply
    Tags: "Rocky exoplanets and their host stars may have similar composition", , , Basic Research, ,   

    From IAC Institute of Astrophysics of the Canary Islands [Instituto de Astrofísica de Canarias] (ES) : “Rocky exoplanets and their host stars may have similar composition” 

    Instituto de Astrofísica de Andalucía

    From IAC Institute of Astrophysics of the Canary Islands [Instituto de Astrofísica de Canarias] (ES)

    14/10/2021

    Garik Israelian
    gil@iac.es

    1
    Illustration of the formation of a planet round a star similar to the Sun, with rocks and iron molecules in the foreground. Credit: Tania Cunha (Harbor Planetarium [Planetário do Porto](PT) – Centro Ciência Viva & Instituto de Astrofísica e Ciências do Espaço).

    Newly formed stars have protoplanetary discs around them. A fraction of the material in the disc condenses into planet-forming chunks, and the rest finally falls into the star. Because of their common origin, researchers have assumed that the composition of these chunks and that of the rocky planets with low masses should be similar to that of their host stars. However, until now the Solar System was the only available reference for the astronomers.

    In a new research article, published today in the journal Science, an international team of astronomers led by the researcher Vardan Adibekyan, of The Instituto de Astrofísica e Ciências do Espaço (IA), with participation by the Instituto de Astrofísica de Canarias (IAC), has established for the first time a correlation between the composition of rocky exoplanets and that of their host stars. The study also shows that this relation does not correspond exactly to the relation previously assumed.

    “The team found that the composition of rocky planets is closely related to the composition of their host stars, which could help us to identify planets which may be similar to ours”, explains Vardan Adibekyan, the first author on the paper. “In addition, the iron content of these planets is higher than that predicted from the composition of the protoplanetary discs from which they formed, which is due to the specific characteristics of the formation processes of planets, and the chemistry of the discs. Our work supports models of planet formation and a level of certainty and detail without precedent”, he added.

    For Garik Israelian, an IAC researcher and co-author of the article, this result could not have been imagined in the year 2000. “At that time we tried to find a correlation between the chemical composition of certain solar type stars and the presence of planets orbiting them (or of their orbital characteristics). It was hard to believe that twenty years later these studies would grow to include the metal abundances of planets similar to the Earth”, he emphasises.

    “For us this would have seemed to be science fiction. Planets similar to the Earth were not yet known, and we concentrated only on the planets we could find, and on the parameters of their orbits around their host stars. And today, we are studying the chemical composition of the interiors and of the atmospheres of extrasolar planets. It is a great leap forward”, he added.

    To establish the relation, the team selected twenty-one rocky planets which had been characterized most accurately, using their measurements of mass and radius to determine their densities and their iron content. They also used high-resolution spectra from the latest generation of spectrographs in the major world observatories: at Mauna Kea (Hawaii), at La Silla and Paranal (Chile) and at the Roque de los Muchachos, (Garafía, La Palma, Canary Islands), to determine the compositions of their host stars, and of the most critical components for the formation of rocks in the protoplanetary discs.

    “Understanding the link in the composition between the stars and their planets has been a basic aspect of research in our centre for over a decade. Using the best high-resolution spectrographs, such as HARPS and ESPRESSO at the European Southern Observatory (ESO), our team has collected spectra of the host stars of exoplanets for several years.

    These spectra were used to determine the stellar parameters and abundances of the host stars, and the results have been put together in the published catalogue SWEET-Cat”, explained Nuno Santos, a researcher at the IA and a co-author of the article.

    The team also found an intriguing result. They found differences in the fraction of iron between the super earths and super mercurys, which implies that these planets seem to constitute different populations in terms of composition, with further implications for their formation. This finding will need more studies, because the simulations of the formation of planets, incorporating collision, cannot by themselves reproduce the super mercurys of high density. “Understanding the formation of the super mercurys will help us to understand the especially high density of Mercury”, Adibekyan assures us.

    This research was carried out in the framework of the project “Observational Tests of the Processes of Nucleosynthesis in the Universe” started in the year 2000 by the IAC researcher Garik Israelian; Michel Mayor, Nobel Laureate in Physics, 2019; and Nuno Santos, researcher at the IA.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    IAC Institute of Astrophysics of the Canary Islands [Instituto de Astrofísica de Canarias] (ES) operates two astronomical observatories in the Canary Islands:

    Roque de los Muchachos Observatory on La Palma
    Teide Observatory on Tenerife.

    The seeing statistics at ORM make it the second-best location for optical and infrared astronomy in the Northern Hemisphere, after Mauna Kea Observatory Hawaii (US).

    Maunakea Observatories Hawai’i (US) altitude 4,213 m (13,822 ft)

    The site also has some of the most extensive astronomical facilities in the Northern Hemisphere; its fleet of telescopes includes the 10.4 m Gran Telescopio Canarias, the world’s largest single-aperture optical telescope as of July 2009, the William Herschel Telescope (second largest in Europe), and the adaptive optics corrected Swedish 1-m Solar Telescope.

    Gran Telescopio Canarias [Instituto de Astrofísica de Canarias ](ES) sited on a volcanic peak 2,267 metres (7,438 ft) above sea level.

    The observatory was established in 1985, after 15 years of international work and cooperation of several countries with the Spanish island hosting many telescopes from Britain, The Netherlands, Spain, and other countries. The island provided better seeing conditions for the telescopes that had been moved to Herstmonceux by the Royal Greenwich Observatory, including the 98 inch aperture Isaac Newton Telescope (the largest reflector in Europe at that time). When it was moved to the island it was upgraded to a 100-inch (2.54 meter), and many even larger telescopes from various nations would be hosted there.

    Teide Observatory [Observatorio del Teide], IAU code 954, is an astronomical observatory on Mount Teide at 2,390 metres (7,840 ft), located on Tenerife, Spain. It has been operated by the Instituto de Astrofísica de Canarias since its inauguration in 1964. It became one of the first major international observatories, attracting telescopes from different countries around the world because of the good astronomical seeing conditions. Later the emphasis for optical telescopes shifted more towards Roque de los Muchachos Observatory on La Palma.

     
  • richardmitnick 4:40 pm on October 14, 2021 Permalink | Reply
    Tags: "Department of Energy gives green light for a flagship petawatt laser facility at SLAC", , Basic Research, , , Locating high-energy high-power lasers next to an XFEL can now be realized., , , Two state-of-the-art laser systems ­– a high-power petawatt laser and a high-energy kilojoule laser., University of Rochester’s Laboratory for Laser Energetics (LLE)   

    From DOE’s SLAC National Accelerator Laboratory (US) : “Department of Energy gives green light for a flagship petawatt laser facility at SLAC” 

    From DOE’s SLAC National Accelerator Laboratory (US)

    October 7, 2021
    Ali Sundermier
    Glennda Chui

    High-power lasers will work in concert with the lab’s X-ray laser to dramatically improve our understanding of matter in extreme conditions.

    Petawatt lasers are the most powerful on the planet, generating a million billion watts to produce some of the most extreme conditions seen on Earth. But today’s petawatt lasers are standalone facilities, with limited ability to fully diagnose the conditions they produce.

    A new facility at the Department of Energy’s SLAC National Accelerator Laboratory will change that. It will be the first to combine these powerful lasers with an X-ray free-electron laser (XFEL) that can probe the extreme conditions they create as never before. Coupled to the lab’s Linac Coherent Light Source (LCLS), the Matter in Extreme Conditions Upgrade, or MEC-U, promises to dramatically improve our understanding of the conditions needed to produce fusion energy and to replicate a wide range of astrophysical phenomena here on Earth.

    1
    In a new underground experimental facility coupled to SLAC’s Linac Coherent Light Source (LCLS), two state-of-the-art laser systems – a high-power petawatt laser and a high-energy kilojoule laser – will feed into two new experimental areas dedicated to the study of hot dense plasmas, astrophysics, and planetary science. (Gilliss Dyer/SLAC National Accelerator Laboratory)

    The project got approval from the DOE Office of Science (SC) on Monday to move from its conceptual design phase to preliminary design and execution, having passed what is known as Critical Decision 1.

    “It’s been gratifying to see the community rally together to support this project, and I think this achievement really validates those efforts. It shows that this notion of locating high-energy high-power lasers next to an XFEL can now be realized,” said SLAC scientist Arianna Gleason.

    “Working in concert, they’ll allow us to look behind the curtain of physics at extreme conditions to see how it’s all stitched together, opening a new frontier.”

    A national opportunity

    SLAC will work in partnership with The DOE’S Lawrence Livermore National Laboratory (US) and University of Rochester’s Laboratory for Laser Energetics (LLE) to design and construct the facility in a new underground cavern.

    University of Rochester(US) Laboratory for Laser Energetics.

    There, two state-of-the-art laser systems ­– a high-power petawatt laser and a high-energy kilojoule laser ­– will feed into two new experimental areas dedicated to the study of hot dense plasmas, astrophysics, and planetary science.

    “Not only are we working with some of the leading laser laboratories in the world, but we’re also working with world experts in experimental science, high energy density science and the operation of DOE Office of Science user facilities, where scientists from all over the world can come to do experiments,” said Alan Fry, MEC-U Project Director.

    Scientists started discussing what would be needed to make a quantum leap in this field in 2014 at a series of high-power laser workshops at SLAC. Three years later, a National Academies report called “Opportunities in intense ultrafast lasers: Reaching for the brightest light” highlighted the importance of this field of science. It recommended that DOE secure a key global advantage for the U.S. by locating high-intensity lasers “with existing infrastructure, such as particle accelerators.”

    Building on success

    This project builds on the success achieved at the existing Matter in Extreme Conditions (MEC) instrument at LCLS. Funded by DOE SC’s Fusion Energy Sciences program (FES), MEC uses short-pulse lasers coupled to X-ray laser pulses from LCLS to probe the characteristics of matter with unprecedented precision. These experiments have delivered a wealth of outstanding science and attracted worldwide media attention, with examples such as the study of “diamond rain” thought to exist on Neptune, to investigating the signatures of asteroid impacts on the Earth, to studying potential failure mechanisms of satellites due to solar flares.

    3
    The Matter in Extreme Conditions instrument at SLAC serves hundreds of scientists from across the community, providing the tools necessary to investigate extremely hot, dense matter similar to that found in the centers of stars and giant planets. Credit: Matt Beardsley/SLAC National Accelerator Laboratory.

    The existing MEC instrument is however limited in the regimes it can access. It has only modest laser capabilities which don’t allow it to reach the conditions of highest interest to researchers. The community called for investment into a petawatt laser that can produce unprecedented light pressures and generate plasmas at the even higher temperatures found in cosmic collisions, the cores of stars and planets, and fusion devices, giving scientists access to more extreme forms of matter needed to address the most important scientific challenges identified by the broad community of scientific users.

    “The new high-power lasers being designed by Livermore and Rochester are world-leading in their own right,” Fry said. “The fact that they’re coupled to LCLS then really puts it over the top in terms of capabilities.”

    MEC-U will also take advantage of the LCLS-II upgrade to the LCLS facility, which will provide X-ray laser beams of unsurpassed brilliance for probing those plasmas, doubling the X-ray energy that has been attainable to date.

    SLAC/LCLS II projected view.

    Magnets called undulators stretch roughly 100 meters down a tunnel at SLAC National Accelerator Laboratory, with one side (right) producing hard x-rays and the other soft x-rays.Credit: SLAC National Accelerator Laboratory.

    New scientific frontiers

    Access to the facility will be open to researchers from across the country and around the world, facilitated in part by LaserNetUS, a research network that is boosting access to high-intensity laser facilities at labs and universities across the country. This will allow more MEC users in a broader range of fields to use the facility, while also helping train new staff and develop new techniques.

    “This new facility will lead to a greater understanding of everything from fusion energy to the most extreme phenomena in the universe, shedding light on cosmic rays, planetary physics and stellar conditions.” said Siegfried Glenzer, director of the High Energy Density Division at SLAC. “It really shows the DOE’s dedication to continue to tackle the most important and exciting problems in plasma physics.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    SLAC National Accelerator Laboratory (US) originally named Stanford Linear Accelerator Center, is a Department of Energy (US) National Laboratory operated by Stanford University (US) under the programmatic direction of the Department of Energy (US) Office of Science and located in Menlo Park, California. It is the site of the Stanford Linear Accelerator, a 3.2 kilometer (2-mile) linear accelerator constructed in 1966 and shut down in the 2000s, which could accelerate electrons to energies of 50 GeV.
    Today SLAC research centers on a broad program in atomic and solid-state physics, chemistry, biology, and medicine using X-rays from synchrotron radiation and a free-electron laser as well as experimental and theoretical research in elementary particle physics, astroparticle physics, and cosmology.

    Founded in 1962 as the Stanford Linear Accelerator Center, the facility is located on 172 hectares (426 acres) of Stanford University-owned land on Sand Hill Road in Menlo Park, California—just west of the University’s main campus. The main accelerator is 3.2 kilometers (2 mi) long—the longest linear accelerator in the world—and has been operational since 1966.

    Research at SLAC has produced three Nobel Prizes in Physics

    1976: The charm quark—see J/ψ meson
    1990: Quark structure inside protons and neutrons
    1995: The tau lepton

    SLAC’s meeting facilities also provided a venue for the Homebrew Computer Club and other pioneers of the home computer revolution of the late 1970s and early 1980s.

    In 1984 the laboratory was named an ASME National Historic Engineering Landmark and an IEEE Milestone.

    SLAC developed and, in December 1991, began hosting the first World Wide Web server outside of Europe.

    In the early-to-mid 1990s, the Stanford Linear Collider (SLC) investigated the properties of the Z boson using the Stanford Large Detector.

    As of 2005, SLAC employed over 1,000 people, some 150 of whom were physicists with doctorate degrees, and served over 3,000 visiting researchers yearly, operating particle accelerators for high-energy physics and the Stanford Synchrotron Radiation Laboratory (SSRL) for synchrotron light radiation research, which was “indispensable” in the research leading to the 2006 Nobel Prize in Chemistry awarded to Stanford Professor Roger D. Kornberg.

    In October 2008, the Department of Energy announced that the center’s name would be changed to SLAC National Accelerator Laboratory. The reasons given include a better representation of the new direction of the lab and the ability to trademark the laboratory’s name. Stanford University had legally opposed the Department of Energy’s attempt to trademark “Stanford Linear Accelerator Center”.

    In March 2009, it was announced that the SLAC National Accelerator Laboratory was to receive $68.3 million in Recovery Act Funding to be disbursed by Department of Energy’s Office of Science.

    In October 2016, Bits and Watts launched as a collaboration between SLAC and Stanford University to design “better, greener electric grids”. SLAC later pulled out over concerns about an industry partner, the state-owned Chinese electric utility.

    Accelerator

    The main accelerator was an RF linear accelerator that accelerated electrons and positrons up to 50 GeV. At 3.2 km (2.0 mi) long, the accelerator was the longest linear accelerator in the world, and was claimed to be “the world’s most straight object.” until 2017 when the European x-ray free electron laser opened. The main accelerator is buried 9 m (30 ft) below ground and passes underneath Interstate Highway 280. The above-ground klystron gallery atop the beamline, was the longest building in the United States until the LIGO project’s twin interferometers were completed in 1999. It is easily distinguishable from the air and is marked as a visual waypoint on aeronautical charts.

    A portion of the original linear accelerator is now part of the Linac Coherent Light Source [below].

    Stanford Linear Collider

    The Stanford Linear Collider was a linear accelerator that collided electrons and positrons at SLAC. The center of mass energy was about 90 GeV, equal to the mass of the Z boson, which the accelerator was designed to study. Grad student Barrett D. Milliken discovered the first Z event on 12 April 1989 while poring over the previous day’s computer data from the Mark II detector. The bulk of the data was collected by the SLAC Large Detector, which came online in 1991. Although largely overshadowed by the Large Electron–Positron Collider at CERN, which began running in 1989, the highly polarized electron beam at SLC (close to 80%) made certain unique measurements possible, such as parity violation in Z Boson-b quark coupling.

    Presently no beam enters the south and north arcs in the machine, which leads to the Final Focus, therefore this section is mothballed to run beam into the PEP2 section from the beam switchyard.

    The SLAC Large Detector (SLD) was the main detector for the Stanford Linear Collider. It was designed primarily to detect Z bosons produced by the accelerator’s electron-positron collisions. Built in 1991, the SLD operated from 1992 to 1998.

    SLAC National Accelerator Laboratory(US)Large Detector

    PEP

    PEP (Positron-Electron Project) began operation in 1980, with center-of-mass energies up to 29 GeV. At its apex, PEP had five large particle detectors in operation, as well as a sixth smaller detector. About 300 researchers made used of PEP. PEP stopped operating in 1990, and PEP-II began construction in 1994.

    PEP-II

    From 1999 to 2008, the main purpose of the linear accelerator was to inject electrons and positrons into the PEP-II accelerator, an electron-positron collider with a pair of storage rings 2.2 km (1.4 mi) in circumference. PEP-II was host to the BaBar experiment, one of the so-called B-Factory experiments studying charge-parity symmetry.

    SLAC National Accelerator Laboratory(US) BaBar

    SLAC National Accelerator Laboratory(US)/SSRL

    Fermi Gamma-ray Space Telescope

    SLAC plays a primary role in the mission and operation of the Fermi Gamma-ray Space Telescope, launched in August 2008. The principal scientific objectives of this mission are:

    To understand the mechanisms of particle acceleration in AGNs, pulsars, and SNRs.
    To resolve the gamma-ray sky: unidentified sources and diffuse emission.
    To determine the high-energy behavior of gamma-ray bursts and transients.
    To probe dark matter and fundamental physics.


    KIPAC

    The Stanford PULSE Institute (PULSE) is a Stanford Independent Laboratory located in the Central Laboratory at SLAC. PULSE was created by Stanford in 2005 to help Stanford faculty and SLAC scientists develop ultrafast x-ray research at LCLS.

    The Linac Coherent Light Source (LCLS)[below] is a free electron laser facility located at SLAC. The LCLS is partially a reconstruction of the last 1/3 of the original linear accelerator at SLAC, and can deliver extremely intense x-ray radiation for research in a number of areas. It achieved first lasing in April 2009.

    The laser produces hard X-rays, 10^9 times the relative brightness of traditional synchrotron sources and is the most powerful x-ray source in the world. LCLS enables a variety of new experiments and provides enhancements for existing experimental methods. Often, x-rays are used to take “snapshots” of objects at the atomic level before obliterating samples. The laser’s wavelength, ranging from 6.2 to 0.13 nm (200 to 9500 electron volts (eV)) is similar to the width of an atom, providing extremely detailed information that was previously unattainable. Additionally, the laser is capable of capturing images with a “shutter speed” measured in femtoseconds, or million-billionths of a second, necessary because the intensity of the beam is often high enough so that the sample explodes on the femtosecond timescale.

    The LCLS-II [below] project is to provide a major upgrade to LCLS by adding two new X-ray laser beams. The new system will utilize the 500 m (1,600 ft) of existing tunnel to add a new superconducting accelerator at 4 GeV and two new sets of undulators that will increase the available energy range of LCLS. The advancement from the discoveries using this new capabilities may include new drugs, next-generation computers, and new materials.

    FACET

    In 2012, the first two-thirds (~2 km) of the original SLAC LINAC were recommissioned for a new user facility, the Facility for Advanced Accelerator Experimental Tests (FACET). This facility was capable of delivering 20 GeV, 3 nC electron (and positron) beams with short bunch lengths and small spot sizes, ideal for beam-driven plasma acceleration studies. The facility ended operations in 2016 for the constructions of LCLS-II which will occupy the first third of the SLAC LINAC. The FACET-II project will re-establish electron and positron beams in the middle third of the LINAC for the continuation of beam-driven plasma acceleration studies in 2019.

    The Next Linear Collider Test Accelerator (NLCTA) is a 60-120 MeV high-brightness electron beam linear accelerator used for experiments on advanced beam manipulation and acceleration techniques. It is located at SLAC’s end station B

    SSRL and LCLS are DOE Office of Science user facilities.

    Stanford University (US)

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members.

    Stanford University, officially Leland Stanford Junior University, is a private research university located in Stanford, California. Stanford was founded in 1885 by Leland and Jane Stanford in memory of their only child, Leland Stanford Jr., who had died of typhoid fever at age 15 the previous year. Stanford is consistently ranked as among the most prestigious and top universities in the world by major education publications. It is also one of the top fundraising institutions in the country, becoming the first school to raise more than a billion dollars in a year.

    Leland Stanford was a U.S. senator and former governor of California who made his fortune as a railroad tycoon. The school admitted its first students on October 1, 1891, as a coeducational and non-denominational institution. Stanford University struggled financially after the death of Leland Stanford in 1893 and again after much of the campus was damaged by the 1906 San Francisco earthquake. Following World War II, provost Frederick Terman supported faculty and graduates’ entrepreneurialism to build self-sufficient local industry in what would later be known as Silicon Valley.

    The university is organized around seven schools: three schools consisting of 40 academic departments at the undergraduate level as well as four professional schools that focus on graduate programs in law, medicine, education, and business. All schools are on the same campus. Students compete in 36 varsity sports, and the university is one of two private institutions in the Division I FBS Pac-12 Conference. It has gained 126 NCAA team championships, and Stanford has won the NACDA Directors’ Cup for 24 consecutive years, beginning in 1994–1995. In addition, Stanford students and alumni have won 270 Olympic medals including 139 gold medals.

    As of October 2020, 84 Nobel laureates, 28 Turing Award laureates, and eight Fields Medalists have been affiliated with Stanford as students, alumni, faculty, or staff. In addition, Stanford is particularly noted for its entrepreneurship and is one of the most successful universities in attracting funding for start-ups. Stanford alumni have founded numerous companies, which combined produce more than $2.7 trillion in annual revenue, roughly equivalent to the 7th largest economy in the world (as of 2020). Stanford is the alma mater of one president of the United States (Herbert Hoover), 74 living billionaires, and 17 astronauts. It is also one of the leading producers of Fulbright Scholars, Marshall Scholars, Rhodes Scholars, and members of the United States Congress.

    Stanford University was founded in 1885 by Leland and Jane Stanford, dedicated to Leland Stanford Jr, their only child. The institution opened in 1891 on Stanford’s previous Palo Alto farm.

    Jane and Leland Stanford modeled their university after the great eastern universities, most specifically Cornell University. Stanford opened being called the “Cornell of the West” in 1891 due to faculty being former Cornell affiliates (either professors, alumni, or both) including its first president, David Starr Jordan, and second president, John Casper Branner. Both Cornell and Stanford were among the first to have higher education be accessible, nonsectarian, and open to women as well as to men. Cornell is credited as one of the first American universities to adopt this radical departure from traditional education, and Stanford became an early adopter as well.

    Despite being impacted by earthquakes in both 1906 and 1989, the campus was rebuilt each time. In 1919, The Hoover Institution on War, Revolution and Peace was started by Herbert Hoover to preserve artifacts related to World War I. The Stanford Medical Center, completed in 1959, is a teaching hospital with over 800 beds. The DOE’s SLAC National Accelerator Laboratory(US)(originally named the Stanford Linear Accelerator Center), established in 1962, performs research in particle physics.

    Land

    Most of Stanford is on an 8,180-acre (12.8 sq mi; 33.1 km^2) campus, one of the largest in the United States. It is located on the San Francisco Peninsula, in the northwest part of the Santa Clara Valley (Silicon Valley) approximately 37 miles (60 km) southeast of San Francisco and approximately 20 miles (30 km) northwest of San Jose. In 2008, 60% of this land remained undeveloped.

    Stanford’s main campus includes a census-designated place within unincorporated Santa Clara County, although some of the university land (such as the Stanford Shopping Center and the Stanford Research Park) is within the city limits of Palo Alto. The campus also includes much land in unincorporated San Mateo County (including the SLAC National Accelerator Laboratory and the Jasper Ridge Biological Preserve), as well as in the city limits of Menlo Park (Stanford Hills neighborhood), Woodside, and Portola Valley.

    Non-central campus

    Stanford currently operates in various locations outside of its central campus.

    On the founding grant:

    Jasper Ridge Biological Preserve is a 1,200-acre (490 ha) natural reserve south of the central campus owned by the university and used by wildlife biologists for research.
    SLAC National Accelerator Laboratory is a facility west of the central campus operated by the university for the Department of Energy. It contains the longest linear particle accelerator in the world, 2 miles (3.2 km) on 426 acres (172 ha) of land.
    Golf course and a seasonal lake: The university also has its own golf course and a seasonal lake (Lake Lagunita, actually an irrigation reservoir), both home to the vulnerable California tiger salamander. As of 2012 Lake Lagunita was often dry and the university had no plans to artificially fill it.

    Off the founding grant:

    Hopkins Marine Station, in Pacific Grove, California, is a marine biology research center owned by the university since 1892.
    Study abroad locations: unlike typical study abroad programs, Stanford itself operates in several locations around the world; thus, each location has Stanford faculty-in-residence and staff in addition to students, creating a “mini-Stanford”.

    Redwood City campus for many of the university’s administrative offices located in Redwood City, California, a few miles north of the main campus. In 2005, the university purchased a small, 35-acre (14 ha) campus in Midpoint Technology Park intended for staff offices; development was delayed by The Great Recession. In 2015 the university announced a development plan and the Redwood City campus opened in March 2019.

    The Bass Center in Washington, DC provides a base, including housing, for the Stanford in Washington program for undergraduates. It includes a small art gallery open to the public.

    China: Stanford Center at Peking University, housed in the Lee Jung Sen Building, is a small center for researchers and students in collaboration with Beijing University [北京大学](CN) (Kavli Institute for Astronomy and Astrophysics at Peking University(CN) (KIAA-PKU).

    Administration and organization

    Stanford is a private, non-profit university that is administered as a corporate trust governed by a privately appointed board of trustees with a maximum membership of 38. Trustees serve five-year terms (not more than two consecutive terms) and meet five times annually.[83] A new trustee is chosen by the current trustees by ballot. The Stanford trustees also oversee the Stanford Research Park, the Stanford Shopping Center, the Cantor Center for Visual Arts, Stanford University Medical Center, and many associated medical facilities (including the Lucile Packard Children’s Hospital).

    The board appoints a president to serve as the chief executive officer of the university, to prescribe the duties of professors and course of study, to manage financial and business affairs, and to appoint nine vice presidents. The provost is the chief academic and budget officer, to whom the deans of each of the seven schools report. Persis Drell became the 13th provost in February 2017.

    As of 2018, the university was organized into seven academic schools. The schools of Humanities and Sciences (27 departments), Engineering (nine departments), and Earth, Energy & Environmental Sciences (four departments) have both graduate and undergraduate programs while the Schools of Law, Medicine, Education and Business have graduate programs only. The powers and authority of the faculty are vested in the Academic Council, which is made up of tenure and non-tenure line faculty, research faculty, senior fellows in some policy centers and institutes, the president of the university, and some other academic administrators, but most matters are handled by the Faculty Senate, made up of 55 elected representatives of the faculty.

    The Associated Students of Stanford University (ASSU) is the student government for Stanford and all registered students are members. Its elected leadership consists of the Undergraduate Senate elected by the undergraduate students, the Graduate Student Council elected by the graduate students, and the President and Vice President elected as a ticket by the entire student body.

    Stanford is the beneficiary of a special clause in the California Constitution, which explicitly exempts Stanford property from taxation so long as the property is used for educational purposes.

    Endowment and donations

    The university’s endowment, managed by the Stanford Management Company, was valued at $27.7 billion as of August 31, 2019. Payouts from the Stanford endowment covered approximately 21.8% of university expenses in the 2019 fiscal year. In the 2018 NACUBO-TIAA survey of colleges and universities in the United States and Canada, only Harvard University(US), the University of Texas System(US), and Yale University(US) had larger endowments than Stanford.

    In 2006, President John L. Hennessy launched a five-year campaign called the Stanford Challenge, which reached its $4.3 billion fundraising goal in 2009, two years ahead of time, but continued fundraising for the duration of the campaign. It concluded on December 31, 2011, having raised a total of $6.23 billion and breaking the previous campaign fundraising record of $3.88 billion held by Yale. Specifically, the campaign raised $253.7 million for undergraduate financial aid, as well as $2.33 billion for its initiative in “Seeking Solutions” to global problems, $1.61 billion for “Educating Leaders” by improving K-12 education, and $2.11 billion for “Foundation of Excellence” aimed at providing academic support for Stanford students and faculty. Funds supported 366 new fellowships for graduate students, 139 new endowed chairs for faculty, and 38 new or renovated buildings. The new funding also enabled the construction of a facility for stem cell research; a new campus for the business school; an expansion of the law school; a new Engineering Quad; a new art and art history building; an on-campus concert hall; a new art museum; and a planned expansion of the medical school, among other things. In 2012, the university raised $1.035 billion, becoming the first school to raise more than a billion dollars in a year.

    Research centers and institutes

    DOE’s SLAC National Accelerator Laboratory(US)
    Stanford Research Institute, a center of innovation to support economic development in the region.
    Hoover Institution, a conservative American public policy institution and research institution that promotes personal and economic liberty, free enterprise, and limited government.
    Hasso Plattner Institute of Design, a multidisciplinary design school in cooperation with the Hasso Plattner Institute of University of Potsdam [Universität Potsdam](DE) that integrates product design, engineering, and business management education).
    Martin Luther King Jr. Research and Education Institute, which grew out of and still contains the Martin Luther King Jr. Papers Project.
    John S. Knight Fellowship for Professional Journalists
    Center for Ocean Solutions
    Together with UC Berkeley(US) and UC San Francisco(US), Stanford is part of the Biohub, a new medical science research center founded in 2016 by a $600 million commitment from Facebook CEO and founder Mark Zuckerberg and pediatrician Priscilla Chan.

    Discoveries and innovation

    Natural sciences

    Biological synthesis of deoxyribonucleic acid (DNA) – Arthur Kornberg synthesized DNA material and won the Nobel Prize in Physiology or Medicine 1959 for his work at Stanford.
    First Transgenic organism – Stanley Cohen and Herbert Boyer were the first scientists to transplant genes from one living organism to another, a fundamental discovery for genetic engineering. Thousands of products have been developed on the basis of their work, including human growth hormone and hepatitis B vaccine.
    Laser – Arthur Leonard Schawlow shared the 1981 Nobel Prize in Physics with Nicolaas Bloembergen and Kai Siegbahn for his work on lasers.
    Nuclear magnetic resonance – Felix Bloch developed new methods for nuclear magnetic precision measurements, which are the underlying principles of the MRI.

    Computer and applied sciences

    ARPANETStanford Research Institute, formerly part of Stanford but on a separate campus, was the site of one of the four original ARPANET nodes.

    Internet—Stanford was the site where the original design of the Internet was undertaken. Vint Cerf led a research group to elaborate the design of the Transmission Control Protocol (TCP/IP) that he originally co-created with Robert E. Kahn (Bob Kahn) in 1973 and which formed the basis for the architecture of the Internet.

    Frequency modulation synthesis – John Chowning of the Music department invented the FM music synthesis algorithm in 1967, and Stanford later licensed it to Yamaha Corporation.

    Google – Google began in January 1996 as a research project by Larry Page and Sergey Brin when they were both PhD students at Stanford. They were working on the Stanford Digital Library Project (SDLP). The SDLP’s goal was “to develop the enabling technologies for a single, integrated and universal digital library” and it was funded through the National Science Foundation, among other federal agencies.

    Klystron tube – invented by the brothers Russell and Sigurd Varian at Stanford. Their prototype was completed and demonstrated successfully on August 30, 1937. Upon publication in 1939, news of the klystron immediately influenced the work of U.S. and UK researchers working on radar equipment.

    RISCARPA funded VLSI project of microprocessor design. Stanford and UC Berkeley are most associated with the popularization of this concept. The Stanford MIPS would go on to be commercialized as the successful MIPS architecture, while Berkeley RISC gave its name to the entire concept, commercialized as the SPARC. Another success from this era were IBM’s efforts that eventually led to the IBM POWER instruction set architecture, PowerPC, and Power ISA. As these projects matured, a wide variety of similar designs flourished in the late 1980s and especially the early 1990s, representing a major force in the Unix workstation market as well as embedded processors in laser printers, routers and similar products.
    SUN workstation – Andy Bechtolsheim designed the SUN workstation for the Stanford University Network communications project as a personal CAD workstation, which led to Sun Microsystems.

    Businesses and entrepreneurship

    Stanford is one of the most successful universities in creating companies and licensing its inventions to existing companies; it is often held up as a model for technology transfer. Stanford’s Office of Technology Licensing is responsible for commercializing university research, intellectual property, and university-developed projects.

    The university is described as having a strong venture culture in which students are encouraged, and often funded, to launch their own companies.

    Companies founded by Stanford alumni generate more than $2.7 trillion in annual revenue, equivalent to the 10th-largest economy in the world.

    Some companies closely associated with Stanford and their connections include:

    Hewlett-Packard, 1939, co-founders William R. Hewlett (B.S, PhD) and David Packard (M.S).
    Silicon Graphics, 1981, co-founders James H. Clark (Associate Professor) and several of his grad students.
    Sun Microsystems, 1982, co-founders Vinod Khosla (M.B.A), Andy Bechtolsheim (PhD) and Scott McNealy (M.B.A).
    Cisco, 1984, founders Leonard Bosack (M.S) and Sandy Lerner (M.S) who were in charge of Stanford Computer Science and Graduate School of Business computer operations groups respectively when the hardware was developed.[163]
    Yahoo!, 1994, co-founders Jerry Yang (B.S, M.S) and David Filo (M.S).
    Google, 1998, co-founders Larry Page (M.S) and Sergey Brin (M.S).
    LinkedIn, 2002, co-founders Reid Hoffman (B.S), Konstantin Guericke (B.S, M.S), Eric Lee (B.S), and Alan Liu (B.S).
    Instagram, 2010, co-founders Kevin Systrom (B.S) and Mike Krieger (B.S).
    Snapchat, 2011, co-founders Evan Spiegel and Bobby Murphy (B.S).
    Coursera, 2012, co-founders Andrew Ng (Associate Professor) and Daphne Koller (Professor, PhD).

    Student body

    Stanford enrolled 6,996 undergraduate and 10,253 graduate students as of the 2019–2020 school year. Women comprised 50.4% of undergraduates and 41.5% of graduate students. In the same academic year, the freshman retention rate was 99%.

    Stanford awarded 1,819 undergraduate degrees, 2,393 master’s degrees, 770 doctoral degrees, and 3270 professional degrees in the 2018–2019 school year. The four-year graduation rate for the class of 2017 cohort was 72.9%, and the six-year rate was 94.4%. The relatively low four-year graduation rate is a function of the university’s coterminal degree (or “coterm”) program, which allows students to earn a master’s degree as a 1-to-2-year extension of their undergraduate program.

    As of 2010, fifteen percent of undergraduates were first-generation students.

    Athletics

    As of 2016 Stanford had 16 male varsity sports and 20 female varsity sports, 19 club sports and about 27 intramural sports. In 1930, following a unanimous vote by the Executive Committee for the Associated Students, the athletic department adopted the mascot “Indian.” The Indian symbol and name were dropped by President Richard Lyman in 1972, after objections from Native American students and a vote by the student senate. The sports teams are now officially referred to as the “Stanford Cardinal,” referring to the deep red color, not the cardinal bird. Stanford is a member of the Pac-12 Conference in most sports, the Mountain Pacific Sports Federation in several other sports, and the America East Conference in field hockey with the participation in the inter-collegiate NCAA’s Division I FBS.

    Its traditional sports rival is the University of California, Berkeley, the neighbor to the north in the East Bay. The winner of the annual “Big Game” between the Cal and Cardinal football teams gains custody of the Stanford Axe.

    Stanford has had at least one NCAA team champion every year since the 1976–77 school year and has earned 126 NCAA national team titles since its establishment, the most among universities, and Stanford has won 522 individual national championships, the most by any university. Stanford has won the award for the top-ranked Division 1 athletic program—the NACDA Directors’ Cup, formerly known as the Sears Cup—annually for the past twenty-four straight years. Stanford athletes have won medals in every Olympic Games since 1912, winning 270 Olympic medals total, 139 of them gold. In the 2008 Summer Olympics, and 2016 Summer Olympics, Stanford won more Olympic medals than any other university in the United States. Stanford athletes won 16 medals at the 2012 Summer Olympics (12 gold, two silver and two bronze), and 27 medals at the 2016 Summer Olympics.

    Traditions

    The unofficial motto of Stanford, selected by President Jordan, is Die Luft der Freiheit weht. Translated from the German language, this quotation from Ulrich von Hutten means, “The wind of freedom blows.” The motto was controversial during World War I, when anything in German was suspect; at that time the university disavowed that this motto was official.
    Hail, Stanford, Hail! is the Stanford Hymn sometimes sung at ceremonies or adapted by the various University singing groups. It was written in 1892 by mechanical engineering professor Albert W. Smith and his wife, Mary Roberts Smith (in 1896 she earned the first Stanford doctorate in Economics and later became associate professor of Sociology), but was not officially adopted until after a performance on campus in March 1902 by the Mormon Tabernacle Choir.
    “Uncommon Man/Uncommon Woman”: Stanford does not award honorary degrees, but in 1953 the degree of “Uncommon Man/Uncommon Woman” was created to recognize individuals who give rare and extraordinary service to the University. Technically, this degree is awarded by the Stanford Associates, a voluntary group that is part of the university’s alumni association. As Stanford’s highest honor, it is not conferred at prescribed intervals, but only when appropriate to recognize extraordinary service. Recipients include Herbert Hoover, Bill Hewlett, Dave Packard, Lucile Packard, and John Gardner.
    Big Game events: The events in the week leading up to the Big Game vs. UC Berkeley, including Gaieties (a musical written, composed, produced, and performed by the students of Ram’s Head Theatrical Society).
    “Viennese Ball”: a formal ball with waltzes that was initially started in the 1970s by students returning from the now-closed Stanford in Vienna overseas program. It is now open to all students.
    “Full Moon on the Quad”: An annual event at Main Quad, where students gather to kiss one another starting at midnight. Typically organized by the Junior class cabinet, the festivities include live entertainment, such as music and dance performances.
    “Band Run”: An annual festivity at the beginning of the school year, where the band picks up freshmen from dorms across campus while stopping to perform at each location, culminating in a finale performance at Main Quad.
    “Mausoleum Party”: An annual Halloween Party at the Stanford Mausoleum, the final resting place of Leland Stanford Jr. and his parents. A 20-year tradition, the “Mausoleum Party” was on hiatus from 2002 to 2005 due to a lack of funding, but was revived in 2006. In 2008, it was hosted in Old Union rather than at the actual Mausoleum, because rain prohibited generators from being rented. In 2009, after fundraising efforts by the Junior Class Presidents and the ASSU Executive, the event was able to return to the Mausoleum despite facing budget cuts earlier in the year.
    Former campus traditions include the “Big Game bonfire” on Lake Lagunita (a seasonal lake usually dry in the fall), which was formally ended in 1997 because of the presence of endangered salamanders in the lake bed.

    Award laureates and scholars

    Stanford’s current community of scholars includes:

    19 Nobel Prize laureates (as of October 2020, 85 affiliates in total)
    171 members of the National Academy of Sciences
    109 members of National Academy of Engineering
    76 members of National Academy of Medicine
    288 members of the American Academy of Arts and Sciences
    19 recipients of the National Medal of Science
    1 recipient of the National Medal of Technology
    4 recipients of the National Humanities Medal
    49 members of American Philosophical Society
    56 fellows of the American Physics Society (since 1995)
    4 Pulitzer Prize winners
    31 MacArthur Fellows
    4 Wolf Foundation Prize winners
    2 ACL Lifetime Achievement Award winners
    14 AAAI fellows
    2 Presidential Medal of Freedom winners

    Stanford University Seal

     
  • richardmitnick 3:54 pm on October 14, 2021 Permalink | Reply
    Tags: "To Find Sterile Neutrinos Think Small", , Basic Research, BeEST experimental program, ,   

    From American Physical Society (US) : “To Find Sterile Neutrinos Think Small” 

    AmericanPhysicalSociety

    From American Physical Society (US)

    10.14.21

    Two small-scale experiments may beat the massive machines pursuing evidence of new physics—and could improve cancer treatment.

    Experiments have spotted anomalies hinting at a new type of neutrino, one that would go beyond the standard model of particle physics and perhaps open a portal to the dark sector. But no one has ever directly observed this hypothetical particle.

    1
    The BeEST experimental program, short for “Beryllium Electron-capture with Superconducting Tunnel junctions,” is utilizing complete momentum reconstruction of nuclear electron-capture decay in radioactive beryllium-7 atoms to search for these elusive new “ghost particles.” Credit: Spencer Fretwell, The Colorado School of Mines(US).

    Now a quantum dark matter detector and a proposed particle accelerator dreamt up by machine learning are poised to prove whether the sterile neutrino exists.

    The IsoDAR cyclotron would deliver ten times more beam current than any existing machine, according to the team at The Massachusetts Institute of Technology (US) that designed it.

    2
    A picture of the ion source used by the IsoDAR cyclotron team, which shows the ion beam glowing inside their device. Credit: IsoDAR collaboration.

    Taking up only a small underground footprint, the cyclotron may give definitive signs of sterile neutrinos within five years.

    At the same time, that intense beam could solve a major problem in cancer treatment: producing enough radioactive isotopes for killing cancerous cells and scanning tumors. The beam could produce high quantities of medical isotopes and even let hospitals and smaller laboratories make their own.

    “There is a direct connection between the technology that can be used to understand our universe, and the technology which can be used to save people’s lives,” said Loyd Waites, an MIT PhD candidate who will discuss the plans at the 2021 Fall Meeting of the APS Division of Nuclear Physics.

    Of the existing sterile neutrino hunters, one of the most powerful in the world possesses a single detector. The BeEST (pronounced “beast”) may sound like a behemoth, but the experiment uses one quantum sensor to measure nuclear recoils from the “kick” of a neutrino.

    This clean method searches for the mysterious particle without the added hurdle of looking for its interactions with normal matter. Just one month of testing yielded a new benchmark that covers a wide mass range—applicable to much bigger sterile neutrino experiments like “There is a direct connection between the technology that can be used to understand our universe, and the technology which can be used to save people’s lives,” said Loyd Waites, an MIT PhD candidate who will discuss the plans at the 2021 Fall Meeting of the APS Division of Nuclear Physics.

    Of the existing sterile neutrino hunters, one of the most powerful in the world possesses a single detector. The BeEST (pronounced “beast”) may sound like a behemoth, but the experiment uses one quantum sensor to measure nuclear recoils from the “kick” of a neutrino.

    This clean method searches for the mysterious particle without the added hurdle of looking for its interactions with normal matter. Just one month of testing yielded a new benchmark that covers a wide mass range—applicable to much bigger sterile neutrino experiments like KATRIN.

    KATRIN experiment aims to measure the mass of the neutrino using a huge device called a spectrometer (interior shown)KIT Karlsruhe Institute of Technology [Karlsruher Institut für Technologie] (DE)

    The KArlsruhe TRItium Neutrino KATRIN experiment which is presently being performed at Tritium Laboratory Karlsruhe at the KIT Karlsruhe Institute of Technology [Karlsruher Institut für Technologie] (DE) Campus North site will investigate the most important open issue in neutrino physics.

    “This initial work already excludes the existence of this type of sterile neutrino up to 10 times better than all previous decay experiments,” said Kyle Leach, an associate professor at the Colorado School of Mines, who presents the first round of results (recently reported in Physical Review Letters) at the meeting.

    The BeEST, a collaboration of 30 scientists from 10 institutions in North America and Europe, is also the first project to successfully use beryllium-7, regarded as the ideal atomic nucleus for the sterile neutrino hunt. Next up: scaling the BeEST setup to many more sensors, using new superconducting materials.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition
    American Physical Society US)
    Physicists are drowning in a flood of research papers in their own fields and coping with an even larger deluge in other areas of physics. How can an active researcher stay informed about the most important developments in physics? Physics highlights a selection of papers from the Physical Review journals. In consultation with expert scientists, the editors choose these papers for their importance and/or intrinsic interest. To highlight these papers, Physics features three kinds of articles: Viewpoints are commentaries written by active researchers, who are asked to explain the results to physicists in other subfields. Focus stories are written by professional science writers in a journalistic style and are intended to be accessible to students and non-experts. Synopses are brief editor-written summaries.

     
  • richardmitnick 3:23 pm on October 14, 2021 Permalink | Reply
    Tags: "Quarks and Antiquarks at High Momentum Shake the Foundations of Visible Matter", , Basic Research, , , EMC effect: longstanding nuclear paradox,   

    From American Physical Society (US) : “Quarks and Antiquarks at High Momentum Shake the Foundations of Visible Matter” 

    AmericanPhysicalSociety

    From American Physical Society (US)

    10.14.21

    DOE’s Thomas Jefferson National Accelerator Facility (US) and DOE’s Fermi National Accelerator Laboratory (US) experiments present new results on nucleon structure

    Two independent studies have illuminated unexpected substructures in the fundamental components of all matter. Preliminary results using a novel tagging method could explain the origin of the longstanding nuclear paradox known as the EMC effect. Meanwhile, authors will share next steps after the recent observation of asymmetrical antimatter in the proton [Nature].

    1
    Artistic rendering of quarks in deuterium. Credit: Ran Shneor.

    Both groups will discuss their experiments at DOE’s Thomas Jefferson National Accelerator Facility and Fermilab during the 2021 Fall Meeting of the APS Division of Nuclear Physics. They will present the results and take questions from the press at a live virtual news briefing on October 12 at 2:15 p.m. EDT.

    One study presents new evidence on the EMC effect, identified nearly 40 years ago when researchers at European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) [CERN] discovered something surprising: Protons and neutrons bound in an atomic nucleus can change their internal makeup of quarks and gluons. But why such modifications arise, and how to predict them, remains unknown.

    For the first time, scientists have measured the EMC effect by tagging spectator neutrons, taking a major step toward solving the mystery.

    “We present initial and preliminary results from a new transformative measurement of a novel observable that provides direct insight into the origin of the EMC effect,” said Tyler T. Kutz, a postdoctoral researcher at The Massachusetts Institute of Technology (US) and Zuckerman Postdoctoral Scholar at The Tel Aviv University אוּנִיבֶרְסִיטַת תֵּל אָבִיב (IL), who will reveal the findings at the meeting.

    Inside the Backward Angle Neutron Detector (BAND) at Jefferson Lab, tagged spectator neutrons “split” the nuclear wave function into different sections. This process maps how momentum and density affect the structure of bound nucleons.

    The team’s initial results point to potential sizable, unpredicted effects. Preliminary observations suggest direct evidence that the EMC effect is connected with nucleon fluctuations of high local density and high momentum.

    “The results can have major implications for our understanding of the QCD structure of visible matter,” said Efrain Segarra, a graduate student at MIT working on the experiment. The research could shed light on the nature of confinement, strong interactions, and the fundamental composition of matter.

    A team from Fermilab found evidence that antimatter asymmetry also plays a crucial role in nucleon properties—a landmark observation published earlier this year in Nature. New analysis indicates that in the most extreme case, a single antiquark can be responsible for almost half the momentum of a proton.

    “This surprising result clearly shows that even at high momentum fractions, antimatter is an important part of the proton,” said Shivangi Prasad, a researcher at DOE’s Argonne National Laboratory (US). “It demonstrates the importance of nonperturbative approaches to the structure of the basic building block of matter, the proton.”

    Prasad will discuss the SeaQuest experiment that found more “down” antiquarks than “up” antiquarks within the proton. She will also share preliminary research on sea-quark and gluon distributions.

    “The SeaQuest Collaboration looked inside the proton by slamming a high-energy beam of protons into targets made of hydrogen (essentially protons) and deuterium (nuclei containing single protons and neutrons),” said Prasad.

    “Within the proton, quarks and antiquarks are held together by extremely strong nuclear forces—so great that they can create antimatter-matter quark pairs out of empty space!” she explained. But the subatomic pairings only exist for a fleeting moment before they annihilate.

    The antiquark results have renewed interest in several earlier explanations for antimatter asymmetry in the proton. Prasad plans to discuss future measurements that could test the proposed mechanisms.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition
    American Physical Society US)
    Physicists are drowning in a flood of research papers in their own fields and coping with an even larger deluge in other areas of physics. How can an active researcher stay informed about the most important developments in physics? Physics highlights a selection of papers from the Physical Review journals. In consultation with expert scientists, the editors choose these papers for their importance and/or intrinsic interest. To highlight these papers, Physics features three kinds of articles: Viewpoints are commentaries written by active researchers, who are asked to explain the results to physicists in other subfields. Focus stories are written by professional science writers in a journalistic style and are intended to be accessible to students and non-experts. Synopses are brief editor-written summaries.

     
  • richardmitnick 11:26 am on October 14, 2021 Permalink | Reply
    Tags: "3 things learned from IceCube's first 10 years", Basic Research, , ,   

    From The National Science Foundation (US) : “3 things learned from IceCube’s first 10 years” 

    From The National Science Foundation (US)

    October 14, 2021
    Lauren Lipuma

    Neutrinos are tiny, nearly massless elementary particles that rarely interact with normal matter. They were first made during the Big Bang and are continuously produced today by stars, black holes and other cosmic structures. Neutrinos are everywhere – billions pass through a square centimeter of Earth every second – but are difficult to detect and study.

    The largest neutrino observatory in the world, the IceCube Neutrino Observatory, consists of thousands of sensors draped through a cubic kilometer of ice at the geographic South Pole. It was built to study cosmic neutrinos – those that come from outside the solar system and are made in powerful cosmic objects like black holes and pulsars.

    Studying neutrinos is important for understanding the makeup of the universe, but IceCube, operated by The University of Wisconsin–Madison (US) and supported by The National Science Foundation (US), was designed to use neutrinos as an astronomical messenger: to tell researchers about the violent, chaotic environments in which they were created.

    In its first decade of operations, the ice-encased detector has given researchers new ways of looking at the cosmos. “Whenever we look at the universe with a new messenger, a particle we hadn’t had the capability to exploit before, we always learn new things,” said Dawn Williams, a physicist at the University of Alabama and member of the IceCube collaboration. The IceCube Observatory was “built to exploit this messenger – to use neutrinos to explore the universe, and we have succeeded … beyond our wildest dreams.”

    Here are three things scientists have learned from IceCube’s first decade of science and a peek at what physicists hope to learn in the future.

    1. High-energy neutrinos are being made outside the solar system.

    One of the first things physicists learned from IceCube is that there is indeed a flux of high-energy cosmic neutrinos detectable on Earth. Before IceCube was built, physicists had observed cosmic neutrinos directly only once before, when light and particles from a supernova reached Earth in 1987. Observatories around the world picked up 25 neutrinos from the explosion of a star in the Large Magellanic Cloud, a small companion galaxy of the Milky Way. But those neutrinos were low in energy. High-energy neutrinos from cosmic accelerators like black holes are much rarer and harder to detect.

    3
    Graphic: Lauren Lipuma

    In 2013, IceCube scientists announced they had detected 28 high-energy neutrinos, which was the first solid evidence for neutrinos coming from cosmic accelerators outside the solar system. These neutrinos were a million times more energetic than those from the 1987 supernova.

    2. Neutrino astronomy is a real thing.

    A few years after discovering a flux of cosmic neutrinos, IceCube accomplished its second major goal: identifying a candidate source of high-energy neutrinos. Physicists knew neutrinos are made in chaotic environments like black holes, but they had never pinpointed a specific object as being a high-energy neutrino “factory.”

    3
    Graphic: Lauren Lipuma.

    In 2017, IceCube scientists picked up a high-energy neutrino they traced to a flaring blazar, a giant elliptical galaxy with a supermassive black hole at its center. Black holes at the center of blazars have twin jets that spew light and elementary particles from their poles.

    That high-energy neutrino triggered IceCube’s automated alert system, which directed telescopes around the world to home in on the area of sky from which the neutrino originated. Several telescopes noticed a flare of gamma rays coming from a blazar about 4 billion light-years away. Astrophysicists concluded that this was the source of both the gamma rays and the high-energy neutrino they observed.

    Physicists then looked at past IceCube observations and found a bigger flux of neutrinos from three years earlier that originated from the same area of the sky – and presumably from the same blazar.

    This discovery was significant not only because it was the first time a high-energy neutrino source had been confirmed, but also because it ushered in the new era of neutrino astronomy: the idea of using neutrinos, rather than light, to study the universe.

    4
    Graphic: Lauren Lipuma.

    “Ten years ago, if I were giving a neutrino astronomy talk, I would have put neutrino astronomy in air quotes,” said Naoko Kurahashi Neilson, a physicist at Drexel University and member of the IceCube collaboration. “Ten years ago, we hadn’t even seen a neutrino from outside our solar system. Now I don’t put air quotes because everybody agrees you can do astronomy with neutrinos.”

    Since then, the IceCube team has identified one more potential cosmic neutrino
    source – the galaxy Messier 77, a starburst galaxy with a supermassive black hole at its center.

    3. IceCube can do fundamental physics.

    Two recent discoveries showed IceCube can help physicists understand the intrinsic properties and behaviors of neutrinos, even though it was not designed to do so. Neutrinos come in three “flavors,” a particle physics term for the species of elementary particles: electron, muon and tau neutrinos. Researchers have so far identified two candidate tau neutrinos.

    Physicists know neutrinos can change their flavor but not fully how or why this happens. IceCube’s observation of the two tau neutrinos means cosmic neutrinos are changing flavor somewhere on their journey across the universe, a process predicted by physics but difficult to observe.

    4
    A simulation of the photon burst detected during the Glashow resonance event. Each photon travels in a straight line until it is deflected by dust or other impurities in the ice surrounding IceCube’s sensors. Photo Credit: Lu Lu, IceCube Collaboration.

    Additionally, researchers detected an electron antineutrino indicative of a Glashow resonance event. This is an extremely rare type of interaction between an electron antineutrino and an atomic electron – a type of particle interaction never observed before. Physicist Sheldon Glashow first theorized the interaction in 1960, but only IceCube’s detection of an electron antineutrino in 2016 proved it happens in reality.

    “It’s incredible that we could actually achieve this,” said Francis Halzen, a physicist at the University of Wisconsin-Madison and principal investigator of the IceCube collaboration said. “I’m a particle physicist, and this to me is just mind-blowing.

    What’s next for IceCube?

    There are still many unanswered questions about cosmic neutrinos, but scientists suspect some will be answered in the next 10 years.

    5
    The server room at the IceCube Neutrino Observatory. Photo Credit: Benjamin Eberhardt; ICECUBE/National Science Foundation.

    Halzen hopes IceCube can help physicists understand where cosmic rays – high-energy charged particles that transfer their energy to neutrinos – come from. Unlike neutrinos, cosmic rays are charged, so their paths through the universe are warped by magnetic fields, making it nearly impossible for physicists to know where they came from without other information.

    Kurahashi Neilson hopes researchers can learn more about cosmic particle accelerators and when and how often they spew out neutrinos. “We’re at the tip of an iceberg, right? And we don’t know how big or deep or what shape the iceberg is. We know there are neutrino sources. We’ve maybe seen one or two, so what are the rest? When do they come out? How often? How are they distributed? What does the universe look like in neutrinos?” she said.

    ___________________________________________________________
    U Wisconsin IceCube neutrino observatory


    IceCube employs more than 5000 detectors lowered on 86 strings into almost 100 holes in the Antarctic ice NSF B. Gudbjartsson, IceCube Collaboration.

    Lunar Icecube

    IceCube Gen-2 DeepCore PINGU annotated

    IceCube neutrino detector interior.

    IceCube DeepCore annotated.

    IceCube Gen-2 DeepCore PINGU annotated

    DM-Ice II at IceCube annotated.
    ___________________________________________________________

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The National Science Foundation (NSF) (US) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.

    We fulfill our mission chiefly by issuing limited-term grants — currently about 12,000 new awards per year, with an average duration of three years — to fund specific research proposals that have been judged the most promising by a rigorous and objective merit-review system. Most of these awards go to individuals or small groups of investigators. Others provide funding for research centers, instruments and facilities that allow scientists, engineers and students to work at the outermost frontiers of knowledge.

    NSF’s goals — discovery, learning, research infrastructure and stewardship — provide an integrated strategy to advance the frontiers of knowledge, cultivate a world-class, broadly inclusive science and engineering workforce and expand the scientific literacy of all citizens, build the nation’s research capability through investments in advanced instrumentation and facilities, and support excellence in science and engineering research and education through a capable and responsive organization. We like to say that NSF is “where discoveries begin.”

    Many of the discoveries and technological advances have been truly revolutionary. In the past few decades, NSF-funded researchers have won some 236 Nobel Prizes as well as other honors too numerous to list. These pioneers have included the scientists or teams that discovered many of the fundamental particles of matter, analyzed the cosmic microwaves left over from the earliest epoch of the universe, developed carbon-14 dating of ancient artifacts, decoded the genetics of viruses, and created an entirely new state of matter called a Bose-Einstein condensate.

    NSF also funds equipment that is needed by scientists and engineers but is often too expensive for any one group or researcher to afford. Examples of such major research equipment include giant optical and radio telescopes, Antarctic research sites, high-end computer facilities and ultra-high-speed connections, ships for ocean research, sensitive detectors of very subtle physical phenomena and gravitational wave observatories.

    Another essential element in NSF’s mission is support for science and engineering education, from pre-K through graduate school and beyond. The research we fund is thoroughly integrated with education to help ensure that there will always be plenty of skilled people available to work in new and emerging scientific, engineering and technological fields, and plenty of capable teachers to educate the next generation.

    No single factor is more important to the intellectual and economic progress of society, and to the enhanced well-being of its citizens, than the continuous acquisition of new knowledge. NSF is proud to be a major part of that process.

    Specifically, the Foundation’s organic legislation authorizes us to engage in the following activities:

    Initiate and support, through grants and contracts, scientific and engineering research and programs to strengthen scientific and engineering research potential, and education programs at all levels, and appraise the impact of research upon industrial development and the general welfare.
    Award graduate fellowships in the sciences and in engineering.
    Foster the interchange of scientific information among scientists and engineers in the United States and foreign countries.
    Foster and support the development and use of computers and other scientific methods and technologies, primarily for research and education in the sciences.
    Evaluate the status and needs of the various sciences and engineering and take into consideration the results of this evaluation in correlating our research and educational programs with other federal and non-federal programs.
    Provide a central clearinghouse for the collection, interpretation and analysis of data on scientific and technical resources in the United States, and provide a source of information for policy formulation by other federal agencies.
    Determine the total amount of federal money received by universities and appropriate organizations for the conduct of scientific and engineering research, including both basic and applied, and construction of facilities where such research is conducted, but excluding development, and report annually thereon to the President and the Congress.
    Initiate and support specific scientific and engineering activities in connection with matters relating to international cooperation, national security and the effects of scientific and technological applications upon society.
    Initiate and support scientific and engineering research, including applied research, at academic and other nonprofit institutions and, at the direction of the President, support applied research at other organizations.
    Recommend and encourage the pursuit of national policies for the promotion of basic research and education in the sciences and engineering. Strengthen research and education innovation in the sciences and engineering, including independent research by individuals, throughout the United States.
    Support activities designed to increase the participation of women and minorities and others underrepresented in science and technology.

    At present, NSF has a total workforce of about 2,100 at its Alexandria, VA, headquarters, including approximately 1,400 career employees, 200 scientists from research institutions on temporary duty, 450 contract workers and the staff of the NSB office and the Office of the Inspector General.

    NSF is divided into the following seven directorates that support science and engineering research and education: Biological Sciences, Computer and Information Science and Engineering, Engineering, Geosciences, Mathematical and Physical Sciences, Social, Behavioral and Economic Sciences, and Education and Human Resources. Each is headed by an assistant director and each is further subdivided into divisions like materials research, ocean sciences and behavioral and cognitive sciences.

    Within NSF’s Office of the Director, the Office of Integrative Activities also supports research and researchers. Other sections of NSF are devoted to financial management, award processing and monitoring, legal affairs, outreach and other functions. The Office of the Inspector General examines the foundation’s work and reports to the NSB and Congress.

    Each year, NSF supports an average of about 200,000 scientists, engineers, educators and students at universities, laboratories and field sites all over the United States and throughout the world, from Alaska to Alabama to Africa to Antarctica. You could say that NSF support goes “to the ends of the earth” to learn more about the planet and its inhabitants, and to produce fundamental discoveries that further the progress of research and lead to products and services that boost the economy and improve general health and well-being.

    As described in our strategic plan, NSF is the only federal agency whose mission includes support for all fields of fundamental science and engineering, except for medical sciences. NSF is tasked with keeping the United States at the leading edge of discovery in a wide range of scientific areas, from astronomy to geology to zoology. So, in addition to funding research in the traditional academic areas, the agency also supports “high risk, high pay off” ideas, novel collaborations and numerous projects that may seem like science fiction today, but which the public will take for granted tomorrow. And in every case, we ensure that research is fully integrated with education so that today’s revolutionary work will also be training tomorrow’s top scientists and engineers.

    Unlike many other federal agencies, NSF does not hire researchers or directly operate our own laboratories or similar facilities. Instead, we support scientists, engineers and educators directly through their own home institutions (typically universities and colleges). Similarly, we fund facilities and equipment such as telescopes, through cooperative agreements with research consortia that have competed successfully for limited-term management contracts.

    NSF’s job is to determine where the frontiers are, identify the leading U.S. pioneers in these fields and provide money and equipment to help them continue. The results can be transformative. For example, years before most people had heard of “nanotechnology,” NSF was supporting scientists and engineers who were learning how to detect, record and manipulate activity at the scale of individual atoms — the nanoscale. Today, scientists are adept at moving atoms around to create devices and materials with properties that are often more useful than those found in nature.

    Dozens of companies are gearing up to produce nanoscale products. NSF is funding the research projects, state-of-the-art facilities and educational opportunities that will teach new skills to the science and engineering students who will make up the nanotechnology workforce of tomorrow.

    At the same time, we are looking for the next frontier.

    NSF’s task of identifying and funding work at the frontiers of science and engineering is not a “top-down” process. NSF operates from the “bottom up,” keeping close track of research around the United States and the world, maintaining constant contact with the research community to identify ever-moving horizons of inquiry, monitoring which areas are most likely to result in spectacular progress and choosing the most promising people to conduct the research.

    NSF funds research and education in most fields of science and engineering. We do this through grants and cooperative agreements to more than 2,000 colleges, universities, K-12 school systems, businesses, informal science organizations and other research organizations throughout the U.S. The Foundation considers proposals submitted by organizations on behalf of individuals or groups for support in most fields of research. Interdisciplinary proposals also are eligible for consideration. Awardees are chosen from those who send us proposals asking for a specific amount of support for a specific project.

    Proposals may be submitted in response to the various funding opportunities that are announced on the NSF website. These funding opportunities fall into three categories — program descriptions, program announcements and program solicitations — and are the mechanisms NSF uses to generate funding requests. At any time, scientists and engineers are also welcome to send in unsolicited proposals for research and education projects, in any existing or emerging field. The Proposal and Award Policies and Procedures Guide (PAPPG) provides guidance on proposal preparation and submission and award management. At present, NSF receives more than 42,000 proposals per year.

    To ensure that proposals are evaluated in a fair, competitive, transparent and in-depth manner, we use a rigorous system of merit review. Nearly every proposal is evaluated by a minimum of three independent reviewers consisting of scientists, engineers and educators who do not work at NSF or for the institution that employs the proposing researchers. NSF selects the reviewers from among the national pool of experts in each field and their evaluations are confidential. On average, approximately 40,000 experts, knowledgeable about the current state of their field, give their time to serve as reviewers each year.

    The reviewer’s job is to decide which projects are of the very highest caliber. NSF’s merit review process, considered by some to be the “gold standard” of scientific review, ensures that many voices are heard and that only the best projects make it to the funding stage. An enormous amount of research, deliberation, thought and discussion goes into award decisions.

    The NSF program officer reviews the proposal and analyzes the input received from the external reviewers. After scientific, technical and programmatic review and consideration of appropriate factors, the program officer makes an “award” or “decline” recommendation to the division director. Final programmatic approval for a proposal is generally completed at NSF’s division level. A principal investigator (PI) whose proposal for NSF support has been declined will receive information and an explanation of the reason(s) for declination, along with copies of the reviews considered in making the decision. If that explanation does not satisfy the PI, he/she may request additional information from the cognizant NSF program officer or division director.

    If the program officer makes an award recommendation and the division director concurs, the recommendation is submitted to NSF’s Division of Grants and Agreements (DGA) for award processing. A DGA officer reviews the recommendation from the program division/office for business, financial and policy implications, and the processing and issuance of a grant or cooperative agreement. DGA generally makes awards to academic institutions within 30 days after the program division/office makes its recommendation.

     
  • richardmitnick 10:35 am on October 14, 2021 Permalink | Reply
    Tags: "Hubble Finds Evidence of Persistent Water Vapor in One Hemisphere of Europa", , Basic Research, , ,   

    From Hubblesite (US)(EU) and NASA/ESA Hubble: “Hubble Finds Evidence of Persistent Water Vapor in One Hemisphere of Europa” 

    From Hubblesite (US)(EU) and NASA/ESA Hubble

    October 14, 2021

    MEDIA CONTACT:

    Ray Villard
    Space Telescope Science Institute (US), Baltimore, Maryland

    Bethany Downer
    ESA/Hubble.org

    SCIENCE CONTACT:

    Lorenz Roth
    KTH Royal Institute of Technology [Kungliga Tekniska högskolan](SE)

    1
    Europa

    Summary

    Ice Sublimating Off the Surface Replenishes a Tenuous Envelope

    You would think that living half-a-billion miles from the Sun would be no place to call home. But planetary astronomers are very interested in exploring the moon Europa in search of life. Slightly smaller than Earth’s moon, Europa orbits monstrous Jupiter. Surface temperatures on the icy moon never rise above a frigid minus 260 degrees Fahrenheit. A temperature so cold that water-ice is as hard as rock.

    Yet, beneath the solid ice crust there may be a global ocean with more water than found on Earth. And, where there is water, there could be life. Like a leaky garden hose, the ocean vents water vapor into space from geysers poking through cracks in the surface, as first photographed by the Hubble Space Telescope in 2013.

    The latest twist comes from archival Hubble observations, spanning 1999 to 2015, which find that water vapor is constantly being replenished throughout one hemisphere of the moon. That’s a bit mysterious. Nevertheless, the atmosphere is only one-billionth the surface pressure of Earth’s atmosphere.

    The water vapor wasn’t seen directly, but rather oxygen’s ultraviolet spectral fingerprint was measured by Hubble. Oxygen is one of the constituents of water. Unlike the geysers, this water vapor is not coming from Europa’s interior, but rather sunlight is causing the surface ice to sublimate. A similar water vapor atmosphere was recently found on the Jovian moon Ganymede.

    Europa is so exciting as a potential abode of life it is a target of NASA’s Europa Clipper and the Jupiter Icy Moons Explorer (JUICE) of the European Space Agency – planned for launch within a decade.


    _____________________________________________________________________________________

    NASA’s Hubble Space Telescope observations of Jupiter’s icy moon Europa have revealed the presence of persistent water vapor — but, mysteriously, only in one hemisphere.

    Europa harbors a vast ocean underneath its icy surface, which might offer conditions hospitable for life. This result advances astronomers’ understanding of the atmospheric structure of icy moons, and helps lay the groundwork for planned science missions to the Jovian system to, in part, explore whether an environment half-a-billion miles from the Sun could support life.

    Previous observations of water vapor on Europa have been associated with plumes erupting through the ice, as photographed by Hubble in 2013. They are analogous to geysers on Earth, but extend more than 60 miles high. They produce transient blobs of water vapor in the moon’s atmosphere, which is only one-billionth the surface pressure of Earth’s atmosphere.

    The new results, however, show similar amounts of water vapor spread over a larger area of Europa in Hubble observations spanning from 1999 to 2015. This suggests a long-term presence of a water vapor atmosphere only in Europa’s trailing hemisphere — that portion of the moon that is always opposite its direction of motion along its orbit. The cause of this asymmetry between the leading and trailing hemisphere is not fully understood.

    This discovery is gleaned from a new analysis of Hubble archival images and spectra, using a technique that recently resulted in the discovery of water vapor in the atmosphere of Jupiter’s moon Ganymede, by Lorenz Roth of the KTH Royal Institute of Technology, Space and Plasma Physics, Sweden.

    “The observation of water vapor on Ganymede, and on the trailing side of Europa, advances our understanding of the atmospheres of icy moons,” said Roth. “However, the detection of a stable water abundance on Europa is a bit more surprising than on Ganymede because Europa’s surface temperatures are lower than Ganymede’s.”

    Europa reflects more sunlight than Ganymede, keeping the surface 60 degrees Fahrenheit cooler than Ganymede. The daytime high on Europa is a frigid minus 260 degrees Fahrenheit. Yet, even at the lower temperature, the new observations suggest water ice is sublimating — that is, transforming directly from solid to vapor without a liquid phase — off Europa’s surface, just like on Ganymede.

    To make this discovery, Roth delved into archival Hubble datasets, selecting ultraviolet observations of Europa from 1999, 2012, 2014 and 2015 while the moon was at various orbital positions. These observations were all taken with Hubble’s Space Telescope Imaging Spectrograph (STIS). The ultraviolet STIS observations allowed Roth to determine the abundance of oxygen — one of the constituents of water — in Europa’s atmosphere, and by interpreting the strength of emission at different wavelengths he was able to infer the presence of water vapor.

    This detection paves the way for in-depth studies of Europa by future probes including NASA’s Europa Clipper and the Jupiter Icy Moons Explorer (JUICE) mission from the European Space Agency (ESA). Understanding the formation and evolution of Jupiter and its moons also helps astronomers gain insights into Jupiter-like planets around other stars.

    These results have been published in the journal Geophysical Research Letters.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition
    The NASA/ESA Hubble Space Telescope is a space telescope that was launched into low Earth orbit in 1990 and remains in operation. It was not the first space telescope, but it is one of the largest and most versatile, renowned both as a vital research tool and as a public relations boon for astronomy. The Hubble telescope is named after astronomer Edwin Hubble and is one of NASA’s Great Observatories, along with the NASA Compton Gamma Ray Observatory, the Chandra X-ray Observatory, and the NASA Spitzer Infared Space Telescope.



    Edwin Hubble at Caltech Palomar Samuel Oschin 48 inch Telescope(US). Credit: Emilio Segre Visual Archives/AIP/SPL).

    Hubble features a 2.4-meter (7.9 ft) mirror, and its four main instruments observe in the ultraviolet, visible, and near-infrared regions of the electromagnetic spectrum. Hubble’s orbit outside the distortion of Earth’s atmosphere allows it to capture extremely high-resolution images with substantially lower background light than ground-based telescopes. It has recorded some of the most detailed visible light images, allowing a deep view into space. Many Hubble observations have led to breakthroughs in astrophysics, such as determining the rate of expansion of the universe.

    The Hubble telescope was built by the United States space agency National Aeronautics Space Agency(US) with contributions from the European Space Agency [Agence spatiale européenne](EU). The Space Telescope Science Institute (STScI) selects Hubble’s targets and processes the resulting data, while the NASA Goddard Space Flight Center(US) controls the spacecraft. Space telescopes were proposed as early as 1923. Hubble was funded in the 1970s with a proposed launch in 1983, but the project was beset by technical delays, budget problems, and the 1986 Challenger disaster. It was finally launched by Space Shuttle Discovery in 1990, but its main mirror had been ground incorrectly, resulting in spherical aberration that compromised the telescope’s capabilities. The optics were corrected to their intended quality by a servicing mission in 1993.

    Hubble is the only telescope designed to be maintained in space by astronauts. Five Space Shuttle missions have repaired, upgraded, and replaced systems on the telescope, including all five of the main instruments. The fifth mission was initially canceled on safety grounds following the Columbia disaster (2003), but NASA administrator Michael D. Griffin approved the fifth servicing mission which was completed in 2009. The telescope was still operating as of April 24, 2020, its 30th anniversary, and could last until 2030–2040. One successor to the Hubble telescope is the National Aeronautics Space Agency(USA)/European Space Agency [Agence spatiale européenne](EU)/Canadian Space Agency(CA) Webb Infrared Space Telescope scheduled for launch in October 2021.

    Proposals and precursors

    In 1923, Hermann Oberth—considered a father of modern rocketry, along with Robert H. Goddard and Konstantin Tsiolkovsky—published Die Rakete zu den Planetenräumen (“The Rocket into Planetary Space“), which mentioned how a telescope could be propelled into Earth orbit by a rocket.

    The history of the Hubble Space Telescope can be traced back as far as 1946, to astronomer Lyman Spitzer’s paper entitled Astronomical advantages of an extraterrestrial observatory. In it, he discussed the two main advantages that a space-based observatory would have over ground-based telescopes. First, the angular resolution (the smallest separation at which objects can be clearly distinguished) would be limited only by diffraction, rather than by the turbulence in the atmosphere, which causes stars to twinkle, known to astronomers as seeing. At that time ground-based telescopes were limited to resolutions of 0.5–1.0 arcseconds, compared to a theoretical diffraction-limited resolution of about 0.05 arcsec for an optical telescope with a mirror 2.5 m (8.2 ft) in diameter. Second, a space-based telescope could observe infrared and ultraviolet light, which are strongly absorbed by the atmosphere.

    Spitzer devoted much of his career to pushing for the development of a space telescope. In 1962, a report by the U.S. National Academy of Sciences recommended development of a space telescope as part of the space program, and in 1965 Spitzer was appointed as head of a committee given the task of defining scientific objectives for a large space telescope.

    Space-based astronomy had begun on a very small scale following World War II, as scientists made use of developments that had taken place in rocket technology. The first ultraviolet spectrum of the Sun was obtained in 1946, and the National Aeronautics and Space Administration (US) launched the Orbiting Solar Observatory (OSO) to obtain UV, X-ray, and gamma-ray spectra in 1962.

    An orbiting solar telescope was launched in 1962 by the United Kingdom as part of the Ariel space program, and in 1966 NASA launched the first Orbiting Astronomical Observatory (OAO) mission. OAO-1’s battery failed after three days, terminating the mission. It was followed by OAO-2, which carried out ultraviolet observations of stars and galaxies from its launch in 1968 until 1972, well beyond its original planned lifetime of one year.

    The OSO and OAO missions demonstrated the important role space-based observations could play in astronomy. In 1968, NASA developed firm plans for a space-based reflecting telescope with a mirror 3 m (9.8 ft) in diameter, known provisionally as the Large Orbiting Telescope or Large Space Telescope (LST), with a launch slated for 1979. These plans emphasized the need for crewed maintenance missions to the telescope to ensure such a costly program had a lengthy working life, and the concurrent development of plans for the reusable Space Shuttle indicated that the technology to allow this was soon to become available.

    Quest for funding

    The continuing success of the OAO program encouraged increasingly strong consensus within the astronomical community that the LST should be a major goal. In 1970, NASA established two committees, one to plan the engineering side of the space telescope project, and the other to determine the scientific goals of the mission. Once these had been established, the next hurdle for NASA was to obtain funding for the instrument, which would be far more costly than any Earth-based telescope. The U.S. Congress questioned many aspects of the proposed budget for the telescope and forced cuts in the budget for the planning stages, which at the time consisted of very detailed studies of potential instruments and hardware for the telescope. In 1974, public spending cuts led to Congress deleting all funding for the telescope project.
    In response a nationwide lobbying effort was coordinated among astronomers. Many astronomers met congressmen and senators in person, and large scale letter-writing campaigns were organized. The National Academy of Sciences published a report emphasizing the need for a space telescope, and eventually the Senate agreed to half the budget that had originally been approved by Congress.

    The funding issues led to something of a reduction in the scale of the project, with the proposed mirror diameter reduced from 3 m to 2.4 m, both to cut costs and to allow a more compact and effective configuration for the telescope hardware. A proposed precursor 1.5 m (4.9 ft) space telescope to test the systems to be used on the main satellite was dropped, and budgetary concerns also prompted collaboration with the European Space Agency. ESA agreed to provide funding and supply one of the first generation instruments for the telescope, as well as the solar cells that would power it, and staff to work on the telescope in the United States, in return for European astronomers being guaranteed at least 15% of the observing time on the telescope. Congress eventually approved funding of US$36 million for 1978, and the design of the LST began in earnest, aiming for a launch date of 1983. In 1983 the telescope was named after Edwin Hubble, who confirmed one of the greatest scientific discoveries of the 20th century, made by Georges Lemaître, that the universe is expanding.

    Construction and engineering

    Once the Space Telescope project had been given the go-ahead, work on the program was divided among many institutions. NASA Marshall Space Flight Center (MSFC) was given responsibility for the design, development, and construction of the telescope, while Goddard Space Flight Center was given overall control of the scientific instruments and ground-control center for the mission. MSFC commissioned the optics company Perkin-Elmer to design and build the Optical Telescope Assembly (OTA) and Fine Guidance Sensors for the space telescope. Lockheed was commissioned to construct and integrate the spacecraft in which the telescope would be housed.

    Optical Telescope Assembly

    Optically, the HST is a Cassegrain reflector of Ritchey–Chrétien design, as are most large professional telescopes. This design, with two hyperbolic mirrors, is known for good imaging performance over a wide field of view, with the disadvantage that the mirrors have shapes that are hard to fabricate and test. The mirror and optical systems of the telescope determine the final performance, and they were designed to exacting specifications. Optical telescopes typically have mirrors polished to an accuracy of about a tenth of the wavelength of visible light, but the Space Telescope was to be used for observations from the visible through the ultraviolet (shorter wavelengths) and was specified to be diffraction limited to take full advantage of the space environment. Therefore, its mirror needed to be polished to an accuracy of 10 nanometers, or about 1/65 of the wavelength of red light. On the long wavelength end, the OTA was not designed with optimum IR performance in mind—for example, the mirrors are kept at stable (and warm, about 15 °C) temperatures by heaters. This limits Hubble’s performance as an infrared telescope.

    Perkin-Elmer intended to use custom-built and extremely sophisticated computer-controlled polishing machines to grind the mirror to the required shape. However, in case their cutting-edge technology ran into difficulties, NASA demanded that PE sub-contract to Kodak to construct a back-up mirror using traditional mirror-polishing techniques. (The team of Kodak and Itek also bid on the original mirror polishing work. Their bid called for the two companies to double-check each other’s work, which would have almost certainly caught the polishing error that later caused such problems.) The Kodak mirror is now on permanent display at the National Air and Space Museum. An Itek mirror built as part of the effort is now used in the 2.4 m telescope at the Magdalena Ridge Observatory.

    Construction of the Perkin-Elmer mirror began in 1979, starting with a blank manufactured by Corning from their ultra-low expansion glass. To keep the mirror’s weight to a minimum it consisted of top and bottom plates, each one inch (25 mm) thick, sandwiching a honeycomb lattice. Perkin-Elmer simulated microgravity by supporting the mirror from the back with 130 rods that exerted varying amounts of force. This ensured the mirror’s final shape would be correct and to specification when finally deployed. Mirror polishing continued until May 1981. NASA reports at the time questioned Perkin-Elmer’s managerial structure, and the polishing began to slip behind schedule and over budget. To save money, NASA halted work on the back-up mirror and put the launch date of the telescope back to October 1984. The mirror was completed by the end of 1981; it was washed using 2,400 US gallons (9,100 L) of hot, deionized water and then received a reflective coating of 65 nm-thick aluminum and a protective coating of 25 nm-thick magnesium fluoride.

    Doubts continued to be expressed about Perkin-Elmer’s competence on a project of this importance, as their budget and timescale for producing the rest of the OTA continued to inflate. In response to a schedule described as “unsettled and changing daily”, NASA postponed the launch date of the telescope until April 1985. Perkin-Elmer’s schedules continued to slip at a rate of about one month per quarter, and at times delays reached one day for each day of work. NASA was forced to postpone the launch date until March and then September 1986. By this time, the total project budget had risen to US$1.175 billion.

    Spacecraft systems

    The spacecraft in which the telescope and instruments were to be housed was another major engineering challenge. It would have to withstand frequent passages from direct sunlight into the darkness of Earth’s shadow, which would cause major changes in temperature, while being stable enough to allow extremely accurate pointing of the telescope. A shroud of multi-layer insulation keeps the temperature within the telescope stable and surrounds a light aluminum shell in which the telescope and instruments sit. Within the shell, a graphite-epoxy frame keeps the working parts of the telescope firmly aligned. Because graphite composites are hygroscopic, there was a risk that water vapor absorbed by the truss while in Lockheed’s clean room would later be expressed in the vacuum of space; resulting in the telescope’s instruments being covered by ice. To reduce that risk, a nitrogen gas purge was performed before launching the telescope into space.

    While construction of the spacecraft in which the telescope and instruments would be housed proceeded somewhat more smoothly than the construction of the OTA, Lockheed still experienced some budget and schedule slippage, and by the summer of 1985, construction of the spacecraft was 30% over budget and three months behind schedule. An MSFC report said Lockheed tended to rely on NASA directions rather than take their own initiative in the construction.

    Computer systems and data processing

    The two initial, primary computers on the HST were the 1.25 MHz DF-224 system, built by Rockwell Autonetics, which contained three redundant CPUs, and two redundant NSSC-1 (NASA Standard Spacecraft Computer, Model 1) systems, developed by Westinghouse and GSFC using diode–transistor logic (DTL). A co-processor for the DF-224 was added during Servicing Mission 1 in 1993, which consisted of two redundant strings of an Intel-based 80386 processor with an 80387 math co-processor. The DF-224 and its 386 co-processor were replaced by a 25 MHz Intel-based 80486 processor system during Servicing Mission 3A in 1999. The new computer is 20 times faster, with six times more memory, than the DF-224 it replaced. It increases throughput by moving some computing tasks from the ground to the spacecraft and saves money by allowing the use of modern programming languages.

    Additionally, some of the science instruments and components had their own embedded microprocessor-based control systems. The MATs (Multiple Access Transponder) components, MAT-1 and MAT-2, utilize Hughes Aircraft CDP1802CD microprocessors. The Wide Field and Planetary Camera (WFPC) also utilized an RCA 1802 microprocessor (or possibly the older 1801 version). The WFPC-1 was replaced by the WFPC-2 [below] during Servicing Mission 1 in 1993, which was then replaced by the Wide Field Camera 3 (WFC3) [below] during Servicing Mission 4 in 2009.

    Initial instruments

    When launched, the HST carried five scientific instruments: the Wide Field and Planetary Camera (WF/PC), Goddard High Resolution Spectrograph (GHRS), High Speed Photometer (HSP), Faint Object Camera (FOC) and the Faint Object Spectrograph (FOS). WF/PC was a high-resolution imaging device primarily intended for optical observations. It was built by NASA JPL-Caltech(US), and incorporated a set of 48 filters isolating spectral lines of particular astrophysical interest. The instrument contained eight charge-coupled device (CCD) chips divided between two cameras, each using four CCDs. Each CCD has a resolution of 0.64 megapixels. The wide field camera (WFC) covered a large angular field at the expense of resolution, while the planetary camera (PC) took images at a longer effective focal length than the WF chips, giving it a greater magnification.

    The GHRS was a spectrograph designed to operate in the ultraviolet. It was built by the Goddard Space Flight Center and could achieve a spectral resolution of 90,000. Also optimized for ultraviolet observations were the FOC and FOS, which were capable of the highest spatial resolution of any instruments on Hubble. Rather than CCDs these three instruments used photon-counting digicons as their detectors. The FOC was constructed by ESA, while the University of California, San Diego(US), and Martin Marietta Corporation built the FOS.

    The final instrument was the HSP, designed and built at the University of Wisconsin–Madison(US). It was optimized for visible and ultraviolet light observations of variable stars and other astronomical objects varying in brightness. It could take up to 100,000 measurements per second with a photometric accuracy of about 2% or better.

    HST’s guidance system can also be used as a scientific instrument. Its three Fine Guidance Sensors (FGS) are primarily used to keep the telescope accurately pointed during an observation, but can also be used to carry out extremely accurate astrometry; measurements accurate to within 0.0003 arcseconds have been achieved.

    Ground support

    The Space Telescope Science Institute (STScI) is responsible for the scientific operation of the telescope and the delivery of data products to astronomers. STScI is operated by the Association of Universities for Research in Astronomy(US) (AURA) and is physically located in Baltimore, Maryland on the Homewood campus of Johns Hopkins University(US), one of the 39 U.S. universities and seven international affiliates that make up the AURA consortium. STScI was established in 1981 after something of a power struggle between NASA and the scientific community at large. NASA had wanted to keep this function in-house, but scientists wanted it to be based in an academic establishment. The Space Telescope European Coordinating Facility (ST-ECF), established at Garching bei München near Munich in 1984, provided similar support for European astronomers until 2011, when these activities were moved to the European Space Astronomy Centre.

    One rather complex task that falls to STScI is scheduling observations for the telescope. Hubble is in a low-Earth orbit to enable servicing missions, but this means most astronomical targets are occulted by the Earth for slightly less than half of each orbit. Observations cannot take place when the telescope passes through the South Atlantic Anomaly due to elevated radiation levels, and there are also sizable exclusion zones around the Sun (precluding observations of Mercury), Moon and Earth. The solar avoidance angle is about 50°, to keep sunlight from illuminating any part of the OTA. Earth and Moon avoidance keeps bright light out of the FGSs, and keeps scattered light from entering the instruments. If the FGSs are turned off, the Moon and Earth can be observed. Earth observations were used very early in the program to generate flat-fields for the WFPC1 instrument. There is a so-called continuous viewing zone (CVZ), at roughly 90° to the plane of Hubble’s orbit, in which targets are not occulted for long periods.

    Challenger disaster, delays, and eventual launch

    By January 1986, the planned launch date of October looked feasible, but the Challenger explosion brought the U.S. space program to a halt, grounding the Shuttle fleet and forcing the launch of Hubble to be postponed for several years. The telescope had to be kept in a clean room, powered up and purged with nitrogen, until a launch could be rescheduled. This costly situation (about US$6 million per month) pushed the overall costs of the project even higher. This delay did allow time for engineers to perform extensive tests, swap out a possibly failure-prone battery, and make other improvements. Furthermore, the ground software needed to control Hubble was not ready in 1986, and was barely ready by the 1990 launch.

    Eventually, following the resumption of shuttle flights in 1988, the launch of the telescope was scheduled for 1990. On April 24, 1990, Space Shuttle Discovery successfully launched it during the STS-31 mission.

    From its original total cost estimate of about US$400 million, the telescope cost about US$4.7 billion by the time of its launch. Hubble’s cumulative costs were estimated to be about US$10 billion in 2010, twenty years after launch.

    List of Hubble instruments

    Hubble accommodates five science instruments at a given time, plus the Fine Guidance Sensors, which are mainly used for aiming the telescope but are occasionally used for scientific astrometry measurements. Early instruments were replaced with more advanced ones during the Shuttle servicing missions. COSTAR was a corrective optics device rather than a science instrument, but occupied one of the five instrument bays.
    Since the final servicing mission in 2009, the four active instruments have been ACS, COS, STIS and WFC3. NICMOS is kept in hibernation, but may be revived if WFC3 were to fail in the future.

    Advanced Camera for Surveys (ACS; 2002–present)
    Cosmic Origins Spectrograph (COS; 2009–present)
    Corrective Optics Space Telescope Axial Replacement (COSTAR; 1993–2009)
    Faint Object Camera (FOC; 1990–2002)
    Faint Object Spectrograph (FOS; 1990–1997)
    Fine Guidance Sensor (FGS; 1990–present)
    Goddard High Resolution Spectrograph (GHRS/HRS; 1990–1997)
    High Speed Photometer (HSP; 1990–1993)
    Near Infrared Camera and Multi-Object Spectrometer (NICMOS; 1997–present, hibernating since 2008)
    Space Telescope Imaging Spectrograph (STIS; 1997–present (non-operative 2004–2009))
    Wide Field and Planetary Camera (WFPC; 1990–1993)
    Wide Field and Planetary Camera 2 (WFPC2; 1993–2009)
    Wide Field Camera 3 (WFC3; 2009–present)

    Of the former instruments, three (COSTAR, FOS and WFPC2) are displayed in the Smithsonian National Air and Space Museum. The FOC is in the Dornier museum, Germany. The HSP is in the Space Place at the University of Wisconsin–Madison. The first WFPC was dismantled, and some components were then re-used in WFC3.

    Flawed mirror

    Within weeks of the launch of the telescope, the returned images indicated a serious problem with the optical system. Although the first images appeared to be sharper than those of ground-based telescopes, Hubble failed to achieve a final sharp focus and the best image quality obtained was drastically lower than expected. Images of point sources spread out over a radius of more than one arcsecond, instead of having a point spread function (PSF) concentrated within a circle 0.1 arcseconds (485 nrad) in diameter, as had been specified in the design criteria.

    Analysis of the flawed images revealed that the primary mirror had been polished to the wrong shape. Although it was believed to be one of the most precisely figured optical mirrors ever made, smooth to about 10 nanometers, the outer perimeter was too flat by about 2200 nanometers (about 1⁄450 mm or 1⁄11000 inch). This difference was catastrophic, introducing severe spherical aberration, a flaw in which light reflecting off the edge of a mirror focuses on a different point from the light reflecting off its center.

    The effect of the mirror flaw on scientific observations depended on the particular observation—the core of the aberrated PSF was sharp enough to permit high-resolution observations of bright objects, and spectroscopy of point sources was affected only through a sensitivity loss. However, the loss of light to the large, out-of-focus halo severely reduced the usefulness of the telescope for faint objects or high-contrast imaging. This meant nearly all the cosmological programs were essentially impossible, since they required observation of exceptionally faint objects. This led politicians to question NASA’s competence, scientists to rue the cost which could have gone to more productive endeavors, and comedians to make jokes about NASA and the telescope − in the 1991 comedy The Naked Gun 2½: The Smell of Fear, in a scene where historical disasters are displayed, Hubble is pictured with RMS Titanic and LZ 129 Hindenburg. Nonetheless, during the first three years of the Hubble mission, before the optical corrections, the telescope still carried out a large number of productive observations of less demanding targets. The error was well characterized and stable, enabling astronomers to partially compensate for the defective mirror by using sophisticated image processing techniques such as deconvolution.

    Origin of the problem

    A commission headed by Lew Allen, director of the Jet Propulsion Laboratory, was established to determine how the error could have arisen. The Allen Commission found that a reflective null corrector, a testing device used to achieve a properly shaped non-spherical mirror, had been incorrectly assembled—one lens was out of position by 1.3 mm (0.051 in). During the initial grinding and polishing of the mirror, Perkin-Elmer analyzed its surface with two conventional refractive null correctors. However, for the final manufacturing step (figuring), they switched to the custom-built reflective null corrector, designed explicitly to meet very strict tolerances. The incorrect assembly of this device resulted in the mirror being ground very precisely but to the wrong shape. A few final tests, using the conventional null correctors, correctly reported spherical aberration. But these results were dismissed, thus missing the opportunity to catch the error, because the reflective null corrector was considered more accurate.

    The commission blamed the failings primarily on Perkin-Elmer. Relations between NASA and the optics company had been severely strained during the telescope construction, due to frequent schedule slippage and cost overruns. NASA found that Perkin-Elmer did not review or supervise the mirror construction adequately, did not assign its best optical scientists to the project (as it had for the prototype), and in particular did not involve the optical designers in the construction and verification of the mirror. While the commission heavily criticized Perkin-Elmer for these managerial failings, NASA was also criticized for not picking up on the quality control shortcomings, such as relying totally on test results from a single instrument.

    Design of a solution

    Many feared that Hubble would be abandoned. The design of the telescope had always incorporated servicing missions, and astronomers immediately began to seek potential solutions to the problem that could be applied at the first servicing mission, scheduled for 1993. While Kodak had ground a back-up mirror for Hubble, it would have been impossible to replace the mirror in orbit, and too expensive and time-consuming to bring the telescope back to Earth for a refit. Instead, the fact that the mirror had been ground so precisely to the wrong shape led to the design of new optical components with exactly the same error but in the opposite sense, to be added to the telescope at the servicing mission, effectively acting as “spectacles” to correct the spherical aberration.

    The first step was a precise characterization of the error in the main mirror. Working backwards from images of point sources, astronomers determined that the conic constant of the mirror as built was −1.01390±0.0002, instead of the intended −1.00230. The same number was also derived by analyzing the null corrector used by Perkin-Elmer to figure the mirror, as well as by analyzing interferograms obtained during ground testing of the mirror.

    Because of the way the HST’s instruments were designed, two different sets of correctors were required. The design of the Wide Field and Planetary Camera 2, already planned to replace the existing WF/PC, included relay mirrors to direct light onto the four separate charge-coupled device (CCD) chips making up its two cameras. An inverse error built into their surfaces could completely cancel the aberration of the primary. However, the other instruments lacked any intermediate surfaces that could be figured in this way, and so required an external correction device.

    The Corrective Optics Space Telescope Axial Replacement (COSTAR) system was designed to correct the spherical aberration for light focused at the FOC, FOS, and GHRS. It consists of two mirrors in the light path with one ground to correct the aberration. To fit the COSTAR system onto the telescope, one of the other instruments had to be removed, and astronomers selected the High Speed Photometer to be sacrificed. By 2002, all the original instruments requiring COSTAR had been replaced by instruments with their own corrective optics. COSTAR was removed and returned to Earth in 2009 where it is exhibited at the National Air and Space Museum. The area previously used by COSTAR is now occupied by the Cosmic Origins Spectrograph.

    Servicing missions and new instruments

    Servicing Mission 1

    The first Hubble serving mission was scheduled for 1993 before the mirror problem was discovered. It assumed greater importance, as the astronauts would need to do extensive work to install corrective optics; failure would have resulted in either abandoning Hubble or accepting its permanent disability. Other components failed before the mission, causing the repair cost to rise to $500 million (not including the cost of the shuttle flight). A successful repair would help demonstrate the viability of building Space Station Alpha, however.

    STS-49 in 1992 demonstrated the difficulty of space work. While its rescue of Intelsat 603 received praise, the astronauts had taken possibly reckless risks in doing so. Neither the rescue nor the unrelated assembly of prototype space station components occurred as the astronauts had trained, causing NASA to reassess planning and training, including for the Hubble repair. The agency assigned to the mission Story Musgrave—who had worked on satellite repair procedures since 1976—and six other experienced astronauts, including two from STS-49. The first mission director since Project Apollo would coordinate a crew with 16 previous shuttle flights. The astronauts were trained to use about a hundred specialized tools.

    Heat had been the problem on prior spacewalks, which occurred in sunlight. Hubble needed to be repaired out of sunlight. Musgrave discovered during vacuum training, seven months before the mission, that spacesuit gloves did not sufficiently protect against the cold of space. After STS-57 confirmed the issue in orbit, NASA quickly changed equipment, procedures, and flight plan. Seven total mission simulations occurred before launch, the most thorough preparation in shuttle history. No complete Hubble mockup existed, so the astronauts studied many separate models (including one at the Smithsonian) and mentally combined their varying and contradictory details. Service Mission 1 flew aboard Endeavour in December 1993, and involved installation of several instruments and other equipment over ten days.

    Most importantly, the High Speed Photometer was replaced with the COSTAR corrective optics package, and WFPC was replaced with the Wide Field and Planetary Camera 2 (WFPC2) with an internal optical correction system. The solar arrays and their drive electronics were also replaced, as well as four gyroscopes in the telescope pointing system, two electrical control units and other electrical components, and two magnetometers. The onboard computers were upgraded with added coprocessors, and Hubble’s orbit was boosted.

    On January 13, 1994, NASA declared the mission a complete success and showed the first sharper images. The mission was one of the most complex performed up until that date, involving five long extra-vehicular activity periods. Its success was a boon for NASA, as well as for the astronomers who now had a more capable space telescope.

    Servicing Mission 2

    Servicing Mission 2, flown by Discovery in February 1997, replaced the GHRS and the FOS with the Space Telescope Imaging Spectrograph (STIS) and the Near Infrared Camera and Multi-Object Spectrometer (NICMOS), replaced an Engineering and Science Tape Recorder with a new Solid State Recorder, and repaired thermal insulation. NICMOS contained a heat sink of solid nitrogen to reduce the thermal noise from the instrument, but shortly after it was installed, an unexpected thermal expansion resulted in part of the heat sink coming into contact with an optical baffle. This led to an increased warming rate for the instrument and reduced its original expected lifetime of 4.5 years to about two years.

    Servicing Mission 3A

    Servicing Mission 3A, flown by Discovery, took place in December 1999, and was a split-off from Servicing Mission 3 after three of the six onboard gyroscopes had failed. The fourth failed a few weeks before the mission, rendering the telescope incapable of performing scientific observations. The mission replaced all six gyroscopes, replaced a Fine Guidance Sensor and the computer, installed a Voltage/temperature Improvement Kit (VIK) to prevent battery overcharging, and replaced thermal insulation blankets.

    Servicing Mission 3B

    Servicing Mission 3B flown by Columbia in March 2002 saw the installation of a new instrument, with the FOC (which, except for the Fine Guidance Sensors when used for astrometry, was the last of the original instruments) being replaced by the Advanced Camera for Surveys (ACS). This meant COSTAR was no longer required, since all new instruments had built-in correction for the main mirror aberration. The mission also revived NICMOS by installing a closed-cycle cooler and replaced the solar arrays for the second time, providing 30 percent more power.

    Servicing Mission 4

    Plans called for Hubble to be serviced in February 2005, but the Columbia disaster in 2003, in which the orbiter disintegrated on re-entry into the atmosphere, had wide-ranging effects on the Hubble program. NASA Administrator Sean O’Keefe decided all future shuttle missions had to be able to reach the safe haven of the International Space Station should in-flight problems develop. As no shuttles were capable of reaching both HST and the space station during the same mission, future crewed service missions were canceled. This decision was criticised by numerous astronomers who felt Hubble was valuable enough to merit the human risk. HST’s planned successor, the James Webb Telescope (JWST), as of 2004 was not expected to launch until at least 2011. A gap in space-observing capabilities between a decommissioning of Hubble and the commissioning of a successor was of major concern to many astronomers, given the significant scientific impact of HST. The consideration that JWST will not be located in low Earth orbit, and therefore cannot be easily upgraded or repaired in the event of an early failure, only made concerns more acute. On the other hand, many astronomers felt strongly that servicing Hubble should not take place if the expense were to come from the JWST budget.

    In January 2004, O’Keefe said he would review his decision to cancel the final servicing mission to HST, due to public outcry and requests from Congress for NASA to look for a way to save it. The National Academy of Sciences convened an official panel, which recommended in July 2004 that the HST should be preserved despite the apparent risks. Their report urged “NASA should take no actions that would preclude a space shuttle servicing mission to the Hubble Space Telescope”. In August 2004, O’Keefe asked Goddard Space Flight Center to prepare a detailed proposal for a robotic service mission. These plans were later canceled, the robotic mission being described as “not feasible”. In late 2004, several Congressional members, led by Senator Barbara Mikulski, held public hearings and carried on a fight with much public support (including thousands of letters from school children across the U.S.) to get the Bush Administration and NASA to reconsider the decision to drop plans for a Hubble rescue mission.

    The nomination in April 2005 of a new NASA Administrator, Michael D. Griffin, changed the situation, as Griffin stated he would consider a crewed servicing mission. Soon after his appointment Griffin authorized Goddard to proceed with preparations for a crewed Hubble maintenance flight, saying he would make the final decision after the next two shuttle missions. In October 2006 Griffin gave the final go-ahead, and the 11-day mission by Atlantis was scheduled for October 2008. Hubble’s main data-handling unit failed in September 2008, halting all reporting of scientific data until its back-up was brought online on October 25, 2008. Since a failure of the backup unit would leave the HST helpless, the service mission was postponed to incorporate a replacement for the primary unit.

    Servicing Mission 4 (SM4), flown by Atlantis in May 2009, was the last scheduled shuttle mission for HST. SM4 installed the replacement data-handling unit, repaired the ACS and STIS systems, installed improved nickel hydrogen batteries, and replaced other components including all six gyroscopes. SM4 also installed two new observation instruments—Wide Field Camera 3 (WFC3) and the Cosmic Origins Spectrograph (COS)—and the Soft Capture and Rendezvous System, which will enable the future rendezvous, capture, and safe disposal of Hubble by either a crewed or robotic mission. Except for the ACS’s High Resolution Channel, which could not be repaired and was disabled, the work accomplished during SM4 rendered the telescope fully functional.

    Major projects

    Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey [CANDELS]

    The survey “aims to explore galactic evolution in the early Universe, and the very first seeds of cosmic structure at less than one billion years after the Big Bang.” The CANDELS project site describes the survey’s goals as the following:

    The Cosmic Assembly Near-IR Deep Extragalactic Legacy Survey is designed to document the first third of galactic evolution from z = 8 to 1.5 via deep imaging of more than 250,000 galaxies with WFC3/IR and ACS. It will also find the first Type Ia SNe beyond z > 1.5 and establish their accuracy as standard candles for cosmology. Five premier multi-wavelength sky regions are selected; each has multi-wavelength data from Spitzer and other facilities, and has extensive spectroscopy of the brighter galaxies. The use of five widely separated fields mitigates cosmic variance and yields statistically robust and complete samples of galaxies down to 109 solar masses out to z ~ 8.

    Frontier Fields program

    The program, officially named Hubble Deep Fields Initiative 2012, is aimed to advance the knowledge of early galaxy formation by studying high-redshift galaxies in blank fields with the help of gravitational lensing to see the “faintest galaxies in the distant universe”. The Frontier Fields web page describes the goals of the program being:

    To reveal hitherto inaccessible populations of z = 5–10 galaxies that are ten to fifty times fainter intrinsically than any presently known
    To solidify our understanding of the stellar masses and star formation histories of sub-L* galaxies at the earliest times
    To provide the first statistically meaningful morphological characterization of star forming galaxies at z > 5
    To find z > 8 galaxies stretched out enough by cluster lensing to discern internal structure and/or magnified enough by cluster lensing for spectroscopic follow-up.

    Cosmic Evolution Survey (COSMOS)

    The Cosmic Evolution Survey (COSMOS) is an astronomical survey designed to probe the formation and evolution of galaxies as a function of both cosmic time (redshift) and the local galaxy environment. The survey covers a two square degree equatorial field with spectroscopy and X-ray to radio imaging by most of the major space-based telescopes and a number of large ground based telescopes, making it a key focus region of extragalactic astrophysics. COSMOS was launched in 2006 as the largest project pursued by the Hubble Space Telescope at the time, and still is the largest continuous area of sky covered for the purposes of mapping deep space in blank fields, 2.5 times the area of the moon on the sky and 17 times larger than the largest of the CANDELS regions. The COSMOS scientific collaboration that was forged from the initial COSMOS survey is the largest and longest-running extragalactic collaboration, known for its collegiality and openness. The study of galaxies in their environment can be done only with large areas of the sky, larger than a half square degree. More than two million galaxies are detected, spanning 90% of the age of the Universe. The COSMOS collaboration is led by Caitlin Casey, Jeyhan Kartaltepe, and Vernesa Smolcic and involves more than 200 scientists in a dozen countries.

    Important discoveries

    Hubble has helped resolve some long-standing problems in astronomy, while also raising new questions. Some results have required new theories to explain them.

    Age of the universe

    Among its primary mission targets was to measure distances to Cepheid variable stars more accurately than ever before, and thus constrain the value of the Hubble constant, the measure of the rate at which the universe is expanding, which is also related to its age. Before the launch of HST, estimates of the Hubble constant typically had errors of up to 50%, but Hubble measurements of Cepheid variables in the Virgo Cluster and other distant galaxy clusters provided a measured value with an accuracy of ±10%, which is consistent with other more accurate measurements made since Hubble’s launch using other techniques. The estimated age is now about 13.7 billion years, but before the Hubble Telescope, scientists predicted an age ranging from 10 to 20 billion years.

    Expansion of the universe

    While Hubble helped to refine estimates of the age of the universe, it also cast doubt on theories about its future. Astronomers from the High-z Supernova Search Team and the Supernova Cosmology Project used ground-based telescopes and HST to observe distant supernovae and uncovered evidence that, far from decelerating under the influence of gravity, the expansion of the universe may in fact be accelerating. Three members of these two groups have subsequently been awarded Nobel Prizes for their discovery.

    Saul Perlmutter [The Supernova Cosmology Project] shared the 2006 Shaw Prize in Astronomy, the 2011 Nobel Prize in Physics, and the 2015 Breakthrough Prize in Fundamental Physics with Brian P. Schmidt and Adam Riess [The High-z Supernova Search Team] for providing evidence that the expansion of the universe is accelerating.

    The cause of this acceleration remains poorly understood; the most common cause attributed is Dark Energy.

    Black holes

    The high-resolution spectra and images provided by the HST have been especially well-suited to establishing the prevalence of black holes in the center of nearby galaxies. While it had been hypothesized in the early 1960s that black holes would be found at the centers of some galaxies, and astronomers in the 1980s identified a number of good black hole candidates, work conducted with Hubble shows that black holes are probably common to the centers of all galaxies. The Hubble programs further established that the masses of the nuclear black holes and properties of the galaxies are closely related. The legacy of the Hubble programs on black holes in galaxies is thus to demonstrate a deep connection between galaxies and their central black holes.

    Extending visible wavelength images

    A unique window on the Universe enabled by Hubble are the Hubble Deep Field, Hubble Ultra-Deep Field, and Hubble Extreme Deep Field images, which used Hubble’s unmatched sensitivity at visible wavelengths to create images of small patches of sky that are the deepest ever obtained at optical wavelengths. The images reveal galaxies billions of light years away, and have generated a wealth of scientific papers, providing a new window on the early Universe. The Wide Field Camera 3 improved the view of these fields in the infrared and ultraviolet, supporting the discovery of some of the most distant objects yet discovered, such as MACS0647-JD.

    The non-standard object SCP 06F6 was discovered by the Hubble Space Telescope in February 2006.

    On March 3, 2016, researchers using Hubble data announced the discovery of the farthest known galaxy to date: GN-z11. The Hubble observations occurred on February 11, 2015, and April 3, 2015, as part of the CANDELS/GOODS-North surveys.

    Solar System discoveries

    HST has also been used to study objects in the outer reaches of the Solar System, including the dwarf planets Pluto and Eris.

    The collision of Comet Shoemaker-Levy 9 with Jupiter in 1994 was fortuitously timed for astronomers, coming just a few months after Servicing Mission 1 had restored Hubble’s optical performance. Hubble images of the planet were sharper than any taken since the passage of Voyager 2 in 1979, and were crucial in studying the dynamics of the collision of a comet with Jupiter, an event believed to occur once every few centuries.

    During June and July 2012, U.S. astronomers using Hubble discovered Styx, a tiny fifth moon orbiting Pluto.

    In March 2015, researchers announced that measurements of aurorae around Ganymede, one of Jupiter’s moons, revealed that it has a subsurface ocean. Using Hubble to study the motion of its aurorae, the researchers determined that a large saltwater ocean was helping to suppress the interaction between Jupiter’s magnetic field and that of Ganymede. The ocean is estimated to be 100 km (60 mi) deep, trapped beneath a 150 km (90 mi) ice crust.

    From June to August 2015, Hubble was used to search for a Kuiper belt object (KBO) target for the New Horizons Kuiper Belt Extended Mission (KEM) when similar searches with ground telescopes failed to find a suitable target.

    This resulted in the discovery of at least five new KBOs, including the eventual KEM target, 486958 Arrokoth, that New Horizons performed a close fly-by of on January 1, 2019.

    In August 2020, taking advantage of a total lunar eclipse, astronomers using NASA’s Hubble Space Telescope have detected Earth’s own brand of sunscreen – ozone – in our atmosphere. This method simulates how astronomers and astrobiology researchers will search for evidence of life beyond Earth by observing potential “biosignatures” on exoplanets (planets around other stars).
    Hubble and ALMA image of MACS J1149.5+2223.

    Supernova reappearance

    On December 11, 2015, Hubble captured an image of the first-ever predicted reappearance of a supernova, dubbed “Refsdal”, which was calculated using different mass models of a galaxy cluster whose gravity is warping the supernova’s light. The supernova was previously seen in November 2014 behind galaxy cluster MACS J1149.5+2223 as part of Hubble’s Frontier Fields program. Astronomers spotted four separate images of the supernova in an arrangement known as an “Einstein Cross”.

    The light from the cluster has taken about five billion years to reach Earth, though the supernova exploded some 10 billion years ago. Based on early lens models, a fifth image was predicted to reappear by the end of 2015. The detection of Refsdal’s reappearance in December 2015 served as a unique opportunity for astronomers to test their models of how mass, especially dark matter, is distributed within this galaxy cluster.

    Impact on astronomy

    Many objective measures show the positive impact of Hubble data on astronomy. Over 15,000 papers based on Hubble data have been published in peer-reviewed journals, and countless more have appeared in conference proceedings. Looking at papers several years after their publication, about one-third of all astronomy papers have no citations, while only two percent of papers based on Hubble data have no citations. On average, a paper based on Hubble data receives about twice as many citations as papers based on non-Hubble data. Of the 200 papers published each year that receive the most citations, about 10% are based on Hubble data.

    Although the HST has clearly helped astronomical research, its financial cost has been large. A study on the relative astronomical benefits of different sizes of telescopes found that while papers based on HST data generate 15 times as many citations as a 4 m (13 ft) ground-based telescope such as the William Herschel Telescope, the HST costs about 100 times as much to build and maintain.

    Deciding between building ground- versus space-based telescopes is complex. Even before Hubble was launched, specialized ground-based techniques such as aperture masking interferometry had obtained higher-resolution optical and infrared images than Hubble would achieve, though restricted to targets about 108 times brighter than the faintest targets observed by Hubble. Since then, advances in “adaptive optics” have extended the high-resolution imaging capabilities of ground-based telescopes to the infrared imaging of faint objects.

    The usefulness of adaptive optics versus HST observations depends strongly on the particular details of the research questions being asked. In the visible bands, adaptive optics can correct only a relatively small field of view, whereas HST can conduct high-resolution optical imaging over a wide field. Only a small fraction of astronomical objects are accessible to high-resolution ground-based imaging; in contrast Hubble can perform high-resolution observations of any part of the night sky, and on objects that are extremely faint.

    Impact on aerospace engineering

    In addition to its scientific results, Hubble has also made significant contributions to aerospace engineering, in particular the performance of systems in low Earth orbit. These insights result from Hubble’s long lifetime on orbit, extensive instrumentation, and return of assemblies to the Earth where they can be studied in detail. In particular, Hubble has contributed to studies of the behavior of graphite composite structures in vacuum, optical contamination from residual gas and human servicing, radiation damage to electronics and sensors, and the long term behavior of multi-layer insulation. One lesson learned was that gyroscopes assembled using pressurized oxygen to deliver suspension fluid were prone to failure due to electric wire corrosion. Gyroscopes are now assembled using pressurized nitrogen. Another is that optical surfaces in LEO can have surprisingly long lifetimes; Hubble was only expected to last 15 years before the mirror became unusable, but after 14 years there was no measureable degradation. Finally, Hubble servicing missions, particularly those that serviced components not designed for in-space maintenance, have contributed towards the development of new tools and techniques for on-orbit repair.

    Archives

    All Hubble data is eventually made available via the Mikulski Archive for Space Telescopes at STScI, CADC and ESA/ESAC. Data is usually proprietary—available only to the principal investigator (PI) and astronomers designated by the PI—for twelve months after being taken. The PI can apply to the director of the STScI to extend or reduce the proprietary period in some circumstances.

    Observations made on Director’s Discretionary Time are exempt from the proprietary period, and are released to the public immediately. Calibration data such as flat fields and dark frames are also publicly available straight away. All data in the archive is in the FITS format, which is suitable for astronomical analysis but not for public use. The Hubble Heritage Project processes and releases to the public a small selection of the most striking images in JPEG and TIFF formats.

    Outreach activities

    It has always been important for the Space Telescope to capture the public’s imagination, given the considerable contribution of taxpayers to its construction and operational costs. After the difficult early years when the faulty mirror severely dented Hubble’s reputation with the public, the first servicing mission allowed its rehabilitation as the corrected optics produced numerous remarkable images.

    Several initiatives have helped to keep the public informed about Hubble activities. In the United States, outreach efforts are coordinated by the Space Telescope Science Institute (STScI) Office for Public Outreach, which was established in 2000 to ensure that U.S. taxpayers saw the benefits of their investment in the space telescope program. To that end, STScI operates the HubbleSite.org website. The Hubble Heritage Project, operating out of the STScI, provides the public with high-quality images of the most interesting and striking objects observed. The Heritage team is composed of amateur and professional astronomers, as well as people with backgrounds outside astronomy, and emphasizes the aesthetic nature of Hubble images. The Heritage Project is granted a small amount of time to observe objects which, for scientific reasons, may not have images taken at enough wavelengths to construct a full-color image.

    Since 1999, the leading Hubble outreach group in Europe has been the Hubble European Space Agency Information Centre (HEIC). This office was established at the Space Telescope European Coordinating Facility in Munich, Germany. HEIC’s mission is to fulfill HST outreach and education tasks for the European Space Agency. The work is centered on the production of news and photo releases that highlight interesting Hubble results and images. These are often European in origin, and so increase awareness of both ESA’s Hubble share (15%) and the contribution of European scientists to the observatory. ESA produces educational material, including a videocast series called Hubblecast designed to share world-class scientific news with the public.

    The Hubble Space Telescope has won two Space Achievement Awards from the Space Foundation, for its outreach activities, in 2001 and 2010.

    A replica of the Hubble Space Telescope is on the courthouse lawn in Marshfield, Missouri, the hometown of namesake Edwin P. Hubble.

    Major Instrumentation

    Hubble WFPC2 no longer in service.

    Wide Field Camera 3 [WFC3]

    Advanced Camera for Surveys [ACS]

    Cosmic Origins Spectrograph [COS]

    The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA’s Goddard Space Flight Center manages the telescope. The Space Telescope Science Institute (STScI), is a free-standing science center, located on the campus of The Johns Hopkins University and operated by the Association of Universities for Research in Astronomy (AURA) for NASA, conducts Hubble science operations.

    ESA50 Logo large

     
  • richardmitnick 9:43 pm on October 13, 2021 Permalink | Reply
    Tags: "Einstein’s Principle of Equivalence verified in quasars for the first time", , , Basic Research, , ,   

    From IAC Institute of Astrophysics of the Canary Islands [Instituto de Astrofísica de Canarias] (ES) : “Einstein’s Principle of Equivalence verified in quasars for the first time” 

    Instituto de Astrofísica de Andalucía

    From IAC Institute of Astrophysics of the Canary Islands [Instituto de Astrofísica de Canarias] (ES)

    13/10/2021

    Evencio Mediavilla
    emg@iac.es

    1
    Artist impression of a quasar. Credit: M. Kornmesser/ European Southern Observatory [Observatoire européen austral][Europäische Südsternwarte](EU)(CL)

    According to Einstein’s theory of general relativity gravity affects light as well as matter. One consequence of this theory, based on the Principle of Equivalence, is that the light which escapes from a region with a strong gravitational field loses energy on its way, so that it becomes redder, a phenomenon known as the gravitational redshift. Quantifying this gives a fundamental test of Einstein’s theory of gravitation. Until now this test had been performed only on bodies in the nearby universe, but thanks to the use of a new experimental procedure scientists at the Instituto de Astrofísica de Canarias (IAC) and The University of Granada [Universidad de Granada] (ES) have been able to measure the gravitational redshift in quasars, and thus extend the test to very distant regions from where the light was emitted when our universe was young.

    Einstein’s Principle of Equivalence is the cornestone of the General Theory of Relativity, which is our best current description of gravity, and is one of the basic theories of modern physics. The principle states that it is experimentally impossible to distinguish between a gravitational field and an accelerated motion of the observer, and one of its predictions is that the light emitted from within an intense gravitational field should undergo a measurable shift to lower spectral energy, which for light means a shift to the red, which is termed “redshift”.

    This prediction has been well and very frequently confirmed close to the Earth, from the first measurements by R.V. Pound and G.A. Rebka at Harvard in 1959 until the most recent measurements with satellites. It has also been confirmed using observations of the Sun, and of some stars, such as our neighbour Sirius B, and the star S2 close to the supermassive black hole at the centre of the Galaxy. But to confirm it with measurements beyond the Galaxy has proved difficult, and there have been only a few tests with complicated measurements and low precision in clusters of galaxies relatively near to us in cosmological terms.

    The reason for this lack of testing in the more distant universe is the difficulty of measuring the redshift because in the majority of situations the effect of gravity on the light is very small. For that reason massive black holes with very strong gravitational fields offer promising scenarios for measuring gravitational redshifts. In particular the supermassive black holes found at the centres of galaxies, which have huge gravitational fields, offer one of the more promising scenarios to measure the gravitational redshift. They are situated at the centres of the extraordinarily luminous and distant quasars.

    A quasar is an object in the sky which looks like a star but is situated at a great distance from us, so that the light we receive from it was emitted when the universe was much younger than now. This means that they must be extremely bright. The origin of this huge power output is a disc of hot material which is being swallowed by the supermassive black hole at its centre. This energy is generated in a very small region, barely a few light days in size.

    In the neighbourhood of the black hole there is a very intense gravitational field and so by studying the light emitted by the chemical elements in this region (mainly hydrogen, carbon, and magnesium) we would expect to measure very large gravitational redshifts. Unfortunately the majority of the elements in quasar accretion discs are also present in regions further out from the central black hole where the gravitational effects are much smaller, so the light we receive from those elements is a mixture in which it is not easy to pick out clearly the gravitational redshifts.

    The measurements cover 80% of the history of the universe

    Now a team of researchers at the Instituto de Astrofísica de Canarias (IAC) and the University of Granada (UGR) have found a well defined portion of he ultraviolet light emitted by iron atoms from a region confined to the neighbourhood of the black hole. “Through our research related to gravitational lensing, another of the predictions of Einstein’s theory of General Relativity, we found that a characteristic spectal feature of iron in quasars seemed to be coming from a region very close to the black hole. Our measurements of the redshift confirmed this finding” explains Evencio Mediavilla, an IAC researcher, Professor at the Unversity of La Laguna(ULL) and first author of the article.

    Using this feature the researchers have been able to measure clearly and precisely the gravitational redshifts of many quasars and, using them, estimate the masses of the black holes. “This technique marks an extraordinary advance, because it allows us to measure precisely the gravitational redshifts of individual objects at great distances, which opens up important possibilities for the future” says Mediavilla.

    Jorge Jimenez Vicente, a researcher at the UGR, and co-author of the article, stressess the implications of this new experimental procedure, as it allows comparison of the measured redshift with the theoretcially predicted value: “this technique allows us for the first time to test Einstein’s Principle of Equivalence, and with it the basis of our understanding of gravity on cosmological scales.”

    This test of the Principle of Equivalence performed by these researchers is based on measurements which include active galaxies in our neighbourhood (some 13,800 million years after the Big Bang) out to individual quasars a large distances, whose light was emitted when the age of the universe was only some 2,200 million years, thus covering around 80% of the history of the universe. “The results, with a precision comparable to those of experiments carried out within our Galaxy, validate the Principle of Equivalence over this vast period of time” notes Jiménez-Vicente.

    The article has been published in the journal The Astrophysical Journal, and recently has been selected by The American Astronomical Society (US) which as published an interview with the researchers in the section “AAS Journal Author Series” of its YouTube channel, whose aim is to link the authors with their article, their personal histories, and the astronomical community in general.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    IAC Institute of Astrophysics of the Canary Islands [Instituto de Astrofísica de Canarias] (ES) operates two astronomical observatories in the Canary Islands:

    Roque de los Muchachos Observatory on La Palma
    Teide Observatory on Tenerife.

    The seeing statistics at ORM make it the second-best location for optical and infrared astronomy in the Northern Hemisphere, after Mauna Kea Observatory Hawaii (US).

    Maunakea Observatories Hawai’i (US) altitude 4,213 m (13,822 ft)

    The site also has some of the most extensive astronomical facilities in the Northern Hemisphere; its fleet of telescopes includes the 10.4 m Gran Telescopio Canarias, the world’s largest single-aperture optical telescope as of July 2009, the William Herschel Telescope (second largest in Europe), and the adaptive optics corrected Swedish 1-m Solar Telescope.

    Gran Telescopio Canarias [Instituto de Astrofísica de Canarias ](ES) sited on a volcanic peak 2,267 metres (7,438 ft) above sea level.

    The observatory was established in 1985, after 15 years of international work and cooperation of several countries with the Spanish island hosting many telescopes from Britain, The Netherlands, Spain, and other countries. The island provided better seeing conditions for the telescopes that had been moved to Herstmonceux by the Royal Greenwich Observatory, including the 98 inch aperture Isaac Newton Telescope (the largest reflector in Europe at that time). When it was moved to the island it was upgraded to a 100-inch (2.54 meter), and many even larger telescopes from various nations would be hosted there.

    Teide Observatory [Observatorio del Teide], IAU code 954, is an astronomical observatory on Mount Teide at 2,390 metres (7,840 ft), located on Tenerife, Spain. It has been operated by the Instituto de Astrofísica de Canarias since its inauguration in 1964. It became one of the first major international observatories, attracting telescopes from different countries around the world because of the good astronomical seeing conditions. Later the emphasis for optical telescopes shifted more towards Roque de los Muchachos Observatory on La Palma.

     
  • richardmitnick 1:40 pm on October 13, 2021 Permalink | Reply
    Tags: "Eerie Discovery of 2 'Identical' Galaxies in Deep Space Is Finally Explained", , , Basic Research, , ,   

    From Science Alert (US) : “Eerie Discovery of 2 ‘Identical’ Galaxies in Deep Space Is Finally Explained” 

    ScienceAlert

    From Science Alert (US)

    13 OCTOBER 2021
    MICHELLE STARR

    1
    Hamilton’s Object. (Joseph DePasquale/Space Telescope Science Institute (US))

    Galaxies are a bit like fingerprints, or snowflakes. There are many of them out there, and they can have a lot of characteristics in common, but no two are exactly alike.

    So, back in 2013, when two galaxies were spotted side-by-side in the distant reaches of the Universe, and which looked to be startlingly similar, astronomers were flummoxed.

    Now, they’ve finally solved the mystery of these strange “identical objects” – and the answer could have implications for understanding dark matter.

    The object, now named Hamilton’s Object, was discovered by astronomer Timothy Hamilton of Shawnee State University (US) by accident, in data obtained by the Hubble Space Telescope nearly a decade ago.

    The two galaxies appeared to be the same shape, and had the same nearly parallel dark streaks across the galactic bulge – the central region of the galaxy where most of the stars live.

    “We were really stumped,” Hamilton said. “My first thought was that maybe they were interacting galaxies with tidally stretched-out arms. It didn’t really fit well, but I didn’t know what else to think.”

    It wasn’t until 2015 that a more plausible answer would emerge. Astronomer Richard Griffiths of The University of Hawaii (US), on seeing Hamilton present his object at a meeting, suggested that the culprit might be a rare phenomenon: gravitational lensing.

    This is a phenomenon that results purely from a chance alignment of massive objects in space. If a massive object sits directly between us and a more distant object, a magnification effect occurs due to the gravitational curvature of space-time around the closer object.

    Any light that then travels through this space-time follows this curvature and enters our telescopes smeared and distorted to varying degrees – but also often magnified and duplicated.

    This made a lot more sense than two identical galaxies, especially when Griffiths found yet another duplication of the galaxy (as can be seen in the picture below).

    A huge problem, however, remained: What was causing the gravitational curvature? So Griffiths and his team set about searching sky survey data for an object massive enough to produce the lensing effect.

    And they found it. Between us and Hamilton’s Object lurks a cluster of galaxies that had only been poorly documented. Usually, these discoveries go the other way – first the cluster is identified, and then astronomers go looking for lensed galaxies behind them.

    The team’s work revealed that Hamilton’s Object is around 11 billion light-years away, and a different team’s work revealed that that the cluster is about 7 billion light-years away.

    The galaxy itself is a barred spiral galaxy with its edge facing us, undergoing clumpy and uneven star formation, the researchers determined. Computer simulations then helped determine that the three duplicated images could only be created if the distribution of dark matter is smooth at small scales.

    3
    (Joseph DePasquale/STScI)

    “It’s great that we only need two mirror images in order to get the scale of how clumpy or not dark matter can be at these positions,” said astronomer Jenny Wagner of the The Ruprecht Karl University of Heidelberg, [Ruprecht-Karls-Universität Heidelberg] (DE).

    “Here, we don’t use any lens models. We just take the observables of the multiple images and the fact they can be transformed into one another. They can be folded into one another by our method. This already gives us an idea of how smooth the dark matter needs to be at these two positions.”

    The two identical side-by-side images were created because they straddle a “ripple” in space-time – an area of greatest magnification created by the gravity of a filament of dark matter. Such filaments are thought to connect the Universe in a vast, invisible cosmic web, joining galaxies and galaxy clusters and feeding them with hydrogen gas.

    But we don’t actually know what dark matter is, so any new discovery that lets us map where it is, how it’s distributed, and how it affects the space around it is another drop of evidence that will ultimately help us solve the mystery.

    “We know it’s some form of matter, but we have no idea what the constituent particle is,” Griffiths explained.

    “So we don’t know how it behaves at all. We just know that it has mass and is subject to gravity. The significance of the limits of size on the clumping or smoothness is that it gives us some clues as to what the particle might be. The smaller the dark matter clumps, the more massive the particles must be.”

    The research has been published in the MNRAS3.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: