Tagged: Dark Energy Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:56 pm on February 4, 2023 Permalink | Reply
    Tags: "3 new studies indicate a conflict at the heart of cosmology", "The Big Think", , , , , Dark Energy, ,   

    From “The Big Think” : “3 new studies indicate a conflict at the heart of cosmology” 

    From “The Big Think”

    2.1.23
    Don Lincoln

    The Universe isn’t as “clumpy” as we think it should be.

    1
    Credit: NASA.

    Key Takeaways

    Telescopes are essentially time machines. As we examine galaxies that are at greater and greater distances from the Earth, we are looking further and further back in time. A new series of studies that examine the “clumpiness” of the Universe indicates that there might be a conflict at the heart of cosmology. The Big Bang theory is still sound, but it may need to be tweaked.

    A series of three scientific papers describing the expansion history of the Universe is telling a confusing tale, with predictions and measurements slightly disagreeing.

    While this disagreement isn’t considered a fatal disproof of modern cosmology, it could be a hint that our theories need to be revised.

    PRD “Joint analysis of DES Year 3 data and CMB lensing from SPT and Planck I: Construction of CMB Lensing Maps and Modeling Choices”
    PRD “Joint analysis of DES Year 3 data and CMB lensing from SPT and Planck II: Cross-correlation measurements and cosmological constraints”
    PRD “Joint analysis of DES Year 3 data and CMB lensing from SPT and Planck III: Combined cosmological constraints”

    Creation stories, both ancient and modern

    Understanding exactly how the world around us came into existence is a question that has bothered humanity for millennia. All around the world, people have devised stories — from the ancient Greek legend of the creation of the Earth and other primordial entities from Chaos (as first written down by Hesiod) to the Hopi creation myth (which describes a series of different kinds of creatures being created, eventually ending up as humans).

    In modern times, there are still competing creation stories, but there is one that is grounded in empiricism and the scientific method: the idea that about 13.8 billion years ago, the Universe began in a much smaller and hotter compressed state, and it has been expanding ever since then. This idea is colloquially called the “Big Bang,” although different writers use the term to mean slightly different things. Some use it to refer to the exact moment at which the Universe came into existence and began to expand, while others use it to refer to all moments after the beginning. For those writers, the Big Bang is still ongoing, as the expansion of the Universe continues.

    The beauty of this scientific explanation is that it can be tested. Astronomers rely on the fact that light has a finite speed, which means that it takes time for light to cross the cosmos. For example, the light we see as the Sun shining was emitted eight minutes before we see it. Light from the nearest star took about four years to get to Earth, and light from elsewhere in the cosmos can take billions of years to arrive.

    The telescope as a time machine

    Effectively, this means that telescopes are time machines. By looking at more and more distant galaxies, astronomers are able to see what the Universe looked like in the distant past. By stitching together observations of galaxies at different distances from the Earth, astronomers can unravel the evolution of the cosmos.

    The recent measurements use two different telescopes to study the structure of the Universe at different cosmic epochs. One facility, called the South Pole Telescope (SPT), looks at the earliest possible light, emitted a mere 380,000 years after the Universe began.

    At that time, the Universe was 0.003% its current age. If we consider the current cosmos to be equivalent to a 50-year-old person, the SPT looks at the Universe when it was a mere 12 hours old.

    The second facility is called the Dark Energy Survey (DES).
    ___________________________________________________________________
    The Dark Energy Survey

    Dark Energy Camera [DECam] built at The DOE’s Fermi National Accelerator Laboratory.

    NOIRLab National Optical Astronomy Observatory Cerro Tololo Inter-American Observatory (CL) Victor M Blanco 4m Telescope which houses the Dark-Energy-Camera – DECam at Cerro Tololo, Chile at an altitude of 7200 feet.

    NOIRLabNSF NOIRLab NOAO Cerro Tololo Inter-American Observatory(CL) approximately 80 km to the East of La Serena, Chile, at an altitude of 2200 meters.

    The Dark Energy Survey is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. The Dark Energy Survey began searching the Southern skies on August 31, 2013.

    According to Albert Einstein’s Theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up.

    Nobel Prize in Physics for 2011 Expansion of the Universe

    4 October 2011

    The Royal Swedish Academy of Sciences has decided to award the Nobel Prize in Physics for 2011

    with one half to

    Saul Perlmutter
    The Supernova Cosmology Project
    The DOE’s Lawrence Berkeley National Laboratory and The University of California-Berkeley,

    and the other half jointly to

    Brian P. Schmidt
    The High-z Supernova Search Team,
    The Australian National University, Weston Creek, Australia.

    and

    Adam G. Riess
    The High-z Supernova Search Team,The Johns Hopkins University and
    The Space Telescope Science Institute, Baltimore, MD.
    Written in the stars

    “Some say the world will end in fire, some say in ice…” *

    What will be the final destiny of the Universe? Probably it will end in ice, if we are to believe this year’s Nobel Laureates in Physics. They have studied several dozen exploding stars, called supernovae, and discovered that the Universe is expanding at an ever-accelerating rate. The discovery came as a complete surprise even to the Laureates themselves.

    In 1998, cosmology was shaken at its foundations as two research teams presented their findings. Headed by Saul Perlmutter, one of the teams had set to work in 1988. Brian Schmidt headed another team, launched at the end of 1994, where Adam Riess was to play a crucial role.

    The research teams raced to map the Universe by locating the most distant supernovae. More sophisticated telescopes on the ground and in space, as well as more powerful computers and new digital imaging sensors (CCD, Nobel Prize in Physics in 2009), opened the possibility in the 1990s to add more pieces to the cosmological puzzle.

    The teams used a particular kind of supernova, called Type 1a supernova. It is an explosion of an old compact star that is as heavy as the Sun but as small as the Earth. A single such supernova can emit as much light as a whole galaxy. All in all, the two research teams found over 50 distant supernovae whose light was weaker than expected – this was a sign that the expansion of the Universe was accelerating. The potential pitfalls had been numerous, and the scientists found reassurance in the fact that both groups had reached the same astonishing conclusion.

    For almost a century, the Universe has been known to be expanding as a consequence of the Big Bang about 14 billion years ago. However, the discovery that this expansion is accelerating is astounding. If the expansion will continue to speed up the Universe will end in ice.

    The acceleration is thought to be driven by dark energy, but what that dark energy is remains an enigma – perhaps the greatest in physics today. What is known is that dark energy constitutes about three quarters of the Universe. Therefore the findings of the 2011 Nobel Laureates in Physics have helped to unveil a Universe that to a large extent is unknown to science. And everything is possible again.

    *Robert Frost, Fire and Ice, 1920
    ___________________________________________________________________
    To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called Dark Energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or Albert Einstein’s Theory of General Relativity must be replaced by a new theory of gravity on cosmic scales.

    The Dark Energy Survey is designed to probe the origin of the accelerating universe and help uncover the nature of Dark Energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the Dark Energy Survey collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.
    ___________________________________________________________________
    This is a very powerful telescope located on a mountain top in Chile. Over the years, it has surveyed about 1/8 of the sky and photographed over 300 million galaxies, many of which are so dim, they are about one-millionth as bright as the dimmest stars visible to the human eye. This telescope can image galaxies from the current day to as far back as eight billion years ago. Continuing with the analogy of a 50-year-old individual, DES can take pictures of the Universe starting when it was 21 years old up until the present. (Full disclosure: Researchers at Fermilab, where I also work, carried out this study — but I did not participate in this research.)

    As light from distant galaxies travels to Earth, it can be distorted by galaxies that are closer to us. By using these tiny distortions, astronomers have developed a very precise map of the distribution of matter in the cosmos. This map includes both ordinary matter, of which stars and galaxies are the most familiar examples, and dark matter, which is a hypothesized form of matter that neither absorbs nor emits light. Dark matter is only observed through its gravitational effect on other objects and is thought to be five times more prevalent than ordinary matter.
    __________________________________
    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., and Vera Rubin a Woman in STEM, denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky.

    Coma cluster via NASA/ESA Hubble, the original example of Dark Matter discovered during observations by Fritz Zwicky and confirmed 30 years later by Vera Rubin.

    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.

    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.

    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970.

    Vera Rubin measuring spectra, worked on Dark Matter(Emilio Segre Visual Archives AIP SPL).

    Dark Matter Research

    Super Cryogenic Dark Matter Search from DOE’s SLAC National Accelerator Laboratory at Stanford University at SNOLAB (Vale Inco Mine, Sudbury, Canada).

    LBNL LZ Dark Matter Experiment xenon detector at Sanford Underground Research Facility Credit: Matt Kapust.


    DAMA at Gran Sasso uses sodium iodide housed in copper to hunt for dark matter LNGS-INFN.

    Yale HAYSTAC axion dark matter experiment at Yale’s Wright Lab.

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB (CA) deep in Sudbury’s Creighton Mine.

    The LBNL LZ Dark Matter Experiment Dark Matter project at SURF, Lead, SD.

    DAMA-LIBRA Dark Matter experiment at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS) located in the Abruzzo region of central Italy.

    DARWIN Dark Matter experiment. A design study for a next-generation, multi-ton dark matter detector in Europe at The University of Zurich [Universität Zürich](CH).

    PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China.

    Inside the Axion Dark Matter eXperiment U Washington. Credit: Mark Stone U. of Washington. Axion Dark Matter Experiment.

    3
    The University of Western Australia ORGAN Experiment’s main detector. A small copper cylinder called a “resonant cavity” traps photons generated during dark matter conversion. The cylinder is bolted to a “dilution refrigerator” which cools the experiment to very low temperatures.
    __________________________________

    Is the Big Bang incomplete?

    In order to test the Big Bang, astronomers can use measurements taken by the South Pole Telescope and use the theory to project forward to the present day. They can then take measurements from the Dark Energy Survey and compare them. If the measurements are accurate and the theory describes the cosmos, they should agree.

    And, by and large, they do — but not completely. When astronomers look at how “clumpy” the matter of the current Universe should be, purely from SPT measurements and extrapolations of theory, they find that the predictions are “clumpier” than current measurements by DES.

    This disagreement is potentially significant and could signal that the theory of the Big Bang is incomplete. Furthermore, this isn’t the first discrepancy that astronomers have encountered when they project measurements of the same primordial light imaged by the SPT to the modern day. Different research groups, using different telescopes, have found that the current Universe is expanding faster than expected from observations of the ancient light seen by the SPT, combined with Big Bang theory. This other discrepancy is called the Hubble Tension, named after American astronomer Edwin Hubble, who first realized that the Universe was expanding.

    __________________________________________________________________________________

    Edwin Hubble

    .

    __________________________________________________________________________________


    Have astronomers disproved the Big Bang?

    While the new discrepancy in predictions and measurements of the clumpiness of the Universe are preliminary, it could be that both this measurement and the Hubble Tension imply that the Big Bang theory might need some tweaking. Mind you, the discrepancies do not rise to the level of scrapping the theory entirely; however, it is the nature of the scientific method to adjust theories to account for new observations.

    See the full article here.

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 8:58 am on October 20, 2022 Permalink | Reply
    Tags: "Pantheon+", "Pantheon+" also cements a major disagreement over the pace of that expansion that has yet to be solved., "Pantheon+" further closes the door on alternative frameworks accounting for dark energy and dark matter., "The Most Precise Accounting Yet of Dark Energy and Dark Matter", , , , , Dark Energy, , G299 was left over by a particular class of supernovas called a Type Ia., , , , The current best theories for dark energy and dark matter hold strong., , The most distant supernovae in the dataset gleam forth from 10.7 billion light years away., The new "Pantheon+" analysis holds that 66.2 percent of the universe manifests as dark energy with the remaining 33.8 percent being a combination of dark matter and matter.,   

    From The Harvard-Smithsonian Center for Astrophysics: “The Most Precise Accounting Yet of Dark Energy and Dark Matter” 

    From The Harvard-Smithsonian Center for Astrophysics

    10.19.22
    Media Contact:
    Nadia Whitehead
    Public Affairs Officer
    Center for Astrophysics | Harvard & Smithsonian
    nadia.whitehead@cfa.harvard.edu
    617-721-7371

    Analyzing more than two decades’ worth of supernova explosions convincingly bolsters modern cosmological theories and reinvigorates efforts to answer fundamental questions.

    1
    G299 was left over by a particular class of supernovas called a Type Ia. Credit: NASA/CXC/University of Texas.

    Astrophysicists have performed a powerful new analysis that places the most precise limits yet on the composition and evolution of the universe. With this analysis, dubbed “Pantheon+”, cosmologists find themselves at a crossroads.

    “Pantheon+” convincingly finds that the cosmos is composed of about two-thirds dark energy and one-third matter — mostly in the form of dark matter — and is expanding at an accelerating pace over the last several billion years. However, “Pantheon+” also cements a major disagreement over the pace of that expansion that has yet to be solved.

    By putting prevailing modern cosmological theories, known as the Standard Model of Cosmology, on even firmer evidentiary and statistical footing, “Pantheon+” further closes the door on alternative frameworks accounting for dark energy and dark matter. Both are bedrocks of the Standard Model of Cosmology but have yet to be directly detected and rank among the model’s biggest mysteries. Following through on the results of “Pantheon+”, researchers can now pursue more precise observational tests and hone explanations for the ostensible cosmos.

    “With these “Pantheon+” results, we are able to put the most precise constraints on the dynamics and history of the universe to date,” says Dillon Brout, an Einstein Fellow at the Center for Astrophysics | Harvard & Smithsonian. “We’ve combed over the data and can now say with more confidence than ever before how the universe has evolved over the eons and that the current best theories for dark energy and dark matter hold strong.”

    Brout is the lead author of a series of papers describing the new “Pantheon+” analysis, published jointly today in a special issue of The Astrophysical Journal [below].

    “Pantheon+” is based on the largest dataset of its kind, comprising more than 1,500 stellar explosions called “Type Ia supernovae”. These bright blasts occur when white dwarf stars — remnants of stars like our Sun — accumulate too much mass and undergo a runaway thermonuclear reaction. Because “Type Ia supernovae” outshine entire galaxies, the stellar detonations can be glimpsed at distances exceeding 10 billion light years, or back through about three-quarters of the universe’s total age. Given that the supernovae blaze with nearly uniform intrinsic brightnesses, scientists can use the explosions’ apparent brightness, which diminishes with distance, along with redshift measurements as markers of time and space.

    That information, in turn, reveals how fast the universe expands during different epochs, which is then used to test theories of the fundamental components of the universe.

    The breakthrough discovery in 1998 of the universe’s accelerating growth was thanks to a study of “Type Ia supernovae” in this manner.

    Scientists attribute the expansion to an invisible energy, therefore monikered dark energy, inherent to the fabric of the universe itself. Subsequent decades of work have continued to compile ever-larger datasets, revealing supernovae across an even wider range of space and time, and Pantheon+ has now brought them together into the most statistically robust analysis to date.

    “In many ways, this latest “Pantheon+” analysis is a culmination of more than two decades’ worth of diligent efforts by observers and theorists worldwide in deciphering the essence of the cosmos,” says Adam Riess, one of the winners of the 2011 Nobel Prize in Physics for the discovery of the accelerating expansion of the universe and the Bloomberg Distinguished Professor at Johns Hopkins University (JHU) and the Space Telescope Science Institute in Baltimore, Maryland. Riess is also an alum of Harvard University, holding a PhD in astrophysics.

    Brout’s own career in cosmology traces back to his undergraduate years at JHU, where he was taught and advised by Riess. There Brout worked with then-PhD-student and Riess-advisee Dan Scolnic, who is now an assistant professor of physics at Duke University and another co-author on the new series of papers.

    Several years ago, Scolnic developed the original Pantheon analysis of approximately 1,000 supernovae.

    Now, Brout and Scolnic and their new “Pantheon+” team have added some 50 percent more supernovae data points in “Pantheon+”, coupled with improvements in analysis techniques and addressing potential sources of error, which ultimately has yielded twice the precision of the original Pantheon.

    “This leap in both the dataset quality and in our understanding of the physics that underpin it would not have been possible without a stellar team of students and collaborators working diligently to improve every facet of the analysis,” says Brout.

    Taking the data as a whole, the new analysis holds that 66.2 percent of the universe manifests as dark energy, with the remaining 33.8 percent being a combination of dark matter and matter. To arrive at even more comprehensive understanding of the constituent components of the universe at different epochs, Brout and colleagues combined “Pantheon+” with other strongly evidenced, independent and complementary measures of the large-scale structure of the universe and with measurements from the earliest light in the universe, the cosmic microwave background [CMB].

    Another key “Pantheon+” result relates to one of the paramount goals of modern cosmology: nailing down the current expansion rate of the universe, known as the “Hubble constant”. Pooling the “Pantheon+” sample with data from the “SH0ES” (Supernova H0 for the Equation of State) collaboration, led by Riess, results in the most stringent local measurement of the current expansion rate of the universe.

    “Pantheon+” and “SH0ES” together find a “Hubble constant” of 73.4 kilometers per second per megaparsec with only 1.3% uncertainty. Stated another way, for every megaparsec, or 3.26 million light years, the analysis estimates that in the nearby universe, space itself is expanding at more than 160,000 miles per hour.

    However, observations from an entirely different epoch of the universe’s history predict a different story. Measurements of the universe’s earliest light, the cosmic microwave background [CMB], when combined with the current Standard Model of Cosmology, consistently peg the “Hubble constant” at a rate that is significantly less than observations taken via “Type Ia supernovae” and other astrophysical markers. This sizable discrepancy between the two methodologies has been termed the “Hubble tension”.

    The new “Pantheon+” and “SH0ES’ datasets heighten this “Hubble tension”. In fact, the tension has now passed the important 5σ threshold (about one-in-a-million odds of arising due to random chance) that physicists use to distinguish between possible statistical flukes and something that must accordingly be understood. Reaching this new statistical level highlights the challenge for both theorists and astrophysicists to try and explain the “Hubble constant” discrepancy.

    “We thought it would be possible to find clues to a novel solution to these problems in our dataset, but instead we’re finding that our data rules out many of these options and that the profound discrepancies remain as stubborn as ever,” says Brout.

    The “Pantheon+” results could help point to where the solution to the “Hubble tension” lies. “Many recent theories have begun pointing to exotic new physics in the very early universe, however such unverified theories must withstand the scientific process and the “Hubble tension” continues to be a major challenge,” says Brout.

    Overall, “Pantheon+” offers scientists a comprehensive lookback through much of cosmic history. The earliest, most distant supernovae in the dataset gleam forth from 10.7 billion light years away, meaning from when the universe was roughly a quarter of its current age. In that earlier era, dark matter and its associated gravity held the universe’s expansion rate in check. Such state of affairs changed dramatically over the next several billion years as the influence of dark energy overwhelmed that of dark matter. Dark energy has since flung the contents of the cosmos ever-farther apart and at an ever-increasing rate.

    “With this combined “Pantheon+” dataset, we get a precise view of the universe from the time when it was dominated by dark matter to when the universe became dominated by dark energy,” says Brout. “This dataset is a unique opportunity to see dark energy turn on and drive the evolution of the cosmos on the grandest scales up through present time.”

    Studying this changeover now with even stronger statistical evidence will hopefully lead to new insights into dark energy’s enigmatic nature.

    “‘Pantheon+’ is giving us our best chance to date of constraining dark energy, its origins, and its evolution,” says Brout.

    Science paper compilation:
    The Astrophysical Journal

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    The The Harvard-Smithsonian Center for Astrophysics combines the resources and research facilities of the Harvard College Observatory and the Smithsonian Astrophysical Observatory under a single director to pursue studies of those basic physical processes that determine the nature and evolution of the universe. The Smithsonian Astrophysical Observatory is a bureau of the Smithsonian Institution, founded in 1890. The Harvard College Observatory, founded in 1839, is a research institution of the Faculty of Arts and Sciences, Harvard University, and provides facilities and substantial other support for teaching activities of the Department of Astronomy.

    Founded in 1973 and headquartered in Cambridge, Massachusetts, the CfA leads a broad program of research in astronomy, astrophysics, Earth and space sciences, as well as science education. The CfA either leads or participates in the development and operations of more than fifteen ground- and space-based astronomical research observatories across the electromagnetic spectrum, including the forthcoming Giant Magellan Telescope(CL) and the Chandra X-ray Observatory, one of NASA’s Great Observatories.

    GMT Giant Magellan Telescope(CL) 21 meters, to be at the Carnegie Institution for Science’s NSF NOIRLab NOAO Las Campanas Observatory(CL) some 115 km (71 mi) north-northeast of La Serena, Chile, over 2,500 m (8,200 ft) high.

    National Aeronautics and Space Administration Chandra X-ray telescope.

    Hosting more than 850 scientists, engineers, and support staff, the CfA is among the largest astronomical research institutes in the world. Its projects have included Nobel Prize-winning advances in cosmology and high energy astrophysics, the discovery of many exoplanets, and the first image of a black hole. The CfA also serves a major role in the global astrophysics research community: the CfA’s Astrophysics Data System, for example, has been universally adopted as the world’s online database of astronomy and physics papers. Known for most of its history as the “Harvard-Smithsonian Center for Astrophysics”, the CfA rebranded in 2018 to its current name in an effort to reflect its unique status as a joint collaboration between Harvard University and the Smithsonian Institution. The CfA’s current Director (since 2004) is Charles R. Alcock, who succeeds Irwin I. Shapiro (Director from 1982 to 2004) and George B. Field (Director from 1973 to 1982).

    The Center for Astrophysics | Harvard & Smithsonian is not formally an independent legal organization, but rather an institutional entity operated under a Memorandum of Understanding between Harvard University and the Smithsonian Institution. This collaboration was formalized on July 1, 1973, with the goal of coordinating the related research activities of the Harvard College Observatory (HCO) and the Smithsonian Astrophysical Observatory (SAO) under the leadership of a single Director, and housed within the same complex of buildings on the Harvard campus in Cambridge, Massachusetts. The CfA’s history is therefore also that of the two fully independent organizations that comprise it. With a combined lifetime of more than 300 years, HCO and SAO have been host to major milestones in astronomical history that predate the CfA’s founding.

    History of the Smithsonian Astrophysical Observatory (SAO)

    Samuel Pierpont Langley, the third Secretary of the Smithsonian, founded the Smithsonian Astrophysical Observatory on the south yard of the Smithsonian Castle (on the U.S. National Mall) on March 1,1890. The Astrophysical Observatory’s initial, primary purpose was to “record the amount and character of the Sun’s heat”. Charles Greeley Abbot was named SAO’s first director, and the observatory operated solar telescopes to take daily measurements of the Sun’s intensity in different regions of the optical electromagnetic spectrum. In doing so, the observatory enabled Abbot to make critical refinements to the Solar constant, as well as to serendipitously discover Solar variability. It is likely that SAO’s early history as a solar observatory was part of the inspiration behind the Smithsonian’s “sunburst” logo, designed in 1965 by Crimilda Pontes.

    In 1955, the scientific headquarters of SAO moved from Washington, D.C. to Cambridge, Massachusetts to affiliate with the Harvard College Observatory (HCO). Fred Lawrence Whipple, then the chairman of the Harvard Astronomy Department, was named the new director of SAO. The collaborative relationship between SAO and HCO therefore predates the official creation of the CfA by 18 years. SAO’s move to Harvard’s campus also resulted in a rapid expansion of its research program. Following the launch of Sputnik (the world’s first human-made satellite) in 1957, SAO accepted a national challenge to create a worldwide satellite-tracking network, collaborating with the United States Air Force on Project Space Track.

    With the creation of National Aeronautics and Space Administration the following year and throughout the space race, SAO led major efforts in the development of orbiting observatories and large ground-based telescopes, laboratory and theoretical astrophysics, as well as the application of computers to astrophysical problems.

    History of Harvard College Observatory (HCO)

    Partly in response to renewed public interest in astronomy following the 1835 return of Halley’s Comet, the Harvard College Observatory was founded in 1839, when the Harvard Corporation appointed William Cranch Bond as an “Astronomical Observer to the University”. For its first four years of operation, the observatory was situated at the Dana-Palmer House (where Bond also resided) near Harvard Yard, and consisted of little more than three small telescopes and an astronomical clock. In his 1840 book recounting the history of the college, then Harvard President Josiah Quincy III noted that “…there is wanted a reflecting telescope equatorially mounted…”. This telescope, the 15-inch “Great Refractor”, opened seven years later (in 1847) at the top of Observatory Hill in Cambridge (where it still exists today, housed in the oldest of the CfA’s complex of buildings). The telescope was the largest in the United States from 1847 until 1867. William Bond and pioneer photographer John Adams Whipple used the Great Refractor to produce the first clear Daguerrotypes of the Moon (winning them an award at the 1851 Great Exhibition in London). Bond and his son, George Phillips Bond (the second Director of HCO), used it to discover Saturn’s 8th moon, Hyperion (which was also independently discovered by William Lassell).

    Under the directorship of Edward Charles Pickering from 1877 to 1919, the observatory became the world’s major producer of stellar spectra and magnitudes, established an observing station in Peru, and applied mass-production methods to the analysis of data. It was during this time that HCO became host to a series of major discoveries in astronomical history, powered by the Observatory’s so-called “Computers” (women hired by Pickering as skilled workers to process astronomical data). These “Computers” included Williamina Fleming; Annie Jump Cannon; Henrietta Swan Leavitt; Florence Cushman; and Antonia Maury, all widely recognized today as major figures in scientific history. Henrietta Swan Leavitt, for example, discovered the so-called period-luminosity relation for Classical Cepheid variable stars, establishing the first major “standard candle” with which to measure the distance to galaxies. Now called “Leavitt’s Law”, the discovery is regarded as one of the most foundational and important in the history of astronomy; astronomers like Edwin Hubble, for example, would later use Leavitt’s Law to establish that the Universe is expanding, the primary piece of evidence for the Big Bang model.

    Upon Pickering’s retirement in 1921, the Directorship of HCO fell to Harlow Shapley (a major participant in the so-called “Great Debate” of 1920). This era of the observatory was made famous by the work of Cecelia Payne-Gaposchkin, who became the first woman to earn a Ph.D. in astronomy from Radcliffe College (a short walk from the Observatory). Payne-Gapochkin’s 1925 thesis proposed that stars were composed primarily of hydrogen and helium, an idea thought ridiculous at the time. Between Shapley’s tenure and the formation of the CfA, the observatory was directed by Donald H. Menzel and then Leo Goldberg, both of whom maintained widely recognized programs in solar and stellar astrophysics. Menzel played a major role in encouraging the Smithsonian Astrophysical Observatory to move to Cambridge and collaborate more closely with HCO.

    Joint history as the Center for Astrophysics (CfA)

    The collaborative foundation for what would ultimately give rise to the Center for Astrophysics began with SAO’s move to Cambridge in 1955. Fred Whipple, who was already chair of the Harvard Astronomy Department (housed within HCO since 1931), was named SAO’s new director at the start of this new era; an early test of the model for a unified Directorship across HCO and SAO. The following 18 years would see the two independent entities merge ever closer together, operating effectively (but informally) as one large research center.

    This joint relationship was formalized as the new Harvard–Smithsonian Center for Astrophysics on July 1, 1973. George B. Field, then affiliated with University of California- Berkeley, was appointed as its first Director. That same year, a new astronomical journal, the CfA Preprint Series was created, and a CfA/SAO instrument flying aboard Skylab discovered coronal holes on the Sun. The founding of the CfA also coincided with the birth of X-ray astronomy as a new, major field that was largely dominated by CfA scientists in its early years. Riccardo Giacconi, regarded as the “father of X-ray astronomy”, founded the High Energy Astrophysics Division within the new CfA by moving most of his research group (then at American Sciences and Engineering) to SAO in 1973. That group would later go on to launch the Einstein Observatory (the first imaging X-ray telescope) in 1976, and ultimately lead the proposals and development of what would become the Chandra X-ray Observatory. Chandra, the second of NASA’s Great Observatories and still the most powerful X-ray telescope in history, continues operations today as part of the CfA’s Chandra X-ray Center. Giacconi would later win the 2002 Nobel Prize in Physics for his foundational work in X-ray astronomy.

    Shortly after the launch of the Einstein Observatory, the CfA’s Steven Weinberg won the 1979 Nobel Prize in Physics for his work on electroweak unification. The following decade saw the start of the landmark CfA Redshift Survey (the first attempt to map the large scale structure of the Universe), as well as the release of the Field Report, a highly influential Astronomy & Astrophysics Decadal Survey chaired by the outgoing CfA Director George Field. He would be replaced in 1982 by Irwin Shapiro, who during his tenure as Director (1982 to 2004) oversaw the expansion of the CfA’s observing facilities around the world.

    Harvard Smithsonian Center for Astrophysics Fred Lawrence Whipple Observatory located near Amado, Arizona on the slopes of Mount Hopkins, Altitude 2,606 m (8,550 ft)

    European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne] [Europäische Weltraumorganization] (EU)/National Aeronautics and Space Administration SOHO satellite. Launched in 1995.

    National Aeronautics Space Agency NASA Kepler Space Telescope

    CfA-led discoveries throughout this period include canonical work on Supernova 1987A, the “CfA2 Great Wall” (then the largest known coherent structure in the Universe), the best-yet evidence for supermassive black holes, and the first convincing evidence for an extrasolar planet.

    The 1990s also saw the CfA unwittingly play a major role in the history of computer science and the internet: in 1990, SAO developed SAOImage, one of the world’s first X11-based applications made publicly available (its successor, DS9, remains the most widely used astronomical FITS image viewer worldwide). During this time, scientists at the CfA also began work on what would become the Astrophysics Data System (ADS), one of the world’s first online databases of research papers. By 1993, the ADS was running the first routine transatlantic queries between databases, a foundational aspect of the internet today.

    The CfA Today

    Research at the CfA

    Charles Alcock, known for a number of major works related to massive compact halo objects, was named the third director of the CfA in 2004. Today Alcock overseas one of the largest and most productive astronomical institutes in the world, with more than 850 staff and an annual budget in excess of $100M. The Harvard Department of Astronomy, housed within the CfA, maintains a continual complement of approximately 60 Ph.D. students, more than 100 postdoctoral researchers, and roughly 25 undergraduate majors in astronomy and astrophysics from Harvard College. SAO, meanwhile, hosts a long-running and highly rated REU Summer Intern program as well as many visiting graduate students. The CfA estimates that roughly 10% of the professional astrophysics community in the United States spent at least a portion of their career or education there.

    The CfA is either a lead or major partner in the operations of the Fred Lawrence Whipple Observatory, the Submillimeter Array, MMT Observatory, the South Pole Telescope, VERITAS, and a number of other smaller ground-based telescopes. The CfA’s 2019-2024 Strategic Plan includes the construction of the Giant Magellan Telescope as a driving priority for the Center.

    CFA Harvard Smithsonian Submillimeter Array on Mauna Kea, Hawai’i, Altitude 4,205 m (13,796 ft).

    South Pole Telescope SPTPOL. The SPT collaboration is made up of over a dozen (mostly North American) institutions, including The University of Chicago ; The University of California-Berkeley ; Case Western Reserve University; Harvard/Smithsonian Astrophysical Observatory; The University of Colorado- Boulder; McGill (CA) University, The University of Illinois, Urbana-Champaign; The University of California- Davis; Ludwig Maximilians Universität München(DE); DOE’s Argonne National Laboratory; and The National Institute for Standards and Technology.

    Along with the Chandra X-ray Observatory, the CfA plays a central role in a number of space-based observing facilities, including the recently launched Parker Solar Probe, Kepler Space Telescope, the Solar Dynamics Observatory (SDO), and HINODE. The CfA, via the Smithsonian Astrophysical Observatory, recently played a major role in the Lynx X-ray Observatory, a NASA-Funded Large Mission Concept Study commissioned as part of the 2020 Decadal Survey on Astronomy and Astrophysics (“Astro2020”). If launched, Lynx would be the most powerful X-ray observatory constructed to date, enabling order-of-magnitude advances in capability over Chandra.

    NASA Parker Solar Probe Plus named to honor Pioneering Physicist Eugene Parker. The Johns Hopkins University Applied Physics Lab.

    National Aeronautics and Space Administration Solar Dynamics Observatory.

    Japan Aerospace Exploration Agency (JAXA) (国立研究開発法人宇宙航空研究開発機構] (JP)/National Aeronautics and Space Administration HINODE spacecraft.

    SAO is one of the 13 stakeholder institutes for the Event Horizon Telescope Board, and the CfA hosts its Array Operations Center. In 2019, the project revealed the first direct image of a black hole.

    Messier 87*, The first image of the event horizon of a black hole. This is the supermassive black hole at the center of the galaxy Messier 87. Image via The Event Horizon Telescope Collaboration released on 10 April 2019 via National Science Foundation.

    The result is widely regarded as a triumph not only of observational radio astronomy, but of its intersection with theoretical astrophysics. Union of the observational and theoretical subfields of astrophysics has been a major focus of the CfA since its founding.

    In 2018, the CfA rebranded, changing its official name to the “Center for Astrophysics | Harvard & Smithsonian” in an effort to reflect its unique status as a joint collaboration between Harvard University and the Smithsonian Institution. Today, the CfA receives roughly 70% of its funding from NASA, 22% from Smithsonian federal funds, and 4% from the National Science Foundation. The remaining 4% comes from contributors including the United States Department of Energy, the Annenberg Foundation, as well as other gifts and endowments.

     
  • richardmitnick 9:13 am on September 24, 2022 Permalink | Reply
    Tags: "Star Light Star Bright … But Exactly How Bright?", , , , , Dark Energy, , , Type 1A supernovae   

    From The National Institute of Standards and Technology: “Star Light Star Bright … But Exactly How Bright?” 

    From The National Institute of Standards and Technology

    9.22.22

    Technical Contacts

    Susana Deustua
    susana.deustua@nist.gov
    (301) 975-3763

    John T. Woodward IV
    john.woodward@nist.gov
    (301) 975-5495

    1
    NIST researcher John Woodward with the four-inch telescope used to calibrate the luminosity of nearby stars.
    Credit: C. Suplee/NIST.

    2
    Astronomers use the brightness of a type of exploding star known as a Type 1A supernova (seen here as bright blue dot to the left of a remote spiral galaxy) to determine the age and expansion rate of the universe. New calibrations of the luminosity of nearby stars, observed by NIST researchers, could help astronomers refine their measurements.
    Credit: J. DePasquale (STScI), M. Kornmesser and M. Zamani (ESA/Hubble), A. Riess (STScI/JHU)NASA, ESA, and the SH0ES team, and the Digitized Sky Survey.

    3
    The four-inch telescope on Mt. Hopkins in Arizona. Credit: J. Woodward/NIST.

    4
    Side view of the telescope undergoing testing in the laboratory. Credit: C. Suplee/NIST.

    A picture may be worth a thousand words, but for astronomers, simply recording images of stars and galaxies isn’t enough. To measure the true size and absolute brightness (luminosity) of heavenly bodies, astronomers need to accurately gauge the distance to these objects. To do so, the researchers rely on “standard candles”– stars whose luminosities are so well known that they act like light bulbs of known wattage.

    One way to determine a star’s distance from Earth is to compare how bright the star appears in the sky to its luminosity.

    But even standard candles need to be calibrated. For more than a decade, scientists at the National Institute of Standards and Technology (NIST) have been working to improve the methods for calibrating standard stars. They observed two nearby bright stars, Vega and Sirius, in order to calibrate their luminosity over a range of visible-light wavelengths. The researchers are now completing their analysis and plan to release the calibration data to astronomers within the next 12 months.

    The calibration data could aid astronomers who use more distant standard candles–exploded stars known as type Ia supernovas–to determine the age and expansion rate of the universe. (Comparing the brightness of remote type Ia supernovas to nearby ones led to the Nobel-prize winning discovery that the expansion of the universe is not slowing down, as expected, but is actually speeding up.)

    ______________________________________________________________________________

    4 October 2011

    The Royal Swedish Academy of Sciences has decided to award the Nobel Prize in Physics for 2011

    with one half to

    Saul Perlmutter
    The Supernova Cosmology Project
    The DOE’s Lawrence Berkeley National Laboratory and The University of California-Berkeley,

    and the other half jointly to

    Brian P. SchmidtThe High-z Supernova Search Team, The Australian National University, Weston Creek, Australia.

    and

    Adam G. Riess

    The High-z Supernova Search Team,The Johns Hopkins University and The Space Telescope Science Institute, Baltimore, MD.

    Written in the stars

    “Some say the world will end in fire, some say in ice…” *

    What will be the final destiny of the Universe? Probably it will end in ice, if we are to believe this year’s Nobel Laureates in Physics. They have studied several dozen exploding stars, called supernovae, and discovered that the Universe is expanding at an ever-accelerating rate. The discovery came as a complete surprise even to the Laureates themselves.

    In 1998, cosmology was shaken at its foundations as two research teams presented their findings. Headed by Saul Perlmutter, one of the teams had set to work in 1988. Brian Schmidt headed another team, launched at the end of 1994, where Adam Riess was to play a crucial role.

    The research teams raced to map the Universe by locating the most distant supernovae. More sophisticated telescopes on the ground and in space, as well as more powerful computers and new digital imaging sensors (CCD, Nobel Prize in Physics in 2009), opened the possibility in the 1990s to add more pieces to the cosmological puzzle.

    The teams used a particular kind of supernova, called Type 1a supernova. It is an explosion of an old compact star that is as heavy as the Sun but as small as the Earth. A single such supernova can emit as much light as a whole galaxy. All in all, the two research teams found over 50 distant supernovae whose light was weaker than expected – this was a sign that the expansion of the Universe was accelerating. The potential pitfalls had been numerous, and the scientists found reassurance in the fact that both groups had reached the same astonishing conclusion.

    For almost a century, the Universe has been known to be expanding as a consequence of the Big Bang about 14 billion years ago. However, the discovery that this expansion is accelerating is astounding. If the expansion will continue to speed up the Universe will end in ice.

    The acceleration is thought to be driven by dark energy, but what that dark energy is remains an enigma – perhaps the greatest in physics today. What is known is that dark energy constitutes about three quarters of the Universe. Therefore the findings of the 2011 Nobel Laureates in Physics have helped to unveil a Universe that to a large extent is unknown to science. And everything is possible again.

    *Robert Frost, Fire and Ice, 1920
    ______________________________________________________________________________

    Astronomers may be able to use the NIST calibrations of Vega and Sirius to better compare the brightness of nearby and faraway type Ia supernovas, leading to more accurate measurements of the expansion of the universe and its age.

    In the ongoing NIST study, scientists observe the two nearby stars with a four-inch telescope they designed and placed atop Mount Hopkins in the desert of southern Arizona.

    John Woodward, Susana Deustua, and their colleagues have repeatedly observed the spectra, or colors, of light emitted by Vega (25 light-years away) and Sirius (8.6 light-years). One light-year, the distance that light travels through a vacuum is one year, is 9.46 trillion kilometers.

    At the beginning and end of each observing night, the researchers tilt the telescope downwards so that they can compare the stellar spectra to that of an artificial star–a quartz lamp whose luminosity has been exactly measured and placed 100 meters from the telescope.

    Before the scientists can directly make the comparisons, they must account for the effect of Earth’s atmosphere, which scatters and absorbs some of the starlight before it can reach the telescope. Although light from the ground-based lamp does not travel through the full depth of the atmosphere, some of it is scattered by air during its short, horizontal journey to the telescope.

    To assess how much of the ground-based light is scattered from the lamp, the NIST team measures the relative ratio of power generated by a helium-neon laser at its output and 100 m away, at the site of the lamp.

    To determine how much starlight is lost to the Earth’s atmosphere, the researchers record the amount of starlight reaching the telescope as it points in different directions, peering through different thicknesses of the atmosphere during the night. Changes in the amount of light recorded by the telescope as the night progresses allow astronomers to correct for the atmospheric absorption.

    Once Vega and Sirius are calibrated, astronomers can use those stars as steppingstones to calibrate the light from other stars. For instance, by using the same telescope, researchers can observe a set of slightly fainter stars—call them Set 2. The luminosity of those fainter stars can then be calibrated using Vega and Sirius as reference standards.

    Switching to a telescope large enough to observe both the newly calibrated Set 2, and a group of even fainter stars (call them Set 3), astronomers can calibrate the light from Set 3 in terms of Set 2. Astronomers can repeat the process as needed to calibrate light from extremely remote stars. In this way, astronomers will be able to transfer the NIST calibration of Vega and Sirius to stars that lie thousands to millions of light-years away.

    Next year, Deustua and Woodward will move their small telescope, now back at NIST, to the European Southern Observatory’s (ESO’s) Paranal Observatory in the high-altitude desert of northern Chile.

    With drier climate than Mt. Hopkins, the Chilean site promises more clear nights to observe Sirius and Vega and less moisture to absorb or scatter the light. The telescope will reside on a mountaintop away from ESO’s Very Large Telescope, a suite of four 8.2-m telescopes and four 1.2-m telescopes, so that the light from NIST’s quartz lamp won’t interfere with observations of distant galaxies.

    The team also plans to expand its repertoire of bright nearby stars to include Arcturus (37 light-years), Gamma Crucis (89 light-years), and Gamma Trianguli Australis (184 light-years) and to observe stars at longer, infrared wavelengths. The recently launched James Webb Space Telescope and the Roman Space Telescope, set for launch by the end of the decade, are designed to examine the universe at these wavelengths.

    The NIST researchers recently received seed money to build a larger telescope which could observe and calibrate fainter, more distant stars. That would allow astronomers to transfer the NIST calibration to remote standard candles more directly. Reducing the number of steppingstones between the stars observed by NIST and the stars astronomers are studying reduces calibration errors.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD.

    The National Institute of Standards and Technology‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “ The National Institute of Standards and Technology” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock.

    NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR).

    The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961.

    SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility.

    This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).
    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
  • richardmitnick 10:28 am on September 7, 2022 Permalink | Reply
    Tags: "'Lopsided' Universe could mean revision of standard cosmological model", , , , , Dark Energy, , ,   

    From The University of Oxford (UK): “‘Lopsided’ Universe could mean revision of Standard Cosmological Model – ΛCDM Model of Cosmology” 

    U Oxford bloc

    From The University of Oxford (UK)

    9.7.22

    1

    Dr Sebastian von Hausegger and Professor Subir Sarkar from the Rudolf Peierls Centre for Theoretical Physics at Oxford, together with their collaborators Dr Nathan Secrest (US Naval Observatory, Washington), Dr Roya Mohayaee (Institut d’Astrophysique, Paris) and Dr Mohamed Rameez (Tata Institute of Fundamental Research, Mumbai), have made a surprising discovery about the Universe. Their paper is in press in The Astrophysical Journal Letters [below].

    The researchers used observations of over a million quasars and half a million radio sources to test the ‘cosmological principle’ which underlies modern cosmology. It says that when averaged on large scales the Universe is isotropic and homogeneous. This allows a simple mathematical description of space-time – the Friedmann-Lemaître-Robertson-Walker (FLRW) metric – which enormously simplifies the application of Albert Einstein’s General Theory of Relativity to the Universe as a whole, thus yielding the “standard cosmological model”. Interpretation of observational data in the framework of this model has however led to the astounding conclusion that about 70% of the Universe is in the form of a mysterious “dark energy” which is causing its expansion rate to accelerate.

    ___________________________________________________________________
    The Dark Energy Survey

    Dark Energy Camera [DECam] built at The DOE’s Fermi National Accelerator Laboratory.

    NOIRLab National Optical Astronomy Observatory Cerro Tololo Inter-American Observatory (CL) Victor M Blanco 4m Telescope which houses the Dark-Energy-Camera – DECam at Cerro Tololo, Chile at an altitude of 7200 feet.

    NOIRLabNSF NOIRLab NOAO Cerro Tololo Inter-American Observatory(CL) approximately 80 km to the East of La Serena, Chile, at an altitude of 2200 meters.

    The Dark Energy Survey is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. The Dark Energy Survey began searching the Southern skies on August 31, 2013.

    According to Albert Einstein’s Theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up.
    Saul Perlmutter (center) [The Supernova Cosmology Project] shared the 2006 Shaw Prize in Astronomy, the 2011 Nobel Prize in Physics, and the 2015 Breakthrough Prize in Fundamental Physics with Brian P. Schmidt (right) and Adam Riess (left) [The High-z Supernova Search Team] for providing evidence that the expansion of the universe is accelerating.

    To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called Dark Energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    The Dark Energy Survey is designed to probe the origin of the accelerating universe and help uncover the nature of Dark Energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the Dark Energy Survey collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.
    ___________________________________________________________________
    This has been interpreted as arising from the zero-point fluctuations of the quantum vacuum, with the associated energy scale set by HØ, the present rate of expansion of the universe. However, this is quite inexplicable in the successful Standard Model (quantum field theory) of fundamental interactions, the characteristic energy scale of which is higher by a factor of 1044. So, while the standard cosmological model (called ΛCDM) describes the observational data well, its main component, dark energy, has no physical basis.

    Testing foundational assumptions

    This is what motivated the researchers to re-examine its underlying assumptions. Professor Sarkar says: “When the foundations of today’s standard cosmological model were laid a hundred years ago, there was no data. We didn’t even know then that we live in a galaxy – just one among a hundred billion others. Now that we do have data, we can, and should, test these foundational assumptions since a lot rests on them – in particular the inference that dark energy dominates the Universe.”

    In fact, the Universe today is manifestly not homogeneous and isotropic. Astronomical surveys reveal a filamentary structure of galaxies, clusters of galaxies, and superclusters of clusters … and this ‘cosmic web’ extends to the deepest scales currently probed of about 2 billion light years.

    The conventional wisdom is that, while clumpy on small scales, the distribution of matter becomes homogeneous when averaged on scales larger than about 300 million light years. The Hubble expansion is smooth and isotropic on large scales, while on small scales the gravitational effect of inhomogeneities give rise to ‘peculiar’ velocities eg our nearest neighbor the Andromeda galaxy is not receding in the Hubble flow – rather it is falling towards us.

    Back in 1966, the cosmologist Dennis Sciama noted that because of this, the cosmic microwave background (CMB) radiation from the Big Bang could not be uniform on the sky.

    It must exhibit a ‘dipole anisotropy’ ie appear hotter in the direction of our local motion and colder in the opposite direction. This was indeed found soon afterwards and is attributed to our motion at about 370 km/s towards a particular direction (in the constellation of Crater). Accordingly, a special relativistic ‘boost’ is applied to all cosmological data (redshifts, apparent magnitudes etc) to transform them to the reference frame in which the universe is isotropic, since it is in this ‘cosmic rest frame’ that the Friedmann-Lemaître equations of the standard cosmological model hold. Application of these equations to the corrected data then indicates that the Hubble expansion rate is accelerating, as if driven by Einstein’s Cosmological Constant “L”, aka dark energy.

    The cosmological principle

    How can we check if this is true? If the dipole anisotropy in the CMB is due to our motion, then there must be a similar dipole in the sky distribution of all cosmologically distant sources. This is due to ‘aberration’ because of the finite speed of light – as was recognized by Oxford astronomer James Bradley in 1727, long before Albert Einstein’s formulation of the Special Theory of Relativity which predicts this effect. Such sources were first identified with radio telescopes; the relativist George Ellis and radio astronomer John Baldwin noted in 1984 that with a uniform sky map of at least a few hundred thousand such sources, this dipole could be measured and compared with the standard expectation. It was not however until this millennium that the first such data became available – the NRAO VLA Sky Survey (NVSS) catalogue of radio sources.

    The dipole amplitude turned out to be higher than expected, although its direction was consistent with that of the CMB. However, the uncertainties were large, so the significance of the discrepancy was not compelling. Two years ago, the present team of researchers upped the stakes by analyzing a bigger catalogue of 1.4 million quasars mapped by NASA’s Wide-field Infrared Explorer (WISE).

    They found a similar discrepancy but at much higher significance. Dr von Hausegger comments: “If distant sources are not isotropic in the rest frame in which the CMB is isotropic, it implies a violation of the cosmological principle … which means going back to square one! So, we must now seek corroborating evidence to understand what causes this unexpected result.”

    In their recent paper, the researchers have addressed this by performing a joint analysis of the NVSS and WISE catalogues after performing various detailed checks to demonstrate their suitability for the purpose. These catalogues are systematically independent and have almost no shared objects so this is equivalent to performing two independent experiments. The dipoles in the two catalogues, made at widely different wavelengths, are found to be consistent with each other. The consistency of the two dipoles improves upon boosting to the frame in which the CMB is isotropic (assuming its dipole to be kinematic in origin), which suggests that cosmologically distant radio galaxies and quasars may have an intrinsic anisotropy in this frame. The joint significance of the discrepancy between the rest frames of radiation and matter now exceeds 5σ (ie a probability of less than 1 in 3.5 million of being a fluke). “This issue can no longer be ignored,” comments Professor Sarkar. “The validity of the FLRW metric itself is now in question!”

    Potential paradigm-changing finding

    New data with which to check this potentially paradigm-changing finding will soon come from the Legacy Survey of Space and Time (LSST) to be carried out at the Vera C Rubin Observatory in Chile.

    Oxford Physics is closely involved in this project, along with many other institutions in the UK and all over the world. Professor Ian Shipsey who has been a member of LSST since 2008, is excited about the prospect of carrying out fundamental cosmological tests. ‘As a particle physicist, I am acutely aware that the foundations of the Standard Model of particle physics are constantly under scrutiny.

    One of the reasons I joined LSST, and have worked for so long on it, is precisely to enable powerful tests of the foundations of the standard cosmological model,’ he says. To this end, Dr Hausegger and Professor Sarkar are leading projects in the LSST Dark Energy Science Collaboration to use the forthcoming data to test the homogeneity and isotropy of the Universe. ‘We will soon know if the standard cosmological model and the inference of dark energy are indeed valid,’ concludes Professor Sarkar.

    Science paper:
    The Astrophysical Journal Letters

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Oxford campus

    The University of Oxford

    1
    Universitas Oxoniensis

    The University of Oxford [a.k.a. The Chancellor, Masters and Scholars of the University of Oxford] is a collegiate research university in Oxford, England. There is evidence of teaching as early as 1096, making it the oldest university in the English-speaking world and the world’s second-oldest university in continuous operation. It grew rapidly from 1167 when Henry II banned English students from attending the University of Paris [Université de Paris] (FR). After disputes between students and Oxford townsfolk in 1209, some academics fled north-east to Cambridge where they established what became the University of Cambridge (UK). The two English ancient universities share many common features and are jointly referred to as Oxbridge.

    The university is made up of thirty-nine semi-autonomous constituent colleges, six permanent private halls, and a range of academic departments which are organized into four divisions. All the colleges are self-governing institutions within the university, each controlling its own membership and with its own internal structure and activities. All students are members of a college. It does not have a main campus, and its buildings and facilities are scattered throughout the city centre. Undergraduate teaching at Oxford consists of lectures, small-group tutorials at the colleges and halls, seminars, laboratory work and occasionally further tutorials provided by the central university faculties and departments. Postgraduate teaching is provided predominantly centrally.

    Oxford operates the world’s oldest university museum, as well as the largest university press in the world and the largest academic library system nationwide. In the fiscal year ending 31 July 2019, the university had a total income of £2.45 billion, of which £624.8 million was from research grants and contracts.

    Oxford has educated a wide range of notable alumni, including 28 prime ministers of the United Kingdom and many heads of state and government around the world. As of October 2020, 72 Nobel Prize laureates, 3 Fields Medalists, and 6 Turing Award winners have studied, worked, or held visiting fellowships at the University of Oxford, while its alumni have won 160 Olympic medals. Oxford is the home of numerous scholarships, including the Rhodes Scholarship, one of the oldest international graduate scholarship programmes.

    The University of Oxford’s foundation date is unknown. It is known that teaching at Oxford existed in some form as early as 1096, but it is unclear when a university came into being.

    It grew quickly from 1167 when English students returned from The University of Paris-Sorbonne [Université de Paris-Sorbonne](FR). The historian Gerald of Wales lectured to such scholars in 1188, and the first known foreign scholar, Emo of Friesland, arrived in 1190. The head of the university had the title of chancellor from at least 1201, and the masters were recognized as a universitas or corporation in 1231. The university was granted a royal charter in 1248 during the reign of King Henry III.

    The students associated together on the basis of geographical origins, into two ‘nations’, representing the North (northerners or Boreales, who included the English people from north of the River Trent and the Scots) and the South (southerners or Australes, who included English people from south of the Trent, the Irish and the Welsh). In later centuries, geographical origins continued to influence many students’ affiliations when membership of a college or hall became customary in Oxford. In addition, members of many religious orders, including Dominicans, Franciscans, Carmelites and Augustinians, settled in Oxford in the mid-13th century, gained influence and maintained houses or halls for students. At about the same time, private benefactors established colleges as self-contained scholarly communities. Among the earliest such founders were William of Durham, who in 1249 endowed University College, and John Balliol, father of a future King of Scots; Balliol College bears his name. Another founder, Walter de Merton, a Lord Chancellor of England and afterwards Bishop of Rochester, devised a series of regulations for college life. Merton College thereby became the model for such establishments at Oxford, as well as at the University of Cambridge. Thereafter, an increasing number of students lived in colleges rather than in halls and religious houses.

    In 1333–1334, an attempt by some dissatisfied Oxford scholars to found a new university at Stamford, Lincolnshire, was blocked by the universities of Oxford and Cambridge petitioning King Edward III. Thereafter, until the 1820s, no new universities were allowed to be founded in England, even in London; thus, Oxford and Cambridge had a duopoly, which was unusual in large western European countries.

    The new learning of the Renaissance greatly influenced Oxford from the late 15th century onwards. Among university scholars of the period were William Grocyn, who contributed to the revival of Greek language studies, and John Colet, the noted biblical scholar.

    With the English Reformation and the breaking of communion with the Roman Catholic Church, recusant scholars from Oxford fled to continental Europe, settling especially at the University of Douai. The method of teaching at Oxford was transformed from the medieval scholastic method to Renaissance education, although institutions associated with the university suffered losses of land and revenues. As a centre of learning and scholarship, Oxford’s reputation declined in the Age of Enlightenment; enrollments fell and teaching was neglected.

    In 1636, William Laud, the chancellor and Archbishop of Canterbury, codified the university’s statutes. These, to a large extent, remained its governing regulations until the mid-19th century. Laud was also responsible for the granting of a charter securing privileges for The University Press, and he made significant contributions to the Bodleian Library, the main library of the university. From the beginnings of the Church of England as the established church until 1866, membership of the church was a requirement to receive the BA degree from the university and “dissenters” were only permitted to receive the MA in 1871.

    The university was a centre of the Royalist party during the English Civil War (1642–1649), while the town favored the opposing Parliamentarian cause. From the mid-18th century onwards, however, the university took little part in political conflicts.

    Wadham College, founded in 1610, was the undergraduate college of Sir Christopher Wren. Wren was part of a brilliant group of experimental scientists at Oxford in the 1650s, the Oxford Philosophical Club, which included Robert Boyle and Robert Hooke. This group held regular meetings at Wadham under the guidance of the college’s Warden, John Wilkins, and the group formed the nucleus that went on to found the Royal Society.

    Before reforms in the early 19th century, the curriculum at Oxford was notoriously narrow and impractical. Sir Spencer Walpole, a historian of contemporary Britain and a senior government official, had not attended any university. He said, “Few medical men, few solicitors, few persons intended for commerce or trade, ever dreamed of passing through a university career.” He quoted the Oxford University Commissioners in 1852 stating: “The education imparted at Oxford was not such as to conduce to the advancement in life of many persons, except those intended for the ministry.” Nevertheless, Walpole argued:

    “Among the many deficiencies attending a university education there was, however, one good thing about it, and that was the education which the undergraduates gave themselves. It was impossible to collect some thousand or twelve hundred of the best young men in England, to give them the opportunity of making acquaintance with one another, and full liberty to live their lives in their own way, without evolving in the best among them, some admirable qualities of loyalty, independence, and self-control. If the average undergraduate carried from university little or no learning, which was of any service to him, he carried from it a knowledge of men and respect for his fellows and himself, a reverence for the past, a code of honor for the present, which could not but be serviceable. He had enjoyed opportunities… of intercourse with men, some of whom were certain to rise to the highest places in the Senate, in the Church, or at the Bar. He might have mixed with them in his sports, in his studies, and perhaps in his debating society; and any associations which he had this formed had been useful to him at the time, and might be a source of satisfaction to him in after life.”

    Out of the students who matriculated in 1840, 65% were sons of professionals (34% were Anglican ministers). After graduation, 87% became professionals (59% as Anglican clergy). Out of the students who matriculated in 1870, 59% were sons of professionals (25% were Anglican ministers). After graduation, 87% became professionals (42% as Anglican clergy).

    M. C. Curthoys and H. S. Jones argue that the rise of organized sport was one of the most remarkable and distinctive features of the history of the universities of Oxford and Cambridge in the late 19th and early 20th centuries. It was carried over from the athleticism prevalent at the public schools such as Eton, Winchester, Shrewsbury, and Harrow.

    All students, regardless of their chosen area of study, were required to spend (at least) their first year preparing for a first-year examination that was heavily focused on classical languages. Science students found this particularly burdensome and supported a separate science degree with Greek language study removed from their required courses. This concept of a Bachelor of Science had been adopted at other European universities (The University of London (UK) had implemented it in 1860) but an 1880 proposal at Oxford to replace the classical requirement with a modern language (like German or French) was unsuccessful. After considerable internal wrangling over the structure of the arts curriculum, in 1886 the “natural science preliminary” was recognized as a qualifying part of the first-year examination.

    At the start of 1914, the university housed about 3,000 undergraduates and about 100 postgraduate students. During the First World War, many undergraduates and fellows joined the armed forces. By 1918 virtually all fellows were in uniform, and the student population in residence was reduced to 12 per cent of the pre-war total. The University Roll of Service records that, in total, 14,792 members of the university served in the war, with 2,716 (18.36%) killed. Not all the members of the university who served in the Great War were on the Allied side; there is a remarkable memorial to members of New College who served in the German armed forces, bearing the inscription, ‘In memory of the men of this college who coming from a foreign land entered into the inheritance of this place and returning fought and died for their country in the war 1914–1918’. During the war years the university buildings became hospitals, cadet schools and military training camps.

    Reforms

    Two parliamentary commissions in 1852 issued recommendations for Oxford and Cambridge. Archibald Campbell Tait, former headmaster of Rugby School, was a key member of the Oxford Commission; he wanted Oxford to follow the German and Scottish model in which the professorship was paramount. The commission’s report envisioned a centralized university run predominantly by professors and faculties, with a much stronger emphasis on research. The professional staff should be strengthened and better paid. For students, restrictions on entry should be dropped, and more opportunities given to poorer families. It called for an enlargement of the curriculum, with honors to be awarded in many new fields. Undergraduate scholarships should be open to all Britons. Graduate fellowships should be opened up to all members of the university. It recommended that fellows be released from an obligation for ordination. Students were to be allowed to save money by boarding in the city, instead of in a college.

    The system of separate honor schools for different subjects began in 1802, with Mathematics and Literae Humaniores. Schools of “Natural Sciences” and “Law, and Modern History” were added in 1853. By 1872, the last of these had split into “Jurisprudence” and “Modern History”. Theology became the sixth honor school. In addition to these B.A. Honors degrees, the postgraduate Bachelor of Civil Law (B.C.L.) was, and still is, offered.

    The mid-19th century saw the impact of the Oxford Movement (1833–1845), led among others by the future Cardinal John Henry Newman. The influence of the reformed model of German universities reached Oxford via key scholars such as Edward Bouverie Pusey, Benjamin Jowett and Max Müller.

    Administrative reforms during the 19th century included the replacement of oral examinations with written entrance tests, greater tolerance for religious dissent, and the establishment of four women’s colleges. Privy Council decisions in the 20th century (e.g. the abolition of compulsory daily worship, dissociation of the Regius Professorship of Hebrew from clerical status, diversion of colleges’ theological bequests to other purposes) loosened the link with traditional belief and practice. Furthermore, although the university’s emphasis had historically been on classical knowledge, its curriculum expanded during the 19th century to include scientific and medical studies. Knowledge of Ancient Greek was required for admission until 1920, and Latin until 1960.

    The University of Oxford began to award doctorates for research in the first third of the 20th century. The first Oxford D.Phil. in mathematics was awarded in 1921.

    The mid-20th century saw many distinguished continental scholars, displaced by Nazism and communism, relocating to Oxford.

    The list of distinguished scholars at the University of Oxford is long and includes many who have made major contributions to politics, the sciences, medicine, and literature. As of October 2020, 72 Nobel laureates and more than 50 world leaders have been affiliated with the University of Oxford.

    To be a member of the university, all students, and most academic staff, must also be a member of a college or hall. There are thirty-nine colleges of the University of Oxford (including Reuben College, planned to admit students in 2021) and six permanent private halls (PPHs), each controlling its membership and with its own internal structure and activities. Not all colleges offer all courses, but they generally cover a broad range of subjects.

    The colleges are:

    All-Souls College
    Balliol College
    Brasenose College
    Christ Church College
    Corpus-Christi College
    Exeter College
    Green-Templeton College
    Harris-Manchester College
    Hertford College
    Jesus College
    Keble College
    Kellogg College
    Lady-Margaret-Hall
    Linacre College
    Lincoln College
    Magdalen College
    Mansfield College
    Merton College
    New College
    Nuffield College
    Oriel College
    Pembroke College
    Queens College
    Reuben College
    St-Anne’s College
    St-Antony’s College
    St-Catherines College
    St-Cross College
    St-Edmund-Hall College
    St-Hilda’s College
    St-Hughs College
    St-John’s College
    St-Peters College
    Somerville College
    Trinity College
    University College
    Wadham College
    Wolfson College
    Worcester College

    The permanent private halls were founded by different Christian denominations. One difference between a college and a PPH is that whereas colleges are governed by the fellows of the college, the governance of a PPH resides, at least in part, with the corresponding Christian denomination. The six current PPHs are:

    Blackfriars
    Campion Hall
    Regent’s Park College
    St Benet’s Hall
    St-Stephen’s Hall
    Wycliffe Hall

    The PPHs and colleges join as the Conference of Colleges, which represents the common concerns of the several colleges of the university, to discuss matters of shared interest and to act collectively when necessary, such as in dealings with the central university. The Conference of Colleges was established as a recommendation of the Franks Commission in 1965.

    Teaching members of the colleges (i.e., fellows and tutors) are collectively and familiarly known as dons, although the term is rarely used by the university itself. In addition to residential and dining facilities, the colleges provide social, cultural, and recreational activities for their members. Colleges have responsibility for admitting undergraduates and organizing their tuition; for graduates, this responsibility falls upon the departments. There is no common title for the heads of colleges: the titles used include Warden, Provost, Principal, President, Rector, Master and Dean.

    Oxford is regularly ranked within the top 5 universities in the world and is currently ranked first in the world in the Times Higher Education World University Rankings, as well as the Forbes’s World University Rankings. It held the number one position in The Times Good University Guide for eleven consecutive years, and the medical school has also maintained first place in the “Clinical, Pre-Clinical & Health” table of The Times Higher Education World University Rankings for the past seven consecutive years. In 2021, it ranked sixth among the universities around the world by SCImago Institutions Rankings. The Times Higher Education has also recognised Oxford as one of the world’s “six super brands” on its World Reputation Rankings, along with The University of California-Berkeley, The University of Cambridge (UK), Harvard University, The Massachusetts Institute of Technology, and Stanford University. The university is fifth worldwide on the US News ranking. Its Saïd Business School came 13th in the world in The Financial Times Global MBA Ranking.
    Oxford was ranked ninth in the world in 2015 by The Nature Index, which measures the largest contributors to papers published in 82 leading journals. It is ranked fifth best university worldwide and first in Britain for forming CEOs according to The Professional Ranking World Universities, and first in the UK for the quality of its graduates as chosen by the recruiters of the UK’s major companies.

    In the 2018 Complete University Guide, all 38 subjects offered by Oxford rank within the top 10 nationally meaning Oxford was one of only two multi-faculty universities (along with Cambridge) in the UK to have 100% of their subjects in the top 10. Computer Science, Medicine, Philosophy, Politics and Psychology were ranked first in the UK by the guide.

    According to The QS World University Rankings by Subject, the University of Oxford also ranks as number one in the world for four Humanities disciplines: English Language and Literature, Modern Languages, Geography, and History. It also ranks second globally for Anthropology, Archaeology, Law, Medicine, Politics & International Studies, and Psychology.

     
  • richardmitnick 11:22 am on July 10, 2022 Permalink | Reply
    Tags: , "Do you see new physics in my CMB?", "ΛCDM": Lamda Cold Dark Matter Accerated Expansion of The universe, , , Can You See Dark Matter and Dark Energy?, cosmic birefringence, , Dark Energy, , , ,   

    From astrobites : “Do you see new physics in my CMB?” 

    Astrobites bloc

    From astrobites

    Jul 9, 2022
    Kayla Kornoelje

    Title: New physics from the polarised light of the cosmic microwave background
    Authors: Eiichiro Komatsu
    First Author’s Institution: Max-Planck-Institut für Astrophysik, Karl-Schwarzschild Str. 1, 85741 Garching, Germany
    Status: Submitted to ArXiv [28 Feb 2022]

    Astronomers have painted an extraordinary picture of our Universe with the standard cosmological model, ΛCDM.

    The only problem is that astronomers don’t exactly know what ΛCDM really is. What is Dark Energy and Dark Matter? What is the physics behind Inflation? The answers to these fundamental questions in cosmology could be hidden right inside your T.V.

    ___________________________________________________________________
    The Dark Energy Survey

    Dark Energy Camera [DECam] built at The DOE’s Fermi National Accelerator Laboratory.

    NOIRLab National Optical Astronomy Observatory Cerro Tololo Inter-American Observatory(CL) Victor M Blanco 4m Telescope which houses the Dark-Energy-Camera – DECam at Cerro Tololo, Chile at an altitude of 7200 feet.

    NOIRLabNSF NOIRLab NOAO Cerro Tololo Inter-American Observatory(CL) approximately 80 km to the East of La Serena, Chile, at an altitude of 2200 meters.

    Timeline of the Inflationary Universe WMAP.

    The The Dark Energy Survey is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. The Dark Energy Survey began searching the Southern skies on August 31, 2013.

    According to Albert Einstein’s Theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up.
    Saul Perlmutter (center) [The Supernova Cosmology Project] shared the 2006 Shaw Prize in Astronomy, the 2011 Nobel Prize in Physics, and the 2015 Breakthrough Prize in Fundamental Physics with Brian P. Schmidt (right) and Adam Riess (left) [The High-z Supernova Search Team] for providing evidence that the expansion of the universe is accelerating.

    To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called Dark Energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    The Dark Energy Survey is designed to probe the origin of the accelerating universe and help uncover the nature of Dark Energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the Dark Energy Survey collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.
    ___________________________________________________________________

    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM, denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky.
    Coma cluster via NASA/ESA Hubble, the original example of Dark Matter discovered during observations by Fritz Zwicky and confirmed 30 years later by Vera Rubin.
    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.

    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.

    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.
    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970.

    Vera Rubin measuring spectra, worked on Dark Matter(Emilio Segre Visual Archives AIP SPL).
    Dark Matter Research

    Super Cryogenic Dark Matter Search from DOE’s SLAC National Accelerator Laboratory (US) at Stanford University (US) at SNOLAB (Vale Inco Mine, Sudbury, Canada).

    LBNL LZ Dark Matter Experiment (US) xenon detector at Sanford Underground Research Facility(US) Credit: Matt Kapust.

    Lamda Cold Dark Matter Accerated Expansion of The universe http://www.scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    DAMA at Gran Sasso uses sodium iodide housed in copper to hunt for dark matter LNGS-INFN.

    Yale HAYSTAC axion dark matter experiment at Yale’s Wright Lab.

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB (CA) deep in Sudbury’s Creighton Mine.

    The LBNL LZ Dark Matter Experiment (US) Dark Matter project at SURF, Lead, SD, USA.

    DAMA-LIBRA Dark Matter experiment at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS) located in the Abruzzo region of central Italy.

    DARWIN Dark Matter experiment. A design study for a next-generation, multi-ton dark matter detector in Europe at The University of Zurich [Universität Zürich](CH).

    PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China.

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.
    ___________________________________________________________________
    Cosmic Inflation Theory

    In physical cosmology, cosmic inflation, cosmological inflation is a theory of exponential expansion of space in the early universe. The inflationary epoch lasted from 10^−36 seconds after the conjectured Big Bang singularity to some time between 10^−33 and 10^−32 seconds after the singularity. Following the inflationary period, the universe continued to expand, but at a slower rate. The acceleration of this expansion due to dark energy began after the universe was already over 7.7 billion years old (5.4 billion years ago).

    Inflation theory was developed in the late 1970s and early 80s, with notable contributions by several theoretical physicists, including Alexei Starobinsky at Landau Institute for Theoretical Physics, Alan Guth at Cornell University, and Andrei Linde at Lebedev Physical Institute. Alexei Starobinsky, Alan Guth, and Andrei Linde won the 2014 Kavli Prize “for pioneering the theory of cosmic inflation.” It was developed further in the early 1980s. It explains the origin of the large-scale structure of the cosmos. Quantum fluctuations in the microscopic inflationary region, magnified to cosmic size, become the seeds for the growth of structure in the Universe. Many physicists also believe that inflation explains why the universe appears to be the same in all directions (isotropic), why the cosmic microwave background radiation is distributed evenly, why the universe is flat, and why no magnetic monopoles have been observed.

    The detailed particle physics mechanism responsible for inflation is unknown. The basic inflationary paradigm is accepted by most physicists, as a number of inflation model predictions have been confirmed by observation; [a] however, a substantial minority of scientists dissent from this position. The hypothetical field thought to be responsible for inflation is called the inflation.

    In 2002 three of the original architects of the theory were recognized for their major contributions; physicists Alan Guth of M.I.T., Andrei Linde of Stanford, and Paul Steinhardt of Princeton shared the prestigious Dirac Prize “for development of the concept of inflation in cosmology”. In 2012 Guth and Linde were awarded the Breakthrough Prize in Fundamental Physics for their invention and development of inflationary cosmology.

    4
    Alan Guth, from M.I.T., who first proposed Cosmic Inflation.

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    Alan Guth’s notes:
    Alan Guth’s original notes on inflation.
    ___________________________________________________________________

    The cosmic microwave background (CMB) is leftover radiation from the Big Bang.

    It’s some of the oldest light in the Universe, and yes, you can see that light in T.V. static!

    The CMB is rich with data that carries profound information about cosmology just waiting to be understood, but most important for our discussion today are the properties of the CMB’s polarization.

    Can You See Dark Matter and Dark Energy?

    3
    Figure 1: An illustration of cosmic birefringence. The left and right images are representations of the CMB before (left) and after (right) photons begin to travel towards us. Notice that the CMB photon’s wavelength is rotated by an angle β, which represents the rotation due to cosmic birefringence. This changes the polarization pattern (black lines in the image) of the CMB. Figure 3 in the paper.

    First, let’s try and answer our first question: what is the nature of dark matter and dark energy? When the CMB was formed around 380,000 years after the Big Bang, the Universe was hot, dense, and filled with electrons. As photons from the CMB made their long journey towards us, they scattered off of these electrons. From these scattering interactions at the appropriately named surface of last scattering, CMB photons naturally got linearly polarized at some specific angle, and some astronomers are on the hunt for a rotation of this initial polarization angle, called cosmic birefringence. This is exactly like the birefringence of a crystal, as light passing through a crystal can also be deflected at an angle relative to its initial path. The biggest difference between these two types of birefringence is merely that the photons from the CMB are polarized due to an energy field rather than a crystal. Some astronomers theorize that this energy field could be related to dark matter and dark energy, so a detection of this cosmic birefringence could tell us a lot about the ‘dark side’ of cosmology. Not only would a detection rule out Einstein’s cosmological constant as the origin of dark energy, but it would also tell us about the physics behind it. Also, since cosmic birefringence isn’t predicted by the standard ΛCDM cosmological model, it would also provide evidence for entirely new physics!

    Through the analysis of Planck polarization data, the author of today’s paper have found a tantalizing hint for cosmic birefringence. By using the latest reprocessing of Planck data, the author found a weak signal of cosmic birefringence corresponding to an angle of β = 0.30°± 0.11°. However, while this is an exciting result, it is not conclusive enough to call this a true detection of cosmic birefringence just yet. This is due to limitations in the precision of the measurements of the initial rotation angle, along with other possible systematic effects.

    Can You See Inflation?

    So, we haven’t detected cosmic birefringence, and we still don’t fully understand the nature of dark matter and dark energy. But what about inflation? While data from the CMB already provides support for inflation, astronomers are still on the lookout for a key piece of evidence in support of inflation: B-modes. Polarization angles from the CMB can be deconstructed into two types of modes: E-modes, which describe parallel or perpendicular angles, and B-modes, which describe 45° angles. B-modes are important proof of the inflationary model as the gravitational waves produced by inflation are the dominant contributor to B-modes. A detection of these B-modes would not only provide strong evidence for inflation, but also provide information about the physics behind it through analysis of their shape and properties. Although these modes also haven’t been detected yet, by using one potential model of inflation, today’s author has shown that their detection may be possible. (see Figure 2).

    3
    Figure 2: Plot of the B-mode power spectrum, which describes the power and properties of B-modes, as a function of multipole, which loosely describes angular size. The main takeaway is that at low multipoles (around 2 – 10), the energy from gravitational waves (blue) and the total contribution of new physics (green) is higher than the background energy (gray). So, with access to low-multipole data from missions such as the upcoming LiteBird satellite mission, detection of the B-modes from inflationary gravitational waves should be possible. Figure 5 in the paper.

    The CMB in the Future

    So, have we seen new physics in the CMB yet? Unfortunately, not quite—detecting cosmic birefringence or B-modes, as you have seen, is no easy task. Even small errors due to contamination, miscalibration, and systematic uncertainties can render these signals undetectable. However, the future looks bright. The noise level for CMB experiments has dropped nearly exponentially with time, and new CMB experiments such as SPT-4, CMB Stage-4, the Simons Observatory, JAXA and LiteBird are set to come online in the next decade. With new high-precision data on the horizon, and a little innovation, we may start to find the answers to these ambitious questions, so keep on the look out for these new results. Who knows, maybe we’ll find new physics along the way too!

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    What do we do?

    Astrobites is a daily astrophysical literature journal written by graduate students in astronomy. Our goal is to present one interesting paper per day in a brief format that is accessible to undergraduate students in the physical sciences who are interested in active research.
    Why read Astrobites?

    Reading a technical paper from an unfamiliar subfield is intimidating. It may not be obvious how the techniques used by the researchers really work or what role the new research plays in answering the bigger questions motivating that field, not to mention the obscure jargon! For most people, it takes years for scientific papers to become meaningful.

    Our goal is to solve this problem, one paper at a time. In 5 minutes a day reading Astrobites, you should not only learn about one interesting piece of current work, but also get a peek at the broader picture of research in a new area of astronomy.

     
  • richardmitnick 11:07 am on July 7, 2022 Permalink | Reply
    Tags: "BBN": Big Bang nucleosynthesis, , "Cosmic Web": the large scale structure of the universe., , "Predicting the composition of dark matter", A new analysis by a team of physicists offers an innovative means to predict "cosmological signatures" for models of "dark matter"., , , Dark Energy, , , Dark matter detected only by its gravitational pull on ordinary matter., In this study the normal matter and dark matter and dark energy in a region of the universe are followed through to the present day using the equations of gravity and hydrodynamics and cosmology., , , , This research establishes new ways to find these cosmological signatures in more complex models.   

    From New York University via “phys.org” : “Predicting the composition of dark matter” 

    NYU BLOC

    From New York University

    Via

    “phys.org”

    July 6, 2022

    1
    An artist’s rendition of big bang nucleosynthesis, the early universe period in which protons “p” and neutrons “n” combine to form light elements. The presence of dark matter “χ” changes how much of each element will form. Credit: Cara Giovanetti/New York University.

    A new analysis by a team of physicists offers an innovative means to predict “cosmological signatures” for models of “dark matter”.

    A team of physicists has developed a method for predicting the composition of dark matter—invisible matter detected only by its gravitational pull on ordinary matter and whose discovery has been long sought by scientists.

    Its work, which appears in the journal Physical Review Letters, centers on predicting “cosmological signatures” for models of dark matter with a mass between that of the electron and the proton. Previous methods had predicted similar signatures for simpler models of dark matter. This research establishes new ways to find these signatures in more complex models, which experiments continue to search for, the paper’s authors note.

    “Experiments that search for dark matter are not the only way to learn more about this mysterious type of matter,” says Cara Giovanetti, a Ph.D. student in New York University’s Department of Physics and the lead author of the paper.


    Predicting the composition of dark matter.
    This visualization of a computer simulation showcases the ‘cosmic web’- the large scale structure of the universe. Each bright knot is an entire galaxy, while the purple filaments show where material exists between the galaxies. To the human eye, only the galaxies would be visible, and this visualization allows us to see the strands of material connecting the galaxies and forming the cosmic web. This visualization is based on a scientific simulation of the growth of structure in the universe. The matter and dark matter and dark energy in a region of the universe are followed from very early times of the universe through to the present day using the equations of gravity, hydrodynamics, and cosmology. The normal matter has been clipped to show only the densest regions, which are the galaxies, and is shown in white. The dark matter is shown in purple. The size of the simulation is a cube with a side length of 134 megaparsecs (437 million light-years). Credit: Hubblesite; Visualization: Frank Summers, Space Telescope Science Institute; Simulation: Martin White and Lars Hernquist, Harvard University.

    “Precision measurements of different parameters of the universe—for example, the amount of helium in the universe, or the temperatures of different particles in the early universe—can also teach us a lot about dark matter,” adds Giovanetti, outlining the method described in the Physical Review Letters paper.

    In the research, conducted with Hongwan Liu, an NYU postdoctoral fellow, Joshua Ruderman, an associate professor in NYU’s Department of Physics, and Princeton physicist Mariangela Lisanti, Giovanetti and her co-authors focused on big bang nucleosynthesis (BBN)—a process by which light forms of matter, such as helium, hydrogen, and lithium, are created. The presence of invisible dark matter affects how each of these elements will form. Also vital to these phenomena is the cosmic microwave background (CMB)—electromagnetic radiation, generated by combining electrons and protons, that remained after the universe’s formation.

    The team sought a means to spot the presence of a specific category of dark matter—that with a mass between that of the electron and the proton—by creating models that took into account both BBN and CMB.

    “Such dark matter can modify the abundances of certain elements produced in the early universe and leave an imprint in the cosmic microwave background by modifying how quickly the universe expands,” Giovanetti explains.

    In its research, the team made predictions of cosmological signatures linked to the presence of certain forms of dark matter. These signatures are the result of dark matter changing the temperatures of different particles or altering how fast the universe expands.

    Their results showed that dark matter that is too light will lead to different amounts of light elements than what astrophysical observations see.

    “Lighter forms of dark matter might make the universe expand so fast that these elements don’t have a chance to form,” says Giovanetti, outlining one scenario.

    “We learn from our analysis that some models of dark matter can’t have a mass that’s too small, otherwise the universe would look different from the one we observe,” she adds.
    __________________________________
    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM, denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky.
    Coma cluster via NASA/ESA Hubble, the original example of Dark Matter discovered during observations by Fritz Zwicky and confirmed 30 years later by Vera Rubin.
    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.

    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.

    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.
    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970.

    Vera Rubin measuring spectra, worked on Dark Matter(Emilio Segre Visual Archives AIP SPL).
    Dark Matter Research

    Super Cryogenic Dark Matter Search from DOE’s SLAC National Accelerator Laboratory (US) at Stanford University (US) at SNOLAB (Vale Inco Mine, Sudbury, Canada).

    LBNL LZ Dark Matter Experiment (US) xenon detector at Sanford Underground Research Facility(US) Credit: Matt Kapust.

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    DAMA at Gran Sasso uses sodium iodide housed in copper to hunt for dark matter LNGS-INFN.

    Yale HAYSTAC axion dark matter experiment at Yale’s Wright Lab.

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB (CA) deep in Sudbury’s Creighton Mine.

    The LBNL LZ Dark Matter Experiment (US) Dark Matter project at SURF, Lead, SD, USA.

    DAMA-LIBRA Dark Matter experiment at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS) located in the Abruzzo region of central Italy.

    DARWIN Dark Matter experiment. A design study for a next-generation, multi-ton dark matter detector in Europe at The University of Zurich [Universität Zürich](CH).

    PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China.

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.
    __________________________________

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NYU Campus

    More than 175 years ago, Albert Gallatin, the distinguished statesman who served as secretary of the treasury under Presidents Thomas Jefferson and James Madison, declared his intention to establish “in this immense and fast-growing city … a system of rational and practical education fitting for all and graciously opened to all.” Founded in 1831, New York University is now one of the largest private universities in the United States. Of the more than 3,000 colleges and universities in America, New York University is one of only 60 member institutions of the distinguished Association of American Universities.

    New York University is a private research university in New York City. Chartered in 1831 by the New York State Legislature, NYU was founded by a group of New Yorkers led by then Secretary of the Treasury Albert Gallatin.

    In 1832, the initial non-denominational all-male institution began its first classes near City Hall based on a curriculum focused on a secular education. The university, in 1833, then moved and has maintained its main campus in Greenwich Village surrounding Washington Square Park. Since then, the university has added an engineering school in Brooklyn’s MetroTech Center and graduate schools throughout Manhattan. NYU has become the largest private university in the United States by enrollment, with a total of 51,848 enrolled students, including 26,733 undergraduate students and 25,115 graduate students, in 2019. NYU also receives the most applications of any private institution in the United States and admissions is considered highly selective.

    NYU is organized into 10 undergraduate schools, including the College of Arts & Science, Gallatin School, Steinhart School, Stern School of Business, Tandon School of Engineering, and the Tisch School of Arts. NYU’s 15 graduate schools includes the Grossman School of Medicine, School of Law, Wagner Graduate School of Public Service, School of Professional Studies, School of Social Work, Rory Meyers School of Nursing, and Silver School of Social Work. The university’s internal academic centers include the Courant Institute of Mathematical Sciences, Center for Data Science, Center for Neural Science, Clive Davis Institute, Institute for the Study of the Ancient World, Institute of Fine Arts, and the NYU Langone Health System. NYU is a global university with degree-granting campuses at NYU Abu Dhabi and NYU Shanghai, and academic centers in Accra, Berlin, Buenos Aires, Florence, London, Los Angeles, Madrid, Paris, Prague, Sydney, Tel Aviv, and Washington, D.C.

    Past and present faculty and alumni include 38 Nobel Laureates, 8 Turing Award winners, 5 Fields Medalists, 31 MacArthur Fellows, 26 Pulitzer Prize winners, 3 heads of state, a U.S. Supreme Court justice, 5 U.S. governors, 4 mayors of New York City, 12 U.S. Senators, 58 members of the U.S. House of Representatives, two Federal Reserve Chairmen, 38 Academy Award winners, 30 Emmy Award winners, 25 Tony Award winners, 12 Grammy Award winners, 17 billionaires, and seven Olympic medalists. The university has also produced six Rhodes Scholars, three Marshall Scholars, 29 Schwarzman Scholars, and one Mitchell Scholar.

    Research

    NYU is classified among “R1: Doctoral Universities – Very high research activity” and research expenditures totaled $917.7 million in 2017. The university was the founding institution of the American Chemical Society. The NYU Grossman School of Medicine received $305 million in external research funding from the National Institutes of Health in 2014. NYU was granted 90 patents in 2014, the 19th most of any institution in the world. NYU owns the fastest supercomputer in New York City. As of 2016, NYU hardware researchers and their collaborators enjoy the largest outside funding level for hardware security of any institution in the United States, including grants from the National Science Foundation, the Office of Naval Research, the Defense Advanced Research Projects Agency, the United States Army Research Laboratory, the Air Force Research Laboratory, the Semiconductor Research Corporation, and companies including Twitter, Boeing, Microsoft, and Google.

    In 2019, four NYU Arts & Science departments ranked in Top 10 of Shanghai Academic Rankings of World Universities by Academic Subjects (Economics, Politics, Psychology, and Sociology).

     
  • richardmitnick 3:23 pm on March 24, 2022 Permalink | Reply
    Tags: "What Can We Learn About the Universe from Just One Galaxy?", , , , CAMELS: Cosmology and Astrophysics with MachinE Learning Simulations, , Dark Energy, , , Omega matter: a cosmological parameter that describes how much dark matter is in the universe, ,   

    From The New Yorker: “What Can We Learn About the Universe from Just One Galaxy?” 


    Rea Irvin

    From The New Yorker

    March 23, 2022
    Rivka Galchen

    1
    Illustration by Nicholas Konrad /The New Yorker

    In new research, begun by an undergraduate, William Blake’s phrase “to see a world in a grain of sand” is suddenly relevant to astrophysics.

    Imagine if you could look at a snowflake at the South Pole and determine the size and the climate of all of Antarctica. Or study a randomly selected tree in the Amazon rain forest and, from that one tree—be it rare or common, narrow or wide, young or old—deduce characteristics of the forest as a whole. Or, what if, by looking at one galaxy among the hundred billion or so in the observable universe, one could say something substantial about the universe as a whole? A recent paper, whose lead authors include a cosmologist, a galaxy-formation expert, and an undergraduate named Jupiter (who did the initial work), suggests that this may be the case. The result at first seemed “crazy” to the paper’s authors. Now, having discussed their work with other astrophysicists and done various “sanity checks,” trying to find errors in their methods, the results are beginning to seem pretty clear. Francisco Villaescusa-Navarro, one of the lead authors of the work, said, “It does look like galaxies somehow retain a memory of the entire universe.”

    The research began as a sort of homework exercise. Jupiter Ding, while a freshman at Princeton University, wrote to the department of astrophysics, hoping to get involved in research. He mentioned that he had some experience with machine learning, a form of artificial intelligence that is adept at picking out patterns in very large data sets. Villaescusa-Navarro, an astrophysicist focused on cosmology, had an idea for what the student might work on. Villaescusa-Navarro had long wanted to look into whether machine learning could be used to help find relationships between galaxies and the universe. “I was thinking, What if you could look at only a thousand galaxies and from that learn properties about the entire universe? I wondered, What is the smallest number we could look at? What if you looked at only one hundred? I thought, O.K., we’ll start with one galaxy.”

    He had no expectation that one galaxy would provide much. But he thought that it would be a good way for Ding to practice using machine learning on a database known as CAMELS (Cosmology and Astrophysics with MachinE Learning Simulations). Shy Genel, an astrophysicist focussed on galaxy formation, who is another lead author on the paper, explained CAMELS this way: “We start with a description of reality shortly after the Big Bang. At that point, the universe is mostly hydrogen gas, and some helium and dark matter. And then, using what we know of the laws of physics, our best guess, we then run the cosmic history for roughly fourteen billion years.” Cosmological simulations have been around for about forty years, but they are increasingly sophisticated—and fast. CAMELS contains some four thousand simulated universes. Working with simulated universes, as opposed to our own, lets researchers ask questions that the gaps in our observational data preclude us from answering. They also let researchers play with different parameters, like the proportions of dark matter and hydrogen gas, to test their impact.

    Ding did the work on CAMELS from his dorm room, on his laptop. He wrote programs to work with the CAMELS data, then sent them to one of the university’s computing clusters, a collection of computers with far more power than his MacBook Air. That computing cluster contained the CAMELS data. Ding’s model trained itself by taking a set of simulated universes and looking at the galaxies within them. Once trained, the model would then be shown a sample galaxy and asked to predict features of the universe from which it was sampled.

    Ding is very humble about his contribution to the research, but he knows far more about astrophysics than even an exceptional first-year student typically does. Ding, a middle child with two sisters, grew up in State College, Pennsylvania. In high school, he took a series of college-level astronomy courses at Penn State and worked on a couple of research projects that involved machine learning. “My dad was really interested in astronomy as a high schooler,” Ding told me. “He went another direction, though.” His father is a professor of marketing at Penn State’s business school.

    Artificial intelligence is an umbrella concept for various disciplines, including machine learning. A famous early machine-learning task was to get a computer to recognize an image of a cat. This is something that a human can do easily, but, for a computer, there are no simple parameters that define the visual concept of a cat. Machine learning is now used for detecting patterns or relationships that are nearly impossible for humans to see, in part because the data is often in many dimensions. The programmer remains the captain, telling the computer what to learn, and deciding what input it’s trained on. But the computer adapts, iteratively, as it learns, and in that way becomes the author of its own algorithms. It was machine learning, for example, that discovered, through analyzing language patterns, the alleged main authors of the posts by “Q” (the supposed high-ranking government official who sparked the QAnon conspiracy theory). It was also able to identify which of Q’s posts appeared to be written by Paul Furber, a South African software developer, and by Ron Watkins, the son of the former owner of 8chan. Machine-learning programs have also been applied in health care, using data to predict which patients are most at risk of falling. Compared with the intuition of doctors, the machine-learning-based assessments reduced falls by about forty per cent, an enormous margin of improvement for a medical intervention.

    Machine learning has catapulted astrophysics research forward, too. Villaescusa-Navarro said, “As a community, we have been dealing with super-hard problems for many, many years. Problems that the smartest people in the field have been working on for decades. And from one day to the next, these problems are getting solved with machine learning.” Even generating a single simulated universe used to take a very long time. You gave a computer some initial conditions and then had to wait while it worked out what those conditions would produce some fourteen billion years down the line. It took less than fourteen billion years, of course, but there was no way to build up a large database of simulated universes in a timely way. Machine-learning advances have sped up these simulations, making a project like CAMELS possible. An even more ambitious project, Learning the Universe, will use machine learning to create simulated universes millions of times faster than CAMELS can; it will then use what’s called simulation-based inference—along with real observational data from telescopes—to determine which starting parameters lead to a universe that most closely resembles our own.

    Ding told me that one of the reasons he chose astronomy has been the proximity he feels to breakthroughs in the field, even as an undergraduate. “For example, I’m in a cosmology class right now, and when my professor talks about dark matter, she talks about it as something ‘a good friend of mine, Vera Rubin, put on the map,’ ” he said. “And dark energy was discovered by a team at Harvard University about twenty years ago, and I did a summer program there. So here I am, learning about this stuff pretty much in the places where these things were happening.” Ding’s research produced something profoundly unexpected. His model used a single galaxy in a simulated universe to pretty accurately say something about that universe. The specific characteristic it was able to predict is called Omega matter, which relates to the density of a universe. Its value was accurately predicted to within ten per cent.

    Ding was initially unsure how meaningful his results were and was curious to hear Villaescusa-Navarro’s perspective. He was more than skeptical. “My first thought was, This is completely crazy, I don’t believe it, this is the work of an undergraduate, there must be a mistake,” Villaescusa-Navarro said. “I asked him to run the program in a few other ways to see if he would still come up with similar results.” The results held.

    Villaescusa-Navarro began to do his own calculations. His doubt focussed foremost on the way that the machine learning itself worked. “One thing about neural networks is that they are amazing at finding correlations, but they also can pick up on numerical artifacts,” he said. Was a parameter wrong? Was there a bug in the code? Villaescusa-Navarro wrote his own program, to ask the same sort of question that he had assigned to Ding: What could information about one galaxy say about the universe in which it resided? Even when asked by a different program, written from scratch, the answer was still coming out the same. This suggested that the result was catching something real.

    “But we couldn’t just publish that,” Villaescusa-Navarro said. “We needed to try and understand why this might be working.” It was working for small galaxies, and for large galaxies, and for galaxies with very different features; only for a small handful of eccentric galaxies did the work not hold. Why?

    The recipe for making a universe is to start with a lot of hydrogen, a little helium, some dark matter, and some dark energy. Dark matter has mass, like the matter we’re familiar with, but it doesn’t reflect or emit light, so we can’t see it. We also can’t see dark energy, but we can think of it as working in the opposite direction of gravity. The universe’s matter, via gravity, pushes it to contract; the universe’s dark energy pushes it to expand.

    Omega matter is a cosmological parameter that describes how much dark matter is in the universe. Along with other parameters, it controls how much the universe is expanding. The higher its value, the slower the universe would grow. One of the research group’s hypotheses to explain their results is, roughly, that the amount of dark matter in a universe has a very strong effect on a galaxy’s properties—a stronger effect than other characteristics. For this reason, even one galaxy could have something to say about the Omega matter of its parent universe, since Omega matter is correlated to what can be pictured as the density of matter that makes a galaxy clump together.

    In December, Genel, an expert on galaxy formation, presented the preliminary results of the paper to the galaxy-formation group he belongs to at The Flatiron Institute Center for Computational Astrophysics. “This was really one of the most fun things that happened to me,” he said. He told me that any galaxy-formation expert could have no other first reaction than to think, This is impossible. A galaxy is, on the scale of a universe, about as substantial as a grain of sand is, relative to the size of the Earth. To think that all by itself it can say something so substantial is, to the majority of the astrophysics community, extremely surprising, in a way analogous to the discovery that each of our cells—from a fingernail cell to a liver cell—contains coding describing our entire body. (Though maybe to the poetic way of thinking—to see the world in a grain of sand—the surprise is that this is surprising.)

    Rachel Somerville, an astrophysicist who was at the talk, recalled the initial reaction as “skepticism, but respectful skepticism, since we knew these were serious researchers.” She remembers being surprised that the approach had even been tried, since it seemed so tremendously unlikely that it would work. Since that time, the researchers have shared their coding and results with experts in the field; the results are taken to be credible and compelling, though the hesitations that the authors themselves have about the results remain.

    The results are not “robust”—for now, the computer can make valid predictions only on the type of universe that it has been trained on. Even within CAMELS, there are two varieties of simulations, and, if the machine is trained on one variety, it cannot be used to make predictions for galaxies in the other variety. That also means that the results cannot be used to make predictions about the universe we live in—at least not yet.

    Villaescusa-Navarro told me, “It is a very beautiful result—I know I shouldn’t say that about my own work.” But what is beauty to an astrophysicist? “It’s about an unexpected connection between two things that seemed not to be related. In this case, cosmology and galaxy formation. It’s about something hidden being revealed.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 2:10 pm on March 12, 2022 Permalink | Reply
    Tags: "Ask Ethan-Did our Universe really arise from nothing?", , , , , Dark Energy, , , ,   

    From Ethan Siegel “Ask Ethan-Did our Universe really arise from nothing?”

    Mar 11, 2022

    The Big Bang was hot, dense, uniform, and filled with matter and energy. Before that? There was nothing. Here’s how that’s possible.

    The more curious we get about the great cosmic unknowns, the more unanswered questions our investigations of the Universe will reveal. Inquiring about the nature of anything — where it is, where it came from, and how it came to be — will inevitably lead you to the same great mysteries: about the ultimate nature and origin of the Universe and everything in it. Yet, no matter how far back we go, those same lingering questions always seem to remain: at some point, the entities that are our “starting point” didn’t necessarily exist, so how did they come to be? Eventually, you wind up at the ultimate question: how did something arise from nothing? As many recent questioners, including Luke Martin, Buzz Morse, Russell Blalack, John Heiss and many others have written:

    “Okay, you surely receive this question endlessly, but I shall ask nonetheless: How did something (the universe/big bang) come from nothing?”

    This is maybe one of the biggest questions of all, because it’s basically asking not only where did everything come from, but how did all of it arise in the first place. Here’s as far as science has gotten us, at least, so far.

    2
    A detailed look at the Universe reveals that it’s made of matter and not antimatter, that dark matter and dark energy are required, and that we don’t know the origin of any of these mysteries. However, the fluctuations in the CMB, the formation and correlations between large-scale structure, and modern observations of gravitational lensing all point towards the same picture. (Credit: Chris Blake and Sam Moorfield)

    Today, when we look out at the Universe, the full suite of observations we’ve collected, even with the known uncertainties taken into account, all point towards a remarkably consistent picture. Our Universe is made of matter (rather than antimatter), obeys the same laws of physics everywhere and at all times, and began — at least, as we know it — with a hot Big Bang some 13.8 billion years ago. It’s governed by General Relativity, it’s expanding and cooling and gravitating, and it’s dominated by dark energy (68%) and dark matter (27%), with normal matter, neutrinos, and radiation making up the rest.

    Today, of course, it’s full of galaxies, stars, planets, heavy elements, and in at least one location, intelligent and technologically advanced life. These structures weren’t always there, but rather arose as a result of cosmic evolution. In a remarkable scientific leap, 20th century scientists were able to reconstruct the timeline for how our Universe went from a mostly uniform Universe, devoid of complex structure and consisting exclusively of hydrogen and helium, to the structure-rich Universe we observe today.

    5
    Supernova remnants (L) and planetary nebulae (R) are both ways for stars to recycle their burned, heavy elements back into the interstellar medium and the next generation of stars and planets. These processes are two ways that the heavy elements necessary for chemical-based life to arise are generated, and it’s difficult (but not impossible) to imagine a Universe without them still giving rise to intelligent observers. (Credits: ESO/VLT/FORS Instrument & Team (L); NASA/ESA/C.R. O’Dell (Vanderbilt) and D. Thompson (LBT) (R))

    If we start from today, we can step backwards in time, and ask where any individual structure or component of that structure came from. For each answer we get, we can then ask, “ok, but where did that come from and how did that arise,” going back until we’re forced to answer, “we don’t know, at least not yet.” Then, at last, we can contemplate what we have, and ask, “how did that arise, and is there a way that it could have arisen from nothing?”

    So, let’s get started.

    The life we have today comes from complex molecules, which must have arisen from the atoms of the periodic table: the raw ingredients that make up all the normal matter we have in the Universe today. The Universe wasn’t born with these atoms; instead, they required multiple generations of stars living-and-dying, with the products of their nuclear reactions recycled into future generations of stars. Without this, planets and complex chemistry would be an impossibility.

    In order to form modern stars and galaxies, we need:

    gravitation to pull small galaxies and star clusters into one another, creating large galaxies and triggering new waves of star formation,
    which required pre-existing collections of mass, created from gravitational growth,
    which require dark matter haloes to form early on, preventing star forming episodes from ejecting that matter back into the intergalactic medium,
    which require the right balance of normal matter, dark matter, and radiation to give rise to the cosmic microwave background, the light elements formed in the hot Big Bang, and the abundances/patterns we see in them,
    which required initial seed fluctuations — density imperfections — to gravitationally grow into these structures,
    which require some way of creating these imperfections, along with some way of creating dark matter and creating the initial amounts of normal matter.

    These are three key ingredients that are required, in the early stages of the hot Big Bang, to give rise to the Universe as we observe it today. Assuming that we also require the laws of physics and spacetime itself to exist — along with matter/energy itself — we probably want to include those as the necessary ingredients that must somehow arise.

    So, in short, when we ask whether we can get a Universe from nothing or not, these are the novel, hitherto unexplained entities that we need to somehow arise.

    5
    An equally-symmetric collection of matter and antimatter (of X and Y, and anti-X and anti-Y) bosons could, with the right GUT properties, give rise to the matter/antimatter asymmetry we find in our Universe today. However, we assume that there is a physical, rather than a divine, explanation for the matter-antimatter asymmetry we observe today, but we do not yet know for certain. (Credit: E. Siegel/Beyond the Galaxy.)

    To get more matter than antimatter, we have to extrapolate back into the very early Universe, to a time when our physics is very much uncertain. The laws of physics as we know them are in some sense symmetric between matter and antimatter: every reaction we’ve ever created or observed can only create-or-destroy matter and antimatter in equal amounts. But the Universe we had, despite beginning in an incredibly hot and dense state where matter and antimatter could both be created in abundant, copious amounts, must have had some way to create a matter/antimatter asymmetry where none existed initially.

    There are many ways to accomplish this. Although we don’t know which scenario actually took place in our young Universe, all ways of doing so involve the following three elements:

    an out-of-equilibrium set of conditions, which naturally arise in an expanding, cooling Universe,
    a way to generate baryon-number-violating interactions, which the Standard Model allows through sphaleron interactions (and beyond-the-Standard-Model scenarios allow in additional ways),
    and a way to generate enough C and CP violation to create a matter/antimattery asymmetry in great enough amounts.

    The Standard Model has all of these ingredients, but not enough.

    If you consider a matter/antimatter symmetric Universe as “a Universe with nothing,” then it’s almost guaranteed that the Universe generated something from nothing, even though we aren’t quite certain exactly how it happened.

    6
    The overdense regions from the early Universe grow and grow over time, but are limited in their growth by both the initial small sizes of the overdensities and also by the presence of radiation that’s still energetic, which prevents structure from growing any faster. It takes tens-to-hundreds of millions of years to form the first stars; clumps of matter exist long before that, however. (Credit: Aaron Smith/TACC/UT-Austin)

    Similarly, there are lots of viable ways to generate dark matter. We know — from extensive testing and searching — that whatever dark matter is, it can’t be composed of any particles that are present in the Standard Model. Whatever its true nature is, it requires new physics beyond what’s presently known. But there are many ways it could have been created, including:

    from being thermally created in the hot, early Universe, and then failing to completely annihilate away, remaining stable thereafter (like the lightest supersymmetric or Kaluza-Klein particle),
    or from a phase transition that spontaneously occurred as the Universe expanded and cooled, ripping massive particles out of the quantum vacuum (e.g., the axion),
    as a new form of a neutrino, which itself can either mix with the known neutrinos (i.e., a sterile neutrino), or as a heavy right-handed neutrino that exists in addition to the conventional neutrinos,
    or as a purely gravitational phenomenon that gives rise to an ultramassive particle (e.g., a WIMPzilla).

    Why is there dark matter, today, when the remainder of the Universe appears to work just fine early on without it? There must have been some way to generate this “thing” where there wasn’t such a thing beforehand, but all of these scenarios require energy. So, then, where did all that energy come from?

    6
    The Universe as we observe it today began with the hot Big Bang: an early hot, dense, uniform, expanding state with specific initial conditions. But if we want to understand where the Big Bang comes from, we must not assume it’s the absolute beginning, and we must not assume that anything we can’t predict doesn’t have a mechanism to explain it. (Credit: C.-A. Faucher-Giguere, A. Lidz, and L. Hernquist, Science, 2008)

    Perhaps, according to cosmic inflation — our leading theory of the Universe’s pre-Big Bang origins — it really did come from nothing. This requires a little bit of an explanation, and is what is most frequently meant by “a Universe from nothing.” (Including, by the way, as it was used in the title of the book of the same name.)

    When you imagine the earliest stages of the hot Big Bang, you have to think of something incredibly hot, dense, high-energy, and almost perfectly uniform. When we ask, “how did this arise,” we typically have two options.

    We can go the Lady Gaga route, and just claim it must’ve been “born this way.” The Universe was born with these properties, which we call initial conditions, and there’s no further explanation. As a theoretical physicist, we call this approach “giving up.”
    Or we can do what theoretical physicists do best: try and concoct a theoretical mechanism that could explain the initial conditions, teasing out concrete predictions that differ from the standard, prevailing theory’s predictions and then going out seeking to measure the critical parameters.

    Cosmic inflation came about as a result of taking that second approach, and it literally changed our conception of how our Universe came to be.

    ___________________________________________________________________
    Inflation

    4
    Alan Guth, from M.I.T., who first proposed cosmic inflation

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    Alan Guth’s notes:
    Alan Guth’s original notes on inflation
    ________________________________________________________________
    7
    Exponential expansion, which takes place during inflation, is so powerful because it is relentless. With every ~10^-35 seconds (or so) that passes, the volume of any particular region of space doubles in each direction, causing any particles or radiation to dilute and causing any curvature to quickly become indistinguishable from flat. (Credit: E. Siegel (L); Ned Wright’s Cosmology Tutorial (R))

    Instead of extrapolating “hot and dense” back to an infinitely hot, infinitely dense singularity, inflation basically says, “perhaps the hot Big Bang was preceded by a period where an extremely large energy density was present in the fabric of space itself, causing the Universe to expand at a relentless (inflationary) rate, and then when inflation ended, that energy got transferred into matter-and-antimatter-and-radiation, creating what we see as the hot Big Bang: the aftermath of inflation.”

    In gory detail, this not only creates a Universe with the same temperature everywhere, spatial flatness, and no leftover relics from a hypothetical grand unified epoch, but also predicts a particular type and spectrum of seed (density) fluctuations, which we then went out and saw. From just empty space itself — although it is empty space filled with a large amount of field energy — a natural process has created the entire observable Universe, rich in structure, as we see it today.

    That’s the big idea of getting a Universe from nothing, but it isn’t satisfying to everyone.

    8
    Even in empty space, the quantum fluctuations inherent to the field nature of the fundamental interactions cannot be removed. As the Universe inflates in the earliest stages, those fluctuations get stretched across the Universe, giving rise to seed density and temperature fluctuations that can still be observed today. (Credit: E. Siegel/Beyond the Galaxy)

    To a large fraction of people, a Universe where space-and-time still exist, along with the laws of physics, the fundamental constants, and some non-zero field energy inherent to the fabric of space itself, is very much divorced from the idea of nothingness. We can imagine, after all, a location outside of space; a moment beyond the confines of time; a set of conditions that have no physical reality to constrain them. And those imaginings — if we define these physical realities as things we need to eliminate to obtain true nothingness — are certainly valid, at least philosophically.

    But that’s the difference between philosophical nothingness and a more physical definition of nothingness. As I wrote back in 2018, there are four scientific definitions of nothing, and they’re all valid, depending on your context:

    A time when your “thing” of interest didn’t exist,
    Empty, physical space,
    Empty spacetime in the lowest-energy state possible, and
    Whatever you’re left with when you take away the entire Universe and the laws governing it.

    We can definitely say we obtained “a Universe from nothing” if we use the first two definitions; we cannot if we use the third; and quite unfortunately, we don’t know enough to say what happens if we use the fourth. Without a physical theory to describe what happens outside of the Universe and beyond the realm physical laws, the concept of true nothingness is physically ill-defined.

    9
    Fluctuations in spacetime itself at the quantum scale get stretched across the Universe during inflation, giving rise to imperfections in both density and gravitational waves. While inflating space can rightfully be called ‘nothing’ in many regards, not everyone agrees. (Credit: E. Siegel; ESA/Planck and the DOE/NASA/NSF Interagency Task Force on CMB research)

    In the context of physics, it’s impossible to make sense of an idea of absolute nothingness. What does it mean to be outside of space and time, and how can space and time sensibly, predictably emerge from a state of non-existence? How can spacetime emerge at a particular location or time, when there’s no definition of location or time without it? Where do the rules governing quanta — the fields and particles both — arise from?

    This line of thought even assumes that space, time, and the laws of physics themselves weren’t eternal, when in fact they may be. Any theorems or proofs to the contrary rely on assumptions whose validity is not soundly established under the conditions which we’d seek to apply them. If you accept a physical definition of “nothing,” then yes, the Universe as we know it very much appears to have arisen from nothing. But if you leave physical constraints behind, then all certainly about our ultimate cosmic origins disappears.

    Unfortunately for us all, inflation, by its very nature, erases any information that might be imprinted from a pre-existing state on our observable Universe. Despite the limitless nature of our imaginations, we can only draw conclusions about matters for which tests involving our physical reality can be constructed. No matter how logically sound any other consideration may be, including a notion of absolute nothingness, it’s merely a construct of our minds.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 4:45 pm on January 21, 2022 Permalink | Reply
    Tags: "Any Single Galaxy Reveals the Composition of an Entire Universe", A group of scientists may have stumbled upon a radical new way to do cosmology., , Cosmic density of matter, , Dark Energy, , , , , , The Cosmology and Astrophysics with Machine Learning Simulations (CAMELS) project, Theoretical Astrophysics   

    From Quanta Magazine (US): “Any Single Galaxy Reveals the Composition of an Entire Universe” 

    From Quanta Magazine (US)

    January 20, 2022
    Charlie Wood

    1
    Credit: Kaze Wong / CAMELS collaboration.


    In the CAMELS project, coders simulated thousands of universes with diverse compositions, arrayed at the end of this video as cubes.

    A group of scientists may have stumbled upon a radical new way to do cosmology.

    Cosmologists usually determine the composition of the universe by observing as much of it as possible. But these researchers have found that a machine learning algorithm can scrutinize a single simulated galaxy and predict the overall makeup of the digital universe in which it exists — a feat analogous to analyzing a random grain of sand under a microscope and working out the mass of Eurasia. The machines appear to have found a pattern that might someday allow astronomers to draw sweeping conclusions about the real cosmos merely by studying its elemental building blocks.

    “This is a completely different idea,” said Francisco Villaescusa-Navarro, a theoretical astrophysicist at The Flatiron Institute Center for Computational Astrophysics (US) and lead author of the work. “Instead of measuring these millions of galaxies, you can just take one. It’s really amazing that this works.”

    It wasn’t supposed to. The improbable find grew out of an exercise Villaescusa-Navarro gave to Jupiter Ding, a Princeton University(US) undergraduate: Build a neural network that, knowing a galaxy’s properties, can estimate a couple of cosmological attributes. The assignment was meant merely to familiarize Ding with machine learning. Then they noticed that the computer was nailing the overall density of matter.

    “I thought the student made a mistake,” Villaescusa-Navarro said. “It was a little bit hard for me to believe, to be honest.”

    The results of the investigation that followed appeared on January 6 submitted for publication. The researchers analyzed 2,000 digital universes generated by The Cosmology and Astrophysics with Machine Learning Simulations (CAMELS) project [The Astrophysical Journal]. These universes had a range of compositions, containing between 10% and 50% matter with the rest made up of Dark Energy, which drives the universe to expand faster and faster. (Our actual cosmos consists of roughly one-third Dark Matter and visible matter and two-thirds Dark Energy.) As the simulations ran, Dark Matter and visible matter swirled together into galaxies. The simulations also included rough treatments of complicated events like supernovas and jets that erupt from supermassive black holes.

    Ding’s neural network studied nearly 1 million simulated galaxies within these diverse digital universes. From its godlike perspective, it knew each galaxy’s size, composition, mass, and more than a dozen other characteristics. It sought to relate this list of numbers to the density of matter in the parent universe.

    It succeeded. When tested on thousands of fresh galaxies from dozens of universes it hadn’t previously examined, the neural network was able to predict the cosmic density of matter to within 10%. “It doesn’t matter which galaxy you are considering,” Villaescusa-Navarro said. “No one imagined this would be possible.”

    “That one galaxy can get [the density to] 10% or so, that was very surprising to me,” said Volker Springel, an expert in simulating galaxy formation at The MPG Institute for Astrophysics [MPG Institut für Astrophysik](DE) who was not involved in the research.

    The algorithm’s performance astonished researchers because galaxies are inherently chaotic objects. Some form all in one go, and others grow by eating their neighbors. Giant galaxies tend to hold onto their matter, while supernovas and black holes in dwarf galaxies might eject most of their visible matter. Still, every galaxy had somehow managed to keep close tabs on the overall density of matter in its universe.

    One interpretation is “that the universe and/or galaxies are in some ways much simpler than we had imagined,” said Pauline Barmby, an astronomer at The Western University (CA). Another is that the simulations have unrecognized flaws.

    The team spent half a year trying to understand how the neural network had gotten so wise. They checked to make sure the algorithm hadn’t just found some way to infer the density from the coding of the simulation rather than the galaxies themselves. “Neural networks are very powerful, but they are super lazy,” Villaescusa-Navarro said.

    Through a series of experiments, the researchers got a sense of how the algorithm was divining the cosmic density. By repeatedly retraining the network while systematically obscuring different galactic properties, they zeroed in on the attributes that mattered most.

    Near the top of the list was a property related to a galaxy’s rotation speed, which corresponds to how much matter (dark and otherwise) sits in the galaxy’s central zone. The finding matches physical intuition, according to Springel. In a universe overflowing with Dark Matter, you’d expect galaxies to grow heavier and spin faster. So you might guess that rotation speed would correlate with the cosmic matter density, although that relationship alone is too rough to have much predictive power.

    The neural network found a much more precise and complicated relationship between 17 or so galactic properties and the matter density. This relationship persists despite galactic mergers, stellar explosions and black hole eruptions. “Once you get to more than [two properties], you can’t plot it and squint at it by eye and see the trend, but a neural network can,” said Shaun Hotchkiss, a cosmologist at The University of Auckland (NZ).

    While the algorithm’s success raises the question of how many of the universe’s traits might be extracted from a thorough study of just one galaxy, cosmologists suspect that real-world applications will be limited. When Villaescusa-Navarro’s group tested their neural network on a different property — cosmic clumpiness — it found no pattern. And Springel expects that other cosmological attributes, such as the accelerating expansion of the universe due to Dark Energy, have little effect on individual galaxies.

    The research does suggest that, in theory, an exhaustive study of the Milky Way and perhaps a few other nearby galaxies could enable an exquisitely precise measurement of our universe’s matter. Such an experiment, Villaescusa-Navarro said, could give clues to other numbers of cosmic import such as the sum of the unknown masses of the universe’s three types of neutrinos.

    3
    Neutrinos- Universe Today

    But in practice, the technique would have to first overcome a major weakness. The CAMELS collaboration cooks up its universes using two different recipes. A neural network trained on one of the recipes makes bad density guesses when given galaxies that were baked according to the other. The cross-prediction failure indicates that the neural network is finding solutions unique to the rules of each recipe. It certainly wouldn’t know what to do with the Milky Way, a galaxy shaped by the real laws of physics. Before applying the technique to the real world, researchers will need to either make the simulations more realistic or adopt more general machine learning techniques — a tall order.

    “I’m very impressed by the possibilities, but one needs to avoid being too carried away,” Springel said.

    But Villaescusa-Navarro takes heart that the neural network was able to find patterns in the messy galaxies of two independent simulations. The digital discovery raises the odds that the real cosmos may be hiding a similar link between the large and the small.

    “It’s a very beautiful thing,” he said. “It establishes a connection between the whole universe and a single galaxy.”

    _____________________________________________________________________________________
    The Dark Energy Survey

    Dark Energy Camera [DECam] built at DOE’s Fermi National Accelerator Laboratory(US).

    NOIRLab National Optical Astronomy Observatory(US) Cerro Tololo Inter-American Observatory(CL) Victor M Blanco 4m Telescope which houses the Dark-Energy-Camera – DECam at Cerro Tololo, Chile at an altitude of 7200 feet.

    NOIRLab(US)NSF NOIRLab NOAO (US) Cerro Tololo Inter-American Observatory(CL) approximately 80 km to the East of La Serena, Chile, at an altitude of 2200 meters.

    Timeline of the Inflationary Universe WMAP.

    The The Dark Energy Survey is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. The Dark Energy Survey began searching the Southern skies on August 31, 2013.

    According to Albert Einstein’s Theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up.

    Saul Perlmutter (center) [The Supernova Cosmology Project] shared the 2006 Shaw Prize in Astronomy, the 2011 Nobel Prize in Physics, and the 2015 Breakthrough Prize in Fundamental Physics with Brian P. Schmidt (right) and Adam Riess (left) [The High-z Supernova Search Team] for providing evidence that the expansion of the universe is accelerating.

    To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called Dark Energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    The Dark Energy Survey is designed to probe the origin of the accelerating universe and help uncover the nature of Dark Energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the Dark Energy Survey collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.
    _____________________________________________________________________________________

    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM, denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky.
    Coma cluster via NASA/ESA Hubble, the original example of Dark Matter discovered during observations by Fritz Zwicky and confirmed 30 years later by Vera Rubin.
    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.

    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.

    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.
    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970.

    Vera Rubin measuring spectra, worked on Dark Matter(Emilio Segre Visual Archives AIP SPL).
    Dark Matter Research

    LBNL LZ Dark Matter Experiment (US) xenon detector at Sanford Underground Research Facility(US) Credit: Matt Kapust.

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    DAMA at Gran Sasso uses sodium iodide housed in copper to hunt for dark matter LNGS-INFN.

    Yale HAYSTAC axion dark matter experiment at Yale’s Wright Lab.

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB (CA) deep in Sudbury’s Creighton Mine.

    The LBNL LZ Dark Matter Experiment (US) Dark Matter project at SURF, Lead, SD, USA.

    DAMA-LIBRA Dark Matter experiment at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS) located in the Abruzzo region of central Italy.

    DARWIN Dark Matter experiment. A design study for a next-generation, multi-ton dark matter detector in Europe at The University of Zurich [Universität Zürich](CH).

    PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China.

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.
    ______________________________________________________

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine (US) is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 9:28 pm on December 6, 2021 Permalink | Reply
    Tags: "The uneven universe", An uneven distribution of the mass in the universe may have an effect on the speed of cosmic expansion., , , , , Dark Energy, , In reality the universe is not uniform: in some places there are stars and planets and in others there is just a void., It is almost always assumed in cosmological calculations that there is a even distribution of matter in the universe., One of the most important applications of the theory is in describing the cosmic expansion of the universe since the Big Bang., , The scientists starting point was the Mori-Zwanzig formalism-a method for describing systems consisting of a large number of particles with a small number of measurands., The speed of this expansion is determined by the amount of energy in the universe., The University of Münster [Westfälische Wilhelms-Universität Münster] (DE)   

    From The University of Münster [Westfälische Wilhelms-Universität Münster] (DE): “The uneven universe” 

    1

    From The University of Münster [Westfälische Wilhelms-Universität Münster](DE)

    3. December 2021

    Communication and Public Relations
    Schlossplatz 2
    48149 Münster
    Tel: +49 251 83-22232
    Fax: +49 251 83-22258
    communication@uni-muenster.de

    Timeline of the Inflationary Universe NASA WMAP (US)

    Researchers study cosmic expansion using methods from many-body physics / Article published in Physical Review Letters.

    It is almost always assumed in cosmological calculations that there is a even distribution of matter in the universe. This is because the calculations would be much too complicated if the position of every single star were to be included. In reality the universe is not uniform: in some places there are stars and planets and in others there is just a void. Physicists Michael te Vrugt and Prof. Raphael Wittkowski from the Institute of Theoretical Physics and the Center for Soft Nanoscience (SoN) at the University of Münster have, together with physicist Dr. Sabine Hossenfelder from The Frankfurt Institute for Advanced Studies (DE), developed a new model for this problem. Their starting point was the Mori-Zwanzig formalism-a method for describing systems consisting of a large number of particles with a small number of measurands. The results of the study have now been published in the journal Physical Review Letters.

    Background: The theory of general relativity developed by Albert Einstein is one of the most successful theories in modern physics. Two of the last five Nobel Prizes for Physics had associations with it: in 2017 for the measurement of gravitational waves, and in 2020 for the discovery of a black hole at the centre of the Milky Way. One of the most important applications of the theory is in describing the cosmic expansion of the universe since the Big Bang. The speed of this expansion is determined by the amount of energy in the universe. In addition to the visible matter, it is above all the dark matter and dark energy which play a role here – at least, according to the Lambda-CDM model currently used in cosmology.

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    “Strictly speaking, it is mathematically wrong to include the mean value of the universe’s energy density in the equations of general relativity”, says Sabine Hossenfelder. The question is now how “bad” this mistake is. Some experts consider it to be irrelevant, others see in it the solution to the enigma of dark energy, whose physical nature is still unknown. An uneven distribution of the mass in the universe may have an effect on the speed of cosmic expansion.

    “The Mori-Zwanzig formalism is already being successfully used in many fields of research, from biophysics to particle physics,” says Raphael Wittkowski, “so it also offered a promising approach to this astrophysical problem.” The team generalised this formalism so that it could be applied to general relativity and, in doing so, derived a model for cosmic expansion while taking into consideration the uneven distribution of matter in the universe.

    The model makes a concrete prediction for the effect of these so-called inhomogeneities on the speed of the expansion of the universe. This prediction deviates slightly from that given by the Lambda-CDM model and thus provides an opportunity to test the new model experimentally. “At present, the astronomical data are not precise enough to measure this deviation,” says Michael te Vrugt, “but the great progress made – for example, in the measurement of gravitational waves – gives us reason to hope that this will change. Also, the new variant of the Mori-Zwanzig formalism can also be applied to other astrophysical problems – so the work is relevant not only to cosmology.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sitz der WWU
    Foto: MünsterView/Tronquet

    The The University of Münster [Westfälische Wilhelms-Universität Münster](DE) is a public university located in the city of Münster, North Rhine-Westphalia in Germany.

    With more than 43,000 students and over 120 fields of study in 15 departments, it is Germany’s fifth largest university and one of the foremost centers of German intellectual life. The university offers a wide range of subjects across the sciences, social sciences and the humanities. Several courses are also taught in English, including PhD programmes as well as postgraduate courses in geoinformatics, geospational technologies or information systems.

    Professors and former students have won ten Leibniz Prizes, the most prestigious as well as the best-funded prize in Europe, and one Fields Medal. The WWU has also been successful in the German government’s Excellence Initiative.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: