Tagged: Cosmic Microwave Background – CMB Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:25 pm on June 6, 2022 Permalink | Reply
    Tags: "Cosmological Gravitational Waves- a new approach to reach back to the Big Bang", , , , , Cosmic Microwave Background - CMB, , Dark Energy and Dark Matter cosmological components, Deep analysis of the data from the POLARBEAR Observatory probes with unprecedented accuracy looking for Cosmological Gravitational Waves., , In physical cosmology cosmic inflation cosmological inflation is a theory of exponential expansion of space in the early universe., Operating observatories around the globe target sky regions characterized by low contamination from Galactic radiation looking for the imprint of Cosmological Gravitational Waves (CGWs)., The acceleration of this expansion due to dark energy began after the universe was already over 7.7 billion years old., ,   

    From The International School for Advanced Studies [Scuola Internazionale Superiore di Studi Avanzati](IT) : “Cosmological Gravitational Waves- a new approach to reach back to the Big Bang” 

    1

    From The International School for Advanced Studies [Scuola Internazionale Superiore di Studi Avanzati](IT)

    02 June 2022

    Nico Pitrelli
    pitrelli@sissa.it
    T +39 040 3787462
    M +39 339 1337950

    Chiara Saviane
    saviane@sissa.it
    T +39 040 3787230
    M +39 333 7675962

    Deep analysis of the data from the POLARBEAR Observatory probes the
    sky region with unprecedented accuracy looking for Cosmological
    Gravitational Waves.

    Operating observatories around the globe target sky regions characterized by low contamination from Galactic radiation looking for the imprint of Cosmological Gravitational Waves (CGWs) produced during Inflation, the mysterious phase of
    quasi-exponential expansion of space, in the very early Universe.

    ___________________________________________________________________
    Cosmic Inflation Theory

    In physical cosmology cosmic inflation cosmological inflation is a theory of exponential expansion of space in the early universe. The inflationary epoch lasted from 10^−36 seconds after the conjectured Big Bang singularity to some time between 10^−33 and 10^−32 seconds after the singularity. Following the inflationary period, the universe continued to expand, but at a slower rate. The acceleration of this expansion due to dark energy began after the universe was already over 7.7 billion years old (5.4 billion years ago).

    Inflation theory was developed in the late 1970s and early 80s, with notable contributions by several theoretical physicists, including Alexei Starobinsky at Landau Institute for Theoretical Physics, Alan Guth at Cornell University, and Andrei Linde at Lebedev Physical Institute. Alexei Starobinsky, Alan Guth, and Andrei Linde won the 2014 Kavli Prize “for pioneering the theory of cosmic inflation.” It was developed further in the early 1980s. It explains the origin of the large-scale structure of the cosmos. Quantum fluctuations in the microscopic inflationary region, magnified to cosmic size, become the seeds for the growth of structure in the Universe. Many physicists also believe that inflation explains why the universe appears to be the same in all directions (isotropic), why the cosmic microwave background radiation is distributed evenly, why the universe is flat, and why no magnetic monopoles have been observed.

    The detailed particle physics mechanism responsible for inflation is unknown. The basic inflationary paradigm is accepted by most physicists, as a number of inflation model predictions have been confirmed by observation;[a] however, a substantial minority of scientists dissent from this position. The hypothetical field thought to be responsible for inflation is called the inflaton.

    In 2002 three of the original architects of the theory were recognized for their major contributions; physicists Alan Guth of M.I.T., Andrei Linde of Stanford, and Paul Steinhardt of Princeton shared the prestigious Dirac Prize “for development of the concept of inflation in cosmology”. In 2012 Guth and Linde were awarded the Breakthrough Prize in Fundamental Physics for their invention and development of inflationary cosmology.

    4
    Alan Guth, from M.I.T., who first proposed Cosmic Inflation.

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    Alan Guth’s notes:
    Alan Guth’s original notes on inflation.
    ___________________________________________________________________

    A new study by the POLARBEAR collaboration, led by SISSA for the part concerning the interpretation for Cosmology and published in The Astrophysical Journal, provides a new correction algorithm that allows to almost double the amount of reliable data acquired in such observatories, thus giving access to uncharted territory of the signal produced from CGWs and bringing us closer to the Big Bang.

    “According to the current understanding in Cosmology, just after the Big Bang the Universe was very small, dense and hot. In 10^-35 seconds, it stretched by a factor of 10^30,” Carlo Baccigalupi, coordinator of the Astrophysics & Cosmology group at SISSA, explains. “This process, known as Inflation, produced Cosmological Gravitational Waves (CGW) that can be detected through the polarization of the Cosmic Microwave Background (CMB), the leftover radiation from the Big Bang.

    The POLARBEAR experiment, which SISSA is part of, looks for such signals using the Huan Tran Telescope in the Atacama Desert of northern Chile in the Antofagasta Region.

    The analysis of data acquired by the POLARBEAR Observatory is a complex pipeline where reliability of measurements represents a most delicate and key factor. “The CGWs excite only a tiny fraction of the CMB polarization signal, better known as B-modes,” Nicoletta Krachmalnicoff, researcher at SISSA, and Davide Poletti, previously at the same institute, explain. “They are very difficult to measure, in particular because of the contamination of the signal due to the
    emissions of the diffuse Galactic gas. This must be removed with exquisite accuracy to isolate the unique contribution of CGWs.”

    Over the past two years, Anto. I. Lonappan, PhD student at SISSA, and Satoru Takakura from the University of Colorado-Boulder have been characterizing the quality of an extended dataset from the POLARBEAR collaboration, tracing
    all the known instrumental and physical uncertainties and systematics. “We have implemented an algorithm that assigns accuracy to the measurements in the‘Large Patch’, a region extending for about 670 squared degrees in the Southern
    Celestial Hemisphere, where our sounder reveals data in agreement with other probes looking in the same location, such as the BICEP2/Keck Array located in the South Pole,” they explain. The study has now been published in The
    Astrophysical Journal
    .

    “This is a milestone on a long road heading to the observation of CGWs. The new approach allows us to probe the sky with unprecedent accuracy, doubling the amount of reliable data and, thus, of accessible information. This is a crucial step for the whole community now that new telescopes are being prepared for operations” the scientists add.

    Great developments are on their way from the experimental point of view. A system of three upgraded POLARBEAR Telescopes, known as the Simons Array, is in preparation.

    The Simons Observatory, a new system of Small and Large Aperture Telescopes, funded by the Simons Foundation, will be
    operational from a nearby location, in Atacama, with first light happening in 2023.

    Later in this decade, the LiteBIRD satellite will fly, and an extended network of ground-based observatories, which facilities in the Atacama Desert and the South Pole, known as “Stage IV”, will complement these observations.

    “All these efforts will lead to the ultimate measurement of CGWs, revealing at the same time most important clues about the Dark Energy and Dark Matter cosmological components,” Baccigalupi concludes. “Through the main mission of SISSA as a PhD school, training students to become young researchers, our Institute is and will be contributing significantly to the main contemporary challenges for Physics, as the present one, targeting Gravitational Waves from a tiny fraction of a second after the Big Bang”.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    2

    The International School for Advanced Studies Scuola Internazionale Superiore di Studi Avanzati (SISSA) is an international, state-supported, post-graduate-education and research institute, located in Trieste, Italy.

    SISSA is active in the fields of mathematics, physics, and neuroscience, offering both undergraduate and post-graduate courses. Each year, about 70 PhD students are admitted to SISSA based on their scientific qualifications. SISSA also runs master’s programs in the same areas, in collaboration with both Italian and other European universities.

    SISSA was founded in 1978, as a part of the reconstruction following the Friuli earthquake of 1976. Although the city of Trieste itself did not suffer any damage, physicist Paolo Budinich asked and obtained from the Italian government to include in the interventions the institution of a new, post-graduate teaching and research institute, modeled on the Scuola Normale Superiore di Pisa. The school became operative with a PhD course in theoretical physics, and Budinich himself was appointed as general director. In 1986, Budinich left his position to Daniele Amati, who at the time was at the head of the theoretical division at The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH)[CERN]. Under his leadership, SISSA expanded its teaching and research activity towards the field of neuroscience, and instituted a new interdisciplinary laboratory aiming at connecting humanities and scientific studies. From 2001 to 2004, the director was the Italian geneticist Edoardo Boncinelli, who fostered the development of the existing research areas. Other directors were appointed in the following years, which saw the strengthening of SISSA collaboration with other Italian and European universities in offering master’s degree programs in the three areas of the School (mathematics, physics and neuroscience). Physicist Stefano Ruffo served as the director from 2015 until 2021, when he was succeeded by Andrea Romanino.

     
  • richardmitnick 10:30 am on January 13, 2022 Permalink | Reply
    Tags: "New theory finds upcoming satellite mission will be able to detect more than expected", A large amount of gravitational waves can be sourced by the quantum vacuum fluctuations of additional fields during inflation., A success story of this hypothesis is that even the simplest inflationary models are able to accurately predict the inhomogeneous distribution of matter in the Universe., , Cosmic Microwave Background - CMB, Detecting these gravitational waves is considered determining the energy at which inflation took place., , How much the inflation field-or the energy source of inflation-can change during inflation — a relation referred to as the “Lyth bound”., JAXA LiteBIRD, Scientists elegantly decoupled the generation of the two types of fluctuations and solved this problem., , These gravitational wave propagating ripples of space and time are important for understanding the physics during the inflationary epoch., Understanding primordial gravitational waves theoretically is gaining interest so any potential detection by LiteBIRD can be interpreted., When you generate gravitational waves from enhanced fluctuations of additional fields you simultaneously generate extra curvature fluctuations.   

    From The Kavli Institute for the Physics and Mathematics of the Universe (IPMU) [カブリ数物連携宇宙研](JP) at The University of Tokyo [東京大学](JP): “New theory finds upcoming satellite mission will be able to detect more than expected” 

    KavliFoundation

    From The Kavli Institute for the Physics and Mathematics of the Universe (IPMU) [カブリ数物連携宇宙研](JP) at The University of Tokyo [東京大学](JP)

    Kavli IPMU

    The upcoming satellite experiment LiteBIRD is expected to probe the physics of the very early Universe if the primordial inflation happened at high energies.

    JAXA LiteBIRD Kavli IPMU

    But now, a new paper in Physical Review Letters shows it can also test inflationary scenarios operating at lower energies.

    1
    The green line is the lowest signal the LiteBIRD can still observe, so any observable signal should be above that line. The red and black lines are the team’s predictions for two different parameter specifications in their model, showing detection is possible. In contrast, the more standard inflationary models operating at the same energy as the team’s mechanism predict the lower gray (dashed) line, which is below the sensitivity limit of LiteBIRD. (Credit: Cai et al.)

    Cosmologists believe that in its very early stages, the Universe underwent a very rapid expansion called “cosmic inflation”.

    _____________________________________________________________________________________
    Inflation

    4
    Alan Guth, from M.I.T., who first proposed cosmic inflation.

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    Alan Guth’s notes:
    Alan Guth’s original notes on inflation.
    _____________________________________________________________________________________

    A success story of this hypothesis is that even the simplest inflationary models are able to accurately predict the inhomogeneous distribution of matter in the Universe. During inflation, these vacuum fluctuations were stretched to astronomical scales, becoming the source all the structure in the Universe, including the Cosmic Microwave Background [CMB] anisotropies, distribution of Dark Matter and galaxies.

    CMB per European Space Agency(EU) Planck.

    The same mechanism also produced gravitational waves.

    Gravitational waves. Credit: W.Benger-Zib. MPG Institute for Gravitational Physics (DE)

    These gravitational wave propagating ripples of space and time are important for understanding the physics during the inflationary epoch. In general, detecting these gravitational waves is considered determining the energy at which inflation took place. It is also linked to how much the inflation field-or the energy source of inflation-can change during inflation — a relation referred to as the “Lyth bound”.

    2
    An artist’s conception of how gravitational waves distort the shape of space and time in the universe (Credit: Kavli IPMU).

    The primordial gravitational waves generated from vacuum are extremely weak, and are very difficult to detect, but the JAXA-led LiteBIRD mission might be able to detect them via the polarization measurements of the Cosmic Microwave Background. Because of this, understanding primordial gravitational waves theoretically is gaining interest so any potential detection by LiteBIRD can be interpreted. It is expected LiteBIRD will be able to detect primordial gravitational waves if inflation happened at sufficiently high energies.

    Several inflationary models constructed in the framework of quantum gravity often predict very low energy scale for inflation, and so would be untestable by LiteBIRD. However, a new study by researchers, including the Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU), has shown the opposite. The researchers argue such scenarios of fundamental importance can be tested by LiteBIRD, if they are accompanied by additional fields, sourcing gravitational waves.

    The researchers suggest an idea, logically very different from the usual.

    “Within our framework in addition to the gravitational waves originating from vacuum fluctuations, a large amount of gravitational waves can be sourced by the quantum vacuum fluctuations of additional fields during inflation. Due to this we were able to produce an observable amount of gravitational waves even if inflation takes place at lower energies.

    “The quantum fluctuations of scalar fields during inflation are typically small, and such induced gravitational waves are not relevant in standard inflationary scenarios. However, if the fluctuations of the additional fields are enhanced, they can source a significant amount of gravitational waves,” said paper author and Kavli IPMU Project Researcher Valeri Vardanyan.

    Other researchers have been working on related ideas, but so far no successful mechanism based on scalar fields alone had been found.

    “The main problem is that when you generate gravitational waves from enhanced fluctuations of additional fields, you also simultaneously generate extra curvature fluctuations, which would make the Universe appear more clumpy than it is in reality. We elegantly decoupled the generation of the two types of fluctuations and solved this problem,” said Vardanyan.

    In their paper, the researchers proposed a proof-of-concept based on two scalar fields operating during inflation.

    “Imagine a car with two engines, corresponding to the two fields of our model. One of the engines is connected to the wheels of the car, while the other one is not. The first one is responsible for moving the car, and, when on a muddy road, for generating all the traces on the road. These represent the seeds of structure in the Universe. The second engine is only producing sound. This represents the gravitational waves, and does not contribute to the movement of the car, or the generation of traces on the road,” said Vardanyan.

    The team quantitatively demonstrated their mechanism works, and even calculated the predictions of their model for the upcoming LiteBIRD mission.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Kavli Institute for the Physics and Mathematics of the Universe (IPMU) [カブリ数物連携宇宙研](JP) at The University of Tokyo [東京大学](JP) is an international research institute with English as its official language. The goal of the institute is to discover the fundamental laws of nature and to understand the Universe from the synergistic perspectives of mathematics, astronomy, and theoretical and experimental physics. The Institute for the Physics and Mathematics of the Universe (IPMU) was established in October 2007 under the World Premier International Research Center Initiative (WPI) of the Ministry of Education, Sports, Science and Technology in Japan with the University of Tokyo as the host institution. IPMU was designated as the first research institute within the University of Tokyo Institutes for Advanced Study (UTIAS) in January 2011. It received an endowment from The Kavli Foundation and was renamed the “Kavli Institute for the Physics and Mathematics of the Universe” in April 2012. Kavli IPMU is located on the Kashiwa campus of the University of Tokyo, and more than half of its full-time scientific members come from outside Japan. http://www.ipmu.jp/

    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

     
  • richardmitnick 12:52 pm on February 8, 2021 Permalink | Reply
    Tags: "The Stars Within Us", , , , , Cosmic Microwave Background - CMB, , Creation of heavier elements requires more extreme environments usually triggered by the end of a star’s life in a supernova., , How the Elements Inside You and Everything Were Forged., Intense heat and pressure fused hydrogen atoms to form helium and lithium., , , , Within a few hundred million years after the Big Bang clouds of hydrogen gas condensed into the first stars., Within the first three minutes following the Big Bang the fundamental building blocks of matter formed and merged into the first element–hydrogen.   

    From National Science Foundation (US): “The Stars Within Us” 

    From National Science Foundation (US)

    1
    Credit: Nicolle R. Fuller/NSF.

    Humans have always looked to the stars and studied them. Over the past century, science has revealed the fundamental role stars play for nearly everything in existence, including the elements on the Periodic Table.

    Periodic Table from
    International Union of Pure and Applied Chemistry 2019.

    The birth, life and death of every star creates and disseminates the elements of the Periodic Table throughout the universe, a cycle that began nearly 14 billion years ago and repeats continuously today.

    Without it, the Earth and everything on it – air, water, soil, plants, wildlife, and human life – would not exist.


    The Stars Within Us: How the Elements Inside You, and Everything, Were Forged.

    Within the first three minutes following the Big Bang, the fundamental building blocks of matter formed and merged into the first element–hydrogen. Within a few hundred million years after the Big Bang, clouds of hydrogen gas condensed into the first stars. In the cores of those stars, intense heat and pressure fused hydrogen atoms to form helium and lithium.

    Recently, astronomers from several U.S.-based universities detected a signal from the birth of those early stars. Since the stars are too distant to be seen with telescopes, the astronomers searched for indirect evidence, such as a tell-tale change in the background electromagnetic radiation that permeates the universe, called the cosmic microwave background [CMB].

    CMB per ESA/Planck.

    Supported for more than a decade by the U.S. National Science Foundation, researchers placed a radio antenna not much larger than a refrigerator in the Australian desert and found clear evidence of these massive blue stars.

    EDGES telescope in a radio quiet zone at the Murchison Radio-astronomy Observatory in Western Australia.

    More chaos, more elements

    The normal functions of a star—those that make it shine brightly and burn at temperatures of thousands of degrees—create the simplest and lightest elements. Creation of heavier elements requires more extreme environments, usually triggered by the end of a star’s life in a supernova.

    After the hydrogen in a star’s core is exhausted, the star fuses helium to form progressively heavier elements, such as carbon and iron. As this fuel runs out, the star either explodes into a supernova, seeding the universe with those elements, or violently collapses, creating neutron stars and black holes. In such violent implosions, star collisions, and the extreme environments around black holes, the heavier elements are forged and then spread far across interstellar space.

    2
    Artist’s now iconic illustration of two merging neutron stars. The beams represent the gamma-ray burst while the rippling space-time grid indicates the isotropic gravitational waves. Credit: A. Simonnet/National Science Foundation/LIGO/Sonoma State University.

    In 2017, for the first time in history, researchers using the twin detectors of NSF’s Laser Interferometer Gravitational-Wave Observatory detected gravitational waves created by the collision of two neutron stars.

    Localizations of gravitational-wave signals detected by LIGO in 2015 (GW150914, LVT151012, GW151226, GW170104), more recently, by the LIGO-Virgo network (GW170814, GW170817). After Virgo (IT) came online in August 2018.

    The researchers worked with the Europe-based Virgo gravitational wave detector and some 70 ground- and space-based telescopes across the globe to track and record the gamma radiation, X-rays, light, and radio waves that cascaded from the explosion.

    MIT /Caltech Advanced aLigo at Hanford, WA (US), Livingston, LA, (US) and VIRGO Gravitational Wave interferometer, near Pisa, Italy.

    The observations revealed signatures of recently synthesized elements, including gold and platinum, solving a decades-long mystery of how nearly half of all elements heavier than iron are produced.

    Some of the heaviest elements, such as uranium, are forged near black holes and in the powerful jets that can emanate from them, such as those that surge away from “feeding” black holes, like blazars, an active galactic nucleus with a relativistic jet composed of ionized matter.

    3
    The timeline of the universe, with the first stars emerging by 180 million years after the Big Bang and black holes another 70 millions years after. Photo Credit: N.R.Fuller/National Science Foundation.

    The NSF-supported Event Horizon Telescope presented the first direct visual evidence of a supermassive black hole in 2019, and NSF’s Ice Cube detector has worked with collaborating observatories to trace a cosmic neutrino to its blazar source.

    EHT map.

    Messier 87*, The first image of the event horizon of a black hole. This is the supermassive black hole at the center of the galaxy Messier 87. Image via JPL/ Event Horizon Telescope Collaboration released on 10 April 2019.

    These extreme environments in space are where the heaviest elements are formed, but because they have such short half-lives, scientists have yet to directly witness their formation, and they have not survived to be found on Earth today.

    This is where researchers in the laboratory have built upon what we have learned from studying the cosmos.

    Filling the Periodic Table

    On Earth, ancient cultures were first to isolate a handful of elements, such as copper and mercury, though in recent centuries, scientists have identified and isolated more than 100 more. They are categorized using the Periodic Table—first published in 1869 by Russian chemist Dmitri Mendeleev. The initial Periodic Table contained 28 elements, and Mendeleev predicted the existence of unidentified elements, leaving gaps for future scientists to fill.

    Laboratory experiments have expanded the Periodic Table to include 118 known elements. For some, particularly the heaviest, they were only discovered when physicists crafted them from the fusion of lighter elements. The heaviest known element is oganesson, which holds 118 protons in its nucleus, although only for fractions of a millisecond.

    Like the stars that constantly recycle and distribute elements throughout space, researchers in all disciplines continue their efforts to expand the Periodic Table and deepen the understanding of the atoms from which we are constructed. This is an ongoing process, and future generations of scientists are just now making their initial observations or conducting their first experiments that will expand the knowledge about the universe and ourselves.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition
    The National Science Foundation (NSF) (US) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.

    We fulfill our mission chiefly by issuing limited-term grants — currently about 12,000 new awards per year, with an average duration of three years — to fund specific research proposals that have been judged the most promising by a rigorous and objective merit-review system. Most of these awards go to individuals or small groups of investigators. Others provide funding for research centers, instruments and facilities that allow scientists, engineers and students to work at the outermost frontiers of knowledge.

    NSF’s goals — discovery, learning, research infrastructure and stewardship — provide an integrated strategy to advance the frontiers of knowledge, cultivate a world-class, broadly inclusive science and engineering workforce and expand the scientific literacy of all citizens, build the nation’s research capability through investments in advanced instrumentation and facilities, and support excellence in science and engineering research and education through a capable and responsive organization. We like to say that NSF is “where discoveries begin.”

    Many of the discoveries and technological advances have been truly revolutionary. In the past few decades, NSF-funded researchers have won some 236 Nobel Prizes as well as other honors too numerous to list. These pioneers have included the scientists or teams that discovered many of the fundamental particles of matter, analyzed the cosmic microwaves left over from the earliest epoch of the universe, developed carbon-14 dating of ancient artifacts, decoded the genetics of viruses, and created an entirely new state of matter called a Bose-Einstein condensate.

    NSF also funds equipment that is needed by scientists and engineers but is often too expensive for any one group or researcher to afford. Examples of such major research equipment include giant optical and radio telescopes, Antarctic research sites, high-end computer facilities and ultra-high-speed connections, ships for ocean research, sensitive detectors of very subtle physical phenomena and gravitational wave observatories.

    Another essential element in NSF’s mission is support for science and engineering education, from pre-K through graduate school and beyond. The research we fund is thoroughly integrated with education to help ensure that there will always be plenty of skilled people available to work in new and emerging scientific, engineering and technological fields, and plenty of capable teachers to educate the next generation.

    No single factor is more important to the intellectual and economic progress of society, and to the enhanced well-being of its citizens, than the continuous acquisition of new knowledge. NSF is proud to be a major part of that process.

    Specifically, the Foundation’s organic legislation authorizes us to engage in the following activities:

    Initiate and support, through grants and contracts, scientific and engineering research and programs to strengthen scientific and engineering research potential, and education programs at all levels, and appraise the impact of research upon industrial development and the general welfare.
    Award graduate fellowships in the sciences and in engineering.
    Foster the interchange of scientific information among scientists and engineers in the United States and foreign countries.
    Foster and support the development and use of computers and other scientific methods and technologies, primarily for research and education in the sciences.
    Evaluate the status and needs of the various sciences and engineering and take into consideration the results of this evaluation in correlating our research and educational programs with other federal and non-federal programs.
    Provide a central clearinghouse for the collection, interpretation and analysis of data on scientific and technical resources in the United States, and provide a source of information for policy formulation by other federal agencies.
    Determine the total amount of federal money received by universities and appropriate organizations for the conduct of scientific and engineering research, including both basic and applied, and construction of facilities where such research is conducted, but excluding development, and report annually thereon to the President and the Congress.
    Initiate and support specific scientific and engineering activities in connection with matters relating to international cooperation, national security and the effects of scientific and technological applications upon society.
    Initiate and support scientific and engineering research, including applied research, at academic and other nonprofit institutions and, at the direction of the President, support applied research at other organizations.
    Recommend and encourage the pursuit of national policies for the promotion of basic research and education in the sciences and engineering. Strengthen research and education innovation in the sciences and engineering, including independent research by individuals, throughout the United States.
    Support activities designed to increase the participation of women and minorities and others underrepresented in science and technology.

    At present, NSF has a total workforce of about 2,100 at its Alexandria, VA, headquarters, including approximately 1,400 career employees, 200 scientists from research institutions on temporary duty, 450 contract workers and the staff of the NSB office and the Office of the Inspector General.

    NSF is divided into the following seven directorates that support science and engineering research and education: Biological Sciences, Computer and Information Science and Engineering, Engineering, Geosciences, Mathematical and Physical Sciences, Social, Behavioral and Economic Sciences, and Education and Human Resources. Each is headed by an assistant director and each is further subdivided into divisions like materials research, ocean sciences and behavioral and cognitive sciences.

    Within NSF’s Office of the Director, the Office of Integrative Activities also supports research and researchers. Other sections of NSF are devoted to financial management, award processing and monitoring, legal affairs, outreach and other functions. The Office of the Inspector General examines the foundation’s work and reports to the NSB and Congress.

    Each year, NSF supports an average of about 200,000 scientists, engineers, educators and students at universities, laboratories and field sites all over the United States and throughout the world, from Alaska to Alabama to Africa to Antarctica. You could say that NSF support goes “to the ends of the earth” to learn more about the planet and its inhabitants, and to produce fundamental discoveries that further the progress of research and lead to products and services that boost the economy and improve general health and well-being.

    As described in our strategic plan, NSF is the only federal agency whose mission includes support for all fields of fundamental science and engineering, except for medical sciences. NSF is tasked with keeping the United States at the leading edge of discovery in a wide range of scientific areas, from astronomy to geology to zoology. So, in addition to funding research in the traditional academic areas, the agency also supports “high risk, high pay off” ideas, novel collaborations and numerous projects that may seem like science fiction today, but which the public will take for granted tomorrow. And in every case, we ensure that research is fully integrated with education so that today’s revolutionary work will also be training tomorrow’s top scientists and engineers.

    Unlike many other federal agencies, NSF does not hire researchers or directly operate our own laboratories or similar facilities. Instead, we support scientists, engineers and educators directly through their own home institutions (typically universities and colleges). Similarly, we fund facilities and equipment such as telescopes, through cooperative agreements with research consortia that have competed successfully for limited-term management contracts.

    NSF’s job is to determine where the frontiers are, identify the leading U.S. pioneers in these fields and provide money and equipment to help them continue. The results can be transformative. For example, years before most people had heard of “nanotechnology,” NSF was supporting scientists and engineers who were learning how to detect, record and manipulate activity at the scale of individual atoms — the nanoscale. Today, scientists are adept at moving atoms around to create devices and materials with properties that are often more useful than those found in nature.

    Dozens of companies are gearing up to produce nanoscale products. NSF is funding the research projects, state-of-the-art facilities and educational opportunities that will teach new skills to the science and engineering students who will make up the nanotechnology workforce of tomorrow.

    At the same time, we are looking for the next frontier.

    NSF’s task of identifying and funding work at the frontiers of science and engineering is not a “top-down” process. NSF operates from the “bottom up,” keeping close track of research around the United States and the world, maintaining constant contact with the research community to identify ever-moving horizons of inquiry, monitoring which areas are most likely to result in spectacular progress and choosing the most promising people to conduct the research.

    NSF funds research and education in most fields of science and engineering. We do this through grants and cooperative agreements to more than 2,000 colleges, universities, K-12 school systems, businesses, informal science organizations and other research organizations throughout the U.S. The Foundation considers proposals submitted by organizations on behalf of individuals or groups for support in most fields of research. Interdisciplinary proposals also are eligible for consideration. Awardees are chosen from those who send us proposals asking for a specific amount of support for a specific project.

    Proposals may be submitted in response to the various funding opportunities that are announced on the NSF website. These funding opportunities fall into three categories — program descriptions, program announcements and program solicitations — and are the mechanisms NSF uses to generate funding requests. At any time, scientists and engineers are also welcome to send in unsolicited proposals for research and education projects, in any existing or emerging field. The Proposal and Award Policies and Procedures Guide (PAPPG) provides guidance on proposal preparation and submission and award management. At present, NSF receives more than 42,000 proposals per year.

    To ensure that proposals are evaluated in a fair, competitive, transparent and in-depth manner, we use a rigorous system of merit review. Nearly every proposal is evaluated by a minimum of three independent reviewers consisting of scientists, engineers and educators who do not work at NSF or for the institution that employs the proposing researchers. NSF selects the reviewers from among the national pool of experts in each field and their evaluations are confidential. On average, approximately 40,000 experts, knowledgeable about the current state of their field, give their time to serve as reviewers each year.

    The reviewer’s job is to decide which projects are of the very highest caliber. NSF’s merit review process, considered by some to be the “gold standard” of scientific review, ensures that many voices are heard and that only the best projects make it to the funding stage. An enormous amount of research, deliberation, thought and discussion goes into award decisions.

    The NSF program officer reviews the proposal and analyzes the input received from the external reviewers. After scientific, technical and programmatic review and consideration of appropriate factors, the program officer makes an “award” or “decline” recommendation to the division director. Final programmatic approval for a proposal is generally completed at NSF’s division level. A principal investigator (PI) whose proposal for NSF support has been declined will receive information and an explanation of the reason(s) for declination, along with copies of the reviews considered in making the decision. If that explanation does not satisfy the PI, he/she may request additional information from the cognizant NSF program officer or division director.

    If the program officer makes an award recommendation and the division director concurs, the recommendation is submitted to NSF’s Division of Grants and Agreements (DGA) for award processing. A DGA officer reviews the recommendation from the program division/office for business, financial and policy implications, and the processing and issuance of a grant or cooperative agreement. DGA generally makes awards to academic institutions within 30 days after the program division/office makes its recommendation.

     
  • richardmitnick 10:40 am on April 25, 2019 Permalink | Reply
    Tags: "Latest Hubble Measurements Suggest Disparity in Hubble Constant Calculations is not a Fluke", , , , , Cosmic Microwave Background - CMB, , Hubble’s measurements of today’s expansion rate do not match the rate that was expected based on how the Universe appeared shortly after the Big Bang over 13 billion years ago., , , To get accurate distances to nearby galaxies the team then looked for galaxies containing both Cepheids and Type Ia supernovae, Using new data from the NASA/ESA Hubble Space Telescope astronomers have significantly lowered the possibility that this discrepancy is a fluke.   

    From NASA/ESA Hubble Telescope: “Latest Hubble Measurements Suggest Disparity in Hubble Constant Calculations is not a Fluke” 

    NASA Hubble Banner

    NASA/ESA Hubble Telescope


    From NASA/ESA Hubble Telescope

    25 April 2019

    Adam Riess
    Space Telescope Science Institute
    Baltimore, USA
    Tel: +1 410 338 6707
    Email: ariess@stsci.edu

    Bethany Downer
    ESA/Hubble, Public Information Officer
    Garching, Germany
    Email: bethany.downer@partner.eso.org

    1
    Hubble’s measurements of today’s expansion rate do not match the rate that was expected based on how the Universe appeared shortly after the Big Bang over 13 billion years ago. Using new data from the NASA/ESA Hubble Space Telescope, astronomers have significantly lowered the possibility that this discrepancy is a fluke.

    2
    This image shows the entire Large Magellanic Cloud, with some of the brightest objects marked. The outline shown corresponds to the overview image from Digitized Sky Survey 2. The field of view is about ten degrees across. Credit: Robert Gendler/ESO

    2
    Three steps to the Hubble constant | ESA/Hubble


    This animation shows the principle of the cosmic distance ladder used by Adam Riess and his team to reduce the uncertainty of the Hubble constant.For the calibration of relatively short distances the team observed Cepheid variables. These are pulsating stars which fade and brighten at rates that are proportional to their true brightness and this property allows astronomers to determine their distances. The researchers calibrated the distances to the Cepheids using a basic geometrical technique called parallax. With Hubble’s sharp-eyed Wide Field Camera 3 (WFC3), they extended the parallax measurements further than previously possible, across the Milky Way galaxy.

    NASA/ESA Hubble WFC3

    To get accurate distances to nearby galaxies, the team then looked for galaxies containing both Cepheids and Type Ia supernovae. Type Ia supernovae always have the same intrinsic brightness and are also bright enough to be seen at relatively large distances. By comparing the observed brightness of both types of stars in those nearby galaxies, the team could then accurately measure the true brightness of the supernova. Using this calibrated rung on the distance ladder the accurate distance to additional 300 type Ia supernovae in far-flung galaxies was calculated.

    Cosmic Distance Ladder, skynetblogs

    Standard Candles to measure age and distance of the universe from supernovae NASA

    They compare those distance measurements with how the light from the supernovae is stretched to longer wavelengths by the expansion of space. Finally, they use these two values to calculate how fast the universe expands with time, called the Hubble constant.

    Credit: NASA, ESA, A. Feild (STScI), and A. Riess (STScI/JHU)

    Using new observations from the NASA/ESA Hubble Space Telescope, researchers have improved the foundations of the cosmic distance ladder, which is used to calculate accurate distances to nearby galaxies. This was done by observing pulsating stars called Cepheid variables in a neighbouring satellite galaxy known as the Large Magellanic Cloud, now calculated to be 162,000 light-years away.

    When defining the distances to galaxies that are further and further away, these Cepheid variables are used as milepost markers. Researchers use these measurements to determine how fast the Universe is expanding over time, a value known as the Hubble constant.

    Before Hubble was launched in 1990, estimates of the Hubble constant varied by a factor of two. In the late 1990s the Hubble Space Telescope Key Project on the Extragalactic Distance Scale refined the value of the Hubble constant to within 10 percent, accomplishing one of the telescope’s key goals. In 2016, astronomers using Hubble discovered that the Universe is expanding between five and nine percent faster than previously calculated by refining the measurement of the Hubble constant and further reducing the uncertainty to only 2.4 percent. In 2017, an independent measurement supported these results. This latest research has reduced the uncertainty in their Hubble constant value to an unprecedented 1.9 percent.

    This research also suggests that the likelihood that this discrepancy between measurements of today’s expansion rate of the Universe and the expected value based on the early Universe’s expansion is a fluke is just 1 in 100,000, a significant improvement from a previous estimate last year of 1 in 3,000.

    “The Hubble tension between the early and late Universe may be the most exciting development in cosmology in decades,” said lead researcher and Nobel Laureate Adam Riess of the Space Telescope Science Institute (STScI) and Johns Hopkins University, in Baltimore, USA. “This mismatch has been growing and has now reached a point that is really impossible to dismiss as a fluke. This disparity could not plausibly occur by chance.”

    As the team’s measurements have become more precise, their calculation of the Hubble constant has remained inconsistent with the expected value derived from observations of the early Universe’s expansion made by the European Space Agency’s Planck satellite. These measurements map a remnant afterglow from the Big Bang known as the Cosmic Microwave Background [CMB], which help scientists to predict how the early Universe would likely have evolved into the expansion rate astronomers can measure today.

    CMB per ESA/Planck

    ESA/Planck 2009 to 2013

    The new estimate of the Hubble constant is 74.03 kilometres per second per megaparsec [1]. The number indicates that the Universe is expanding at a rate about 9 percent faster than that implied by Planck’s observations of the early Universe, which give a value for the Hubble constant of 67.4 kilometres per second per megaparsec.

    To reach this conclusion, Riess and his team analysed the light from 70 Cepheid variables in the Large Magellanic Cloud. Because these stars brighten and dim at predictable rates, and the periods of these variations give us their luminosity and hence distance, astronomers use them as cosmic mileposts. Riess’s team used an efficient observing technique called Drift And Shift (DASH) using Hubble as a “point-and-shoot” camera to snap quick images of the bright stars. This avoids the more time-consuming step of anchoring the telescope with guide stars to observe each star. The results were combined with observations made by the Araucaria Project, a collaboration between astronomers from institutions in Europe, Chile, and the United States, to measure the distance to the Large Magellanic Cloud by observing the dimming of light as one star passes in front of its partner in a binary-star system.

    Because cosmological models suggest that observed values of the expansion of the Universe should be the same as those determined from the Cosmic Microwave Background, new physics may be needed to explain the disparity. “Previously, theorists would say to me, ‘it can’t be. It’s going to break everything.’ Now they are saying, ‘we actually could do this,’” Riess said.

    Various scenarios have been proposed to explain the discrepancy, but there is yet to be a conclusive answer. An invisible form of matter called dark matter may interact more strongly with normal matter than astronomers previously thought. Or perhaps dark energy, an unknown form of energy that pervades space, is responsible for accelerating the expansion of the Universe.

    Although Riess does not have an answer to this perplexing disparity, he and his team intend to continue using Hubble to reduce the uncertainty in their measure of the Hubble constant, which they hope to decrease to 1 percent.

    The team’s results have been accepted for publication in The Astrophysical Journal.
    Notes

    [1] This means that for every 3.3 million light-years further away a galaxy is from us, it appears to be moving about 74 kilometres per second faster, as a result of the expansion of the Universe.

    See the full article here .
    See the full HubbleSite article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA’s Goddard Space Flight Center manages the telescope. The Space Telescope Science Institute (STScI), is a free-standing science center, located on the campus of The Johns Hopkins University and operated by the Association of Universities for Research in Astronomy (AURA) for NASA, conducts Hubble science operations.

    ESA50 Logo large

    AURA Icon

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: