Tagged: Dark Energy Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:08 pm on May 8, 2023 Permalink | Reply
    Tags: , "The Euclid spacecraft will transform how we view the 'dark universe'", , , , Dark Energy, , ,   

    From The European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne] [Europäische Weltraumorganization](EU) Via “phys.org” And “The Conversation (AU)” : “The Euclid spacecraft will transform how we view the ‘dark universe'” 

    ESA Space For Europe Banner

    European Space Agency – United Space in Europe (EU)

    From The European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne] [Europäische Weltraumorganization](EU)

    Via

    “phys.org”

    And

    “The Conversation (AU)”

    5.8.23

    1
    Euclid is set to launch this year on a rocket built by SpaceX. Credit: Work performed by ATG under contract for ESA, CC BY-SA

    “The European Space Agency’s (ESA) Euclid satellite completed the first part of its long journey into space on May 1, 2023, when it arrived in Florida on a boat from Italy. It is scheduled to lift off on a Falcon 9 rocket, built by SpaceX, from Cape Canaveral in early July.

    Euclid is designed to provide us with a better understanding of the “mysterious” components of our universe, known as dark matter and dark energy.

    Unlike the normal matter we experience here on Earth, dark matter neither reflects nor emits light. It binds galaxies together and is thought to make up about 80% of all the mass in the universe. We’ve known about it for a century, but its true nature remains an enigma.

    Dark energy is similarly puzzling. Astronomers have shown that the expansion of the universe over the last five billion years has been accelerating faster than expected. Many believe this acceleration is driven by an unseen force, which has been dubbed dark energy. This makes up about 70% of the energy in the universe.

    Euclid will map this “dark universe,” using a suite of scientific instruments to shed light on different aspects of dark energy and dark matter.

    A light in the dark

    After launch, Euclid will undertake a month-long journey to a region in space called the second Earth-Sun Lagrangian point, which is five times further from us than the Moon.

    It’s where the gravitational pull of the Sun and the Earth balance out and provides a stable vantage point for Euclid to observe the universe. Euclid will join the James Webb Space Telescope (JWST) at this point and will be the perfect companion to that amazing space observatory.

    My involvement in Euclid began in 2007 when I was invited by ESA to participate in an independent concept advisory team to assess two competing mission proposals called SPACE and DUNE.

    Both used different techniques, and therefore different instruments, to study the dark universe, and ESA was struggling to decide between them. Both were compelling concepts and our team decided that both had merit, especially to provide a vital cross-check between them. Euclid was thus born from the best of both concepts.

    Euclid is designed to study the whole universe so needs instruments with wide fields of view. The wider the field of view of the imaging instrument, the more of the universe it can observe. To do this, Euclid uses a relatively small telescope compared to JWST. In size, Euclid is roughly the size of a truck compared to the aircraft-sized JWST. But Euclid also carries some of the biggest digital cameras deployed in space with fields of view hundreds of times greater than JWST’s.

    Shapes and colors

    The Euclid VIS (or visible) instrument, built mostly in the UK, is designed to measure the positions and shapes of as many galaxies as possible to look for subtle correlations in this data caused by the gravitational lensing of the light, as it travels to us through the intervening dark matter. This gravitational lensing effect is weak, only one part in a hundred thousand for most galaxies, thus requiring lots of galaxies to see the effect in high definition. Thus VIS will produce Hubble telescope-like image quality over a third of the night sky.

    VIS, however, can’t measure the colors of objects. This is needed to measure their distance through the redshift effect, where light from those objects is shifted to longer, or redder, wavelengths in a way that relates to their distance from us. Some of this data will need to come from existing and planned ground-based observatories, but Euclid also carries the NISP (Near-Infra Spectrometer and Photometer) instrument which is specifically designed to measure the infrared colors and spectra, and therefore redshifts, for the most distant galaxies that Euclid will see.

    To measure dark energy, NISP will exploit a relative new technique called Baryon Acoustic Oscillations (BAO) that provides an accurate measurement of the expansion history of the universe over its last 10 billion years. That history is vital for testing possible models of dark energy including suggested modifications to Albert Einstien’s Theory of General Relativity.

    Treasure trove

    Such an experiment takes an army of scientists and not everyone is solely working on dark matter and dark energy. Like JWST, Euclid will be a treasure-trove of new discoveries in many areas of astronomy. The Euclid consortium needs hundreds of people to help develop the sophisticated software needed to merge the space data with the ground-based data, and extract, to high accuracy, the shapes and colors of billions of galaxies.

    This software has also been checked and verified using some of the largest simulations of the universe that have ever been constructed. After arriving at L2, Euclid will undergo several months of testing, validation and calibration to ensure the instruments and telescope are working as expected. We are all familiar with such nervous waiting after the recent JWST launch.

    Once ready, Euclid will embark on a five-year survey of 15,000 square degrees of the sky with about 2,000 scientists from across the world collecting results along the way. However, the true power of Euclid will only be realized once we have all this data together and analyzed carefully. That could take another five years, taking us well into next decade before we have our final dark answers. The SpaceX launch therefore only feels like the half-way point in the Euclid story.

    I will travel to Florida this summer to see the launch of Euclid. I will be joined by hundreds of my colleagues who have dedicated their careers to building this amazing telescope and experiment. Seeing the project come together in this way makes me proud to call myself a “Euclidian.’

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganization](EU), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC (NL) in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the
    European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ESA’s space flight programme includes human spaceflight (mainly through participation in the International Space Station program); the launch and operation of uncrewed exploration missions to other planets and the Moon; Earth observation, science and telecommunication; designing launch vehicles; and maintaining a major spaceport, the The Guiana Space Centre [Centre Spatial Guyanais; CSG also called Europe’s Spaceport) at Kourou, French Guiana. The main European launch vehicle Ariane 5 is operated through Arianespace with ESA sharing in the costs of launching and further developing this launch vehicle. The agency is also working with The National Aeronautics and Space Agency to manufacture the Orion Spacecraft service module that will fly on the Space Launch System.

    The agency’s facilities are distributed among the following centres:

    ESA European Space Research and Technology Centre (ESTEC) (NL) in Noordwijk, Netherlands;
    ESA Centre for Earth Observation [ESRIN] (IT) in Frascati, Italy;
    ESA Mission Control ESA European Space Operations Center [ESOC](DE) is in Darmstadt, Germany;
    ESA -European Astronaut Centre [EAC] trains astronauts for future missions is situated in Cologne, Germany;
    European Centre for Space Applications and Telecommunications (ECSAT) (UK), a research institute created in 2009, is located in Harwell, England;
    ESA – European Space Astronomy Centre [ESAC] (ES) is located in Villanueva de la Cañada, Madrid, Spain.
    European Space Agency Science Programme is a long-term programme of space science and space exploration missions.

    Foundation

    After World War II, many European scientists left Western Europe in order to work with the United States. Although the 1950s boom made it possible for Western European countries to invest in research and specifically in space-related activities, Western European scientists realized solely national projects would not be able to compete with the two main superpowers. In 1958, only months after the Sputnik shock, Edoardo Amaldi (Italy) and Pierre Auger (France), two prominent members of the Western European scientific community, met to discuss the foundation of a common Western European space agency. The meeting was attended by scientific representatives from eight countries, including Harrie Massey (United Kingdom).

    The Western European nations decided to have two agencies: one concerned with developing a launch system, ELDO (European Launch Development Organization) , and the other the precursor of the European Space Agency, ESRO (European Space Research Organization) . The latter was established on 20 March 1964 by an agreement signed on 14 June 1962. From 1968 to 1972, ESRO launched seven research satellites.

    ESA in its current form was founded with the ESA Convention in 1975, when ESRO was merged with ELDO. ESA had ten founding member states: Belgium, Denmark, France, West Germany, Italy, the Netherlands, Spain, Sweden, Switzerland, and the United Kingdom. These signed the ESA Convention in 1975 and deposited the instruments of ratification by 1980, when the convention came into force. During this interval the agency functioned in a de facto fashion. ESA launched its first major scientific mission in 1975, Cos-B, a space probe monitoring gamma-ray emissions in the universe, which was first worked on by ESRO.

    ESA50 Logo large

    Later activities

    The European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganization](EU) Copernicus mission


    Copernicus science center campus

    ESA collaborated with National Aeronautics Space Agency on the International Ultraviolet Explorer (IUE), the world’s first high-orbit telescope, which was launched in 1978 and operated successfully for 18 years.

    ESA Infrared Space Observatory.

    European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganization](EU)/National Aeronautics and Space Administration Solar Orbiter annotated.

    A number of successful Earth-orbit projects followed, and in 1986 ESA began Giotto, its first deep-space mission, to study the comets Halley and Grigg–Skjellerup. Hipparcos, a star-mapping mission, was launched in 1989 and in the 1990s SOHO, Ulysses and the Hubble Space Telescope were all jointly carried out with NASA. Later scientific missions in cooperation with NASA include the Cassini–Huygens space probe, to which ESA contributed by building the Titan landing module Huygens.

    ESA/Huygens Probe from Cassini landed on Titan.

    As the successor of ELDO, ESA has also constructed rockets for scientific and commercial payloads. Ariane 1, launched in 1979, carried mostly commercial payloads into orbit from 1984 onward. The next two versions of the Ariane rocket were intermediate stages in the development of a more advanced launch system, the Ariane 4, which operated between 1988 and 2003 and established ESA as the world leader in commercial space launches in the 1990s. Although the succeeding Ariane 5 experienced a failure on its first flight, it has since firmly established itself within the heavily competitive commercial space launch market with 82 successful launches until 2018. The successor launch vehicle of Ariane 5, the Ariane 6, is under development and is envisioned to enter service in the 2020s.

    The beginning of the new millennium saw ESA become, along with agencies like National Aeronautics Space Agency, Japan Aerospace Exploration Agency (JP), Indian Space Research Organization (IN), the Canadian Space Agency(CA) and Roscosmos (RU), one of the major participants in scientific space research. Although ESA had relied on co-operation with NASA in previous decades, especially the 1990s, changed circumstances (such as tough legal restrictions on information sharing by the United States military) led to decisions to rely more on itself and on co-operation with Russia. A 2011 press issue thus stated:

    “Russia is ESA’s first partner in its efforts to ensure long-term access to space. There is a framework agreement between ESA and the government of the Russian Federation on cooperation and partnership in the exploration and use of outer space for peaceful purposes, and cooperation is already underway in two different areas of launcher activity that will bring benefits to both partners.”

    Notable ESA programs include SMART-1, a probe testing cutting-edge space propulsion technology, the Mars Express and Venus Express missions, as well as the development of the Ariane 5 rocket and its role in the ISS partnership. ESA maintains its scientific and research projects mainly for astronomy-space missions such as Corot, launched on 27 December 2006, a milestone in the search for exoplanets.

    On 21 January 2019, ArianeGroup and Arianespace announced a one-year contract with ESA to study and prepare for a mission to mine the Moon for lunar regolith.

    Mission

    The treaty establishing the European Space Agency reads:

    The purpose of the Agency shall be to provide for and to promote, for exclusively peaceful purposes, cooperation among European States in space research and technology and their space applications, with a view to their being used for scientific purposes and for operational space applications systems…

    ESA is responsible for setting a unified space and related industrial policy, recommending space objectives to the member states, and integrating national programs like satellite development, into the European program as much as possible.

    Jean-Jacques Dordain – ESA’s Director General (2003–2015) – outlined the European Space Agency’s mission in a 2003 interview:

    “Today space activities have pursued the benefit of citizens, and citizens are asking for a better quality of life on Earth. They want greater security and economic wealth, but they also want to pursue their dreams, to increase their knowledge, and they want younger people to be attracted to the pursuit of science and technology. I think that space can do all of this: it can produce a higher quality of life, better security, more economic wealth, and also fulfill our citizens’ dreams and thirst for knowledge, and attract the young generation. This is the reason space exploration is an integral part of overall space activities. It has always been so, and it will be even more important in the future.”

    Activities

    According to the ESA website, the activities are:

    Observing the Earth
    Human Spaceflight
    Launchers
    Navigation
    Space Science
    Space Engineering & Technology
    Operations
    Telecommunications & Integrated Applications
    Preparing for the Future
    Space for Climate

    Programs

    Copernicus Programme
    Cosmic Vision
    ExoMars
    FAST20XX
    Galileo
    Horizon 2000
    Living Planet Programme
    Mandatory

    Every member country must contribute to these programs:

    Technology Development Element Program
    Science Core Technology Program
    General Study Program
    European Component Initiative

    Optional

    Depending on their individual choices the countries can contribute to the following programs, listed according to:

    Launchers
    Earth Observation
    Human Spaceflight and Exploration
    Telecommunications
    Navigation
    Space Situational Awareness
    Technology

    ESA_LAB@

    ESA has formed partnerships with universities. ESA_LAB@ refers to research laboratories at universities. Currently there are ESA_LAB@

    Technische Universität Darmstadt (DE)
    École des hautes études commerciales de Paris (HEC Paris) (FR)
    Université de recherche Paris Sciences et Lettres (FR)
    The University of Central Lancashire (UK)

    Membership and contribution to ESA

    By 2015, ESA was an intergovernmental organization of 22 member states. Member states participate to varying degrees in the mandatory (25% of total expenditures in 2008) and optional space programs (75% of total expenditures in 2008). The 2008 budget amounted to €3.0 billion whilst the 2009 budget amounted to €3.6 billion. The total budget amounted to about €3.7 billion in 2010, €3.99 billion in 2011, €4.02 billion in 2012, €4.28 billion in 2013, €4.10 billion in 2014 and €4.33 billion in 2015. English is the main language within ESA. Additionally, official documents are also provided in German and documents regarding the Spacelab are also provided in Italian. If found appropriate, the agency may conduct its correspondence in any language of a member state.

    Non-full member states
    Slovenia
    Since 2016, Slovenia has been an associated member of the ESA.

    Latvia
    Latvia became the second current associated member on 30 June 2020, when the Association Agreement was signed by ESA Director Jan Wörner and the Minister of Education and Science of Latvia, Ilga Šuplinska in Riga. The Saeima ratified it on July 27. Previously associated members were Austria, Norway and Finland, all of which later joined ESA as full members.

    Canada
    Since 1 January 1979, Canada has had the special status of a Cooperating State within ESA. By virtue of this accord, The Canadian Space Agency [Agence spatiale canadienne, ASC] (CA) takes part in ESA’s deliberative bodies and decision-making and also in ESA’s programs and activities. Canadian firms can bid for and receive contracts to work on programs. The accord has a provision ensuring a fair industrial return to Canada. The most recent Cooperation Agreement was signed on 15 December 2010 with a term extending to 2020. For 2014, Canada’s annual assessed contribution to the ESA general budget was €6,059,449 (CAD$8,559,050). For 2017, Canada has increased its annual contribution to €21,600,000 (CAD$30,000,000).

    Enlargement

    After the decision of the ESA Council of 21/22 March 2001, the procedure for accession of the European states was detailed as described the document titled The Plan for European Co-operating States (PECS). Nations that want to become a full member of ESA do so in 3 stages. First a Cooperation Agreement is signed between the country and ESA. In this stage, the country has very limited financial responsibilities. If a country wants to co-operate more fully with ESA, it signs a European Cooperating State (ECS) Agreement. The ECS Agreement makes companies based in the country eligible for participation in ESA procurements. The country can also participate in all ESA programs, except for the Basic Technology Research Programme. While the financial contribution of the country concerned increases, it is still much lower than that of a full member state. The agreement is normally followed by a Plan For European Cooperating State (or PECS Charter). This is a 5-year programme of basic research and development activities aimed at improving the nation’s space industry capacity. At the end of the 5-year period, the country can either begin negotiations to become a full member state or an associated state or sign a new PECS Charter.

    During the Ministerial Meeting in December 2014, ESA ministers approved a resolution calling for discussions to begin with Israel, Australia and South Africa on future association agreements. The ministers noted that “concrete cooperation is at an advanced stage” with these nations and that “prospects for mutual benefits are existing”.

    A separate space exploration strategy resolution calls for further co-operation with the United States, Russia and China on “LEO” exploration, including a continuation of ISS cooperation and the development of a robust plan for the coordinated use of space transportation vehicles and systems for exploration purposes, participation in robotic missions for the exploration of the Moon, the robotic exploration of Mars, leading to a broad Mars Sample Return mission in which Europe should be involved as a full partner, and human missions beyond LEO in the longer term.”

    Relationship with the European Union

    The political perspective of the European Union (EU) was to make ESA an agency of the EU by 2014, although this date was not met. The EU member states provide most of ESA’s funding, and they are all either full ESA members or observers.

    History

    At the time ESA was formed, its main goals did not encompass human space flight; rather it considered itself to be primarily a scientific research organization for uncrewed space exploration in contrast to its American and Soviet counterparts. It is therefore not surprising that the first non-Soviet European in space was not an ESA astronaut on a European space craft; it was Czechoslovak Vladimír Remek who in 1978 became the first non-Soviet or American in space (the first man in space being Yuri Gagarin of the Soviet Union) – on a Soviet Soyuz spacecraft, followed by the Pole Mirosław Hermaszewski and East German Sigmund Jähn in the same year. This Soviet co-operation programme, known as Intercosmos, primarily involved the participation of Eastern bloc countries. In 1982, however, Jean-Loup Chrétien became the first non-Communist Bloc astronaut on a flight to the Soviet Salyut 7 space station.

    Because Chrétien did not officially fly into space as an ESA astronaut, but rather as a member of the French CNES astronaut corps, the German Ulf Merbold is considered the first ESA astronaut to fly into space. He participated in the STS-9 Space Shuttle mission that included the first use of the European-built Spacelab in 1983. STS-9 marked the beginning of an extensive ESA/NASA joint partnership that included dozens of space flights of ESA astronauts in the following years. Some of these missions with Spacelab were fully funded and organizationally and scientifically controlled by ESA (such as two missions by Germany and one by Japan) with European astronauts as full crew members rather than guests on board. Beside paying for Spacelab flights and seats on the shuttles, ESA continued its human space flight co-operation with the Soviet Union and later Russia, including numerous visits to Mir.

    During the latter half of the 1980s, European human space flights changed from being the exception to routine and therefore, in 1990, the European Astronaut Centre in Cologne, Germany was established. It selects and trains prospective astronauts and is responsible for the co-ordination with international partners, especially with regard to the International Space Station. As of 2006, the ESA astronaut corps officially included twelve members, including nationals from most large European countries except the United Kingdom.

    In the summer of 2008, ESA started to recruit new astronauts so that final selection would be due in spring 2009. Almost 10,000 people registered as astronaut candidates before registration ended in June 2008. 8,413 fulfilled the initial application criteria. Of the applicants, 918 were chosen to take part in the first stage of psychological testing, which narrowed down the field to 192. After two-stage psychological tests and medical evaluation in early 2009, as well as formal interviews, six new members of the European Astronaut Corps were selected – five men and one woman.

    Cooperation with other countries and organizations

    ESA has signed co-operation agreements with the following states that currently neither plan to integrate as tightly with ESA institutions as Canada, nor envision future membership of ESA: Argentina, Brazil, China, India (for the Chandrayan mission), Russia and Turkey.

    Additionally, ESA has joint projects with the European Union, NASA of the United States and is participating in the International Space Station together with the United States (NASA), Russia and Japan (JAXA).

    European Union
    ESA and EU member states
    ESA-only members
    EU-only members

    ESA is not an agency or body of the European Union (EU), and has non-EU countries (Norway, Switzerland, and the United Kingdom) as members. There are however ties between the two, with various agreements in place and being worked on, to define the legal status of ESA with regard to the EU.

    There are common goals between ESA and the EU. ESA has an EU liaison office in Brussels. On certain projects, the EU and ESA co-operate, such as the upcoming Galileo satellite navigation system. Space policy has since December 2009 been an area for voting in the European Council. Under the European Space Policy of 2007, the EU, ESA and its Member States committed themselves to increasing co-ordination of their activities and programs and to organizing their respective roles relating to space.

    The Lisbon Treaty of 2009 reinforces the case for space in Europe and strengthens the role of ESA as an R&D space agency. Article 189 of the Treaty gives the EU a mandate to elaborate a European space policy and take related measures, and provides that the EU should establish appropriate relations with ESA.

    Former Italian astronaut Umberto Guidoni, during his tenure as a Member of the European Parliament from 2004 to 2009, stressed the importance of the European Union as a driving force for space exploration, “…since other players are coming up such as India and China it is becoming ever more important that Europeans can have an independent access to space. We have to invest more into space research and technology in order to have an industry capable of competing with other international players.”

    The first EU-ESA International Conference on Human Space Exploration took place in Prague on 22 and 23 October 2009. A road map which would lead to a common vision and strategic planning in the area of space exploration was discussed. Ministers from all 29 EU and ESA members as well as members of parliament were in attendance.

    National space organizations of member states:

    The Centre National d’Études Spatiales(FR) (CNES) (National Centre for Space Study) is the French government space agency (administratively, a “public establishment of industrial and commercial character”). Its headquarters are in central Paris. CNES is the main participant on the Ariane project. Indeed, CNES designed and tested all Ariane family rockets (mainly from its centre in Évry near Paris)
    The UK Space Agency is a partnership of the UK government departments which are active in space. Through the UK Space Agency, the partners provide delegates to represent the UK on the various ESA governing bodies. Each partner funds its own programme.
    The Italian Space Agency A.S.I. – Agenzia Spaziale Italiana was founded in 1988 to promote, co-ordinate and conduct space activities in Italy. Operating under the Ministry of the Universities and of Scientific and Technological Research, the agency cooperates with numerous entities active in space technology and with the president of the Council of Ministers. Internationally, the ASI provides Italy’s delegation to the Council of the European Space Agency and to its subordinate bodies.
    The German Aerospace Center (DLR)[Deutsches Zentrum für Luft- und Raumfahrt e. V.] is the national research centre for aviation and space flight of the Federal Republic of Germany and of other member states in the Helmholtz Association. Its extensive research and development projects are included in national and international cooperative programs. In addition to its research projects, the centre is the assigned space agency of Germany bestowing headquarters of German space flight activities and its associates.
    The Instituto Nacional de Técnica Aeroespacial (INTA)(ES) (National Institute for Aerospace Technique) is a Public Research Organization specialized in aerospace research and technology development in Spain. Among other functions, it serves as a platform for space research and acts as a significant testing facility for the aeronautic and space sector in the country.

    National Aeronautics Space Agency

    ESA has a long history of collaboration with NASA. Since ESA’s astronaut corps was formed, the Space Shuttle has been the primary launch vehicle used by ESA’s astronauts to get into space through partnership programs with NASA. In the 1980s and 1990s, the Spacelab programme was an ESA-NASA joint research programme that had ESA develop and manufacture orbital labs for the Space Shuttle for several flights on which ESA participate with astronauts in experiments.

    In robotic science mission and exploration missions, NASA has been ESA’s main partner. Cassini–Huygens was a joint NASA-ESA mission, along with the Infrared Space Observatory, INTEGRAL, SOHO, and others.

    National Aeronautics and Space Administration/European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganization](EU)/ASI Italian Space Agency [Agenzia Spaziale Italiana](IT) Cassini Spacecraft.

    European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganization](EU) Integral spacecraft

    European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne] [Europäische Weltraumorganization] (EU)/National Aeronautics and Space Administration SOHO satellite. Launched in 1995.

    Also, the Hubble Space Telescope is a joint project of NASA and ESA.

    National Aeronautics and Space Administration/European Space Agency[La Agencia Espacial Europea] [Agence spatiale européenne] [Europäische Weltraumorganization](EU) Hubble Space Telescope

    ESA-NASA joint projects include the James Webb Space Telescope and the proposed Laser Interferometer Space Antenna.

    National Aeronautics Space Agency/European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne] [Europäische Weltraumorganization]Canadian Space Agency [Agence Spatiale Canadienne](CA) James Webb Space Telescope annotated. Launched in December 2021.

    Gravity is talking. Lisa will listen. Dialogos of Eide.

    The European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganization](EU)/National Aeronautics and Space Administration eLISA space based, the future of gravitational wave research.

    NASA has committed to provide support to ESA’s proposed MarcoPolo-R mission to return an asteroid sample to Earth for further analysis. NASA and ESA will also likely join together for a Mars Sample Return Mission. In October 2020 the ESA entered into a memorandum of understanding (MOU) with NASA to work together on the Artemis program, which will provide an orbiting lunar gateway and also accomplish the first manned lunar landing in 50 years, whose team will include the first woman on the Moon.

    NASA ARTEMIS spacecraft depiction.

    Cooperation with other space agencies

    Since China has started to invest more money into space activities, the Chinese Space Agency[中国国家航天局] (CN) has sought international partnerships. ESA is, beside, The Russian Federal Space Agency Государственная корпорация по космической деятельности «Роскосмос»](RU) one of its most important partners. Two space agencies cooperated in the development of the Double Star Mission. In 2017, ESA sent two astronauts to China for two weeks sea survival training with Chinese astronauts in Yantai, Shandong.

    ESA entered into a major joint venture with Russia in the form of the CSTS, the preparation of French Guiana spaceport for launches of Soyuz-2 rockets and other projects. With India, ESA agreed to send instruments into space aboard the ISRO’s Chandrayaan-1 in 2008. ESA is also co-operating with Japan, the most notable current project in collaboration with JAXA is the BepiColombo mission to Mercury.

    European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganization](EU)/Japan Aerospace Exploration Agency [国立研究開発法人宇宙航空研究開発機構](JP) Bepicolumbo in flight illustration. Artist’s impression of BepiColombo – ESA’s first mission to Mercury. ESA’s Mercury Planetary Orbiter (MPO) will be operated from ESOC Germany.

    ESA’s Mercury Planetary Orbiter (MPO) will be operated from ESOC Germany.

    Speaking to reporters at an air show near Moscow in August 2011, ESA head Jean-Jacques Dordain said ESA and Russia’s Roskosmos space agency would “carry out the first flight to Mars together.”

     
  • richardmitnick 12:32 pm on April 20, 2023 Permalink | Reply
    Tags: "Understanding our place in the universe", AI is an incredible scientific asset but it can also be used for more nefarious purposes: facial recognition software; sentencing decisions in criminal court. Many algorithms are biased against people, , , , At Fermilab he spends his days teaching machines how to analyze cosmological data a task for which they are better suited than most human scientists., At MIT Nord has focused his efforts on exploring the potential of AI to design new scientific experiments and instruments., , Brian Nord, , Dark Energy, , , In recent years Nord has attempted to develop methods to make the application of AI more ethical., Nord asks “Could we design the next particle collider or the next telescope in less than five years instead of 30?”, Nord’s efforts to combat racism in STEM have established him as a leader in the movement to address inequities and oppression in academic and research environments., ,   

    From The Massachusetts Institute of Technology: “Understanding our place in the universe” Brian Nord 

    From The Massachusetts Institute of Technology

    4.12.23
    Phie Jacobs | School of Science

    1
    “A touchstone that I often come back to is space,” says Brian Nord. “The mystery of traveling in it and seeing what’s at the edge.”

    Brian Nord first fell in love with physics when he was a teenager growing up in Wisconsin. His high school physics program wasn’t exceptional, and he sometimes struggled to keep up with class material, but those difficulties did nothing to dampen his interest in the subject. In addition to the main curriculum, students were encouraged to independently study topics they found interesting, and Nord quickly developed a fascination with the cosmos. “A touchstone that I often come back to is space,” he says. “The mystery of traveling in it and seeing what’s at the edge.”

    Nord was an avid reader of comic books, and astrophysics appealed to his desire to become a part of something bigger. “There always seemed to be something special about having this kinship with the universe around you,” he recalls. “I always thought it would be cool if I could have that deep connection to physics.”

    Nord began to cultivate that connection as an undergraduate at The Johns Hopkins University. After graduating with a BA in physics, he went on to study at the University of Michigan, where he earned an MS and PhD in the same field. By this point, he was already thinking big, but he wanted to think even bigger. This desire for a more comprehensive understanding of the universe led him away from astrophysics and toward the more expansive field of cosmology. “Cosmology deals with the whole kit and caboodle, the whole shebang,” he explains. “Our biggest questions are about the origin and the fate of the universe.”

    Dark mysteries

    Nord was particularly interested in parts of the universe that can’t be observed through traditional means. Evidence suggests that dark matter makes up the majority of mass in the universe and provides most of its gravity, but its nature largely remains in the realm of hypothesis and speculation. It doesn’t absorb, reflect, or emit any type of electromagnetic radiation, which makes it nearly impossible for scientists to detect. But while dark matter provides gravity to pull the universe together, an equally mysterious force — dark energy — is pulling it apart. “We know even less about dark energy than we do about dark matter,” Nord explains.

    For the past 15 years, Nord has been attempting to close that gap in our knowledge. Part of his work focuses on the statistical modeling of galaxy clusters and their ability to distort and magnify light as it travels through the cosmos. This effect, which is known as strong gravitational lensing, is a useful tool for detecting the influence of dark matter on gravity and for measuring how dark energy affects the expansion rate of the universe.

    After earning his PhD, Nord remained at the University of Michigan to continue his research as part of a postdoctoral fellowship. He currently holds a position at the Fermi National Accelerator Laboratory and is a senior member of the Kavli Institute for Cosmological Physics at the University of Chicago. He continues to investigate questions about the origin and destiny of the universe, but his more recent work has also focused on improving the ways in which we make scientific discoveries.

    AI powerup

    When it comes to addressing big questions about the nature of the cosmos, Nord has consistently run into one major problem: although his mastery of physics can sometimes make him feel like a superhero, he’s only human, and humans aren’t perfect. They make mistakes, adapt slowly to new information, and take a long time to get things done.

    The solution, Nord argues, is to go beyond the human, into the realm of algorithms and models. As part of Fermilab’s Artificial Intelligence Project, he spends his days teaching machines how to analyze cosmological data, a task for which they are better suited than most human scientists. “Artificial intelligence can give us models that are more flexible than what we can create ourselves with pen and paper,” Nord explains. “In a lot of cases, it does better than humans do.”

    Nord is continuing this research at MIT as part of the Martin Luther King Jr. (MLK) Visiting Scholars and Professors Program. Earlier this year, he joined the Laboratory for Nuclear Science (LNS), with Jesse Thaler in the Department of Physics and Center for Theoretical Physics (CTP) as his faculty host. Thaler is the director of the National Science Foundation’s Institute for Artificial Intelligence and Fundamental Interactions (IAIFI). Since arriving on campus, Nord has focused his efforts on exploring the potential of AI to design new scientific experiments and instruments. These processes ordinarily take an enormous amount of time, he explains, but AI could rapidly accelerate them. “Could we design the next particle collider or the next telescope in less than five years, instead of 30?” he wonders.

    But if Nord has learned anything from the comics of his youth, it is that with great power comes great responsibility. AI is an incredible scientific asset, but it can also be used for more nefarious purposes. The same computer algorithms that could build the next particle collider also underlie things like facial recognition software and the risk assessment tools that inform sentencing decisions in criminal court. Many of these algorithms are deeply biased against people of color. “It’s a double-edged sword,” Nord explains. “Because if [AI] works better for science, it works better for facial recognition. So, I’m working against myself.”

    Culture change superpowers

    In recent years, Nord has attempted to develop methods to make the application of AI more ethical, and his work has focused on the broad intersections between ethics, justice, and scientific discovery. His efforts to combat racism in STEM have established him as a leader in the movement to address inequities and oppression in academic and research environments. In June of 2020, he collaborated with members of Particles for Justice — a group that boasts MIT professors Daniel Harlow and Tracy Slatyer, as well as former MLK Visiting Scholar and CTP researcher Chanda Prescod-Weinstein — to create the academic Strike for Black Lives. The strike, which emerged as a response to the police killings of George Floyd, Breonna Taylor, and many others, called on the academic community to take a stand against anti-Black racism.

    Nord is also the co-author of Black Light, a curriculum for learning about Black experiences, and the co-founder of Change Now, which produced a list of calls for action to make a more just laboratory environment at Fermilab. As the co-founder of Deep Skies, he also strives to foster justice-oriented research communities free of traditional hierarchies and oppressive power structures. “The basic idea is just humanity over productivity,” he explains.

    This work has led Nord to reconsider what motivated him to pursue a career in physics in the first place. When he first discovered his passion for the subject as a teenager, he knew he wanted to use physics to help people, but he wasn’t sure how. “I was thinking I’d make some technology that will save lives, and I still hope to do that,” he says. “But I think maybe more of my direct impact, at least in this stage of my career, is in trying to change the culture.”

    Physics may not have granted Nord flight or X-ray vision — not yet, at least. But over the course of his long career, he has discovered a more substantial power. “If I can understand the universe,” he says, “maybe that will help me understand myself and my place in the world and our place as humanity.”

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.


    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    4

    The Computer Science and Artificial Intelligence Laboratory (CSAIL)

    From The Kavli Institute For Astrophysics and Space Research

    MIT’s Institute for Medical Engineering and Science is a research institute at the Massachusetts Institute of Technology

    The MIT Laboratory for Nuclear Science

    The MIT Media Lab

    The MIT School of Engineering

    The MIT Sloan School of Management

    Spectrum

    MIT.nano

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 11:01 am on March 7, 2023 Permalink | Reply
    Tags: "Dark energy from supermassive black holes? Physicists spar over radical idea", , , , , Dark Energy, , , ,   

    From “Science Magazine” : “Dark energy from supermassive black holes? Physicists spar over radical idea” 

    From “Science Magazine”

    2.17.23
    Adam Mann

    1
    An artist’s impression of a supermassive black hole in the center of a galaxy. New reseach links these behemoths to the mysterious phenomenon called dark energy.NASA/JPL-Caltech.

    Earlier this week, a study made headlines claiming that the mysterious “dark energy” cosmologists believe is accelerating the expansion of the universe could arise from supermassive black holes at the hearts of galaxies. If true, the connection would link two of the most mind-bending concepts in physics—black holes and dark energy—and suggest the source of the latter has been under theorists’ noses for decades. However, some leading theorists are deeply skeptical of the idea.

    “What they are proposing makes no sense to me,” says Robert Wald, a theoretical physicist at the University of Chicago who specializes in Albert Einstein’s General Theory of Relativity, the standard understanding of gravity. Other theorists were more receptive to the radical claim—even if it ends up being wrong. “I’m personally excited about it,” says astrophysicist Niayesh Afshordi of the Perimeter Institute for Theoretical Physics.

    At first blush, black holes and dark energy seem to have nothing to do with each other. According to General Relativity, a black hole is a pure gravitational field so strong that its own energy sustains its existence. Such peculiar beasts are thought to emerge when massive stars collapse to an infinitesimal point, leaving just their gravitational fields behind. Supermassive black holes having millions or billions of times the mass of our Sun are believed to lurk in the hearts of galaxies.

    In contrast, dark energy is a mysterious phenomenon that literally stretches space and is accelerating the expansion of the universe. Theorists think dark energy could represent some new sort of field in space, a bit like an electric field, or it could be a fundamental property of empty space itself.

    So how could the two be connected? Quantum mechanics suggests the vacuum of empty space should contain a type of energy known as vacuum energy. This is thought to be spread throughout the universe and exert a force opposing gravity, making it a prime candidate for the identity of dark energy. In 1966, Soviet physicist Erast Gliner showed Einstein’s equations could also produce objects that to outside observers look and behave exactly like a black hole—yet are, in fact, giant balls of vacuum energy.

    If such objects were to exist, it would mean that rather than being uniformly spread throughout space, dark energy is actually confined to specific locations: the interiors of black holes. Even bound in these particular knots, dark energy would still exert its space-stretching effect on the universe.

    One consequence of this idea—that supermassive black holes are the source of dark energy—is that they would be linked to the constant stretching of space and their mass should change as the universe expands, says astrophysicist Duncan Farrah of the University of Hawaii, Manoa. “If the volume of the universe doubles, so does the mass of the black hole,” he adds.

    To test this possibility, Farrah and his colleagues studied elliptical galaxies, which contain black holes with millions or billions of times the Sun’s mass in their centers. They focused on galaxies with little gas or dust floating around between their stars, which would provide a reservoir of material that the central black hole could feed on. Such black holes wouldn’t be expected to change much over the course of cosmic history.

    Yet by analyzing the properties of ellipticals over roughly 9 billion years, the team saw that black holes in the early universe were much smaller relative to their host galaxy than those in the modern universe, indicating they had grown by a factor of seven to 10 times in mass, Farrah and colleagues reported this month in The Astrophysical Journal [below].

    The fact that the black holes swelled but the galaxies didn’t is the key, Farrah says. If the black holes had grown by feeding on nearby gas and dust, that material should have also generated many new stars in parts of the galaxy far from the black hole. But if black holes were made from dark energy, they would react to changes in the universe’s size in exactly the way that researchers observed in the centers of elliptical galaxies, Farrah’s team additionally reported this week in The Astrophysical Journal Letters [below].

    Wald is unpersuaded. He questions how an orb of pure dark energy could be stable. He also says the numbers don’t seem to add up: Dark energy is known to make up 70% of the mass-energy of the universe, whereas black holes are a mere fraction of the ordinary matter, which constitutes less than 5% of the universe. “I don’t see how it is in any way conceivable that such objects could be relevant to the observed dark energy,” he says.

    Others are taking a wait-and-see attitude. “At the moment, this is an interesting possibility,” says cosmologist Geraint Lewis of the University of Sydney, but “there would have to be a lot more evidence on the table if this is even a remotely plausible source of dark energy.”

    Afshordi agrees. If black holes and dark energy are linked in this way, it would likely have other visible consequences in the universe, he says. At the moment, though, he’s unsure what those would be. Determining exactly how galaxies evolve over time is a tricky business, he adds, and there could be other mechanisms to grow black holes that the team hasn’t considered.

    Nevertheless, Afshordi is supportive of efforts to rethink fundamental assumptions about the universe. “Most new theoretical ideas are dismissed by skepticism,” he says. “But if we dismiss all the new ideas then there won’t be anything left.”

    The Astrophysical Journal
    From the science paper
    Abstract
    The assembly of stellar and supermassive black hole (SMBH) mass in elliptical galaxies since z ∼ 1 can help to
    diagnose the origins of locally observed correlations between SMBH mass and stellar mass. We therefore construct three samples of elliptical galaxies, one at z ∼ 0 and two at 0.7  z  2.5, and quantify their relative positions in the MBH−M* plane. Using a Bayesian analysis framework, we find evidence for translational offsets in both stellar mass and SMBH mass between the local sample and both higher-redshift samples. The offsets in stellar mass are small, and consistent with measurement bias, but the offsets in SMBH mass are much larger, reaching a factor of 7 between z ∼ 1 and z ∼ 0. The magnitude of the SMBH offset may also depend on redshift, reaching a factor of ∼20 at z ∼ 2. The result is robust against variation in the high- and low-redshift samples and changes in the analysis approach. The magnitude and redshift evolution of the offset are challenging to explain in terms of selection and measurement biases. We conclude that either there is a physical mechanism that preferentially grows SMBHs in elliptical galaxies at z  2, or that selection and measurement biases are both underestimated, and depend on redshift.

    1

    3

    For further illustrations see the science paper.

    The Astrophysical Journal Letters
    From the science paper
    Abstract
    Observations have found black holes spanning 10 orders of magnitude in mass across most of cosmic history. The Kerr black hole solution is, however, provisional as its behavior at infinity is incompatible with an expanding universe. Black hole models with realistic behavior at infinity predict that the gravitating mass of a black hole can increase with the expansion of the universe independently of accretion or mergers, in a manner that depends on the black hole’s interior solution. We test this prediction by considering the growth of supermassive black holes in elliptical galaxies over 0 < z  2.5. We find evidence for cosmologically coupled mass growth among these black holes, with zero cosmological coupling excluded at 99.98% confidence. The redshift dependence of the mass growth implies that, at z  7, black holes contribute an effectively constant cosmological energy density to Friedmann’s equations. The continuity equation then requires that black holes contribute cosmologically as vacuum energy. We further show that black hole production from the cosmic star formation history gives the value of ΩΛ measured by Planck while being consistent with constraints from massive compact halo objects. We thuspropose that stellar remnant black holes are the astrophysical origin of dark energy, explaining the onset of accelerating expansion at z ∼ 0.7.

    4
    5

    For further illustrations see the science paper.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 12:56 pm on February 4, 2023 Permalink | Reply
    Tags: , , , , , Dark Energy, ,   

    From “The Big Think” : “3 new studies indicate a conflict at the heart of cosmology” 

    From “The Big Think”

    2.1.23
    Don Lincoln

    The Universe isn’t as “clumpy” as we think it should be.

    1
    Credit: NASA.

    Key Takeaways

    Telescopes are essentially time machines. As we examine galaxies that are at greater and greater distances from the Earth, we are looking further and further back in time. A new series of studies that examine the “clumpiness” of the Universe indicates that there might be a conflict at the heart of cosmology. The Big Bang theory is still sound, but it may need to be tweaked.

    A series of three scientific papers describing the expansion history of the Universe is telling a confusing tale, with predictions and measurements slightly disagreeing.

    While this disagreement isn’t considered a fatal disproof of modern cosmology, it could be a hint that our theories need to be revised.

    PRD “Joint analysis of DES Year 3 data and CMB lensing from SPT and Planck I: Construction of CMB Lensing Maps and Modeling Choices”
    PRD “Joint analysis of DES Year 3 data and CMB lensing from SPT and Planck II: Cross-correlation measurements and cosmological constraints”
    PRD “Joint analysis of DES Year 3 data and CMB lensing from SPT and Planck III: Combined cosmological constraints”

    Creation stories, both ancient and modern

    Understanding exactly how the world around us came into existence is a question that has bothered humanity for millennia. All around the world, people have devised stories — from the ancient Greek legend of the creation of the Earth and other primordial entities from Chaos (as first written down by Hesiod) to the Hopi creation myth (which describes a series of different kinds of creatures being created, eventually ending up as humans).

    In modern times, there are still competing creation stories, but there is one that is grounded in empiricism and the scientific method: the idea that about 13.8 billion years ago, the Universe began in a much smaller and hotter compressed state, and it has been expanding ever since then. This idea is colloquially called the “Big Bang,” although different writers use the term to mean slightly different things. Some use it to refer to the exact moment at which the Universe came into existence and began to expand, while others use it to refer to all moments after the beginning. For those writers, the Big Bang is still ongoing, as the expansion of the Universe continues.

    The beauty of this scientific explanation is that it can be tested. Astronomers rely on the fact that light has a finite speed, which means that it takes time for light to cross the cosmos. For example, the light we see as the Sun shining was emitted eight minutes before we see it. Light from the nearest star took about four years to get to Earth, and light from elsewhere in the cosmos can take billions of years to arrive.

    The telescope as a time machine

    Effectively, this means that telescopes are time machines. By looking at more and more distant galaxies, astronomers are able to see what the Universe looked like in the distant past. By stitching together observations of galaxies at different distances from the Earth, astronomers can unravel the evolution of the cosmos.

    The recent measurements use two different telescopes to study the structure of the Universe at different cosmic epochs. One facility, called the South Pole Telescope (SPT), looks at the earliest possible light, emitted a mere 380,000 years after the Universe began.

    At that time, the Universe was 0.003% its current age. If we consider the current cosmos to be equivalent to a 50-year-old person, the SPT looks at the Universe when it was a mere 12 hours old.

    The second facility is called the Dark Energy Survey (DES).
    ___________________________________________________________________
    The Dark Energy Survey

    Dark Energy Camera [DECam] built at The DOE’s Fermi National Accelerator Laboratory.

    NOIRLab National Optical Astronomy Observatory Cerro Tololo Inter-American Observatory (CL) Victor M Blanco 4m Telescope which houses the Dark-Energy-Camera – DECam at Cerro Tololo, Chile at an altitude of 7200 feet.

    NOIRLabNSF NOIRLab NOAO Cerro Tololo Inter-American Observatory(CL) approximately 80 km to the East of La Serena, Chile, at an altitude of 2200 meters.

    The Dark Energy Survey is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. The Dark Energy Survey began searching the Southern skies on August 31, 2013.

    According to Albert Einstein’s Theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up.

    Nobel Prize in Physics for 2011 Expansion of the Universe

    4 October 2011

    The Royal Swedish Academy of Sciences has decided to award the Nobel Prize in Physics for 2011

    with one half to

    Saul Perlmutter
    The Supernova Cosmology Project
    The DOE’s Lawrence Berkeley National Laboratory and The University of California-Berkeley,

    and the other half jointly to

    Brian P. Schmidt
    The High-z Supernova Search Team,
    The Australian National University, Weston Creek, Australia.

    and

    Adam G. Riess
    The High-z Supernova Search Team,The Johns Hopkins University and
    The Space Telescope Science Institute, Baltimore, MD.
    Written in the stars

    “Some say the world will end in fire, some say in ice…” *

    What will be the final destiny of the Universe? Probably it will end in ice, if we are to believe this year’s Nobel Laureates in Physics. They have studied several dozen exploding stars, called supernovae, and discovered that the Universe is expanding at an ever-accelerating rate. The discovery came as a complete surprise even to the Laureates themselves.

    In 1998, cosmology was shaken at its foundations as two research teams presented their findings. Headed by Saul Perlmutter, one of the teams had set to work in 1988. Brian Schmidt headed another team, launched at the end of 1994, where Adam Riess was to play a crucial role.

    The research teams raced to map the Universe by locating the most distant supernovae. More sophisticated telescopes on the ground and in space, as well as more powerful computers and new digital imaging sensors (CCD, Nobel Prize in Physics in 2009), opened the possibility in the 1990s to add more pieces to the cosmological puzzle.

    The teams used a particular kind of supernova, called Type 1a supernova. It is an explosion of an old compact star that is as heavy as the Sun but as small as the Earth. A single such supernova can emit as much light as a whole galaxy. All in all, the two research teams found over 50 distant supernovae whose light was weaker than expected – this was a sign that the expansion of the Universe was accelerating. The potential pitfalls had been numerous, and the scientists found reassurance in the fact that both groups had reached the same astonishing conclusion.

    For almost a century, the Universe has been known to be expanding as a consequence of the Big Bang about 14 billion years ago. However, the discovery that this expansion is accelerating is astounding. If the expansion will continue to speed up the Universe will end in ice.

    The acceleration is thought to be driven by dark energy, but what that dark energy is remains an enigma – perhaps the greatest in physics today. What is known is that dark energy constitutes about three quarters of the Universe. Therefore the findings of the 2011 Nobel Laureates in Physics have helped to unveil a Universe that to a large extent is unknown to science. And everything is possible again.

    *Robert Frost, Fire and Ice, 1920
    ___________________________________________________________________
    To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called Dark Energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or Albert Einstein’s Theory of General Relativity must be replaced by a new theory of gravity on cosmic scales.

    The Dark Energy Survey is designed to probe the origin of the accelerating universe and help uncover the nature of Dark Energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the Dark Energy Survey collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.
    ___________________________________________________________________
    This is a very powerful telescope located on a mountain top in Chile. Over the years, it has surveyed about 1/8 of the sky and photographed over 300 million galaxies, many of which are so dim, they are about one-millionth as bright as the dimmest stars visible to the human eye. This telescope can image galaxies from the current day to as far back as eight billion years ago. Continuing with the analogy of a 50-year-old individual, DES can take pictures of the Universe starting when it was 21 years old up until the present. (Full disclosure: Researchers at Fermilab, where I also work, carried out this study — but I did not participate in this research.)

    As light from distant galaxies travels to Earth, it can be distorted by galaxies that are closer to us. By using these tiny distortions, astronomers have developed a very precise map of the distribution of matter in the cosmos. This map includes both ordinary matter, of which stars and galaxies are the most familiar examples, and dark matter, which is a hypothesized form of matter that neither absorbs nor emits light. Dark matter is only observed through its gravitational effect on other objects and is thought to be five times more prevalent than ordinary matter.
    __________________________________
    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., and Vera Rubin a Woman in STEM, denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky.

    Coma cluster via NASA/ESA Hubble, the original example of Dark Matter discovered during observations by Fritz Zwicky and confirmed 30 years later by Vera Rubin.

    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.

    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.

    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970.

    Vera Rubin measuring spectra, worked on Dark Matter(Emilio Segre Visual Archives AIP SPL).

    Dark Matter Research

    Super Cryogenic Dark Matter Search from DOE’s SLAC National Accelerator Laboratory at Stanford University at SNOLAB (Vale Inco Mine, Sudbury, Canada).

    LBNL LZ Dark Matter Experiment xenon detector at Sanford Underground Research Facility Credit: Matt Kapust.


    DAMA at Gran Sasso uses sodium iodide housed in copper to hunt for dark matter LNGS-INFN.

    Yale HAYSTAC axion dark matter experiment at Yale’s Wright Lab.

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB (CA) deep in Sudbury’s Creighton Mine.

    The LBNL LZ Dark Matter Experiment Dark Matter project at SURF, Lead, SD.

    DAMA-LIBRA Dark Matter experiment at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS) located in the Abruzzo region of central Italy.

    DARWIN Dark Matter experiment. A design study for a next-generation, multi-ton dark matter detector in Europe at The University of Zurich [Universität Zürich](CH).

    PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China.

    Inside the Axion Dark Matter eXperiment U Washington. Credit: Mark Stone U. of Washington. Axion Dark Matter Experiment.

    3
    The University of Western Australia ORGAN Experiment’s main detector. A small copper cylinder called a “resonant cavity” traps photons generated during dark matter conversion. The cylinder is bolted to a “dilution refrigerator” which cools the experiment to very low temperatures.
    __________________________________

    Is the Big Bang incomplete?

    In order to test the Big Bang, astronomers can use measurements taken by the South Pole Telescope and use the theory to project forward to the present day. They can then take measurements from the Dark Energy Survey and compare them. If the measurements are accurate and the theory describes the cosmos, they should agree.

    And, by and large, they do — but not completely. When astronomers look at how “clumpy” the matter of the current Universe should be, purely from SPT measurements and extrapolations of theory, they find that the predictions are “clumpier” than current measurements by DES.

    This disagreement is potentially significant and could signal that the theory of the Big Bang is incomplete. Furthermore, this isn’t the first discrepancy that astronomers have encountered when they project measurements of the same primordial light imaged by the SPT to the modern day. Different research groups, using different telescopes, have found that the current Universe is expanding faster than expected from observations of the ancient light seen by the SPT, combined with Big Bang theory. This other discrepancy is called the Hubble Tension, named after American astronomer Edwin Hubble, who first realized that the Universe was expanding.

    __________________________________________________________________________________

    Edwin Hubble

    .

    __________________________________________________________________________________


    Have astronomers disproved the Big Bang?

    While the new discrepancy in predictions and measurements of the clumpiness of the Universe are preliminary, it could be that both this measurement and the Hubble Tension imply that the Big Bang theory might need some tweaking. Mind you, the discrepancies do not rise to the level of scrapping the theory entirely; however, it is the nature of the scientific method to adjust theories to account for new observations.

    See the full article here.

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 8:58 am on October 20, 2022 Permalink | Reply
    Tags: "Pantheon+", "Pantheon+" also cements a major disagreement over the pace of that expansion that has yet to be solved., "Pantheon+" further closes the door on alternative frameworks accounting for dark energy and dark matter., "The Most Precise Accounting Yet of Dark Energy and Dark Matter", , , , , Dark Energy, , G299 was left over by a particular class of supernovas called a Type Ia., , , , The current best theories for dark energy and dark matter hold strong., , The most distant supernovae in the dataset gleam forth from 10.7 billion light years away., The new "Pantheon+" analysis holds that 66.2 percent of the universe manifests as dark energy with the remaining 33.8 percent being a combination of dark matter and matter.,   

    From The Harvard-Smithsonian Center for Astrophysics: “The Most Precise Accounting Yet of Dark Energy and Dark Matter” 

    From The Harvard-Smithsonian Center for Astrophysics

    10.19.22
    Media Contact:
    Nadia Whitehead
    Public Affairs Officer
    Center for Astrophysics | Harvard & Smithsonian
    nadia.whitehead@cfa.harvard.edu
    617-721-7371

    Analyzing more than two decades’ worth of supernova explosions convincingly bolsters modern cosmological theories and reinvigorates efforts to answer fundamental questions.

    1
    G299 was left over by a particular class of supernovas called a Type Ia. Credit: NASA/CXC/University of Texas.

    Astrophysicists have performed a powerful new analysis that places the most precise limits yet on the composition and evolution of the universe. With this analysis, dubbed “Pantheon+”, cosmologists find themselves at a crossroads.

    “Pantheon+” convincingly finds that the cosmos is composed of about two-thirds dark energy and one-third matter — mostly in the form of dark matter — and is expanding at an accelerating pace over the last several billion years. However, “Pantheon+” also cements a major disagreement over the pace of that expansion that has yet to be solved.

    By putting prevailing modern cosmological theories, known as the Standard Model of Cosmology, on even firmer evidentiary and statistical footing, “Pantheon+” further closes the door on alternative frameworks accounting for dark energy and dark matter. Both are bedrocks of the Standard Model of Cosmology but have yet to be directly detected and rank among the model’s biggest mysteries. Following through on the results of “Pantheon+”, researchers can now pursue more precise observational tests and hone explanations for the ostensible cosmos.

    “With these “Pantheon+” results, we are able to put the most precise constraints on the dynamics and history of the universe to date,” says Dillon Brout, an Einstein Fellow at the Center for Astrophysics | Harvard & Smithsonian. “We’ve combed over the data and can now say with more confidence than ever before how the universe has evolved over the eons and that the current best theories for dark energy and dark matter hold strong.”

    Brout is the lead author of a series of papers describing the new “Pantheon+” analysis, published jointly today in a special issue of The Astrophysical Journal [below].

    “Pantheon+” is based on the largest dataset of its kind, comprising more than 1,500 stellar explosions called “Type Ia supernovae”. These bright blasts occur when white dwarf stars — remnants of stars like our Sun — accumulate too much mass and undergo a runaway thermonuclear reaction. Because “Type Ia supernovae” outshine entire galaxies, the stellar detonations can be glimpsed at distances exceeding 10 billion light years, or back through about three-quarters of the universe’s total age. Given that the supernovae blaze with nearly uniform intrinsic brightnesses, scientists can use the explosions’ apparent brightness, which diminishes with distance, along with redshift measurements as markers of time and space.

    That information, in turn, reveals how fast the universe expands during different epochs, which is then used to test theories of the fundamental components of the universe.

    The breakthrough discovery in 1998 of the universe’s accelerating growth was thanks to a study of “Type Ia supernovae” in this manner.

    Scientists attribute the expansion to an invisible energy, therefore monikered dark energy, inherent to the fabric of the universe itself. Subsequent decades of work have continued to compile ever-larger datasets, revealing supernovae across an even wider range of space and time, and Pantheon+ has now brought them together into the most statistically robust analysis to date.

    “In many ways, this latest “Pantheon+” analysis is a culmination of more than two decades’ worth of diligent efforts by observers and theorists worldwide in deciphering the essence of the cosmos,” says Adam Riess, one of the winners of the 2011 Nobel Prize in Physics for the discovery of the accelerating expansion of the universe and the Bloomberg Distinguished Professor at Johns Hopkins University (JHU) and the Space Telescope Science Institute in Baltimore, Maryland. Riess is also an alum of Harvard University, holding a PhD in astrophysics.

    Brout’s own career in cosmology traces back to his undergraduate years at JHU, where he was taught and advised by Riess. There Brout worked with then-PhD-student and Riess-advisee Dan Scolnic, who is now an assistant professor of physics at Duke University and another co-author on the new series of papers.

    Several years ago, Scolnic developed the original Pantheon analysis of approximately 1,000 supernovae.

    Now, Brout and Scolnic and their new “Pantheon+” team have added some 50 percent more supernovae data points in “Pantheon+”, coupled with improvements in analysis techniques and addressing potential sources of error, which ultimately has yielded twice the precision of the original Pantheon.

    “This leap in both the dataset quality and in our understanding of the physics that underpin it would not have been possible without a stellar team of students and collaborators working diligently to improve every facet of the analysis,” says Brout.

    Taking the data as a whole, the new analysis holds that 66.2 percent of the universe manifests as dark energy, with the remaining 33.8 percent being a combination of dark matter and matter. To arrive at even more comprehensive understanding of the constituent components of the universe at different epochs, Brout and colleagues combined “Pantheon+” with other strongly evidenced, independent and complementary measures of the large-scale structure of the universe and with measurements from the earliest light in the universe, the cosmic microwave background [CMB].

    Another key “Pantheon+” result relates to one of the paramount goals of modern cosmology: nailing down the current expansion rate of the universe, known as the “Hubble constant”. Pooling the “Pantheon+” sample with data from the “SH0ES” (Supernova H0 for the Equation of State) collaboration, led by Riess, results in the most stringent local measurement of the current expansion rate of the universe.

    “Pantheon+” and “SH0ES” together find a “Hubble constant” of 73.4 kilometers per second per megaparsec with only 1.3% uncertainty. Stated another way, for every megaparsec, or 3.26 million light years, the analysis estimates that in the nearby universe, space itself is expanding at more than 160,000 miles per hour.

    However, observations from an entirely different epoch of the universe’s history predict a different story. Measurements of the universe’s earliest light, the cosmic microwave background [CMB], when combined with the current Standard Model of Cosmology, consistently peg the “Hubble constant” at a rate that is significantly less than observations taken via “Type Ia supernovae” and other astrophysical markers. This sizable discrepancy between the two methodologies has been termed the “Hubble tension”.

    The new “Pantheon+” and “SH0ES’ datasets heighten this “Hubble tension”. In fact, the tension has now passed the important 5σ threshold (about one-in-a-million odds of arising due to random chance) that physicists use to distinguish between possible statistical flukes and something that must accordingly be understood. Reaching this new statistical level highlights the challenge for both theorists and astrophysicists to try and explain the “Hubble constant” discrepancy.

    “We thought it would be possible to find clues to a novel solution to these problems in our dataset, but instead we’re finding that our data rules out many of these options and that the profound discrepancies remain as stubborn as ever,” says Brout.

    The “Pantheon+” results could help point to where the solution to the “Hubble tension” lies. “Many recent theories have begun pointing to exotic new physics in the very early universe, however such unverified theories must withstand the scientific process and the “Hubble tension” continues to be a major challenge,” says Brout.

    Overall, “Pantheon+” offers scientists a comprehensive lookback through much of cosmic history. The earliest, most distant supernovae in the dataset gleam forth from 10.7 billion light years away, meaning from when the universe was roughly a quarter of its current age. In that earlier era, dark matter and its associated gravity held the universe’s expansion rate in check. Such state of affairs changed dramatically over the next several billion years as the influence of dark energy overwhelmed that of dark matter. Dark energy has since flung the contents of the cosmos ever-farther apart and at an ever-increasing rate.

    “With this combined “Pantheon+” dataset, we get a precise view of the universe from the time when it was dominated by dark matter to when the universe became dominated by dark energy,” says Brout. “This dataset is a unique opportunity to see dark energy turn on and drive the evolution of the cosmos on the grandest scales up through present time.”

    Studying this changeover now with even stronger statistical evidence will hopefully lead to new insights into dark energy’s enigmatic nature.

    “‘Pantheon+’ is giving us our best chance to date of constraining dark energy, its origins, and its evolution,” says Brout.

    Science paper compilation:
    The Astrophysical Journal

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    The The Harvard-Smithsonian Center for Astrophysics combines the resources and research facilities of the Harvard College Observatory and the Smithsonian Astrophysical Observatory under a single director to pursue studies of those basic physical processes that determine the nature and evolution of the universe. The Smithsonian Astrophysical Observatory is a bureau of the Smithsonian Institution, founded in 1890. The Harvard College Observatory, founded in 1839, is a research institution of the Faculty of Arts and Sciences, Harvard University, and provides facilities and substantial other support for teaching activities of the Department of Astronomy.

    Founded in 1973 and headquartered in Cambridge, Massachusetts, the CfA leads a broad program of research in astronomy, astrophysics, Earth and space sciences, as well as science education. The CfA either leads or participates in the development and operations of more than fifteen ground- and space-based astronomical research observatories across the electromagnetic spectrum, including the forthcoming Giant Magellan Telescope(CL) and the Chandra X-ray Observatory, one of NASA’s Great Observatories.

    GMT Giant Magellan Telescope(CL) 21 meters, to be at the Carnegie Institution for Science’s NSF NOIRLab NOAO Las Campanas Observatory(CL) some 115 km (71 mi) north-northeast of La Serena, Chile, over 2,500 m (8,200 ft) high.

    National Aeronautics and Space Administration Chandra X-ray telescope.

    Hosting more than 850 scientists, engineers, and support staff, the CfA is among the largest astronomical research institutes in the world. Its projects have included Nobel Prize-winning advances in cosmology and high energy astrophysics, the discovery of many exoplanets, and the first image of a black hole. The CfA also serves a major role in the global astrophysics research community: the CfA’s Astrophysics Data System, for example, has been universally adopted as the world’s online database of astronomy and physics papers. Known for most of its history as the “Harvard-Smithsonian Center for Astrophysics”, the CfA rebranded in 2018 to its current name in an effort to reflect its unique status as a joint collaboration between Harvard University and the Smithsonian Institution. The CfA’s current Director (since 2004) is Charles R. Alcock, who succeeds Irwin I. Shapiro (Director from 1982 to 2004) and George B. Field (Director from 1973 to 1982).

    The Center for Astrophysics | Harvard & Smithsonian is not formally an independent legal organization, but rather an institutional entity operated under a Memorandum of Understanding between Harvard University and the Smithsonian Institution. This collaboration was formalized on July 1, 1973, with the goal of coordinating the related research activities of the Harvard College Observatory (HCO) and the Smithsonian Astrophysical Observatory (SAO) under the leadership of a single Director, and housed within the same complex of buildings on the Harvard campus in Cambridge, Massachusetts. The CfA’s history is therefore also that of the two fully independent organizations that comprise it. With a combined lifetime of more than 300 years, HCO and SAO have been host to major milestones in astronomical history that predate the CfA’s founding.

    History of the Smithsonian Astrophysical Observatory (SAO)

    Samuel Pierpont Langley, the third Secretary of the Smithsonian, founded the Smithsonian Astrophysical Observatory on the south yard of the Smithsonian Castle (on the U.S. National Mall) on March 1,1890. The Astrophysical Observatory’s initial, primary purpose was to “record the amount and character of the Sun’s heat”. Charles Greeley Abbot was named SAO’s first director, and the observatory operated solar telescopes to take daily measurements of the Sun’s intensity in different regions of the optical electromagnetic spectrum. In doing so, the observatory enabled Abbot to make critical refinements to the Solar constant, as well as to serendipitously discover Solar variability. It is likely that SAO’s early history as a solar observatory was part of the inspiration behind the Smithsonian’s “sunburst” logo, designed in 1965 by Crimilda Pontes.

    In 1955, the scientific headquarters of SAO moved from Washington, D.C. to Cambridge, Massachusetts to affiliate with the Harvard College Observatory (HCO). Fred Lawrence Whipple, then the chairman of the Harvard Astronomy Department, was named the new director of SAO. The collaborative relationship between SAO and HCO therefore predates the official creation of the CfA by 18 years. SAO’s move to Harvard’s campus also resulted in a rapid expansion of its research program. Following the launch of Sputnik (the world’s first human-made satellite) in 1957, SAO accepted a national challenge to create a worldwide satellite-tracking network, collaborating with the United States Air Force on Project Space Track.

    With the creation of National Aeronautics and Space Administration the following year and throughout the space race, SAO led major efforts in the development of orbiting observatories and large ground-based telescopes, laboratory and theoretical astrophysics, as well as the application of computers to astrophysical problems.

    History of Harvard College Observatory (HCO)

    Partly in response to renewed public interest in astronomy following the 1835 return of Halley’s Comet, the Harvard College Observatory was founded in 1839, when the Harvard Corporation appointed William Cranch Bond as an “Astronomical Observer to the University”. For its first four years of operation, the observatory was situated at the Dana-Palmer House (where Bond also resided) near Harvard Yard, and consisted of little more than three small telescopes and an astronomical clock. In his 1840 book recounting the history of the college, then Harvard President Josiah Quincy III noted that “…there is wanted a reflecting telescope equatorially mounted…”. This telescope, the 15-inch “Great Refractor”, opened seven years later (in 1847) at the top of Observatory Hill in Cambridge (where it still exists today, housed in the oldest of the CfA’s complex of buildings). The telescope was the largest in the United States from 1847 until 1867. William Bond and pioneer photographer John Adams Whipple used the Great Refractor to produce the first clear Daguerrotypes of the Moon (winning them an award at the 1851 Great Exhibition in London). Bond and his son, George Phillips Bond (the second Director of HCO), used it to discover Saturn’s 8th moon, Hyperion (which was also independently discovered by William Lassell).

    Under the directorship of Edward Charles Pickering from 1877 to 1919, the observatory became the world’s major producer of stellar spectra and magnitudes, established an observing station in Peru, and applied mass-production methods to the analysis of data. It was during this time that HCO became host to a series of major discoveries in astronomical history, powered by the Observatory’s so-called “Computers” (women hired by Pickering as skilled workers to process astronomical data). These “Computers” included Williamina Fleming; Annie Jump Cannon; Henrietta Swan Leavitt; Florence Cushman; and Antonia Maury, all widely recognized today as major figures in scientific history. Henrietta Swan Leavitt, for example, discovered the so-called period-luminosity relation for Classical Cepheid variable stars, establishing the first major “standard candle” with which to measure the distance to galaxies. Now called “Leavitt’s Law”, the discovery is regarded as one of the most foundational and important in the history of astronomy; astronomers like Edwin Hubble, for example, would later use Leavitt’s Law to establish that the Universe is expanding, the primary piece of evidence for the Big Bang model.

    Upon Pickering’s retirement in 1921, the Directorship of HCO fell to Harlow Shapley (a major participant in the so-called “Great Debate” of 1920). This era of the observatory was made famous by the work of Cecelia Payne-Gaposchkin, who became the first woman to earn a Ph.D. in astronomy from Radcliffe College (a short walk from the Observatory). Payne-Gapochkin’s 1925 thesis proposed that stars were composed primarily of hydrogen and helium, an idea thought ridiculous at the time. Between Shapley’s tenure and the formation of the CfA, the observatory was directed by Donald H. Menzel and then Leo Goldberg, both of whom maintained widely recognized programs in solar and stellar astrophysics. Menzel played a major role in encouraging the Smithsonian Astrophysical Observatory to move to Cambridge and collaborate more closely with HCO.

    Joint history as the Center for Astrophysics (CfA)

    The collaborative foundation for what would ultimately give rise to the Center for Astrophysics began with SAO’s move to Cambridge in 1955. Fred Whipple, who was already chair of the Harvard Astronomy Department (housed within HCO since 1931), was named SAO’s new director at the start of this new era; an early test of the model for a unified Directorship across HCO and SAO. The following 18 years would see the two independent entities merge ever closer together, operating effectively (but informally) as one large research center.

    This joint relationship was formalized as the new Harvard–Smithsonian Center for Astrophysics on July 1, 1973. George B. Field, then affiliated with University of California- Berkeley, was appointed as its first Director. That same year, a new astronomical journal, the CfA Preprint Series was created, and a CfA/SAO instrument flying aboard Skylab discovered coronal holes on the Sun. The founding of the CfA also coincided with the birth of X-ray astronomy as a new, major field that was largely dominated by CfA scientists in its early years. Riccardo Giacconi, regarded as the “father of X-ray astronomy”, founded the High Energy Astrophysics Division within the new CfA by moving most of his research group (then at American Sciences and Engineering) to SAO in 1973. That group would later go on to launch the Einstein Observatory (the first imaging X-ray telescope) in 1976, and ultimately lead the proposals and development of what would become the Chandra X-ray Observatory. Chandra, the second of NASA’s Great Observatories and still the most powerful X-ray telescope in history, continues operations today as part of the CfA’s Chandra X-ray Center. Giacconi would later win the 2002 Nobel Prize in Physics for his foundational work in X-ray astronomy.

    Shortly after the launch of the Einstein Observatory, the CfA’s Steven Weinberg won the 1979 Nobel Prize in Physics for his work on electroweak unification. The following decade saw the start of the landmark CfA Redshift Survey (the first attempt to map the large scale structure of the Universe), as well as the release of the Field Report, a highly influential Astronomy & Astrophysics Decadal Survey chaired by the outgoing CfA Director George Field. He would be replaced in 1982 by Irwin Shapiro, who during his tenure as Director (1982 to 2004) oversaw the expansion of the CfA’s observing facilities around the world.

    Harvard Smithsonian Center for Astrophysics Fred Lawrence Whipple Observatory located near Amado, Arizona on the slopes of Mount Hopkins, Altitude 2,606 m (8,550 ft)

    European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne] [Europäische Weltraumorganization] (EU)/National Aeronautics and Space Administration SOHO satellite. Launched in 1995.

    National Aeronautics Space Agency NASA Kepler Space Telescope

    CfA-led discoveries throughout this period include canonical work on Supernova 1987A, the “CfA2 Great Wall” (then the largest known coherent structure in the Universe), the best-yet evidence for supermassive black holes, and the first convincing evidence for an extrasolar planet.

    The 1990s also saw the CfA unwittingly play a major role in the history of computer science and the internet: in 1990, SAO developed SAOImage, one of the world’s first X11-based applications made publicly available (its successor, DS9, remains the most widely used astronomical FITS image viewer worldwide). During this time, scientists at the CfA also began work on what would become the Astrophysics Data System (ADS), one of the world’s first online databases of research papers. By 1993, the ADS was running the first routine transatlantic queries between databases, a foundational aspect of the internet today.

    The CfA Today

    Research at the CfA

    Charles Alcock, known for a number of major works related to massive compact halo objects, was named the third director of the CfA in 2004. Today Alcock overseas one of the largest and most productive astronomical institutes in the world, with more than 850 staff and an annual budget in excess of $100M. The Harvard Department of Astronomy, housed within the CfA, maintains a continual complement of approximately 60 Ph.D. students, more than 100 postdoctoral researchers, and roughly 25 undergraduate majors in astronomy and astrophysics from Harvard College. SAO, meanwhile, hosts a long-running and highly rated REU Summer Intern program as well as many visiting graduate students. The CfA estimates that roughly 10% of the professional astrophysics community in the United States spent at least a portion of their career or education there.

    The CfA is either a lead or major partner in the operations of the Fred Lawrence Whipple Observatory, the Submillimeter Array, MMT Observatory, the South Pole Telescope, VERITAS, and a number of other smaller ground-based telescopes. The CfA’s 2019-2024 Strategic Plan includes the construction of the Giant Magellan Telescope as a driving priority for the Center.

    CFA Harvard Smithsonian Submillimeter Array on Mauna Kea, Hawai’i, Altitude 4,205 m (13,796 ft).

    South Pole Telescope SPTPOL. The SPT collaboration is made up of over a dozen (mostly North American) institutions, including The University of Chicago ; The University of California-Berkeley ; Case Western Reserve University; Harvard/Smithsonian Astrophysical Observatory; The University of Colorado- Boulder; McGill (CA) University, The University of Illinois, Urbana-Champaign; The University of California- Davis; Ludwig Maximilians Universität München(DE); DOE’s Argonne National Laboratory; and The National Institute for Standards and Technology.

    Along with the Chandra X-ray Observatory, the CfA plays a central role in a number of space-based observing facilities, including the recently launched Parker Solar Probe, Kepler Space Telescope, the Solar Dynamics Observatory (SDO), and HINODE. The CfA, via the Smithsonian Astrophysical Observatory, recently played a major role in the Lynx X-ray Observatory, a NASA-Funded Large Mission Concept Study commissioned as part of the 2020 Decadal Survey on Astronomy and Astrophysics (“Astro2020”). If launched, Lynx would be the most powerful X-ray observatory constructed to date, enabling order-of-magnitude advances in capability over Chandra.

    NASA Parker Solar Probe Plus named to honor Pioneering Physicist Eugene Parker. The Johns Hopkins University Applied Physics Lab.

    National Aeronautics and Space Administration Solar Dynamics Observatory.

    Japan Aerospace Exploration Agency (JAXA) (国立研究開発法人宇宙航空研究開発機構] (JP)/National Aeronautics and Space Administration HINODE spacecraft.

    SAO is one of the 13 stakeholder institutes for the Event Horizon Telescope Board, and the CfA hosts its Array Operations Center. In 2019, the project revealed the first direct image of a black hole.

    Messier 87*, The first image of the event horizon of a black hole. This is the supermassive black hole at the center of the galaxy Messier 87. Image via The Event Horizon Telescope Collaboration released on 10 April 2019 via National Science Foundation.

    The result is widely regarded as a triumph not only of observational radio astronomy, but of its intersection with theoretical astrophysics. Union of the observational and theoretical subfields of astrophysics has been a major focus of the CfA since its founding.

    In 2018, the CfA rebranded, changing its official name to the “Center for Astrophysics | Harvard & Smithsonian” in an effort to reflect its unique status as a joint collaboration between Harvard University and the Smithsonian Institution. Today, the CfA receives roughly 70% of its funding from NASA, 22% from Smithsonian federal funds, and 4% from the National Science Foundation. The remaining 4% comes from contributors including the United States Department of Energy, the Annenberg Foundation, as well as other gifts and endowments.

     
  • richardmitnick 9:13 am on September 24, 2022 Permalink | Reply
    Tags: "Star Light Star Bright … But Exactly How Bright?", , , , , Dark Energy, , , Type 1A supernovae   

    From The National Institute of Standards and Technology: “Star Light Star Bright … But Exactly How Bright?” 

    From The National Institute of Standards and Technology

    9.22.22

    Technical Contacts

    Susana Deustua
    susana.deustua@nist.gov
    (301) 975-3763

    John T. Woodward IV
    john.woodward@nist.gov
    (301) 975-5495

    1
    NIST researcher John Woodward with the four-inch telescope used to calibrate the luminosity of nearby stars.
    Credit: C. Suplee/NIST.

    2
    Astronomers use the brightness of a type of exploding star known as a Type 1A supernova (seen here as bright blue dot to the left of a remote spiral galaxy) to determine the age and expansion rate of the universe. New calibrations of the luminosity of nearby stars, observed by NIST researchers, could help astronomers refine their measurements.
    Credit: J. DePasquale (STScI), M. Kornmesser and M. Zamani (ESA/Hubble), A. Riess (STScI/JHU)NASA, ESA, and the SH0ES team, and the Digitized Sky Survey.

    3
    The four-inch telescope on Mt. Hopkins in Arizona. Credit: J. Woodward/NIST.

    4
    Side view of the telescope undergoing testing in the laboratory. Credit: C. Suplee/NIST.

    A picture may be worth a thousand words, but for astronomers, simply recording images of stars and galaxies isn’t enough. To measure the true size and absolute brightness (luminosity) of heavenly bodies, astronomers need to accurately gauge the distance to these objects. To do so, the researchers rely on “standard candles”– stars whose luminosities are so well known that they act like light bulbs of known wattage.

    One way to determine a star’s distance from Earth is to compare how bright the star appears in the sky to its luminosity.

    But even standard candles need to be calibrated. For more than a decade, scientists at the National Institute of Standards and Technology (NIST) have been working to improve the methods for calibrating standard stars. They observed two nearby bright stars, Vega and Sirius, in order to calibrate their luminosity over a range of visible-light wavelengths. The researchers are now completing their analysis and plan to release the calibration data to astronomers within the next 12 months.

    The calibration data could aid astronomers who use more distant standard candles–exploded stars known as type Ia supernovas–to determine the age and expansion rate of the universe. (Comparing the brightness of remote type Ia supernovas to nearby ones led to the Nobel-prize winning discovery that the expansion of the universe is not slowing down, as expected, but is actually speeding up.)

    ______________________________________________________________________________

    4 October 2011

    The Royal Swedish Academy of Sciences has decided to award the Nobel Prize in Physics for 2011

    with one half to

    Saul Perlmutter
    The Supernova Cosmology Project
    The DOE’s Lawrence Berkeley National Laboratory and The University of California-Berkeley,

    and the other half jointly to

    Brian P. SchmidtThe High-z Supernova Search Team, The Australian National University, Weston Creek, Australia.

    and

    Adam G. Riess

    The High-z Supernova Search Team,The Johns Hopkins University and The Space Telescope Science Institute, Baltimore, MD.

    Written in the stars

    “Some say the world will end in fire, some say in ice…” *

    What will be the final destiny of the Universe? Probably it will end in ice, if we are to believe this year’s Nobel Laureates in Physics. They have studied several dozen exploding stars, called supernovae, and discovered that the Universe is expanding at an ever-accelerating rate. The discovery came as a complete surprise even to the Laureates themselves.

    In 1998, cosmology was shaken at its foundations as two research teams presented their findings. Headed by Saul Perlmutter, one of the teams had set to work in 1988. Brian Schmidt headed another team, launched at the end of 1994, where Adam Riess was to play a crucial role.

    The research teams raced to map the Universe by locating the most distant supernovae. More sophisticated telescopes on the ground and in space, as well as more powerful computers and new digital imaging sensors (CCD, Nobel Prize in Physics in 2009), opened the possibility in the 1990s to add more pieces to the cosmological puzzle.

    The teams used a particular kind of supernova, called Type 1a supernova. It is an explosion of an old compact star that is as heavy as the Sun but as small as the Earth. A single such supernova can emit as much light as a whole galaxy. All in all, the two research teams found over 50 distant supernovae whose light was weaker than expected – this was a sign that the expansion of the Universe was accelerating. The potential pitfalls had been numerous, and the scientists found reassurance in the fact that both groups had reached the same astonishing conclusion.

    For almost a century, the Universe has been known to be expanding as a consequence of the Big Bang about 14 billion years ago. However, the discovery that this expansion is accelerating is astounding. If the expansion will continue to speed up the Universe will end in ice.

    The acceleration is thought to be driven by dark energy, but what that dark energy is remains an enigma – perhaps the greatest in physics today. What is known is that dark energy constitutes about three quarters of the Universe. Therefore the findings of the 2011 Nobel Laureates in Physics have helped to unveil a Universe that to a large extent is unknown to science. And everything is possible again.

    *Robert Frost, Fire and Ice, 1920
    ______________________________________________________________________________

    Astronomers may be able to use the NIST calibrations of Vega and Sirius to better compare the brightness of nearby and faraway type Ia supernovas, leading to more accurate measurements of the expansion of the universe and its age.

    In the ongoing NIST study, scientists observe the two nearby stars with a four-inch telescope they designed and placed atop Mount Hopkins in the desert of southern Arizona.

    John Woodward, Susana Deustua, and their colleagues have repeatedly observed the spectra, or colors, of light emitted by Vega (25 light-years away) and Sirius (8.6 light-years). One light-year, the distance that light travels through a vacuum is one year, is 9.46 trillion kilometers.

    At the beginning and end of each observing night, the researchers tilt the telescope downwards so that they can compare the stellar spectra to that of an artificial star–a quartz lamp whose luminosity has been exactly measured and placed 100 meters from the telescope.

    Before the scientists can directly make the comparisons, they must account for the effect of Earth’s atmosphere, which scatters and absorbs some of the starlight before it can reach the telescope. Although light from the ground-based lamp does not travel through the full depth of the atmosphere, some of it is scattered by air during its short, horizontal journey to the telescope.

    To assess how much of the ground-based light is scattered from the lamp, the NIST team measures the relative ratio of power generated by a helium-neon laser at its output and 100 m away, at the site of the lamp.

    To determine how much starlight is lost to the Earth’s atmosphere, the researchers record the amount of starlight reaching the telescope as it points in different directions, peering through different thicknesses of the atmosphere during the night. Changes in the amount of light recorded by the telescope as the night progresses allow astronomers to correct for the atmospheric absorption.

    Once Vega and Sirius are calibrated, astronomers can use those stars as steppingstones to calibrate the light from other stars. For instance, by using the same telescope, researchers can observe a set of slightly fainter stars—call them Set 2. The luminosity of those fainter stars can then be calibrated using Vega and Sirius as reference standards.

    Switching to a telescope large enough to observe both the newly calibrated Set 2, and a group of even fainter stars (call them Set 3), astronomers can calibrate the light from Set 3 in terms of Set 2. Astronomers can repeat the process as needed to calibrate light from extremely remote stars. In this way, astronomers will be able to transfer the NIST calibration of Vega and Sirius to stars that lie thousands to millions of light-years away.

    Next year, Deustua and Woodward will move their small telescope, now back at NIST, to the European Southern Observatory’s (ESO’s) Paranal Observatory in the high-altitude desert of northern Chile.

    With drier climate than Mt. Hopkins, the Chilean site promises more clear nights to observe Sirius and Vega and less moisture to absorb or scatter the light. The telescope will reside on a mountaintop away from ESO’s Very Large Telescope, a suite of four 8.2-m telescopes and four 1.2-m telescopes, so that the light from NIST’s quartz lamp won’t interfere with observations of distant galaxies.

    The team also plans to expand its repertoire of bright nearby stars to include Arcturus (37 light-years), Gamma Crucis (89 light-years), and Gamma Trianguli Australis (184 light-years) and to observe stars at longer, infrared wavelengths. The recently launched James Webb Space Telescope and the Roman Space Telescope, set for launch by the end of the decade, are designed to examine the universe at these wavelengths.

    The NIST researchers recently received seed money to build a larger telescope which could observe and calibrate fainter, more distant stars. That would allow astronomers to transfer the NIST calibration to remote standard candles more directly. Reducing the number of steppingstones between the stars observed by NIST and the stars astronomers are studying reduces calibration errors.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD.

    The National Institute of Standards and Technology‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “ The National Institute of Standards and Technology” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock.

    NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR).

    The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961.

    SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility.

    This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).
    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
  • richardmitnick 10:28 am on September 7, 2022 Permalink | Reply
    Tags: , , , , Dark Energy, , ,   

    From The University of Oxford (UK): “‘Lopsided’ Universe could mean revision of Standard Cosmological Model – ΛCDM Model of Cosmology” 

    U Oxford bloc

    From The University of Oxford (UK)

    9.7.22

    1

    Dr Sebastian von Hausegger and Professor Subir Sarkar from the Rudolf Peierls Centre for Theoretical Physics at Oxford, together with their collaborators Dr Nathan Secrest (US Naval Observatory, Washington), Dr Roya Mohayaee (Institut d’Astrophysique, Paris) and Dr Mohamed Rameez (Tata Institute of Fundamental Research, Mumbai), have made a surprising discovery about the Universe. Their paper is in press in The Astrophysical Journal Letters [below].

    The researchers used observations of over a million quasars and half a million radio sources to test the ‘cosmological principle’ which underlies modern cosmology. It says that when averaged on large scales the Universe is isotropic and homogeneous. This allows a simple mathematical description of space-time – the Friedmann-Lemaître-Robertson-Walker (FLRW) metric – which enormously simplifies the application of Albert Einstein’s General Theory of Relativity to the Universe as a whole, thus yielding the “standard cosmological model”. Interpretation of observational data in the framework of this model has however led to the astounding conclusion that about 70% of the Universe is in the form of a mysterious “dark energy” which is causing its expansion rate to accelerate.

    ___________________________________________________________________
    The Dark Energy Survey

    Dark Energy Camera [DECam] built at The DOE’s Fermi National Accelerator Laboratory.

    NOIRLab National Optical Astronomy Observatory Cerro Tololo Inter-American Observatory (CL) Victor M Blanco 4m Telescope which houses the Dark-Energy-Camera – DECam at Cerro Tololo, Chile at an altitude of 7200 feet.

    NOIRLabNSF NOIRLab NOAO Cerro Tololo Inter-American Observatory(CL) approximately 80 km to the East of La Serena, Chile, at an altitude of 2200 meters.

    The Dark Energy Survey is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. The Dark Energy Survey began searching the Southern skies on August 31, 2013.

    According to Albert Einstein’s Theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up.
    Saul Perlmutter (center) [The Supernova Cosmology Project] shared the 2006 Shaw Prize in Astronomy, the 2011 Nobel Prize in Physics, and the 2015 Breakthrough Prize in Fundamental Physics with Brian P. Schmidt (right) and Adam Riess (left) [The High-z Supernova Search Team] for providing evidence that the expansion of the universe is accelerating.

    To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called Dark Energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    The Dark Energy Survey is designed to probe the origin of the accelerating universe and help uncover the nature of Dark Energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the Dark Energy Survey collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.
    ___________________________________________________________________
    This has been interpreted as arising from the zero-point fluctuations of the quantum vacuum, with the associated energy scale set by HØ, the present rate of expansion of the universe. However, this is quite inexplicable in the successful Standard Model (quantum field theory) of fundamental interactions, the characteristic energy scale of which is higher by a factor of 1044. So, while the standard cosmological model (called ΛCDM) describes the observational data well, its main component, dark energy, has no physical basis.

    Testing foundational assumptions

    This is what motivated the researchers to re-examine its underlying assumptions. Professor Sarkar says: “When the foundations of today’s standard cosmological model were laid a hundred years ago, there was no data. We didn’t even know then that we live in a galaxy – just one among a hundred billion others. Now that we do have data, we can, and should, test these foundational assumptions since a lot rests on them – in particular the inference that dark energy dominates the Universe.”

    In fact, the Universe today is manifestly not homogeneous and isotropic. Astronomical surveys reveal a filamentary structure of galaxies, clusters of galaxies, and superclusters of clusters … and this ‘cosmic web’ extends to the deepest scales currently probed of about 2 billion light years.

    The conventional wisdom is that, while clumpy on small scales, the distribution of matter becomes homogeneous when averaged on scales larger than about 300 million light years. The Hubble expansion is smooth and isotropic on large scales, while on small scales the gravitational effect of inhomogeneities give rise to ‘peculiar’ velocities eg our nearest neighbor the Andromeda galaxy is not receding in the Hubble flow – rather it is falling towards us.

    Back in 1966, the cosmologist Dennis Sciama noted that because of this, the cosmic microwave background (CMB) radiation from the Big Bang could not be uniform on the sky.

    It must exhibit a ‘dipole anisotropy’ ie appear hotter in the direction of our local motion and colder in the opposite direction. This was indeed found soon afterwards and is attributed to our motion at about 370 km/s towards a particular direction (in the constellation of Crater). Accordingly, a special relativistic ‘boost’ is applied to all cosmological data (redshifts, apparent magnitudes etc) to transform them to the reference frame in which the universe is isotropic, since it is in this ‘cosmic rest frame’ that the Friedmann-Lemaître equations of the standard cosmological model hold. Application of these equations to the corrected data then indicates that the Hubble expansion rate is accelerating, as if driven by Einstein’s Cosmological Constant “L”, aka dark energy.

    The cosmological principle

    How can we check if this is true? If the dipole anisotropy in the CMB is due to our motion, then there must be a similar dipole in the sky distribution of all cosmologically distant sources. This is due to ‘aberration’ because of the finite speed of light – as was recognized by Oxford astronomer James Bradley in 1727, long before Albert Einstein’s formulation of the Special Theory of Relativity which predicts this effect. Such sources were first identified with radio telescopes; the relativist George Ellis and radio astronomer John Baldwin noted in 1984 that with a uniform sky map of at least a few hundred thousand such sources, this dipole could be measured and compared with the standard expectation. It was not however until this millennium that the first such data became available – the NRAO VLA Sky Survey (NVSS) catalogue of radio sources.

    The dipole amplitude turned out to be higher than expected, although its direction was consistent with that of the CMB. However, the uncertainties were large, so the significance of the discrepancy was not compelling. Two years ago, the present team of researchers upped the stakes by analyzing a bigger catalogue of 1.4 million quasars mapped by NASA’s Wide-field Infrared Explorer (WISE).

    They found a similar discrepancy but at much higher significance. Dr von Hausegger comments: “If distant sources are not isotropic in the rest frame in which the CMB is isotropic, it implies a violation of the cosmological principle … which means going back to square one! So, we must now seek corroborating evidence to understand what causes this unexpected result.”

    In their recent paper, the researchers have addressed this by performing a joint analysis of the NVSS and WISE catalogues after performing various detailed checks to demonstrate their suitability for the purpose. These catalogues are systematically independent and have almost no shared objects so this is equivalent to performing two independent experiments. The dipoles in the two catalogues, made at widely different wavelengths, are found to be consistent with each other. The consistency of the two dipoles improves upon boosting to the frame in which the CMB is isotropic (assuming its dipole to be kinematic in origin), which suggests that cosmologically distant radio galaxies and quasars may have an intrinsic anisotropy in this frame. The joint significance of the discrepancy between the rest frames of radiation and matter now exceeds 5σ (ie a probability of less than 1 in 3.5 million of being a fluke). “This issue can no longer be ignored,” comments Professor Sarkar. “The validity of the FLRW metric itself is now in question!”

    Potential paradigm-changing finding

    New data with which to check this potentially paradigm-changing finding will soon come from the Legacy Survey of Space and Time (LSST) to be carried out at the Vera C Rubin Observatory in Chile.

    Oxford Physics is closely involved in this project, along with many other institutions in the UK and all over the world. Professor Ian Shipsey who has been a member of LSST since 2008, is excited about the prospect of carrying out fundamental cosmological tests. ‘As a particle physicist, I am acutely aware that the foundations of the Standard Model of particle physics are constantly under scrutiny.

    One of the reasons I joined LSST, and have worked for so long on it, is precisely to enable powerful tests of the foundations of the standard cosmological model,’ he says. To this end, Dr Hausegger and Professor Sarkar are leading projects in the LSST Dark Energy Science Collaboration to use the forthcoming data to test the homogeneity and isotropy of the Universe. ‘We will soon know if the standard cosmological model and the inference of dark energy are indeed valid,’ concludes Professor Sarkar.

    Science paper:
    The Astrophysical Journal Letters

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Oxford campus

    The University of Oxford

    1
    Universitas Oxoniensis

    The University of Oxford [a.k.a. The Chancellor, Masters and Scholars of the University of Oxford] is a collegiate research university in Oxford, England. There is evidence of teaching as early as 1096, making it the oldest university in the English-speaking world and the world’s second-oldest university in continuous operation. It grew rapidly from 1167 when Henry II banned English students from attending the University of Paris [Université de Paris] (FR). After disputes between students and Oxford townsfolk in 1209, some academics fled north-east to Cambridge where they established what became the University of Cambridge (UK). The two English ancient universities share many common features and are jointly referred to as Oxbridge.

    The university is made up of thirty-nine semi-autonomous constituent colleges, six permanent private halls, and a range of academic departments which are organized into four divisions. All the colleges are self-governing institutions within the university, each controlling its own membership and with its own internal structure and activities. All students are members of a college. It does not have a main campus, and its buildings and facilities are scattered throughout the city centre. Undergraduate teaching at Oxford consists of lectures, small-group tutorials at the colleges and halls, seminars, laboratory work and occasionally further tutorials provided by the central university faculties and departments. Postgraduate teaching is provided predominantly centrally.

    Oxford operates the world’s oldest university museum, as well as the largest university press in the world and the largest academic library system nationwide. In the fiscal year ending 31 July 2019, the university had a total income of £2.45 billion, of which £624.8 million was from research grants and contracts.

    Oxford has educated a wide range of notable alumni, including 28 prime ministers of the United Kingdom and many heads of state and government around the world. As of October 2020, 72 Nobel Prize laureates, 3 Fields Medalists, and 6 Turing Award winners have studied, worked, or held visiting fellowships at the University of Oxford, while its alumni have won 160 Olympic medals. Oxford is the home of numerous scholarships, including the Rhodes Scholarship, one of the oldest international graduate scholarship programmes.

    The University of Oxford’s foundation date is unknown. It is known that teaching at Oxford existed in some form as early as 1096, but it is unclear when a university came into being.

    It grew quickly from 1167 when English students returned from The University of Paris-Sorbonne [Université de Paris-Sorbonne](FR). The historian Gerald of Wales lectured to such scholars in 1188, and the first known foreign scholar, Emo of Friesland, arrived in 1190. The head of the university had the title of chancellor from at least 1201, and the masters were recognized as a universitas or corporation in 1231. The university was granted a royal charter in 1248 during the reign of King Henry III.

    The students associated together on the basis of geographical origins, into two ‘nations’, representing the North (northerners or Boreales, who included the English people from north of the River Trent and the Scots) and the South (southerners or Australes, who included English people from south of the Trent, the Irish and the Welsh). In later centuries, geographical origins continued to influence many students’ affiliations when membership of a college or hall became customary in Oxford. In addition, members of many religious orders, including Dominicans, Franciscans, Carmelites and Augustinians, settled in Oxford in the mid-13th century, gained influence and maintained houses or halls for students. At about the same time, private benefactors established colleges as self-contained scholarly communities. Among the earliest such founders were William of Durham, who in 1249 endowed University College, and John Balliol, father of a future King of Scots; Balliol College bears his name. Another founder, Walter de Merton, a Lord Chancellor of England and afterwards Bishop of Rochester, devised a series of regulations for college life. Merton College thereby became the model for such establishments at Oxford, as well as at the University of Cambridge. Thereafter, an increasing number of students lived in colleges rather than in halls and religious houses.

    In 1333–1334, an attempt by some dissatisfied Oxford scholars to found a new university at Stamford, Lincolnshire, was blocked by the universities of Oxford and Cambridge petitioning King Edward III. Thereafter, until the 1820s, no new universities were allowed to be founded in England, even in London; thus, Oxford and Cambridge had a duopoly, which was unusual in large western European countries.

    The new learning of the Renaissance greatly influenced Oxford from the late 15th century onwards. Among university scholars of the period were William Grocyn, who contributed to the revival of Greek language studies, and John Colet, the noted biblical scholar.

    With the English Reformation and the breaking of communion with the Roman Catholic Church, recusant scholars from Oxford fled to continental Europe, settling especially at the University of Douai. The method of teaching at Oxford was transformed from the medieval scholastic method to Renaissance education, although institutions associated with the university suffered losses of land and revenues. As a centre of learning and scholarship, Oxford’s reputation declined in the Age of Enlightenment; enrollments fell and teaching was neglected.

    In 1636, William Laud, the chancellor and Archbishop of Canterbury, codified the university’s statutes. These, to a large extent, remained its governing regulations until the mid-19th century. Laud was also responsible for the granting of a charter securing privileges for The University Press, and he made significant contributions to the Bodleian Library, the main library of the university. From the beginnings of the Church of England as the established church until 1866, membership of the church was a requirement to receive the BA degree from the university and “dissenters” were only permitted to receive the MA in 1871.

    The university was a centre of the Royalist party during the English Civil War (1642–1649), while the town favored the opposing Parliamentarian cause. From the mid-18th century onwards, however, the university took little part in political conflicts.

    Wadham College, founded in 1610, was the undergraduate college of Sir Christopher Wren. Wren was part of a brilliant group of experimental scientists at Oxford in the 1650s, the Oxford Philosophical Club, which included Robert Boyle and Robert Hooke. This group held regular meetings at Wadham under the guidance of the college’s Warden, John Wilkins, and the group formed the nucleus that went on to found the Royal Society.

    Before reforms in the early 19th century, the curriculum at Oxford was notoriously narrow and impractical. Sir Spencer Walpole, a historian of contemporary Britain and a senior government official, had not attended any university. He said, “Few medical men, few solicitors, few persons intended for commerce or trade, ever dreamed of passing through a university career.” He quoted the Oxford University Commissioners in 1852 stating: “The education imparted at Oxford was not such as to conduce to the advancement in life of many persons, except those intended for the ministry.” Nevertheless, Walpole argued:

    “Among the many deficiencies attending a university education there was, however, one good thing about it, and that was the education which the undergraduates gave themselves. It was impossible to collect some thousand or twelve hundred of the best young men in England, to give them the opportunity of making acquaintance with one another, and full liberty to live their lives in their own way, without evolving in the best among them, some admirable qualities of loyalty, independence, and self-control. If the average undergraduate carried from university little or no learning, which was of any service to him, he carried from it a knowledge of men and respect for his fellows and himself, a reverence for the past, a code of honor for the present, which could not but be serviceable. He had enjoyed opportunities… of intercourse with men, some of whom were certain to rise to the highest places in the Senate, in the Church, or at the Bar. He might have mixed with them in his sports, in his studies, and perhaps in his debating society; and any associations which he had this formed had been useful to him at the time, and might be a source of satisfaction to him in after life.”

    Out of the students who matriculated in 1840, 65% were sons of professionals (34% were Anglican ministers). After graduation, 87% became professionals (59% as Anglican clergy). Out of the students who matriculated in 1870, 59% were sons of professionals (25% were Anglican ministers). After graduation, 87% became professionals (42% as Anglican clergy).

    M. C. Curthoys and H. S. Jones argue that the rise of organized sport was one of the most remarkable and distinctive features of the history of the universities of Oxford and Cambridge in the late 19th and early 20th centuries. It was carried over from the athleticism prevalent at the public schools such as Eton, Winchester, Shrewsbury, and Harrow.

    All students, regardless of their chosen area of study, were required to spend (at least) their first year preparing for a first-year examination that was heavily focused on classical languages. Science students found this particularly burdensome and supported a separate science degree with Greek language study removed from their required courses. This concept of a Bachelor of Science had been adopted at other European universities (The University of London (UK) had implemented it in 1860) but an 1880 proposal at Oxford to replace the classical requirement with a modern language (like German or French) was unsuccessful. After considerable internal wrangling over the structure of the arts curriculum, in 1886 the “natural science preliminary” was recognized as a qualifying part of the first-year examination.

    At the start of 1914, the university housed about 3,000 undergraduates and about 100 postgraduate students. During the First World War, many undergraduates and fellows joined the armed forces. By 1918 virtually all fellows were in uniform, and the student population in residence was reduced to 12 per cent of the pre-war total. The University Roll of Service records that, in total, 14,792 members of the university served in the war, with 2,716 (18.36%) killed. Not all the members of the university who served in the Great War were on the Allied side; there is a remarkable memorial to members of New College who served in the German armed forces, bearing the inscription, ‘In memory of the men of this college who coming from a foreign land entered into the inheritance of this place and returning fought and died for their country in the war 1914–1918’. During the war years the university buildings became hospitals, cadet schools and military training camps.

    Reforms

    Two parliamentary commissions in 1852 issued recommendations for Oxford and Cambridge. Archibald Campbell Tait, former headmaster of Rugby School, was a key member of the Oxford Commission; he wanted Oxford to follow the German and Scottish model in which the professorship was paramount. The commission’s report envisioned a centralized university run predominantly by professors and faculties, with a much stronger emphasis on research. The professional staff should be strengthened and better paid. For students, restrictions on entry should be dropped, and more opportunities given to poorer families. It called for an enlargement of the curriculum, with honors to be awarded in many new fields. Undergraduate scholarships should be open to all Britons. Graduate fellowships should be opened up to all members of the university. It recommended that fellows be released from an obligation for ordination. Students were to be allowed to save money by boarding in the city, instead of in a college.

    The system of separate honor schools for different subjects began in 1802, with Mathematics and Literae Humaniores. Schools of “Natural Sciences” and “Law, and Modern History” were added in 1853. By 1872, the last of these had split into “Jurisprudence” and “Modern History”. Theology became the sixth honor school. In addition to these B.A. Honors degrees, the postgraduate Bachelor of Civil Law (B.C.L.) was, and still is, offered.

    The mid-19th century saw the impact of the Oxford Movement (1833–1845), led among others by the future Cardinal John Henry Newman. The influence of the reformed model of German universities reached Oxford via key scholars such as Edward Bouverie Pusey, Benjamin Jowett and Max Müller.

    Administrative reforms during the 19th century included the replacement of oral examinations with written entrance tests, greater tolerance for religious dissent, and the establishment of four women’s colleges. Privy Council decisions in the 20th century (e.g. the abolition of compulsory daily worship, dissociation of the Regius Professorship of Hebrew from clerical status, diversion of colleges’ theological bequests to other purposes) loosened the link with traditional belief and practice. Furthermore, although the university’s emphasis had historically been on classical knowledge, its curriculum expanded during the 19th century to include scientific and medical studies. Knowledge of Ancient Greek was required for admission until 1920, and Latin until 1960.

    The University of Oxford began to award doctorates for research in the first third of the 20th century. The first Oxford D.Phil. in mathematics was awarded in 1921.

    The mid-20th century saw many distinguished continental scholars, displaced by Nazism and communism, relocating to Oxford.

    The list of distinguished scholars at the University of Oxford is long and includes many who have made major contributions to politics, the sciences, medicine, and literature. As of October 2020, 72 Nobel laureates and more than 50 world leaders have been affiliated with the University of Oxford.

    To be a member of the university, all students, and most academic staff, must also be a member of a college or hall. There are thirty-nine colleges of the University of Oxford (including Reuben College, planned to admit students in 2021) and six permanent private halls (PPHs), each controlling its membership and with its own internal structure and activities. Not all colleges offer all courses, but they generally cover a broad range of subjects.

    The colleges are:

    All-Souls College
    Balliol College
    Brasenose College
    Christ Church College
    Corpus-Christi College
    Exeter College
    Green-Templeton College
    Harris-Manchester College
    Hertford College
    Jesus College
    Keble College
    Kellogg College
    Lady-Margaret-Hall
    Linacre College
    Lincoln College
    Magdalen College
    Mansfield College
    Merton College
    New College
    Nuffield College
    Oriel College
    Pembroke College
    Queens College
    Reuben College
    St-Anne’s College
    St-Antony’s College
    St-Catherines College
    St-Cross College
    St-Edmund-Hall College
    St-Hilda’s College
    St-Hughs College
    St-John’s College
    St-Peters College
    Somerville College
    Trinity College
    University College
    Wadham College
    Wolfson College
    Worcester College

    The permanent private halls were founded by different Christian denominations. One difference between a college and a PPH is that whereas colleges are governed by the fellows of the college, the governance of a PPH resides, at least in part, with the corresponding Christian denomination. The six current PPHs are:

    Blackfriars
    Campion Hall
    Regent’s Park College
    St Benet’s Hall
    St-Stephen’s Hall
    Wycliffe Hall

    The PPHs and colleges join as the Conference of Colleges, which represents the common concerns of the several colleges of the university, to discuss matters of shared interest and to act collectively when necessary, such as in dealings with the central university. The Conference of Colleges was established as a recommendation of the Franks Commission in 1965.

    Teaching members of the colleges (i.e., fellows and tutors) are collectively and familiarly known as dons, although the term is rarely used by the university itself. In addition to residential and dining facilities, the colleges provide social, cultural, and recreational activities for their members. Colleges have responsibility for admitting undergraduates and organizing their tuition; for graduates, this responsibility falls upon the departments. There is no common title for the heads of colleges: the titles used include Warden, Provost, Principal, President, Rector, Master and Dean.

    Oxford is regularly ranked within the top 5 universities in the world and is currently ranked first in the world in the Times Higher Education World University Rankings, as well as the Forbes’s World University Rankings. It held the number one position in The Times Good University Guide for eleven consecutive years, and the medical school has also maintained first place in the “Clinical, Pre-Clinical & Health” table of The Times Higher Education World University Rankings for the past seven consecutive years. In 2021, it ranked sixth among the universities around the world by SCImago Institutions Rankings. The Times Higher Education has also recognised Oxford as one of the world’s “six super brands” on its World Reputation Rankings, along with The University of California-Berkeley, The University of Cambridge (UK), Harvard University, The Massachusetts Institute of Technology, and Stanford University. The university is fifth worldwide on the US News ranking. Its Saïd Business School came 13th in the world in The Financial Times Global MBA Ranking.
    Oxford was ranked ninth in the world in 2015 by The Nature Index, which measures the largest contributors to papers published in 82 leading journals. It is ranked fifth best university worldwide and first in Britain for forming CEOs according to The Professional Ranking World Universities, and first in the UK for the quality of its graduates as chosen by the recruiters of the UK’s major companies.

    In the 2018 Complete University Guide, all 38 subjects offered by Oxford rank within the top 10 nationally meaning Oxford was one of only two multi-faculty universities (along with Cambridge) in the UK to have 100% of their subjects in the top 10. Computer Science, Medicine, Philosophy, Politics and Psychology were ranked first in the UK by the guide.

    According to The QS World University Rankings by Subject, the University of Oxford also ranks as number one in the world for four Humanities disciplines: English Language and Literature, Modern Languages, Geography, and History. It also ranks second globally for Anthropology, Archaeology, Law, Medicine, Politics & International Studies, and Psychology.

     
  • richardmitnick 11:22 am on July 10, 2022 Permalink | Reply
    Tags: , "Do you see new physics in my CMB?", , , , Can You See Dark Matter and Dark Energy?, cosmic birefringence, , Dark Energy, , , ,   

    From astrobites : “Do you see new physics in my CMB?” 

    Astrobites bloc

    From astrobites

    Jul 9, 2022
    Kayla Kornoelje

    Title: New physics from the polarised light of the cosmic microwave background
    Authors: Eiichiro Komatsu
    First Author’s Institution: Max-Planck-Institut für Astrophysik, Karl-Schwarzschild Str. 1, 85741 Garching, Germany
    Status: Submitted to ArXiv [28 Feb 2022]

    Astronomers have painted an extraordinary picture of our Universe with the standard cosmological model, ΛCDM.

    The only problem is that astronomers don’t exactly know what ΛCDM really is. What is Dark Energy and Dark Matter? What is the physics behind Inflation? The answers to these fundamental questions in cosmology could be hidden right inside your T.V.

    ___________________________________________________________________
    The Dark Energy Survey

    Dark Energy Camera [DECam] built at The DOE’s Fermi National Accelerator Laboratory.

    NOIRLab National Optical Astronomy Observatory Cerro Tololo Inter-American Observatory(CL) Victor M Blanco 4m Telescope which houses the Dark-Energy-Camera – DECam at Cerro Tololo, Chile at an altitude of 7200 feet.

    NOIRLabNSF NOIRLab NOAO Cerro Tololo Inter-American Observatory(CL) approximately 80 km to the East of La Serena, Chile, at an altitude of 2200 meters.

    Timeline of the Inflationary Universe WMAP.

    The The Dark Energy Survey is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. The Dark Energy Survey began searching the Southern skies on August 31, 2013.

    According to Albert Einstein’s Theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up.
    Saul Perlmutter (center) [The Supernova Cosmology Project] shared the 2006 Shaw Prize in Astronomy, the 2011 Nobel Prize in Physics, and the 2015 Breakthrough Prize in Fundamental Physics with Brian P. Schmidt (right) and Adam Riess (left) [The High-z Supernova Search Team] for providing evidence that the expansion of the universe is accelerating.

    To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called Dark Energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    The Dark Energy Survey is designed to probe the origin of the accelerating universe and help uncover the nature of Dark Energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the Dark Energy Survey collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.
    ___________________________________________________________________

    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM, denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky.
    Coma cluster via NASA/ESA Hubble, the original example of Dark Matter discovered during observations by Fritz Zwicky and confirmed 30 years later by Vera Rubin.
    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.

    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.

    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.
    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970.

    Vera Rubin measuring spectra, worked on Dark Matter(Emilio Segre Visual Archives AIP SPL).
    Dark Matter Research

    Super Cryogenic Dark Matter Search from DOE’s SLAC National Accelerator Laboratory (US) at Stanford University (US) at SNOLAB (Vale Inco Mine, Sudbury, Canada).

    LBNL LZ Dark Matter Experiment (US) xenon detector at Sanford Underground Research Facility(US) Credit: Matt Kapust.

    Lamda Cold Dark Matter Accerated Expansion of The universe http://www.scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    DAMA at Gran Sasso uses sodium iodide housed in copper to hunt for dark matter LNGS-INFN.

    Yale HAYSTAC axion dark matter experiment at Yale’s Wright Lab.

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB (CA) deep in Sudbury’s Creighton Mine.

    The LBNL LZ Dark Matter Experiment (US) Dark Matter project at SURF, Lead, SD, USA.

    DAMA-LIBRA Dark Matter experiment at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS) located in the Abruzzo region of central Italy.

    DARWIN Dark Matter experiment. A design study for a next-generation, multi-ton dark matter detector in Europe at The University of Zurich [Universität Zürich](CH).

    PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China.

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.
    ___________________________________________________________________
    Cosmic Inflation Theory

    In physical cosmology, cosmic inflation, cosmological inflation is a theory of exponential expansion of space in the early universe. The inflationary epoch lasted from 10^−36 seconds after the conjectured Big Bang singularity to some time between 10^−33 and 10^−32 seconds after the singularity. Following the inflationary period, the universe continued to expand, but at a slower rate. The acceleration of this expansion due to dark energy began after the universe was already over 7.7 billion years old (5.4 billion years ago).

    Inflation theory was developed in the late 1970s and early 80s, with notable contributions by several theoretical physicists, including Alexei Starobinsky at Landau Institute for Theoretical Physics, Alan Guth at Cornell University, and Andrei Linde at Lebedev Physical Institute. Alexei Starobinsky, Alan Guth, and Andrei Linde won the 2014 Kavli Prize “for pioneering the theory of cosmic inflation.” It was developed further in the early 1980s. It explains the origin of the large-scale structure of the cosmos. Quantum fluctuations in the microscopic inflationary region, magnified to cosmic size, become the seeds for the growth of structure in the Universe. Many physicists also believe that inflation explains why the universe appears to be the same in all directions (isotropic), why the cosmic microwave background radiation is distributed evenly, why the universe is flat, and why no magnetic monopoles have been observed.

    The detailed particle physics mechanism responsible for inflation is unknown. The basic inflationary paradigm is accepted by most physicists, as a number of inflation model predictions have been confirmed by observation; [a] however, a substantial minority of scientists dissent from this position. The hypothetical field thought to be responsible for inflation is called the inflation.

    In 2002 three of the original architects of the theory were recognized for their major contributions; physicists Alan Guth of M.I.T., Andrei Linde of Stanford, and Paul Steinhardt of Princeton shared the prestigious Dirac Prize “for development of the concept of inflation in cosmology”. In 2012 Guth and Linde were awarded the Breakthrough Prize in Fundamental Physics for their invention and development of inflationary cosmology.

    4
    Alan Guth, from M.I.T., who first proposed Cosmic Inflation.

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    Alan Guth’s notes:
    Alan Guth’s original notes on inflation.
    ___________________________________________________________________

    The cosmic microwave background (CMB) is leftover radiation from the Big Bang.

    It’s some of the oldest light in the Universe, and yes, you can see that light in T.V. static!

    The CMB is rich with data that carries profound information about cosmology just waiting to be understood, but most important for our discussion today are the properties of the CMB’s polarization.

    Can You See Dark Matter and Dark Energy?

    3
    Figure 1: An illustration of cosmic birefringence. The left and right images are representations of the CMB before (left) and after (right) photons begin to travel towards us. Notice that the CMB photon’s wavelength is rotated by an angle β, which represents the rotation due to cosmic birefringence. This changes the polarization pattern (black lines in the image) of the CMB. Figure 3 in the paper.

    First, let’s try and answer our first question: what is the nature of dark matter and dark energy? When the CMB was formed around 380,000 years after the Big Bang, the Universe was hot, dense, and filled with electrons. As photons from the CMB made their long journey towards us, they scattered off of these electrons. From these scattering interactions at the appropriately named surface of last scattering, CMB photons naturally got linearly polarized at some specific angle, and some astronomers are on the hunt for a rotation of this initial polarization angle, called cosmic birefringence. This is exactly like the birefringence of a crystal, as light passing through a crystal can also be deflected at an angle relative to its initial path. The biggest difference between these two types of birefringence is merely that the photons from the CMB are polarized due to an energy field rather than a crystal. Some astronomers theorize that this energy field could be related to dark matter and dark energy, so a detection of this cosmic birefringence could tell us a lot about the ‘dark side’ of cosmology. Not only would a detection rule out Einstein’s cosmological constant as the origin of dark energy, but it would also tell us about the physics behind it. Also, since cosmic birefringence isn’t predicted by the standard ΛCDM cosmological model, it would also provide evidence for entirely new physics!

    Through the analysis of Planck polarization data, the author of today’s paper have found a tantalizing hint for cosmic birefringence. By using the latest reprocessing of Planck data, the author found a weak signal of cosmic birefringence corresponding to an angle of β = 0.30°± 0.11°. However, while this is an exciting result, it is not conclusive enough to call this a true detection of cosmic birefringence just yet. This is due to limitations in the precision of the measurements of the initial rotation angle, along with other possible systematic effects.

    Can You See Inflation?

    So, we haven’t detected cosmic birefringence, and we still don’t fully understand the nature of dark matter and dark energy. But what about inflation? While data from the CMB already provides support for inflation, astronomers are still on the lookout for a key piece of evidence in support of inflation: B-modes. Polarization angles from the CMB can be deconstructed into two types of modes: E-modes, which describe parallel or perpendicular angles, and B-modes, which describe 45° angles. B-modes are important proof of the inflationary model as the gravitational waves produced by inflation are the dominant contributor to B-modes. A detection of these B-modes would not only provide strong evidence for inflation, but also provide information about the physics behind it through analysis of their shape and properties. Although these modes also haven’t been detected yet, by using one potential model of inflation, today’s author has shown that their detection may be possible. (see Figure 2).

    3
    Figure 2: Plot of the B-mode power spectrum, which describes the power and properties of B-modes, as a function of multipole, which loosely describes angular size. The main takeaway is that at low multipoles (around 2 – 10), the energy from gravitational waves (blue) and the total contribution of new physics (green) is higher than the background energy (gray). So, with access to low-multipole data from missions such as the upcoming LiteBird satellite mission, detection of the B-modes from inflationary gravitational waves should be possible. Figure 5 in the paper.

    The CMB in the Future

    So, have we seen new physics in the CMB yet? Unfortunately, not quite—detecting cosmic birefringence or B-modes, as you have seen, is no easy task. Even small errors due to contamination, miscalibration, and systematic uncertainties can render these signals undetectable. However, the future looks bright. The noise level for CMB experiments has dropped nearly exponentially with time, and new CMB experiments such as SPT-4, CMB Stage-4, the Simons Observatory, JAXA and LiteBird are set to come online in the next decade. With new high-precision data on the horizon, and a little innovation, we may start to find the answers to these ambitious questions, so keep on the look out for these new results. Who knows, maybe we’ll find new physics along the way too!

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    What do we do?

    Astrobites is a daily astrophysical literature journal written by graduate students in astronomy. Our goal is to present one interesting paper per day in a brief format that is accessible to undergraduate students in the physical sciences who are interested in active research.
    Why read Astrobites?

    Reading a technical paper from an unfamiliar subfield is intimidating. It may not be obvious how the techniques used by the researchers really work or what role the new research plays in answering the bigger questions motivating that field, not to mention the obscure jargon! For most people, it takes years for scientific papers to become meaningful.

    Our goal is to solve this problem, one paper at a time. In 5 minutes a day reading Astrobites, you should not only learn about one interesting piece of current work, but also get a peek at the broader picture of research in a new area of astronomy.

     
  • richardmitnick 11:07 am on July 7, 2022 Permalink | Reply
    Tags: , , "Cosmic Web": the large scale structure of the universe., , "Predicting the composition of dark matter", A new analysis by a team of physicists offers an innovative means to predict "cosmological signatures" for models of "dark matter"., , , Dark Energy, , , Dark matter detected only by its gravitational pull on ordinary matter., In this study the normal matter and dark matter and dark energy in a region of the universe are followed through to the present day using the equations of gravity and hydrodynamics and cosmology., , , , This research establishes new ways to find these cosmological signatures in more complex models.   

    From New York University via “phys.org” : “Predicting the composition of dark matter” 

    NYU BLOC

    From New York University

    Via

    “phys.org”

    July 6, 2022

    1
    An artist’s rendition of big bang nucleosynthesis, the early universe period in which protons “p” and neutrons “n” combine to form light elements. The presence of dark matter “χ” changes how much of each element will form. Credit: Cara Giovanetti/New York University.

    A new analysis by a team of physicists offers an innovative means to predict “cosmological signatures” for models of “dark matter”.

    A team of physicists has developed a method for predicting the composition of dark matter—invisible matter detected only by its gravitational pull on ordinary matter and whose discovery has been long sought by scientists.

    Its work, which appears in the journal Physical Review Letters, centers on predicting “cosmological signatures” for models of dark matter with a mass between that of the electron and the proton. Previous methods had predicted similar signatures for simpler models of dark matter. This research establishes new ways to find these signatures in more complex models, which experiments continue to search for, the paper’s authors note.

    “Experiments that search for dark matter are not the only way to learn more about this mysterious type of matter,” says Cara Giovanetti, a Ph.D. student in New York University’s Department of Physics and the lead author of the paper.


    Predicting the composition of dark matter.
    This visualization of a computer simulation showcases the ‘cosmic web’- the large scale structure of the universe. Each bright knot is an entire galaxy, while the purple filaments show where material exists between the galaxies. To the human eye, only the galaxies would be visible, and this visualization allows us to see the strands of material connecting the galaxies and forming the cosmic web. This visualization is based on a scientific simulation of the growth of structure in the universe. The matter and dark matter and dark energy in a region of the universe are followed from very early times of the universe through to the present day using the equations of gravity, hydrodynamics, and cosmology. The normal matter has been clipped to show only the densest regions, which are the galaxies, and is shown in white. The dark matter is shown in purple. The size of the simulation is a cube with a side length of 134 megaparsecs (437 million light-years). Credit: Hubblesite; Visualization: Frank Summers, Space Telescope Science Institute; Simulation: Martin White and Lars Hernquist, Harvard University.

    “Precision measurements of different parameters of the universe—for example, the amount of helium in the universe, or the temperatures of different particles in the early universe—can also teach us a lot about dark matter,” adds Giovanetti, outlining the method described in the Physical Review Letters paper.

    In the research, conducted with Hongwan Liu, an NYU postdoctoral fellow, Joshua Ruderman, an associate professor in NYU’s Department of Physics, and Princeton physicist Mariangela Lisanti, Giovanetti and her co-authors focused on big bang nucleosynthesis (BBN)—a process by which light forms of matter, such as helium, hydrogen, and lithium, are created. The presence of invisible dark matter affects how each of these elements will form. Also vital to these phenomena is the cosmic microwave background (CMB)—electromagnetic radiation, generated by combining electrons and protons, that remained after the universe’s formation.

    The team sought a means to spot the presence of a specific category of dark matter—that with a mass between that of the electron and the proton—by creating models that took into account both BBN and CMB.

    “Such dark matter can modify the abundances of certain elements produced in the early universe and leave an imprint in the cosmic microwave background by modifying how quickly the universe expands,” Giovanetti explains.

    In its research, the team made predictions of cosmological signatures linked to the presence of certain forms of dark matter. These signatures are the result of dark matter changing the temperatures of different particles or altering how fast the universe expands.

    Their results showed that dark matter that is too light will lead to different amounts of light elements than what astrophysical observations see.

    “Lighter forms of dark matter might make the universe expand so fast that these elements don’t have a chance to form,” says Giovanetti, outlining one scenario.

    “We learn from our analysis that some models of dark matter can’t have a mass that’s too small, otherwise the universe would look different from the one we observe,” she adds.
    __________________________________
    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM, denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky.
    Coma cluster via NASA/ESA Hubble, the original example of Dark Matter discovered during observations by Fritz Zwicky and confirmed 30 years later by Vera Rubin.
    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.

    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.

    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.
    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970.

    Vera Rubin measuring spectra, worked on Dark Matter(Emilio Segre Visual Archives AIP SPL).
    Dark Matter Research

    Super Cryogenic Dark Matter Search from DOE’s SLAC National Accelerator Laboratory (US) at Stanford University (US) at SNOLAB (Vale Inco Mine, Sudbury, Canada).

    LBNL LZ Dark Matter Experiment (US) xenon detector at Sanford Underground Research Facility(US) Credit: Matt Kapust.

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    DAMA at Gran Sasso uses sodium iodide housed in copper to hunt for dark matter LNGS-INFN.

    Yale HAYSTAC axion dark matter experiment at Yale’s Wright Lab.

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB (CA) deep in Sudbury’s Creighton Mine.

    The LBNL LZ Dark Matter Experiment (US) Dark Matter project at SURF, Lead, SD, USA.

    DAMA-LIBRA Dark Matter experiment at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS) located in the Abruzzo region of central Italy.

    DARWIN Dark Matter experiment. A design study for a next-generation, multi-ton dark matter detector in Europe at The University of Zurich [Universität Zürich](CH).

    PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China.

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.
    __________________________________

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NYU Campus

    More than 175 years ago, Albert Gallatin, the distinguished statesman who served as secretary of the treasury under Presidents Thomas Jefferson and James Madison, declared his intention to establish “in this immense and fast-growing city … a system of rational and practical education fitting for all and graciously opened to all.” Founded in 1831, New York University is now one of the largest private universities in the United States. Of the more than 3,000 colleges and universities in America, New York University is one of only 60 member institutions of the distinguished Association of American Universities.

    New York University is a private research university in New York City. Chartered in 1831 by the New York State Legislature, NYU was founded by a group of New Yorkers led by then Secretary of the Treasury Albert Gallatin.

    In 1832, the initial non-denominational all-male institution began its first classes near City Hall based on a curriculum focused on a secular education. The university, in 1833, then moved and has maintained its main campus in Greenwich Village surrounding Washington Square Park. Since then, the university has added an engineering school in Brooklyn’s MetroTech Center and graduate schools throughout Manhattan. NYU has become the largest private university in the United States by enrollment, with a total of 51,848 enrolled students, including 26,733 undergraduate students and 25,115 graduate students, in 2019. NYU also receives the most applications of any private institution in the United States and admissions is considered highly selective.

    NYU is organized into 10 undergraduate schools, including the College of Arts & Science, Gallatin School, Steinhart School, Stern School of Business, Tandon School of Engineering, and the Tisch School of Arts. NYU’s 15 graduate schools includes the Grossman School of Medicine, School of Law, Wagner Graduate School of Public Service, School of Professional Studies, School of Social Work, Rory Meyers School of Nursing, and Silver School of Social Work. The university’s internal academic centers include the Courant Institute of Mathematical Sciences, Center for Data Science, Center for Neural Science, Clive Davis Institute, Institute for the Study of the Ancient World, Institute of Fine Arts, and the NYU Langone Health System. NYU is a global university with degree-granting campuses at NYU Abu Dhabi and NYU Shanghai, and academic centers in Accra, Berlin, Buenos Aires, Florence, London, Los Angeles, Madrid, Paris, Prague, Sydney, Tel Aviv, and Washington, D.C.

    Past and present faculty and alumni include 38 Nobel Laureates, 8 Turing Award winners, 5 Fields Medalists, 31 MacArthur Fellows, 26 Pulitzer Prize winners, 3 heads of state, a U.S. Supreme Court justice, 5 U.S. governors, 4 mayors of New York City, 12 U.S. Senators, 58 members of the U.S. House of Representatives, two Federal Reserve Chairmen, 38 Academy Award winners, 30 Emmy Award winners, 25 Tony Award winners, 12 Grammy Award winners, 17 billionaires, and seven Olympic medalists. The university has also produced six Rhodes Scholars, three Marshall Scholars, 29 Schwarzman Scholars, and one Mitchell Scholar.

    Research

    NYU is classified among “R1: Doctoral Universities – Very high research activity” and research expenditures totaled $917.7 million in 2017. The university was the founding institution of the American Chemical Society. The NYU Grossman School of Medicine received $305 million in external research funding from the National Institutes of Health in 2014. NYU was granted 90 patents in 2014, the 19th most of any institution in the world. NYU owns the fastest supercomputer in New York City. As of 2016, NYU hardware researchers and their collaborators enjoy the largest outside funding level for hardware security of any institution in the United States, including grants from the National Science Foundation, the Office of Naval Research, the Defense Advanced Research Projects Agency, the United States Army Research Laboratory, the Air Force Research Laboratory, the Semiconductor Research Corporation, and companies including Twitter, Boeing, Microsoft, and Google.

    In 2019, four NYU Arts & Science departments ranked in Top 10 of Shanghai Academic Rankings of World Universities by Academic Subjects (Economics, Politics, Psychology, and Sociology).

     
  • richardmitnick 3:23 pm on March 24, 2022 Permalink | Reply
    Tags: "What Can We Learn About the Universe from Just One Galaxy?", , , , CAMELS: Cosmology and Astrophysics with MachinE Learning Simulations, , Dark Energy, , , Omega matter: a cosmological parameter that describes how much dark matter is in the universe, ,   

    From The New Yorker: “What Can We Learn About the Universe from Just One Galaxy?” 


    Rea Irvin

    From The New Yorker

    March 23, 2022
    Rivka Galchen

    1
    Illustration by Nicholas Konrad /The New Yorker

    In new research, begun by an undergraduate, William Blake’s phrase “to see a world in a grain of sand” is suddenly relevant to astrophysics.

    Imagine if you could look at a snowflake at the South Pole and determine the size and the climate of all of Antarctica. Or study a randomly selected tree in the Amazon rain forest and, from that one tree—be it rare or common, narrow or wide, young or old—deduce characteristics of the forest as a whole. Or, what if, by looking at one galaxy among the hundred billion or so in the observable universe, one could say something substantial about the universe as a whole? A recent paper, whose lead authors include a cosmologist, a galaxy-formation expert, and an undergraduate named Jupiter (who did the initial work), suggests that this may be the case. The result at first seemed “crazy” to the paper’s authors. Now, having discussed their work with other astrophysicists and done various “sanity checks,” trying to find errors in their methods, the results are beginning to seem pretty clear. Francisco Villaescusa-Navarro, one of the lead authors of the work, said, “It does look like galaxies somehow retain a memory of the entire universe.”

    The research began as a sort of homework exercise. Jupiter Ding, while a freshman at Princeton University, wrote to the department of astrophysics, hoping to get involved in research. He mentioned that he had some experience with machine learning, a form of artificial intelligence that is adept at picking out patterns in very large data sets. Villaescusa-Navarro, an astrophysicist focused on cosmology, had an idea for what the student might work on. Villaescusa-Navarro had long wanted to look into whether machine learning could be used to help find relationships between galaxies and the universe. “I was thinking, What if you could look at only a thousand galaxies and from that learn properties about the entire universe? I wondered, What is the smallest number we could look at? What if you looked at only one hundred? I thought, O.K., we’ll start with one galaxy.”

    He had no expectation that one galaxy would provide much. But he thought that it would be a good way for Ding to practice using machine learning on a database known as CAMELS (Cosmology and Astrophysics with MachinE Learning Simulations). Shy Genel, an astrophysicist focussed on galaxy formation, who is another lead author on the paper, explained CAMELS this way: “We start with a description of reality shortly after the Big Bang. At that point, the universe is mostly hydrogen gas, and some helium and dark matter. And then, using what we know of the laws of physics, our best guess, we then run the cosmic history for roughly fourteen billion years.” Cosmological simulations have been around for about forty years, but they are increasingly sophisticated—and fast. CAMELS contains some four thousand simulated universes. Working with simulated universes, as opposed to our own, lets researchers ask questions that the gaps in our observational data preclude us from answering. They also let researchers play with different parameters, like the proportions of dark matter and hydrogen gas, to test their impact.

    Ding did the work on CAMELS from his dorm room, on his laptop. He wrote programs to work with the CAMELS data, then sent them to one of the university’s computing clusters, a collection of computers with far more power than his MacBook Air. That computing cluster contained the CAMELS data. Ding’s model trained itself by taking a set of simulated universes and looking at the galaxies within them. Once trained, the model would then be shown a sample galaxy and asked to predict features of the universe from which it was sampled.

    Ding is very humble about his contribution to the research, but he knows far more about astrophysics than even an exceptional first-year student typically does. Ding, a middle child with two sisters, grew up in State College, Pennsylvania. In high school, he took a series of college-level astronomy courses at Penn State and worked on a couple of research projects that involved machine learning. “My dad was really interested in astronomy as a high schooler,” Ding told me. “He went another direction, though.” His father is a professor of marketing at Penn State’s business school.

    Artificial intelligence is an umbrella concept for various disciplines, including machine learning. A famous early machine-learning task was to get a computer to recognize an image of a cat. This is something that a human can do easily, but, for a computer, there are no simple parameters that define the visual concept of a cat. Machine learning is now used for detecting patterns or relationships that are nearly impossible for humans to see, in part because the data is often in many dimensions. The programmer remains the captain, telling the computer what to learn, and deciding what input it’s trained on. But the computer adapts, iteratively, as it learns, and in that way becomes the author of its own algorithms. It was machine learning, for example, that discovered, through analyzing language patterns, the alleged main authors of the posts by “Q” (the supposed high-ranking government official who sparked the QAnon conspiracy theory). It was also able to identify which of Q’s posts appeared to be written by Paul Furber, a South African software developer, and by Ron Watkins, the son of the former owner of 8chan. Machine-learning programs have also been applied in health care, using data to predict which patients are most at risk of falling. Compared with the intuition of doctors, the machine-learning-based assessments reduced falls by about forty per cent, an enormous margin of improvement for a medical intervention.

    Machine learning has catapulted astrophysics research forward, too. Villaescusa-Navarro said, “As a community, we have been dealing with super-hard problems for many, many years. Problems that the smartest people in the field have been working on for decades. And from one day to the next, these problems are getting solved with machine learning.” Even generating a single simulated universe used to take a very long time. You gave a computer some initial conditions and then had to wait while it worked out what those conditions would produce some fourteen billion years down the line. It took less than fourteen billion years, of course, but there was no way to build up a large database of simulated universes in a timely way. Machine-learning advances have sped up these simulations, making a project like CAMELS possible. An even more ambitious project, Learning the Universe, will use machine learning to create simulated universes millions of times faster than CAMELS can; it will then use what’s called simulation-based inference—along with real observational data from telescopes—to determine which starting parameters lead to a universe that most closely resembles our own.

    Ding told me that one of the reasons he chose astronomy has been the proximity he feels to breakthroughs in the field, even as an undergraduate. “For example, I’m in a cosmology class right now, and when my professor talks about dark matter, she talks about it as something ‘a good friend of mine, Vera Rubin, put on the map,’ ” he said. “And dark energy was discovered by a team at Harvard University about twenty years ago, and I did a summer program there. So here I am, learning about this stuff pretty much in the places where these things were happening.” Ding’s research produced something profoundly unexpected. His model used a single galaxy in a simulated universe to pretty accurately say something about that universe. The specific characteristic it was able to predict is called Omega matter, which relates to the density of a universe. Its value was accurately predicted to within ten per cent.

    Ding was initially unsure how meaningful his results were and was curious to hear Villaescusa-Navarro’s perspective. He was more than skeptical. “My first thought was, This is completely crazy, I don’t believe it, this is the work of an undergraduate, there must be a mistake,” Villaescusa-Navarro said. “I asked him to run the program in a few other ways to see if he would still come up with similar results.” The results held.

    Villaescusa-Navarro began to do his own calculations. His doubt focussed foremost on the way that the machine learning itself worked. “One thing about neural networks is that they are amazing at finding correlations, but they also can pick up on numerical artifacts,” he said. Was a parameter wrong? Was there a bug in the code? Villaescusa-Navarro wrote his own program, to ask the same sort of question that he had assigned to Ding: What could information about one galaxy say about the universe in which it resided? Even when asked by a different program, written from scratch, the answer was still coming out the same. This suggested that the result was catching something real.

    “But we couldn’t just publish that,” Villaescusa-Navarro said. “We needed to try and understand why this might be working.” It was working for small galaxies, and for large galaxies, and for galaxies with very different features; only for a small handful of eccentric galaxies did the work not hold. Why?

    The recipe for making a universe is to start with a lot of hydrogen, a little helium, some dark matter, and some dark energy. Dark matter has mass, like the matter we’re familiar with, but it doesn’t reflect or emit light, so we can’t see it. We also can’t see dark energy, but we can think of it as working in the opposite direction of gravity. The universe’s matter, via gravity, pushes it to contract; the universe’s dark energy pushes it to expand.

    Omega matter is a cosmological parameter that describes how much dark matter is in the universe. Along with other parameters, it controls how much the universe is expanding. The higher its value, the slower the universe would grow. One of the research group’s hypotheses to explain their results is, roughly, that the amount of dark matter in a universe has a very strong effect on a galaxy’s properties—a stronger effect than other characteristics. For this reason, even one galaxy could have something to say about the Omega matter of its parent universe, since Omega matter is correlated to what can be pictured as the density of matter that makes a galaxy clump together.

    In December, Genel, an expert on galaxy formation, presented the preliminary results of the paper to the galaxy-formation group he belongs to at The Flatiron Institute Center for Computational Astrophysics. “This was really one of the most fun things that happened to me,” he said. He told me that any galaxy-formation expert could have no other first reaction than to think, This is impossible. A galaxy is, on the scale of a universe, about as substantial as a grain of sand is, relative to the size of the Earth. To think that all by itself it can say something so substantial is, to the majority of the astrophysics community, extremely surprising, in a way analogous to the discovery that each of our cells—from a fingernail cell to a liver cell—contains coding describing our entire body. (Though maybe to the poetic way of thinking—to see the world in a grain of sand—the surprise is that this is surprising.)

    Rachel Somerville, an astrophysicist who was at the talk, recalled the initial reaction as “skepticism, but respectful skepticism, since we knew these were serious researchers.” She remembers being surprised that the approach had even been tried, since it seemed so tremendously unlikely that it would work. Since that time, the researchers have shared their coding and results with experts in the field; the results are taken to be credible and compelling, though the hesitations that the authors themselves have about the results remain.

    The results are not “robust”—for now, the computer can make valid predictions only on the type of universe that it has been trained on. Even within CAMELS, there are two varieties of simulations, and, if the machine is trained on one variety, it cannot be used to make predictions for galaxies in the other variety. That also means that the results cannot be used to make predictions about the universe we live in—at least not yet.

    Villaescusa-Navarro told me, “It is a very beautiful result—I know I shouldn’t say that about my own work.” But what is beauty to an astrophysicist? “It’s about an unexpected connection between two things that seemed not to be related. In this case, cosmology and galaxy formation. It’s about something hidden being revealed.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: