Tagged: Albert Einstein’s theory of general relativity Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:13 pm on January 22, 2022 Permalink | Reply
    Tags: "This New Record in Laser Beam Stability Could Help Answer Physics' Biggest Questions", Albert Einstein's theory of general relativity, , , , , ,   

    From The University of Western Australia (AU) via Science Alert (AU) : “This New Record in Laser Beam Stability Could Help Answer Physics’ Biggest Questions” 

    U Western Australia bloc

    From The University of Western Australia (AU)

    via

    Science Alert (AU)

    1
    The laser setup at the University of Western Australia. Credit: D. Gozzard/UWA.

    22 JANUARY 2022
    DAVID NIELD

    Scientists are on a mission to create a global network of atomic clocks that will enable us to, among other things, better understand the fundamental laws of physics, investigate dark matter, and navigate across Earth and space more precisely.

    However, to be at their most effective, these clocks will need to be reliably and speedily linked together through layers of the atmosphere, which is far from easy. New research outlines a successful experiment with a laser beam that has been kept stable across a distance of 2.4 kilometers (1.5 miles).

    For comparison, the new link is around 100 times more stable than anything that’s been put together before. It also demonstrates stability that’s around 1,000 times better than the atomic clocks these lasers could be used to monitor.

    “The result shows that the phase and amplitude stabilization technologies presented in this paper can provide the basis for ultra-precise timescale comparison of optical atomic clocks through the turbulent atmosphere,” write the researchers in their published paper [Physical Review Letters].

    The system builds on research carried out last year in which scientists developed a laser link capable of holding its own through the atmosphere with unprecedented stability.

    In the new study, researchers shot a laser beam from a fifth-floor window to a reflector 1.2 kilometers (0.74 miles) away. The beam was then bounced back to the source to achieve the total distance for a period of five minutes.

    Using noise reduction techniques, temperature controls, and tiny adjustments to the reflector, the team was able to keep the laser stable through the pockets of fluctuating air. The atmospheric turbulence at ground level here is likely to equate to ground-to-satellite turbulence (the air is calmer and less dense higher in the atmosphere) of several hundred kilometers.

    While laser accuracy has remained fairly constant for a decade or so, we’ve seen some significant improvements recently, including a laser setup operated by the Boulder Atomic Clock Optical Network (BACON) Collaboration and tested last March [Nature].

    That setup involved a pulse laser rather than the continuous wave laser tested in this new study. Both have their advantages in different scenarios, but continuous wave lasers offer better stability and can transfer more data in a set period of time.

    “Both systems beat the current best atomic clock, so we’re splitting hairs here, but our ultimate precision is better,” says astrophysicist David Gozzard from the University of Western Australia.

    Once an atomic clock network is put together, among the tests scientists will be able to perform is Albert Einstein’s Theory of General Relativity, and how its incompatibility with what we know about quantum physics could be resolved.

    By very precisely comparing the time-keeping of two atomic clocks – one on Earth and one in space – scientists are eventually hoping to be able to work out where General Relativity does and doesn’t hold up. If Einstein’s ideas are correct, the clock further away from Earth’s gravity should tick ever-so-slightly faster.

    But its usefulness doesn’t stop there. Lasers like this could eventually be used for managing the launching of objects into orbit, for communications between Earth and space, or for connecting two points in space.

    “Of course, you can’t run fiber optic cable to a satellite,” says Gozzard.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Western Australia is a public research university in the Australian state of Western Australia. The university’s main campus is in Perth, the state capital, with a secondary campus in Albany and various other facilities elsewhere.

    UWA was established in 1911 by an act of the Parliament of Western Australia and began teaching students two years later. It is the sixth-oldest university in Australia and was Western Australia’s only university until the establishment of Murdoch University (AU) in 1973. Because of its age and reputation, UWA is classed one of the “sandstone universities”, an informal designation given to the oldest university in each state. The university also belongs to several more formal groupings, including The Group of Eight (AU) and The Matariki Network of Universities. In recent years, UWA has generally been ranked either in the bottom half or just outside the world’s top 100 universities, depending on the system used.

    Alumni of UWA include one Prime Minister of Australia (Bob Hawke), five Justices of the High Court of Australia (including one Chief Justice, Robert French, now Chancellor), one Governor of the Reserve Bank (H. C. Coombs), various federal cabinet ministers, and seven of Western Australia’s eight most recent premiers. In 2018 alumnus mathematician Akshay Venkatesh was a recipient of the Fields Medal. As at 2021, the university had produced 106 Rhodes Scholars. Two members of the UWA faculty, Barry Marshall and Robin Warren won Nobel Prizes as a result of research at the university.

    History

    The university was established in 1911 following the tabling of proposals by a royal commission in September 1910. The original campus, which received its first students in March 1913, was located on Irwin Street in the centre of Perth, and consisted of several buildings situated between Hay Street and St Georges Terrace. Irwin Street was also known as “Tin Pan Alley” as many buildings featured corrugated iron roofs. These buildings served as the university campus until 1932, when the campus relocated to its present-day site in Crawley.

    The founding chancellor, Sir John Winthrop Hackett, died in 1916, and bequeathed property which, after being carefully managed for ten years, yielded £425,000 to the university, a far larger sum than expected. This allowed the construction of the main buildings. Many buildings and landmarks within the university bear his name, including Winthrop Hall and Hackett Hall. In addition, his bequest funded many scholarships, because he did not wish eager students to be deterred from studying because they could not afford to do so.

    During UWA’s first decade there was controversy about whether the policy of free education was compatible with high expenditure on professorial chairs and faculties. An “old student” publicised his concern in 1921 that there were 13 faculties serving only 280 students.

    A remnant of the original buildings survives to this day in the form of the “Irwin Street Building”, so called after its former location. In the 1930s it was transported to the new campus and served a number of uses till its 1987 restoration, after which it was moved across campus to James Oval. Recently, the building has served as the Senate meeting room and is currently in use as a cricket pavilion and office of the university archives. The building has been heritage-listed by both the National Trust and the Australian Heritage Council.

    The university introduced the Doctorate of Philosophy degree in 1946 and made its first award in October 1950 to Warwick Bottomley for his research of the chemistry of native plants in Western Australia.

     
  • richardmitnick 11:48 am on December 24, 2021 Permalink | Reply
    Tags: "Lasers and Ultracold Atoms for a Changing Earth", Albert Einstein's theory of general relativity, , Applying new technology rooted in quantum mechanics and relativity to terrestrial and space geodesy will sharpen our understanding of how the planet responds to natural and human-induced changes., , Improving technology for laser interferometric ranging between spacecraft to achieve nanometer-scale accuracy, Laser altimetry, , Measuring Earth’s gravity field from space requires precisely monitoring the changing distance between paired orbiting satellites., NASA Grace mission, NASA Grace-FO mission, , The future of high-precision geodesy lies in the development and application of novel technologies based on quantum mechanics and relativity.   

    From Eos: “Lasers and Ultracold Atoms for a Changing Earth” 

    From AGU
    Eos news bloc

    From Eos

    20 December 2021
    Michel Van Camp
    F. Pereira dos Santos
    Michael Murböck
    Gérard Petit and
    Jürgen Müller

    Applying new technology rooted in quantum mechanics and relativity to terrestrial and space geodesy will sharpen our understanding of how the planet responds to natural and human-induced changes.

    1
    Credit: VENTRIS/Science Photo Library via Getty Images.

    Quantum mechanics rules the atomic world, challenging our intuitions based on Newton’s classical mechanics. And yet atoms share at least one commonality with Newton’s apple and with you and me: They experience gravity and fall in the same way.

    Of course, observing free-falling atoms requires extremely sophisticated experimental devices, which became available only in the 1990s with the advent of laser cooling. Heat represents the extent to which atoms move, so cooling atoms eases their manipulation, allowing scientists to measure their free fall and to quantify and study the effects of gravity with extraordinary precision. Creating samples of ultracold atoms involves slowing the atoms using the momentum of photons in specialized laser beams.

    Today novel developments in methods using ultracold atoms and laser technologies open enhanced prospects for applying quantum physics in both satellite and terrestrial geodesy—the science of measuring the shape, rotation, and gravity of Earth—and for improving measurement reference systems. Such methods have great potential for more accurately monitoring how the Earth system is responding to natural and human-induced forcing, from the planet’s solid surface shifting in response to tectonic and magmatic movements to sea level rising in response to melting glaciers.

    Taking Earth’s Measure

    Earth’s shape is always changing, even if the changes are mostly imperceptible to us humans. In the subsurface, large convection currents and plate tectonics influence each other, shifting huge masses of rock around and causing earthquakes and volcanic eruptions. On the surface, the ocean, atmosphere, glaciers, rivers, and aquifers never rest either—nor do we as we excavate rock, extract groundwater and oil, and generally move mass around. All these movements subtly affect not only the planet’s shape but also its rotation and its gravitational field.

    2
    Fig. 1. The colored bubbles indicate the ranges of spatial resolution (in kilometers) and signal amplitude (in equivalent water height, EWH) characteristic of mass change processes related to continental hydrology (yellow), ice sheets and glaciers (pink), ocean processes (blue), and volcanoes and earthquakes (gray). The current measurement limits of laser interferometric ranging methods (e.g., aboard the Gravity Recovery and Climate Experiment (GRACE) and GRACE Follow-On (GRACE-FO) missions; solid black line) and of terrestrial absolute gravimetry (dashed green line) are shown, along with the directions of improvement in these technologies (arrows) needed to cover more of the ranges of the processes. Credit: IfE/LUH.

    NASA Grace

    National Aeronautics Space Agency (US)/GFZ German Research Centre for Geosciences [Deutsches Forschungszentrum für Geowissenschaften](GFZ)(DE) Grace-FO satellites.

    Geodetic methods allow us to measure minute quantities that tell scientists a lot about Earth’s size, shape, and makeup. As such, geodesy is essential to all branches of geophysics: tectonics, seismology, volcanology, oceanography, hydrology, glaciology, geomagnetism, climatology, meteorology, planetology, and even metrology, physics, and astronomy. Measuring these changes sheds light on many important Earth processes, such as mass loss from polar ice sheets, yet making these measurements accurately remains a challenging task (Figure 1).

    Determining the elevation of an ice sheet’s surface, to gauge whether it might have lost or gained mass, is often done using laser altimetry—that is, by observing the travel time of a laser beam emitted from a plane or a satellite and reflected off the ice surface back up to the observer. It’s a powerful technique, but the laser does not necessarily distinguish between light, fresh snow and dense, old ice, introducing uncertainty into the measurement and into our understanding of the ice sheet’s change.

    Beyond this ambiguity, what happens if Earth’s crust beneath the ice cap is deforming and influencing the elevation of the ice surface? Moreover, the altimeter’s observation is relative: The elevation of the ice sheet surface is measured with respect to the position of the observing aircraft or satellite, which itself must be determined in comparison to a reference height datum (typically sea level). This feat requires measuring quantities that are exceedingly small compared with the size of Earth. If you drew a circle representing Earth on a standard piece of printer paper, even the 20-kilometer difference in height between Mount Everest’s peak and the bottom of abyssal oceanic trenches—would be thinner than the thickness of your pencil line!

    Meanwhile, measuring variation in Earth’s rotation means determining its instantaneous orientation relative to fixed stars to within a fraction of a thousandth of an arc second—the amount Earth rotates in a few micro arc seconds. Assessing velocities and deformations of the tectonic plates requires determining positions at the millimeter scale. And detecting groundwater mass changes requires measuring the associated gravitational effect of a 1-centimeter-thick layer of water (i.e., equivalent water height, or EWH) spread over a 160,000-square-kilometer area. In other words, changes in Earth’s rotation, deformations, and gravity must be measured with precisions that are 10 orders of magnitude shorter than the length of the day, smaller than Earth’s diameter, and weaker than the gravity itself, respectively.

    The Challenges of Attraction

    Performing gravity measurements and analyses remains especially demanding. For land-based measurements, gravimeters are generally cumbersome, expensive, tricky to use and, in the case of the most precise superconducting instruments, require a high-wattage (1,500-watt) continuous power supply. In addition, most gravimeters, including superconducting instruments, offer only relative measurements—that is, they inform us about spatial and temporal variations in gravitational attraction, but they drift with time and do not provide the absolute value of gravitational acceleration (about 9.8 meters per second squared). Absolute gravimeters do, but these instruments are rare, expensive (costing roughly $500,000 apiece), and heavy. And as most are mechanical, wear and tear prevents their use for continuous measurements.

    2
    This absolute gravimeter developed by the SYRTE (Time and Space Reference Systems) department at the Paris Observatory uses ultracold-atom technology to make high-precision measurements of gravity. Credit: Sébastien Merlet, LNE­SYRTE.

    Moreover, terrestrial gravimeters are mostly sensitive to the mass distribution nearby, in a radius of a few hundred meters from the instrument. This sensitivity and scale allow observation of rapid and small-scale changes, such as from flash floods, in small watersheds or glaciers, and in volcanic systems, but they complicate data gathering over larger areas.

    On the other hand, space-based gravimetry, realized in the Gravity Recovery and Climate Experiment mission and its follow-on mission, GRACE-FO, is blind to structures smaller than a few hundred kilometers. However, it offers unique information of homogeneous quality about mass anomalies over larger areas within Earth or at its surface. These missions can detect and monitor a mass change equivalent to a 1-centimeter EWH spread over a 400- × 400-kilometer area, with a temporal resolution of 10 days.

    To monitor change from important Earth processes—from flooding and volcanism to glacier melting and groundwater movement—reliably and across scales, we need gravitational data with better spatiotemporal resolution and higher accuracy than are currently available (Figure 1). We also need highly stable and accurate reference systems to provide the fundamental backbone required to monitor sea level changes and tectonic and human-induced deformation. The needed improvements can be achieved only by using innovative quantum technologies.

    The past few years have seen new efforts to develop such technologies for many uses. In 2018, for example, the European Commission began a long-term research and innovation initiative called Quantum Flagship. For geodetic applications, efforts are being coordinated and supported largely through the Novel Sensors and Quantum Technology for Geodesy (QuGe) program, a worldwide initiative organized under the umbrella of the International Association of Geodesy and launched in 2019. QuGe fosters synergies in technology development, space mission requirements, and geodetic and geophysical modeling by organizing workshops and conference sessions and by otherwise providing a platform where experts from different fields can collaborate.

    A Quantum Upgrade for Gravity Sensing

    QuGe emphasizes three pillars of development. The first focuses on investigations of ultracold-atom technologies for gravimetry on the ground and in space. Quantum gravimetry will benefit a comprehensive set of applications, from fast, localized gravity surveys and exploration to observing regional and global Earth system processes with high spatial and temporal resolution.

    On Earth, the ideal instrument is an absolute, rather than relative, gravimeter capable of taking continuous measurements. This is not possible with a classical mechanical absolute gravimeter, in which a test mass is repeatedly dropped and lifted. In atomic instruments, there are no mobile parts or mechanical wear; instead, lasers control falling rubidium atoms. Recent achievements should enable production of such instruments on a larger scale, allowing scientists to establish dense networks of absolute gravimetric instruments to monitor, for example, aquifer and volcanic systems.

    Today achieving dense coverage with gravimetric surveys, with measurements made at perhaps dozens of points, involves huge efforts, and sampling rates—with measurements taken typically once every month, year, or more—are still poor. Moreover, errors related to instrument calibration and drift remain problematic. Alternatively, a fixed instrument provides a measurement every second but at only a single location. The ability to continuously measure gravity at multiple locations, without the difficulties of drifting instruments, will allow much less ambiguous interpretations of gravity changes and related geophysical phenomena.

    Measuring Earth’s gravity field from space requires precisely monitoring the changing distance between paired orbiting satellites—as in the GRACE-FO mission—which accelerate and decelerate slightly as they are tugged more or less by the gravitational pull exerted by different masses on Earth. However, the satellites can also speed up and slow down because of forces other than changes in Earth’s gravity field, including aerodynamic drag in the thin upper atmosphere. Currently, these other forces acting on the satellites are measured using electrostatic, suspended-mass accelerometers, which also tend to exhibit gradual, low-frequency drifts that hamper their accuracy.

    The performance of these traditional accelerometers is thus challenged by quantum sensors, which have already demonstrated improved long-term stability and lower noise levels on the ground. In addition, hybrid systems combining the benefits of quantum accelerometers with electrostatic accelerometers, which still provide higher measurement rates, could cover a wider range of slower and faster accelerations and could greatly support navigation and inertial sensing on the ground and in space. Quantum accelerometers will also serve as a basis for developing the next generation of gravity-monitoring missions, such as the follow-on to the Gravity field and steady-state Ocean Circulation Explorer (GOCE) mission, which will measure gravity differences in 3D and allow higher-resolution mapping of Earth’s static gravity field.

    Wide-Ranging Improvement

    The second pillar of QuGe focuses on improving technology for laser interferometric ranging between spacecraft to achieve nanometer-scale accuracy, which will become the standard for future geodetic gravity-sensing missions. This method involves comparing the difference in phase between two laser beams: a reference beam and a test beam received back from the second satellite. Such optical measurements are much more precise than similar measurements using microwave ranging or mechanical devices, allowing intersatellite distances to be observed with an accuracy of tens of nanometers or better compared with micrometer accuracies achieved with microwaves.

    High-precision laser ranging was successfully tested in 2017 during the Laser Interferometer Space Antenna (LISA) Pathfinder mission, in which the main goal was to hold the spacecraft as motionless as possible to test technology for use in future missions that will seek to detect gravitational waves with a space-based observatory. It has also been applied successfully in the GRACE-FO mission, demonstrating the superior performance for intersatellite tracking of laser interferometry over traditional microwave-based ranging methods used in the original GRACE mission.

    Although extremely useful, recent satellite gravity missions give only rather rough pictures of global mass variations. Enhanced monitoring of intersatellite distances should improve the ability to resolve 1-centimeter EWH to about 200 kilometers or finer, instead of the 400 kilometers presently. This improvement will allow better separation of overlapping effects, such as continental versus oceanic mass contributions along coastlines, changes in neighboring subsurface aquifers, and variations in glaciers and nearby groundwater tables.

    Even more refined concepts, like intersatellite tracking using laser interferometry for multiple satellite pairs or among a swarm of satellites, might be realized as well within the coming years. Using more satellites in next-generation geodetic missions would yield data with higher temporal and spatial resolution and accuracy—and hence with greater ability to distinguish smaller-scale processes—than are available with current two-satellite configurations.

    Measuring Height with Optical Clocks

    QuGe’s third pillar of development focuses on applying general relativity and optical clocks to improve measurement reference systems. Einstein told us that gravity distorts space and time. In particular, a clock closer to a mass—or, say, at a lower elevation on Earth’s surface, closer to the planet’s center of mass—runs slower than one farther away. Hence, comparing the ticking rates of accurate clocks placed at different locations on Earth informs us about height differences, a technique called chronometric leveling. This technique has been achieved by comparing outputs from highly precise optical clocks connected by optical links over distances on the order of 1,000 kilometers.

    Today systems for measuring height are referenced to mean sea level in some way, for example, through tide gauges. However, sea level is not stable enough to be used as a reference.

    4
    The transportable optical clock of the PTB (left) is housed inside a trailer (right). Credit: PTB Braunschweig, CC BY 4.0.

    Optical clocks keep time by measuring the high frequency of a laser light that is kept locked to the transition frequency between two given energy levels of electrons in ultracold (laser-cooled) atoms or ions. These clocks have demonstrated at least a 100-fold improvement in accuracy over the usual atomic clocks, which measure lower-frequency microwave transitions. With a global network of such optical clocks, if we can remotely compare the clocks’ frequencies with the same accuracy, we could realize a global height reference with 1-centimeter consistency. One can even imagine the reference clocks being placed in a high satellite orbit, far from the noisy Earth environment, to serve as a stable reference for terrestrial height systems and improve measurement accuracy.

    In addition to chronometric leveling, such clocks will improve the accuracy of the International Atomic Time standard—the basis for the Coordinated Universal Time used for civil timekeeping—and will have many other impacts on science and technology. For example, global navigation satellite systems could provide better navigation by using more predictable clocks on satellites, which would have the added advantage of requiring less input from the ground operators controlling the satellite orbits. Space navigation could rely on one-way range measurements instead of on more time-consuming two-way ranging if a spacecraft’s clock were highly accurate. And radio astronomers could make use of more stable frequency references for easier processing and better results in very long baseline interferometry experiments. More fundamental applications are also envisioned for optical clocks, such as detecting gravitational waves, testing the constancy of the fundamental constants of physics, and even redefining the second.

    The Best Tools for the Job

    Our knowledge of Earth’s shape and gravity and the subtle shifts they undergo in response to numerous natural and human-induced processes has grown immensely as geodetic methods and tools have matured. But with current technologies, the clarity and confidence with which we can discern these changes remain limited. Such limitations, namely, insufficient accuracy and resolution in time and space, will become increasingly important as we look to better understand and predict the consequences of accelerating—or even perhaps previously unrecognized—changes occurring as the planet responds to warming temperatures and other anthropogenic influences.

    The future of high-precision geodesy lies in the development and application of novel technologies based on quantum mechanics and relativity. QuGe is working to ensure that the Earth and planetary sciences benefit from the vast potential of these technologies. In particular, ultracold-atom accelerometry, high-precision laser ranging between satellites, and relativistic geodesy with optical clocks are very promising approaches that will overcome problems of classical gravimetric Earth observations. With such advances, we will have the best tools available not only to understand vital geophysical processes but also to better navigate on Earth and in space and to discern the fundamental physics that underlie our world.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 10:38 am on December 14, 2021 Permalink | Reply
    Tags: "Einstein's theory passes rigorous 16-year tests", Albert Einstein's theory of general relativity, , , , , , Double pulsar system   

    From CSIRO -Commonwealth Scientific and Industrial Research Organisation (AU) : “Einstein’s theory passes rigorous 16-year tests” 

    CSIRO bloc

    From CSIRO -Commonwealth Scientific and Industrial Research Organisation (AU)

    12.13.21

    Ms. Mikayla Keen
    Communication advisor, Space
    Tel +61 2 9372 4433
    Fax +61 4 0148 8562

    1
    © Michael Kramer/The MPG Institute for Radio Astronomy [MPG Institut für Radioastronomie](DE)

    The team, led by Professor Michael Kramer from The MPG Institute for Radio Astronomy [MPG Institut für Radioastronomie](DE), showed that Einstein’s theory published in 1915 still holds true.

    Dr Dick Manchester, a Fellow at Australia’s national science agency, CSIRO, and a member of the research team, explained how this result provides us with a more precise understanding of our Universe.

    “The theory of general relativity describes how gravity works at large scales in the Universe, but it breaks down at the atomic scale where quantum mechanics reigns supreme,” Dr Manchester said.

    “We needed to find ways of testing Einstein’s theory at an intermediate scale to see if it still holds true. Fortunately, just the right cosmic laboratory, known as the ‘double pulsar’, was found using the Parkes telescope in 2003.

    CSIRO-Commonwealth Scientific and Industrial Research Organisation (AU) CSIRO Parkes Observatory Australia Telescope National Facility [ Murriyang, the traditional Indigenous name], located 20 kilometres north of the town of Parkes, New South Wales, Australia, 414.80m above sea level.

    “Our observations of the double pulsar over the past 16 years proved to be amazingly consistent with Einstein’s General Theory of Relativity, within 99.99 per cent to be precise,” he said.

    The double pulsar system is made up of two pulsars, rapidly rotating compact stars that emit radio waves like a cosmic lighthouse and create very strong gravitational fields.

    One star rotates 45 times every second, while the second spins just 2.8 times per second. The stars complete an orbit every 2.5 hours.

    Dame Susan Jocelyn Bell Burnell, discovered pulsars with radio astronomy. Jocelyn Bell at the Mullard Radio Astronomy Observatory, University of Cambridge(UK), taken for the Daily Herald newspaper in 1968. Denied the Nobel.

    According to general relativity, the extreme accelerations in the double pulsar system strain the fabric of space-time and send out ripples that will slow the system. The two pulsars are predicted to collide in 85 million years’ time.

    With such a long timescale for this energy loss its effects are difficult to detect. Fortunately, clock-like ticks coming from the spinning pulsars are perfect tools to trace the tiny perturbations.

    Associate Professor Adam Deller from The Swinburne University of Technology (AU) and OzGrav-ARC CENTRE OF EXCELLENCE FOR GRAVITATIONAL WAVE DISCOVERY (AU), another member of the research team, explained that the ticks from the pulsar ‘clocks’ had taken around 2,400 years to reach Earth.

    “We modelled the precise arrival times of more than 20 billion of these clock ticks over 16 years,” Dr Deller said.

    “That still wasn’t enough to tell us how far away the stars are, and we needed to know that to test general relativity.”

    By adding in data from the Global VLBI Array– a network of telescopes spread across the globe – the research team was able to spot a tiny wobble in the star’s positions every year, which revealed their distance from Earth.

    GMVA The Global VLBI Array

    “We’ll be back in the future using new radio telescopes and new data analysis hoping to spot a weakness in general relativity that will lead us to an even better gravitational theory,” Dr Deller said.

    The research is published today in the journal Physical Review X.

    Results in detail:

    An international research team has completed the most rigorous tests yet of Albert Einstein’s Theory of General Relativity, showing that the theory published in 1915 holds true.
    These tests include the emission of gravitational waves, effects of light propagation in strong gravitational fields, and the effect of ‘time dilation’ that makes clocks run slower in gravitational fields.
    These results provide us with a more precise understanding of our Universe.
    Key to the research was the double pulsar system, which was first discovered using CSIRO’s Parkes radio telescope, Murriyang, in 2003.
    The double pulsar system is made up of two pulsars, rapidly rotating compact stars that emit radio waves like a cosmic lighthouse and create very strong gravitational fields. One star rotates 45 times every second, while the second spins just 2.8 times per second. The stars complete an orbit every 2.5 hours.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    CSIRO campus

    CSIRO -Commonwealth Scientific and Industrial Research Organisation (AU) , is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

    CSIRO works with leading organisations around the world. From its headquarters in Canberra, CSIRO maintains more than 50 sites across Australia and in France, Chile and the United States, employing about 5,500 people.

    Federally funded scientific research began in Australia 104 years ago. The Advisory Council of Science and Industry was established in 1916 but was hampered by insufficient available finance. In 1926 the research effort was reinvigorated by establishment of the Council for Scientific and Industrial Research (CSIR), which strengthened national science leadership and increased research funding. CSIR grew rapidly and achieved significant early successes. In 1949 further legislated changes included renaming the organisation as CSIRO.

    Notable developments by CSIRO have included the invention of atomic absorption spectroscopy; essential components of Wi-Fi technology; development of the first commercially successful polymer banknote; the invention of the insect repellent in Aerogard and the introduction of a series of biological controls into Australia, such as the introduction of myxomatosis and rabbit calicivirus for the control of rabbit populations.

    Research and focus areas

    Research Business Units

    As at 2019, CSIRO’s research areas are identified as “Impact science” and organised into the following Business Units:

    Agriculture and Food
    Health and Biosecurity
    Data 61
    Energy
    Land and Water
    Manufacturing
    Mineral Resources
    Oceans and Atmosphere

    National Facilities

    CSIRO manages national research facilities and scientific infrastructure on behalf of the nation to assist with the delivery of research. The national facilities and specialized laboratories are available to both international and Australian users from industry and research. As at 2019, the following National Facilities are listed:

    Australian Animal Health Laboratory (AAHL)
    Australia Telescope National Facility – radio telescopes included in the Facility include the Australia Telescope Compact Array, the Parkes Observatory, Mopra Observatory and the Australian Square Kilometre Array Pathfinder.

    .

    CSIRO Pawsey Supercomputing Centre AU)

    Others not shown

    SKA

    SKA- Square Kilometer Array

    .

     
  • richardmitnick 12:05 pm on September 8, 2019 Permalink | Reply
    Tags: Albert Einstein's theory of general relativity, , , , , , Craig Callender, , Second law of thermodynamics,   

    From WIRED: “Are We All Wrong About Black Holes?” 

    Wired logo

    From WIRED

    09.08.2019
    Brendan Z. Foster

    1
    Craig Callender, a philosopher of science at the University of California San Diego, argues that the connection between black holes and thermodynamics is less ironclad than assumed.Photograph: Peggy Peattie/Quanta Magazine

    In the early 1970s, people studying general relativity, our modern theory of gravity, noticed rough similarities between the properties of black holes and the laws of thermodynamics. Stephen Hawking proved that the area of a black hole’s event horizon—the surface that marks its boundary—cannot decrease. That sounded suspiciously like the second law of thermodynamics, which says entropy—a measure of disorder—cannot decrease.

    Yet at the time, Hawking and others emphasized that the laws of black holes only looked like thermodynamics on paper; they did not actually relate to thermodynamic concepts like temperature or entropy.

    Then in quick succession, a pair of brilliant results—one by Hawking himself—suggested that the equations governing black holes were in fact actual expressions of the thermodynamic laws applied to black holes. In 1972, Jacob Bekenstein argued that a black hole’s surface area was proportional to its entropy [Physical Review D], and thus the second law similarity was a true identity. And in 1974, Hawking found that black holes appear to emit radiation [Nature]—what we now call Hawking radiation—and this radiation would have exactly the same “temperature” in the thermodynamic analogy.

    This connection gave physicists a tantalizing window into what many consider the biggest problem in theoretical physics—how to combine quantum mechanics, our theory of the very small, with general relativity. After all, thermodynamics comes from statistical mechanics, which describes the behavior of all the unseen atoms in a system. If a black hole is obeying thermodynamic laws, we can presume that a statistical description of all its fundamental, indivisible parts can be made. But in the case of a black hole, those parts aren’t atoms. They must be a kind of basic unit of gravity that makes up the fabric of space and time.

    Modern researchers insist that any candidate for a theory of quantum gravity must explain how the laws of black hole thermodynamics arise from microscopic gravity, and in particular, why the entropy-to-area connection happens. And few question the truth of the connection between black hole thermodynamics and ordinary thermodynamics.

    But what if the connection between the two really is little more than a rough analogy, with little physical reality? What would that mean for the past decades of work in string theory, loop quantum gravity, and beyond? Craig Callender, a philosopher of science at the University of California, San Diego, argues that the notorious laws of black hole thermodynamics may be nothing more than a useful analogy stretched too far [Phil Sci]. The interview has been condensed and edited for clarity.

    Why did people ever think to connect black holes and thermodynamics?

    Callender: In the early ’70s, people noticed a few similarities between the two. One is that both seem to possess an equilibrium-like state. I have a box of gas. It can be described by a small handful of parameters—say, pressure, volume, and temperature. Same thing with a black hole. It might be described with just its mass, angular momentum, and charge. Further details don’t matter to either system.

    Nor does this state tell me what happened beforehand. I walk into a room and see a box of gas with stable values of pressure, volume and temperature. Did it just settle into that state, or did that happen last week, or perhaps a million years ago? Can’t tell. The black hole is similar. You can’t tell what type of matter fell in or when it collapsed.

    The second feature is that Hawking proved that the area of black holes is always non-decreasing. That reminds one of the thermodynamic second law, that entropy always increases. So both systems seem to be heading toward simply described states.

    Now grab a thermodynamics textbook, locate the laws, and see if you can find true statements when you replace the thermodynamic terms with black hole variables. In many cases you can, and the analogy improves.

    Hawking then discovers Hawking radiation, which further improves the analogy. At that point, most physicists start claiming the analogy is so good that it’s more than an analogy—it’s an identity! That’s a super-strong and surprising claim. It says that black hole laws, most of which are features of the geometry of space-time, are somehow identical to the physical principles underlying the physics of steam engines.

    Because the identity plays a huge role in quantum gravity, I want to reconsider this identity claim. Few in the foundations of physics have done so.

    So what’s the statistical mechanics for black holes?

    Well, that’s a good question. Why does ordinary thermodynamics hold? Well, we know that all these macroscopic thermodynamic systems are composed of particles. The laws of thermodynamics turn out to be descriptions of the most statistically likely configurations to happen from the microscopic point of view.

    Why does black hole thermodynamics hold? Are the laws also the statistically most likely way for black holes to behave? Although there are speculations in this direction, so far we don’t have a solid microscopic understanding of black hole physics. Absent this, the identity claim seems even more surprising.

    What led you to start thinking about the analogy?

    Many people are worried about whether theoretical physics has become too speculative. There’s a lot of commentary about whether holography, the string landscape—all sorts of things—are tethered enough to experiment. I have similar concerns. So my former Ph.D. student John Dougherty and I thought, where did it all start?

    To our mind a lot of it starts with this claimed identity between black holes and thermodynamics. When you look in the literature, you see people say, “The only evidence we have for quantum gravity, the only solid hint, is black hole thermodynamics.”

    If that’s the main thing we’re bouncing off for quantum gravity, then we ought to examine it very carefully. If it turns out to be a poor clue, maybe it would be better to spread our bets a little wider, instead of going all in on this identity.

    What problems do you see with treating a black hole as a thermodynamic system?

    I see basically three. The first problem is: What is a black hole? People often think of black holes as just kind of a dark sphere, like in a Hollywood movie or something; they’re thinking of it like a star that collapsed. But a mathematical black hole, the basis of black hole thermodynamics, is not the material from the star that’s collapsed. That’s all gone into the singularity. The black hole is what’s left.

    The black hole isn’t a solid thing at the center. The system is really the entire space-time.

    Yes, it’s this global notion for which black hole thermodynamics was developed, in which case the system really is the whole space-time.

    Here is another way to think about the worry. Suppose a star collapses and forms an event horizon. But now another star falls past this event horizon and it collapses, so it’s inside the first. You can’t think that each one has its own little horizon that is behaving thermodynamically. It’s only the one horizon.

    Here’s another. The event horizon changes shape depending on what’s about to be thrown into it. It’s clairvoyant. Weird, but there is nothing spooky here so long as we remember that the event horizon is only defined globally. It’s not a locally observable quantity.

    The picture is more counterintuitive than people usually think. To me, if the system is global, then it’s not at all like thermodynamics.

    The second objection is: Black hole thermodynamics is really a pale shadow of thermodynamics. I was surprised to see the analogy wasn’t as thorough as I expected it to be. If you grab a thermodynamics textbook and start replacing claims with their black hole counterparts, you will not find the analogy goes that deep.


    Craig Callender explains why the connection between black holes and thermodynamics is little more than an analogy.

    For instance, the zeroth law of thermodynamics sets up the whole theory and a notion of equilibrium — the basic idea that the features of the system aren’t changing. And it says that if one system is in equilibrium with another — A with B, and B with C — then A must be in equilibrium with C. The foundation of thermodynamics is this equilibrium relation, which sets up the meaning of temperature.

    The zeroth law for black holes is that the surface gravity of a black hole, which measures the gravitational acceleration, is a constant on the horizon. So that assumes temperature being constant is the zeroth law. That’s not really right. Here we see a pale shadow of the original zeroth law.

    The counterpart of equilibrium is supposed to be “stationary,” a technical term that basically says the black hole is spinning at a constant rate. But there’s no sense in which one black hole can be “stationary with” another black hole. You can take any thermodynamic object and cut it in half and say one half is in equilibrium with the other half. But you can’t take a black hole and cut it in half. You can’t say that this half is stationary with the other half.

    Here’s another way in which the analogy falls flat. Black hole entropy is given by the black hole area. Well, area is length squared, volume is length cubed. So what do we make of all those thermodynamic relations that include volume, like Boyle’s law? Is volume, which is length times area, really length times entropy? That would ruin the analogy. So we have to say that volume is not the counterpart of volume, which is surprising.

    The most famous connection between black holes and thermodynamics comes from the notion of entropy. For normal stuff, we think of entropy as a measure of the disorder of the underlying atoms. But in the 1970s, Jacob Bekenstein said that the surface area of a black hole’s event horizon is equivalent to entropy. What’s the basis of this?

    This is my third concern. Bekenstein says, if I throw something into a black hole, the entropy vanishes. But this can’t happen, he thinks, according to the laws of thermodynamics, for entropy must always increase. So some sort of compensation must be paid when you throw things into a black hole.

    Bekenstein notices a solution. When I throw something into the black hole, the mass goes up, and so does the area. If I identify the area of the black hole as the entropy, then I’ve found my compensation. There is a nice deal between the two—one goes down while the other one goes up—and it saves the second law.

    When I saw that I thought, aha, he’s thinking that not knowing about the system anymore means its entropy value has changed. I immediately saw that this is pretty objectionable, because it identifies entropy with uncertainty and our knowledge.

    There’s a long debate in the foundations of statistical mechanics about whether entropy is a subjective notion or an objective notion. I’m firmly on the side of thinking it’s an objective notion. I think trees unobserved in a forest go to equilibrium regardless of what anyone knows about them or not, that the way heat flows has nothing to do with knowledge, and so on.

    Chuck a steam engine behind the event horizon. We can’t know anything about it apart from its mass, but I claim it can still do as much work as before. If you don’t believe me, we can test this by having a physicist jump into the black hole and follow the steam engine! There is only need for compensation if you think that what you can no longer know about ceases to exist.

    Do you think it’s possible to patch up black hole thermodynamics, or is it all hopeless?

    My mind is open, but I have to admit that I’m deeply skeptical about it. My suspicion is that black hole “thermodynamics” is really an interesting set of relationships about information from the point of view of the exterior of the black hole. It’s all about forgetting information.

    Because thermodynamics is more than information theory, I don’t think there’s a deep thermodynamic principle operating through the universe that causes black holes to behave the way they do, and I worry that physics is all in on it being a great hint for quantum gravity when it might not be.

    Playing the role of the Socratic gadfly in the foundations of physics is sometimes important. In this case, looking back invites a bit of skepticism that may be useful going forward.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 8:37 am on September 2, 2019 Permalink | Reply
    Tags: "Physicists mash quantum and gravity and find time but not as we know it", A new kind of quantum time order, Albert Einstein's theory of general relativity, , , , ,   

    From University of Queensland via Science Bulletin: “Physicists mash quantum and gravity and find time, but not as we know it” 

    u-queensland-bloc

    From University of Queensland

    via

    2

    Science Bulletin

    August 28, 2019

    3

    A University of Queensland-led international team of researchers say they have discovered “a new kind of quantum time order.”

    UQ physicist Dr Magdalena Zych said the discovery arose from an experiment the team designed to bring together elements of the two big — but contradictory — physics theories developed in the past century.

    “Our proposal sought to discover: what happens when an object massive enough to influence the flow of time is placed in a quantum state?” Dr Zych said.

    She said Einstein’s theory described how the presence of a massive object slowed time.

    “Imagine two space ships, asked to fire at each other at a specified time while dodging the other’s attack,” she said.

    “If either fires too early, it will destroy the other.”

    “In Einstein’s theory, a powerful enemy could use the principles of general relativity by placing a massive object — like a planet — closer to one ship to slow the passing of time.”

    “Because of the time lag, the ship furthest away from the massive object will fire earlier, destroying the other.”

    Dr Zych said the second theory, of quantum mechanics, says any object can be in a state of “superposition”.

    “This means it can be found in different states — think Schrodinger’s cat,” she said.

    Dr Zych said using the theory of quantum mechanics, if the enemy put the planet into a state of “quantum superposition,” then time also should be disrupted.

    “There would be a new way for the order of events to unfold, with neither of the events being first or second — but in a genuine quantum state of being both first and second,” she said.

    UQ researcher Dr Fabio Costa said although “a superposition of planets” as described in the paper — may never be possible, technology allowed a simulation of how time works in the quantum world — without using gravity.

    “Even if the experiment can never be done, the study is relevant for future technologies,” Dr Costa said.

    “We are currently working towards quantum computers that — very simply speaking — could effectively jump through time to perform their operations much more efficiently than devices operating in fixed sequence in time, as we know it in our ‘normal’ world.”

    Stevens Institute of Technology and the University of Vienna scientists were co-authors on Bell’s Theorem for Temporal Order, published in Nature Communications.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    u-queensland-campus

    The University of Queensland (UQ) is one of Australia’s leading research and teaching institutions. We strive for excellence through the creation, preservation, transfer and application of knowledge. For more than a century, we have educated and worked with outstanding people to deliver knowledge leadership for a better world.

    UQ ranks in the top 50 as measured by the QS World University Rankings and the Performance Ranking of Scientific Papers for World Universities. The University also ranks 52 in the US News Best Global Universities Rankings, 60 in the Times Higher Education World University Rankings and 55 in the Academic Ranking of World Universities.

     
  • richardmitnick 5:44 pm on April 17, 2019 Permalink | Reply
    Tags: Albert Einstein's theory of general relativity, , Einstein’s Unfinished Revolution: The Search for What Lies Beyond the Quantum, , , ,   

    From Scientific American: “Cosmologist Lee Smolin says that at certain key points, the scientific worldview is based on fallacious reasoning” 

    Scientific American

    From Scientific American

    April 17, 2019
    Jim Daley

    Lee Smolin, author of six books about the philosophical issues raised by contemporary physics, says every time he writes a new one, the experience completely changes the direction his own research is taking. In his latest book, Einstein’s Unfinished Revolution: The Search for What Lies Beyond the Quantum, Smolin, a cosmologist and quantum theorist at the Perimeter Institute for Theoretical Physics in Ontario, tackles what he sees as the limitations in quantum theory.

    1
    Credit: Perimeter Institute

    “I want to say the scientific worldview is based on fallacious reasoning at certain key points,” Smolin says. In Einstein’s Unfinished Revolution, he argues one of those key points was the assumption that quantum physics is a complete theory. This incompleteness, Smolin argues, is the reason quantum physics has not been able to solve certain questions about the universe.

    “Most of what we do [in science] is take the laws that have been discovered by experiments to apply to parts of the universe, and just assume that they can be scaled up to apply to the whole universe,” Smolin says. “I’m going to be suggesting that’s wrong.”

    Join Smolin at the Perimeter Institute as he discusses his book and takes the audience on a journey through the basics of quantum physics and the experiments and scientists who have changed our understanding of the universe. The discussion, “Einstein’s Unfinished Revolution,” is part of Perimeter’s public lecture series and will take place on Wednesday, April 17, at 7 P.M. Eastern time. Online viewers can participate in the discussion by tweeting to @Perimeter using the #piLIVE hashtag.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 10:08 am on April 10, 2019 Permalink | Reply
    Tags: Albert Einstein's theory of general relativity, Although the telescopes are not physically connected they are able to synchronize their recorded data with atomic clocks — hydrogen masers — which precisely time their observations., , , , BlackHoleCam, , Data were flown to highly specialised supercomputers — known as correlators — at the Max Planck Institute for Radio Astronomy and MIT Haystack Observatory to be combined., , , , Sagittarius A* the supermassive black hole at the center of our galaxy,   

    From European Southern Observatory: “Astronomers Capture First Image of a Black Hole” 

    ESO 50 Large

    From European Southern Observatory

    10 April 2019

    Heino Falcke
    Chair of the EHT Science Council, Radboud University
    The Netherlands
    Tel: +31 24 3652020
    Email: h.falcke@astro.ru.nl

    Luciano Rezzolla
    EHT Board Member, Goethe Universität
    Germany
    Tel: +49 69 79847871
    Email: rezzolla@itp.uni-frankfurt.de

    Eduardo Ros
    EHT Board Secretary, Max-Planck-Institut für Radioastronomie
    Germany
    Tel: +49 22 8525125
    Email: ros@mpifr.de

    Calum Turner
    ESO Public Information Officer
    Garching bei München, Germany
    Tel: +49 89 3200 6655
    Email: pio@eso.org

    ESO, ALMA, and APEX contribute to paradigm-shifting observations of the gargantuan black hole at the heart of the galaxy Messier 87.

    1
    The Event Horizon Telescope (EHT) — a planet-scale array of eight ground-based radio telescopes forged through international collaboration — was designed to capture images of a black hole. Today, in coordinated press conferences across the globe, EHT researchers reveal that they have succeeded, unveiling the first direct visual evidence of a supermassive black hole and its shadow.

    This breakthrough was announced today in a series of six papers published in a special issue of The Astrophysical Journal Letters. The image reveals the black hole at the centre of Messier 87 [1], a massive galaxy in the nearby Virgo galaxy cluster. This black hole resides 55 million light-years from Earth and has a mass 6.5 billion times that of the Sun [2].

    The EHT links telescopes around the globe to form an unprecedented Earth-sized virtual telescope [3]. The EHT offers scientists a new way to study the most extreme objects in the Universe predicted by Einstein’s general relativity during the centenary year of the historic experiment that first confirmed the theory [4].

    “We have taken the first picture of a black hole,” said EHT project director Sheperd S. Doeleman of the Center for Astrophysics | Harvard & Smithsonian. “This is an extraordinary scientific feat accomplished by a team of more than 200 researchers.”

    Black holes are extraordinary cosmic objects with enormous masses but extremely compact sizes. The presence of these objects affects their environment in extreme ways, warping spacetime and superheating any surrounding material.

    “If immersed in a bright region, like a disc of glowing gas, we expect a black hole to create a dark region similar to a shadow — something predicted by Einstein’s general relativity that we’ve never seen before,” explained chair of the EHT Science Council Heino Falcke of Radboud University, the Netherlands. “This shadow, caused by the gravitational bending and capture of light by the event horizon, reveals a lot about the nature of these fascinating objects and has allowed us to measure the enormous mass of Messier 87’s black hole.”

    Multiple calibration and imaging methods have revealed a ring-like structure with a dark central region — the black hole’s shadow — that persisted over multiple independent EHT observations.

    “Once we were sure we had imaged the shadow, we could compare our observations to extensive computer models that include the physics of warped space, superheated matter and strong magnetic fields. Many of the features of the observed image match our theoretical understanding surprisingly well,” remarks Paul T.P. Ho, EHT Board member and Director of the East Asian Observatory [5]. “This makes us confident about the interpretation of our observations, including our estimation of the black hole’s mass.”

    “The confrontation of theory with observations is always a dramatic moment for a theorist. It was a relief and a source of pride to realise that the observations matched our predictions so well,” elaborated EHT Board member Luciano Rezzolla of Goethe Universität, Germany.

    Creating the EHT was a formidable challenge which required upgrading and connecting a worldwide network of eight pre-existing telescopes deployed at a variety of challenging high-altitude sites. These locations included volcanoes in Hawai`i and Mexico, mountains in Arizona and the Spanish Sierra Nevada, the Chilean Atacama Desert, and Antarctica.

    The EHT observations use a technique called very-long-baseline interferometry (VLBI) which synchronises telescope facilities around the world and exploits the rotation of our planet to form one huge, Earth-size telescope observing at a wavelength of 1.3mm. VLBI allows the EHT to achieve an angular resolution of 20 micro-arcseconds — enough to read a newspaper in New York from a café in Paris [6].

    The telescopes contributing to this result were ALMA, APEX, the IRAM 30-meter telescope, the James Clerk Maxwell Telescope, the Large Millimeter Telescope Alfonso Serrano, the Submillimeter Array, the Submillimeter Telescope, and the South Pole Telescope [7]. Petabytes of raw data from the telescopes were combined by highly specialised supercomputers hosted by the Max Planck Institute for Radio Astronomy and MIT Haystack Observatory.

    Max Planck Institute for Radio Astronomy Bonn Germany

    MIT Haystack Observatory, Westford, Massachusetts, USA, Altitude 131 m (430 ft)

    ESO/NRAO/NAOJ ALMA Array in Chile in the Atacama at Chajnantor plateau, at 5,000 metres

    ESO/MPIfR APEX high on the Chajnantor plateau in Chile’s Atacama region, at an altitude of over 4,800 m (15,700 ft)

    IRAM 30m Radio telescope, on Pico Veleta in the Spanish Sierra Nevada,, Altitude 2,850 m (9,350 ft)

    East Asia Observatory James Clerk Maxwell telescope, Mauna Kea, Hawaii, USA,4,207 m (13,802 ft) above sea level

    The University of Massachusetts Amherst and Mexico’s Instituto Nacional de Astrofísica, Óptica y Electrónica
    Large Millimeter Telescope Alfonso Serrano, Mexico, at an altitude of 4850 meters on top of the Sierra Negra

    CfA Submillimeter Array Mauna Kea, Hawaii, USA, Altitude 4,080 m (13,390 ft)

    U Arizona Submillimeter Telescope located on Mt. Graham near Safford, Arizona, USA, Altitude 3,191 m (10,469 ft)

    South Pole Telescope SPTPOL. The SPT collaboration is made up of over a dozen (mostly North American) institutions, including the University of Chicago, the University of California, Berkeley, Case Western Reserve University, Harvard/Smithsonian Astrophysical Observatory, the University of Colorado Boulder, McGill University, The University of Illinois at Urbana-Champaign, University of California, Davis, Ludwig Maximilian University of Munich, Argonne National Laboratory, and the National Institute for Standards and Technology. It is funded by the National Science Foundation. Altitude 2.8 km (9,200 ft)

    European facilities and funding played a crucial role in this worldwide effort, with the participation of advanced European telescopes and the support from the European Research Council — particularly a €14 million grant for the BlackHoleCam project [8]. Support from ESO, IRAM and the Max Planck Society was also key. “This result builds on decades of European expertise in millimetre astronomy”, commented Karl Schuster, Director of IRAM and member of the EHT Board.

    The construction of the EHT and the observations announced today represent the culmination of decades of observational, technical, and theoretical work. This example of global teamwork required close collaboration by researchers from around the world. Thirteen partner institutions worked together to create the EHT, using both pre-existing infrastructure and support from a variety of agencies. Key funding was provided by the US National Science Foundation (NSF), the EU’s European Research Council (ERC), and funding agencies in East Asia.

    “ESO is delighted to have significantly contributed to this result through its European leadership and pivotal role in two of the EHT’s component telescopes, located in Chile — ALMA and APEX,” commented ESO Director General Xavier Barcons. “ALMA is the most sensitive facility in the EHT, and its 66 high-precision antennas were critical in making the EHT a success.”

    “We have achieved something presumed to be impossible just a generation ago,” concluded Doeleman. “Breakthroughs in technology, connections between the world’s best radio observatories, and innovative algorithms all came together to open an entirely new window on black holes and the event horizon.”
    Notes

    [1] The shadow of a black hole is the closest we can come to an image of the black hole itself, a completely dark object from which light cannot escape. The black hole’s boundary — the event horizon from which the EHT takes its name — is around 2.5 times smaller than the shadow it casts and measures just under 40 billion km across.

    [2] Supermassive black holes are relatively tiny astronomical objects — which has made them impossible to directly observe until now. As the size of a black hole’s event horizon is proportional to its mass, the more massive a black hole, the larger the shadow. Thanks to its enormous mass and relative proximity, M87’s black hole was predicted to be one of the largest viewable from Earth — making it a perfect target for the EHT.

    [3] Although the telescopes are not physically connected, they are able to synchronize their recorded data with atomic clocks — hydrogen masers — which precisely time their observations. These observations were collected at a wavelength of 1.3 mm during a 2017 global campaign. Each telescope of the EHT produced enormous amounts of data – roughly 350 terabytes per day – which was stored on high-performance helium-filled hard drives. These data were flown to highly specialised supercomputers — known as correlators — at the Max Planck Institute for Radio Astronomy and MIT Haystack Observatory to be combined. They were then painstakingly converted into an image using novel computational tools developed by the collaboration.

    [4] 100 years ago, two expeditions set out for Principe Island off the coast of Africa and Sobral in Brazil to observe the 1919 solar eclipse, with the goal of testing general relativity by seeing if starlight would be bent around the limb of the sun, as predicted by Einstein. In an echo of those observations, the EHT has sent team members to some of the world’s highest and most isolated radio facilities to once again test our understanding of gravity.

    [5] The East Asian Observatory (EAO) partner on the EHT project represents the participation of many regions in Asia, including China, Japan, Korea, Taiwan, Vietnam, Thailand, Malaysia, India and Indonesia.

    [6] Future EHT observations will see substantially increased sensitivity with the participation of the IRAM NOEMA Observatory, the Greenland Telescope and the Kitt Peak Telescope.

    [7] ALMA is a partnership of the European Southern Observatory (ESO; Europe, representing its member states), the U.S. National Science Foundation (NSF), and the National Institutes of Natural Sciences(NINS) of Japan, together with the National Research Council (Canada), the Ministry of Science and Technology (MOST; Taiwan), Academia Sinica Institute of Astronomy and Astrophysics (ASIAA; Taiwan), and Korea Astronomy and Space Science Institute (KASI; Republic of Korea), in cooperation with the Republic of Chile. APEX is operated by ESO, the 30-meter telescope is operated by IRAM (the IRAM Partner Organizations are MPG (Germany), CNRS (France) and IGN (Spain)), the James Clerk Maxwell Telescope is operated by the EAO, the Large Millimeter Telescope Alfonso Serrano is operated by INAOE and UMass, the Submillimeter Array is operated by SAO and ASIAA and the Submillimeter Telescope is operated by the Arizona Radio Observatory (ARO). The South Pole Telescope is operated by the University of Chicago with specialized EHT instrumentation provided by the University of Arizona.

    [8] BlackHoleCam is an EU-funded project to image, measure and understand astrophysical black holes. The main goal of BlackHoleCam and the Event Horizon Telescope (EHT) is to make the first ever images of the billion solar masses black hole in the nearby galaxy Messier 87 and of its smaller cousin, Sagittarius A*, the supermassive black hole at the centre of our Milky Way. This allows the determination of the deformation of spacetime caused by a black hole with extreme precision.

    More information

    This research was presented in a series of six papers published today in a special issue of The Astrophysical Journal Letters.

    The EHT collaboration involves more than 200 researchers from Africa, Asia, Europe, North and South America. The international collaboration is working to capture the most detailed black hole images ever by creating a virtual Earth-sized telescope. Supported by considerable international investment, the EHT links existing telescopes using novel systems — creating a fundamentally new instrument with the highest angular resolving power that has yet been achieved.

    The EHT consortium consists of 13 stakeholder institutes; the Academia Sinica Institute of Astronomy and Astrophysics, the University of Arizona, the University of Chicago, the East Asian Observatory, Goethe-Universitaet Frankfurt, Institut de Radioastronomie Millimétrique, Large Millimeter Telescope, Max Planck Institute for Radio Astronomy, MIT Haystack Observatory, National Astronomical Observatory of Japan, Perimeter Institute for Theoretical Physics, Radboud University and the Smithsonian Astrophysical Observatory.

    Links

    ESO EHT web page
    EHT Website & Press Release
    ESOBlog on the EHT Project

    Papers:

    Paper I: The Shadow of the Supermassive Black Hole
    Paper II: Array and Instrumentation
    Paper III: Data processing and Calibration
    Paper IV: Imaging the Central Supermassive Black Hole
    Paper V: Physical Origin of the Asymmetric Ring
    Paper VI: The Shadow and Mass of the Central Black Hole

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Visit ESO in Social Media-

    Facebook

    Twitter

    YouTube

    ESO Bloc Icon

    ESO is the foremost intergovernmental astronomy organisation in Europe and the world’s most productive ground-based astronomical observatory by far. It is supported by 16 countries: Austria, Belgium, Brazil, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Poland, Portugal, Spain, Sweden, Switzerland and the United Kingdom, along with the host state of Chile. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates three unique world-class observing sites in Chile: La Silla, Paranal and Chajnantor. At Paranal, ESO operates the Very Large Telescope, the world’s most advanced visible-light astronomical observatory and two survey telescopes. VISTA works in the infrared and is the world’s largest survey telescope and the VLT Survey Telescope is the largest telescope designed to exclusively survey the skies in visible light. ESO is a major partner in ALMA, the largest astronomical project in existence. And on Cerro Armazones, close to Paranal, ESO is building the 39-metre EEuropean Extremely Large Telescope, the E-ELT, which will become “the world’s biggest eye on the sky”.

     
  • richardmitnick 9:16 am on August 20, 2018 Permalink | Reply
    Tags: Albert Einstein's theory of general relativity, , , , , , , , ,   

    From ARC Centres of Excellence via Science Alert: “We May Soon Know How a Crucial Einstein Principle Works in The Quantum Realm” 

    arc-centers-of-excellence-bloc

    From ARC Centres of Excellence

    via

    Science Alert

    1
    (NiPlot/iStock)

    20 AUG 2018
    MICHELLE STARR

    The puzzle of how Einstein’s equivalence principle plays out in the quantum realm has vexed physicists for decades. Now two researchers may have finally figured out the key that will allow us to solve this mystery.

    Einstein’s physical theories have held up under pretty much every classical physics test thrown at them. But when you get down to the very smallest scales – the quantum realm – things start behaving a little bit oddly.

    The thing is, it’s not really clear how Einstein’s theory of general relativity and quantum mechanics work together. The laws that govern the two realms are incompatible with each other, and attempts to resolve these differences have come up short.

    But the equivalence principle – one of the cornerstones of modern physics – is an important part of general relativity. And if it can be resolved within the quantum realm, that may give us a toehold into resolving general relativity and quantum mechanics.

    The equivalence principle, in simple terms, means that gravity accelerates all objects equally, as can be observed in the famous feather and hammer experiment conducted by Apollo 15 Commander David Scott on the Moon.

    It also means that gravitational mass and inertial mass are equivalent; to put it simply, if you were in a sealed chamber, like an elevator, you would be unable to tell if the force outside the chamber was gravity or acceleration equivalent to gravity. The effect is the same.

    “Einstein’s equivalence principle contends that the total inertial and gravitational mass of any objects are equivalent, meaning all bodies fall in the same way when subject to gravity,” explained physicist Magdalena Zych of the ARC Centre of Excellence for Engineered Quantum Systems in Australia.

    “Physicists have been debating whether the principle applies to quantum particles, so to translate it to the quantum world we needed to find out how quantum particles interact with gravity.

    “We realised that to do this we had to look at the mass.”

    According to relativity, mass is held together by energy. But in quantum mechanics, that gets a bit complicated. A quantum particle can have two different energy states, with different numerical values, known as a superposition.

    And because it has a superposition of energy states, it also has a superposition of inertial masses.

    This means – theoretically, at least – that it should also have a superposition of gravitational masses. But the superposition of quantum particles isn’t accounted for by the equivalence principle.

    “We realised that we had to look how particles in such quantum states of the mass behave in order to understand how a quantum particle sees gravity in general,” Zych said.

    “Our research found that for quantum particles in quantum superpositions of different masses, the principle implies additional restrictions that are not present for classical particles – this hadn’t been discovered before.”

    This discovery allowed the team to re-formulate the equivalence principle to account for the superposition of values in a quantum particle.

    The new formulation hasn’t yet been applied experimentally; but, the researchers said, opens a door to experiments that could test the newly discovered restrictions.

    And it offers a new framework for testing the equivalence principle in the quantum realm – we can hardly wait.

    The team’s research has been published in the journal Nature Physics.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The objectives for the ARC Centres of Excellence are to:

    undertake highly innovative and potentially transformational research that aims to achieve international standing in the fields of research envisaged and leads to a significant advancement of capabilities and knowledge
    link existing Australian research strengths and build critical mass with new capacity for interdisciplinary, collaborative approaches to address the most challenging and significant research problems
    develope relationships and build new networks with major national and international centres and research programs to help strengthen research, achieve global competitiveness and gain recognition for Australian research
    build Australia’s human capacity in a range of research areas by attracting and retaining, from within Australia and abroad, researchers of high international standing as well as the most promising research students
    provide high-quality postgraduate and postdoctoral training environments for the next generation of researchers
    offer Australian researchers opportunities to work on large-scale problems over long periods of time
    establish Centres that have an impact on the wider community through interaction with higher education institutes, governments, industry and the private and non-profit sector.

     
  • richardmitnick 7:44 pm on April 24, 2018 Permalink | Reply
    Tags: Albert Einstein's theory of general relativity, “Newton was the first physicist” says Sylvester James Gates a physicist at Brown University, , Peter Woit - “When you go far enough back you really can’t tell who’s a physicist and who’s a mathematician”, , Riemannian geometry, , The relationship between physics and mathematics goes back to the beginning of both subjects   

    From Symmetry: “The coevolution of physics and math” 

    Symmetry Mag
    Symmetry

    04/24/18
    Evelyn Lamb

    1
    Artwork by Sandbox Studio, Chicago

    Breakthroughs in physics sometimes require an assist from the field of mathematics—and vice versa.

    In 1912, Albert Einstein, then a 33-year-old theoretical physicist at the Eidgenössische Technische Hochschule in Zürich, was in the midst of developing an extension to his theory of special relativity.

    With special relativity, he had codified the relationship between the dimensions of space and time. Now, seven years later, he was trying to incorporate into his theory the effects of gravity. This feat—a revolution in physics that would supplant Isaac Newton’s law of universal gravitation and result in Einstein’s theory of general relativity—would require some new ideas.

    Fortunately, Einstein’s friend and collaborator Marcel Grossmann swooped in like a waiter bearing an exotic, appetizing delight (at least in a mathematician’s overactive imagination): Riemannian geometry.

    This mathematical framework, developed in the mid-19th century by German mathematician Bernhard Riemann, was something of a revolution itself. It represented a shift in mathematical thinking from viewing mathematical shapes as subsets of the three-dimensional space they lived in to thinking about their properties intrinsically. For example, a sphere can be described as the set of points in 3-dimensional space that lie exactly 1 unit away from a central point. But it can also be described as a 2-dimensional object that has particular curvature properties at every single point. This alternative definition isn’t terribly important for understanding the sphere itself but ends up being very useful with more complicated manifolds or higher-dimensional spaces.

    By Einstein’s time, the theory was still new enough that it hadn’t completely permeated through mathematics, but it happened to be exactly what Einstein needed. Riemannian geometry gave him the foundation he needed to formulate the precise equations of general relativity. Einstein and Grossmann were able to publish their work later that year.

    “It’s hard to imagine how he would have come up with relativity without help from mathematicians,” says Peter Woit, a theoretical physicist in the Mathematics Department at Columbia University.

    The story of general relativity could go to mathematicians’ heads. Here mathematics seems to be a benevolent patron, blessing the benighted world of physics with just the right equations at the right time.

    But of course the interplay between mathematics and physics is much more complicated than that. They weren’t even separate disciplines for most of recorded history. Ancient Greek, Egyptian and Babylonian mathematics took as an assumption the fact that we live in a world in which distance, time and gravity behave in a certain way.

    “Newton was the first physicist,” says Sylvester James Gates, a physicist at Brown University. “In order to reach the pinnacle, he had to invent a new piece of mathematics; it’s called calculus.”

    Calculus made some classical geometry problems easier to solve, but its foremost purpose to Newton was to give him a way to analyze the motion and change he observed in physics. In that story, mathematics is perhaps more of a butler, hired to help keep the affairs in order, than a savior.

    Even after physics and mathematics began their separate evolutionary paths, the disciplines were closely linked. “When you go far enough back, you really can’t tell who’s a physicist and who’s a mathematician,” Woit says. (As a mathematician, I was a bit scandalized the first time I saw Emmy Noether’s name attached to physics! I knew her primarily through abstract algebra.)

    Throughout the history of the two fields, mathematics and physics have each contributed important ideas to the other. Mathematician Hermann Weyl’s work on mathematical objects called Lie groups provided an important basis for understanding symmetry in quantum mechanics. In his 1930 book The Principles of Quantum Mechanics, theoretical physicist Paul Dirac introduced the Dirac delta function to help describe the concept in particle physics of a pointlike particle—anything so small that it would be modeled by a point in an idealized situation. A picture of the Dirac delta function looks like a horizontal line lying along the bottom of the x axis of a graph, at x=0, except at the place where it intersects with the y axis, where it explodes into a line pointing up to infinity. Dirac declared that the integral of this function, the measure of the area underneath it, was equal to 1. Strictly speaking, no such function exists, but Dirac’s use of the Dirac delta eventually spurred mathematician Laurent Schwartz to develop the theory of distributions in a mathematically rigorous way. Today distributions are extraordinarily useful in the mathematical fields of ordinary and partial differential equations.

    Though modern researchers focus their work more and more tightly, the line between physics and mathematics is still a blurry one. A physicist has won the Fields Medal, one of the most prestigious accolades in mathematics. And a mathematician, Maxim Kontsevich, has won the new Breakthrough Prizes in both mathematics and physics. One can attend seminar talks about quantum field theory, black holes, and string theory in both math and physics departments. Since 2011, the annual String Math conference has brought mathematicians and physicists together to work on the intersection of their fields in string theory and quantum field theory.

    String theory is perhaps the best recent example of the interplay between mathematics and physics, for reasons that eventually bring us back to Einstein and the question of gravity.

    String theory is a theoretical framework in which those pointlike particles Dirac was describing become one-dimensional objects called strings. Part of the theoretical model for those strings corresponds to gravitons, theoretical particles that carry the force of gravity.

    Most humans will tell you that we perceive the universe as having three spatial dimensions and one dimension of time. But string theory naturally lives in 10 dimensions. In 1984, as the number of physicists working on string theory ballooned, a group of researchers including Edward Witten, the physicist who was later awarded a Fields Medal, discovered that the extra six dimensions of string theory needed to be part of a space known as a Calabi-Yau manifold.

    When mathematicians joined the fray to try to figure out what structures these manifolds could have, physicists were hoping for just a few candidates. Instead, they found boatloads of Calabi-Yaus. Mathematicians still have not finished classifying them. They haven’t even determined whether their classification has a finite number of pieces.

    As mathematicians and physicists studied these spaces, they discovered an interesting duality between Calabi-Yau manifolds. Two manifolds that seem completely different can end up describing the same physics. This idea, called mirror symmetry, has blossomed in mathematics, leading to entire new research avenues. The framework of string theory has almost become a playground for mathematicians, yielding countless new avenues of exploration.

    Mina Aganagic, a theoretical physicist at the University of California, Berkeley, believes string theory and related topics will continue to provide these connections between physics and math.

    “In some sense, we’ve explored a very small part of string theory and a very small number of its predictions,” she says. Mathematicians and their focus on detailed rigorous proofs bring one point of view to the field, and physicists, with their tendency to prioritize intuitive understanding, bring another. “That’s what makes the relationship so satisfying.”

    The relationship between physics and mathematics goes back to the beginning of both subjects; as the fields have advanced, this relationship has gotten more and more tangled, a complicated tapestry. There is seemingly no end to the places where a well-placed set of tools for making calculations could help physicists, or where a probing question from physics could inspire mathematicians to create entirely new mathematical objects or theories.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 8:22 pm on March 18, 2018 Permalink | Reply
    Tags: Albert Einstein's theory of general relativity, , Mathematics vs Physics, , , , Shake a Black Hole, , The black hole stability conjecture   

    From Quanta: “To Test Einstein’s Equations, Poke a Black Hole” 

    Quanta Magazine
    Quanta Magazine

    mathematical physics
    https://sciencesprings.wordpress.com/2018/03/17/from-ethan-siegel-where-is-the-line-between-mathematics-and-physics/

    March 8, 2018
    Kevin Hartnett

    1
    Fantastic animation. Olena Shmahalo/Quanta Magazine

    In November 1915, in a lecture before the Prussian Academy of Sciences, Albert Einstein described an idea that upended humanity’s view of the universe. Rather than accepting the geometry of space and time as fixed, Einstein explained that we actually inhabit a four-dimensional reality called space-time whose form fluctuates in response to matter and energy.

    Einstein elaborated this dramatic insight in several equations, referred to as his “field equations,” that form the core of his theory of general relativity. That theory has been vindicated by every experimental test thrown at it in the century since.

    Yet even as Einstein’s theory seems to describe the world we observe, the mathematics underpinning it remain largely mysterious. Mathematicians have been able to prove very little about the equations themselves. We know they work, but we can’t say exactly why. Even Einstein had to fall back on approximations, rather than exact solutions, to see the universe through the lens he’d created.

    Over the last year, however, mathematicians have brought the mathematics of general relativity into sharper focus. Two groups have come up with proofs related to an important problem in general relativity called the black hole stability conjecture. Their work proves that Einstein’s equations match a physical intuition for how space-time should behave: If you jolt it, it shakes like Jell-O, then settles down into a stable form like the one it began with.

    “If these solutions were unstable, that would imply they’re not physical. They’d be a mathematical ghost that exists mathematically and has no significance from a physical point of view,” said Sergiu Klainerman, a mathematician at Princeton University and co-author, with Jérémie Szeftel, of one of the two new results [https://arxiv.org/abs/1711.07597].

    To complete the proofs, the mathematicians had to resolve a central difficulty with Einstein’s equations. To describe how the shape of space-time evolves, you need a coordinate system — like lines of latitude and longitude — that tells you which points are where. And in space-time, as on Earth, it’s hard to find a coordinate system that works everywhere.

    Shake a Black Hole

    General relativity famously describes space-time as something like a rubber sheet. Absent any matter, the sheet is flat. But start dropping balls onto it — stars and planets — and the sheet deforms. The balls roll toward one another. And as the objects move around, the shape of the rubber sheet changes in response.

    Einstein’s field equations describe the evolution of the shape of space-time. You give the equations information about curvature and energy at each point, and the equations tell you the shape of space-time in the future. In this way, Einstein’s equations are like equations that model any physical phenomenon: This is where the ball is at time zero, this is where it is five seconds later.

    “They’re a mathematically precise quantitative version of the statement that space-time curves in the presence of matter,” said Peter Hintz, a Clay research fellow at the University of California, Berkeley, and co-author, with András Vasy, of the other recent result [https://arxiv.org/abs/1606.04014].

    In 1916, almost immediately after Einstein released his theory of general relativity, the German physicist Karl Schwarzschild found an exact solution to the equations that describes what we now know as a black hole (the term wouldn’t be invented for another five decades). Later, physicists found exact solutions that describe a rotating black hole and one with an electrical charge.

    These remain the only exact solutions that describe a black hole. If you add even a second black hole, the interplay of forces becomes too complicated for present-day mathematical techniques to handle in all but the most special situations.

    Yet you can still ask important questions about this limited group of solutions. One such question developed out of work in 1952 by the French mathematician Yvonne Choquet-Bruhat. It asks, in effect: What happens when you shake a black hole?

    2
    Lucy Reading-Ikkanda/Quanta Magazine

    This problem is now known as the black hole stability conjecture. The conjecture predicts that solutions to Einstein’s equations will be “stable under perturbation.” Informally, this means that if you wiggle a black hole, space-time will shake at first, before eventually settling down into a form that looks a lot like the form you started with. “Roughly, stability means if I take special solutions and perturb them a little bit, change data a little bit, then the resulting dynamics will be very close to the original solution,” Klainerman said.

    So-called “stability” results are an important test of any physical theory. To understand why, it’s useful to consider an example that’s more familiar than a black hole.

    Imagine a pond. Now imagine that you perturb the pond by tossing in a stone. The pond will slosh around for a bit and then become still again. Mathematically, the solutions to whatever equations you use to describe the pond (in this case, the Navier-Stokes equations) should describe that basic physical picture. If the initial and long-term solutions don’t match, you might question the validity of your equations.

    “This equation might have whatever properties, it might be perfectly fine mathematically, but if it goes against what you expect physically, it can’t be the right equation,” Vasy said.

    For mathematicians working on Einstein’s equations, stability proofs have been even harder to find than solutions to the equations themselves. Consider the case of flat, empty Minkowski space — the simplest of all space-time configurations. This solution to Einstein’s equations was found in 1908 in the context of Einstein’s earlier theory of special relativity. Yet it wasn’t until 1993 that mathematicians managed to prove that if you wiggle flat, empty space-time, you eventually get back flat, empty space-time. That result, by Klainerman and Demetrios Christodoulou, is a celebrated work in the field.

    One of the main difficulties with stability proofs has to do with keeping track of what is going on in four-dimensional space-time as the solution evolves. You need a coordinate system that allows you to measure distances and identify points in space-time, just as lines of latitude and longitude allow us to define locations on Earth. But it’s not easy to find a coordinate system that works at every point in space-time and then continues to work as the shape of space-time evolves.

    “We don’t know of a one-size-fits-all way to do this,” Hintz wrote in an email. “After all, the universe does not hand you a preferred coordinate system.”

    The Measurement Problem

    The first thing to recognize about coordinate systems is that they’re a human invention. The second is that not every coordinate system works to identify every point in a space.

    Take lines of latitude and longitude: They’re arbitrary. Cartographers could have anointed any number of imaginary lines to be 0 degrees longitude.

    2

    And while latitude and longitude work to identify just about every location on Earth, they stop making sense at the North and South poles. If you knew nothing about Earth itself, and only had access to latitude and longitude readings, you might wrongly conclude there’s something topologically strange going on at those points.

    This possibility — of drawing wrong conclusions about the properties of physical space because the coordinate system used to describe it is inadequate — is at the heart of why it’s hard to prove the stability of space-time.

    “It could be the case that stability is true, but you’re using coordinates that are not stable and thus you miss the fact that stability is true,” said Mihalis Dafermos, a mathematician at the University of Cambridge and a leading figure in the study of Einstein’s equations.

    In the context of the black hole stability conjecture, whatever coordinate system you’re using has to evolve as the shape of space-time evolves — like a snugly fitting glove adjusting as the hand it encloses changes shape. The fit between the coordinate system and space-time has to be good at the start and remain good throughout. If it doesn’t, there are two things that can happen that would defeat efforts to prove stability.

    First, your coordinate system might change shape in a way that makes it break down at certain points, just as latitude and longitude fail at the poles. Such points are called “coordinate singularities” (to distinguish them from physical singularities, like an actual black hole). They are undefined points in your coordinate system that make it impossible to follow an evolving solution all the way through.

    Second, a poorly fitting coordinate system might disguise the underlying physical phenomena it’s meant to measure. To prove that solutions to Einstein’s equations settle down into a stable state after being perturbed, mathematicians must keep careful track of the ripples in space-time that are set in motion by the perturbation. To see why, it’s worth considering the pond again. A rock thrown into a pond generates waves. The long-term stability of the pond results from the fact that those waves decay over time — they grow smaller and smaller until there’s no sign they were ever there.

    The situation is similar for space-time. A perturbation will set off a cascade of gravitational waves, and proving stability requires proving that those gravitational waves decay. And proving decay requires a coordinate system — referred to as a “gauge” — that allows you to measure the size of the waves. The right gauge allows mathematicians to see the waves flatten and eventually disappear altogether.

    “The decay has to be measured relative to something, and it’s here where the gauge issue shows up,” Klainerman said. “If I’m not in the right gauge, even though in principle I have stability, I can’t prove it because the gauge will just not allow me to see that decay. If I don’t have decay rates of waves, I can’t prove stability.”

    The trouble is, while the coordinate system is crucial, it’s not obvious which one to choose. “You have a lot of freedom about what this gauge condition can be,” Hintz said. “Most of these choices are going to be bad.”

    Partway There

    A full proof of the black hole stability conjecture requires proving that all known black hole solutions to Einstein’s equations (with the spin of the black hole below a certain threshold) are stable after being perturbed. These known solutions include the Schwarzschild solution, which describes space-time with a nonrotating black hole, and the Kerr family of solutions, which describe configurations of space-time empty of everything save a single rotating black hole (where the properties of that rotating black hole — its mass and angular momentum — vary within the family of solutions).

    Both of the new results make partial progress toward a proof of the full conjecture.

    Hintz and Vasy, in a paper posted to the scientific preprint site arxiv.org in 2016 [see above 1606.04014], proved that slowly rotating black holes are stable. But their work did not cover black holes rotating above a certain threshold.

    Their proof also makes some assumptions about the nature of space-time. The original conjecture is in Minkowski space, which is not just flat and empty but also fixed in size. Hintz and Vasy’s proof takes place in what’s called de Sitter space, where space-time is accelerating outward, just like in the actual universe. This change of setting makes the problem simpler from a technical point of view, which is easy enough to appreciate at a conceptual level: If you drop a rock into an expanding pond, the expansion is going to stretch the waves and cause them to decay faster than they would have if the pond were not expanding.

    “You’re looking at a universe undergoing an accelerated expansion,” Hintz said. “This makes the problem a little easier as it appears to dilute the gravitational waves.”

    Klainerman and Szeftel’s work has a slightly different flavor. Their proof, the first part of which was posted online last November [see above 1711.07597], takes place in Schwarzschild space-time — closer to the original, more difficult setting for the problem. They prove the stability of a nonrotating black hole, but they do not address solutions in which the black hole is spinning. Moreover, they only prove the stability of black hole solutions for a narrow class of perturbations — where the gravitational waves generated by those perturbations are symmetric in a certain way.

    Both results involve new techniques for finding the right coordinate system for the problem. Hintz and Vasy start with an approximate solution to the equations, based on an approximate coordinate system, and gradually increase the precision of their answer until they arrive at exact solutions and well-behaved coordinates. Klainerman and Szeftel take a more geometric approach to the challenge.

    The two teams are now trying to build on their respective methods to find a proof of the full conjecture. Some expert observers think the day might not be far off.

    “I really think things are now at the stage that the remaining difficulties are just technical,” Dafermos said. “Somehow one doesn’t need new ideas to solve this problem.” He emphasized that a final proof could come from any one of the large number of mathematicians currently working on the problem.

    For 100 years Einstein’s equations have served as a reliable experimental guide to the universe. Now mathematicians may be getting closer to demonstrating exactly why they work so well.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: