Tagged: Albert Einstein’s theory of general relativity Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:45 am on November 26, 2022 Permalink | Reply
    Tags: "Black holes explained", Albert Einstein's theory of general relativity, , , , , , , , , , , , , ,   

    From The University of Chicago: “Black holes explained” 

    U Chicago bloc

    From The University of Chicago

    Oct 13, 2022 [Just found this.]
    Louise Lerner

    Messier 87* vs Sagittarius A*, via The Event Horizon Telescope. Credit: The European Southern Observatory [La Observatorio Europeo Austral] [Observatoire européen austral][Europäische Südsternwarte](EU)(CL).

    Black holes are regions in space where an enormous amount of mass is packed into a tiny volume. This creates a gravitational pull so strong that not even light can escape. They are created when giant stars collapse, and perhaps by other methods that are still unknown.

    Black holes fascinate both the public and scientists—they push the limits of our understanding about matter, space and time.

    Scientists at the University of Chicago and across the world have made many discoveries about our universe with the help of black holes, but there’s a lot we still don’t know about these extraordinary cosmic phenomena.

    What is a black hole?

    Black holes are made of matter packed so tightly that gravity overwhelms all other forces.

    When you pick up a bowling ball, it’s heavy because the matter is densely packed. If you packed more and more mass into the same tiny space, eventually it would create gravity so strong that it would exert a significant pull on passing rays of light.

    Black holes are created when massive stars collapse at the end of their lives (and perhaps under other circumstances that we don’t know about yet.)

    One of the first steps toward the discovery of black holes was made by University of Chicago Professor and Nobel laureate Subrahmanyan Chandrasekhar [below], when he realized that massive stars would have to collapse after they ran out of fuel for the fusion reactions which keep them hot and bright.

    The universe is full of black holes. In the past decade, scientists have detected the signals of their collisions and taken images of the light from the gas swirling around them—and this has helped us learn many things about the universe. For example, black holes have helped us test Albert Einstein’s Theory of General Relativity, which describes how mass, space, and time are related to one another. Scientists think they can tell us much more about these and other essential rules of the universe.

    And on a more personal level, the supermassive black hole at the center of our own Milky Way galaxy may have played a role in how Earth came to be here!

    What do black holes look like?

    Black holes themselves are invisible—they emit virtually no light and so cannot be seen directly. But we have developed several ways to find them anyway.

    By looking for the stuff that’s falling in. If material is falling into a black hole, it travels at such high speeds that it gets hot and glows very brightly, and we can detect that. (That’s how the Event Horizon Telescope took its famous first images of black holes.) Scientists hope to use this method to learn a lot more about how and what black holes “eat.”

    By seeing their gravity pulling on other things. We can find black holes by watching the movements of visible objects around them. For example, a black hole’s gravity is so strong that nearby stars will orbit around them, so we can look for stars behaving strangely around a patch of “empty” space. From this, we can calculate exactly how heavy that black hole must be. That’s how Nobel Prize winner Andrea Ghez and her team detected the supermassive black hole at the center of our own galaxy.

    By detecting the gravitational ripples when they collide. We can also detect black holes by detecting the ripples in space-time created when two of them crash into each other. From that signal, we can tell how massive the black holes were, how far away they were, and how fast they were traveling when they collided.

    What’s inside a black hole?

    The short answer is that no one knows!

    “In some ways that’s one of the most profound questions in physics,” said University of Chicago Prof. Daniel Holz. “There are not many cases in physics where we simply cannot predict what happens, but this is one of them.”

    Black holes have two parts. There is the event horizon, which you can think of as the surface, though it’s simply the point where the gravity gets too strong for anything to escape. And then, at the center, is the singularity. That’s the word we use to describe a point that is infinitely small and infinitely dense.

    We have a good understanding of what the event horizon looks like, thanks to the laws of general relativity. But as you get close to the singularity itself, we lose the ability to even predict what it looks like.

    “Very near the singularity, one would expect quantum effects to become important. However, we don’t yet have a quantum theory of gravity (or, at least, one capable of reliably making such predictions), so we just don’t know the correct description of the singularity—or even whether it really is a singularity,” said University of Chicago Prof. Robert Wald.

    Scientists think that black holes eventually will explode, but it will take many, many times longer than the current age of the universe for that to happen. What will it look like when that happens? That’s another big mystery.

    “Maybe there’s a little nugget left behind containing all of the information that fell into the black hole, maybe there’s a portal to a new universe, maybe the information is just gone forever; we simply don’t know,” said Holz.

    (If all of this is unsatisfying, know that it keeps scientists awake at night, too.)

    How do black holes form?

    Scientists know about one way that black holes form, but there may be others.

    One way to make a black hole is to have a massive star collapse at the end of its life. Prof. Subrahmanyan Chandrasekhar was the first to calculate that when a massive star burns up all its fuel, it will collapse. The idea was ridiculed at first, but other scientists calculated that the star continues forever to fall inward toward its center—thus creating what we called a black hole.

    Black holes can grow more massive over time as they “eat” gas, stars, planets and even other black holes!

    There’s another type of black hole called a supermassive black hole. These are way too massive to have been created by one star collapsing; it’s still a mystery how they form. Black holes can eat other black holes, so it’s possible that the supermassive ones are made of many small black holes merged together. “Or perhaps these big black holes were especially hungry, and ate so much of their surroundings that they grew to enormous size,” said Prof. Holz. But we can see these supermassive black holes formed very early on in the universe—maybe too early to have been made by stars getting old enough to collapse—so it’s possible there’s some other way to make a black hole that we don’t know about yet.

    An artist’s rendition of a supermassive black hole being a “messy eater.” Sometimes matter can be flung off at high speeds in the form of jets—which may create the conditions for stars to form.© M. Kornmesser/ESO.

    What is a supermassive black hole?

    There are two kinds of black holes: stellar mass black holes and supermassive black holes.

    Supermassive black holes are so named because they contain on the order of millions to billions times the mass of our sun.

    As far as we can tell, nearly every galaxy in the universe has one of these supermassive black holes sitting right at its center like a seed. And they are correlated—a bigger galaxy has a bigger black hole, and a smaller galaxy has a smaller black hole. All of this makes scientists think these supermassive black holes have something to do with how the galaxies formed. But that relationship is still a mystery, and so is how the supermassive black holes formed in the first place.

    Our “neighborhood” supermassive black hole, the one at the center of our own Milky Way galaxy, is called Sagittarius A* (pronounced A-star).

    It’s about 15 million miles across and contains the equivalent of 4 million suns’ worth of mass. Don’t worry; it’s much too far away to pose any danger to Earth.

    What do black holes eat?

    Contrary to what you may have seen in movies, black holes don’t actually “suck” things in. For example, there are actually stars orbiting the supermassive black hole at the center of our galaxy, and they’ll keep orbiting without falling in unless something else disturbs them [see “Star S0-2” above]. An object really has to fall right into the mouth of a black hole for it to be eaten. (And the mouth-which we call the event horizon-of a black hole, is tiny; if the entire Earth were to collapse and form a black hole, its mouth would be less than an inch across!)

    But the movements of stars and galaxies do sometimes mean that stuff falls into a black hole’s mouth. Sagittarius A*, the black hole at the center of our galaxy, mostly eats interstellar gas and dust that is drifting around. With telescopes, we have seen other black holes eating stars and even the gas from neighboring galaxies.

    Black holes can be “messy eaters.” As objects are being ripped apart, some of the gas and matter can be flung off at high speeds. Sometimes this is so powerful that it forms jets and winds shooting outwards at nearly the speed of light, and this can affect the galaxy containing it. These jets can blow apart nearby stars and planets; or they can provide just the right amount of churn to create the ideal conditions for making new stars over millions of years.

    How were black holes discovered?

    The first inkling that anyone had about black holes came when 19-year-old astrophysicist Subrahmanyan Chandrasekhar was mulling over the consequences of several recent discoveries, including Albert Einstein’s Theory of Special Relativity.

    Prof. Subrahmanyan Chandrasekhar.

    He calculated that all stars larger than 1.4 times the mass of our sun would eventually run out of fuel and collapse.

    Black Holes: The Day Tomorrow Began at the University of Chicago.
    Learn how Prof. Subrahmanyan Chandrasekhar’s pioneering research—once ridiculed by his peers—paved the way to the discovery of black holes. Video by UChicago Creative.

    Scientists at the time were shocked and skeptical. The most famous astrophysicist at the time, Arthur Eddington, publicly trashed the idea at a gathering, saying, “I think there should be a law of nature to prevent a star from behaving in this absurd way!”

    However, the damage was done. “Once the astrophysics community had come to grips with a calculation performed by a 19-year-old student sailing off to graduate school, the heavens could never again be seen as a perfect and tranquil dominion,” physicist Freeman Dyson later wrote.

    Scientists soon worked out that other laws, including Albert Einstein’s Theory of General Relativity, required black holes to exist.

    The idea became increasingly accepted. In the latter half of the 20th century, eminent theoretical scientists, including Steven Hawking at Cambridge, John Wheeler and Jacob Bekenstein at Princeton, Chandrasekhar and Robert Wald at the University of Chicago, and many others, explored the details of the mathematics and physics behind black holes.

    Meanwhile, evidence from telescopes began to pile up that black holes were out there in the universe.

    In the 1960s, quasars were discovered—faraway objects that were emitting such strong radiation that there was no explanation other than gigantic black holes chewing up and spitting out matter.

    Throughout the 1990s, scientists including Andrea Ghez and Reinhard Genzel precisely tracked the movements of stars around the center of our galaxy, proving they were orbiting around something invisible but so massive that it had to be a black hole. (They would receive the Nobel Prize in 2020 for this work [above].)

    Then, in 2015, two special detectors known as the Laser Interferometer Gravitational-Wave Observatory (LIGO) [above] picked up the ripples from a pair of black holes colliding. (This also received a Nobel Prize, in 2018). They have since detected nearly 100 such collisions.

    In 2019, the Event Horizon Telescope, a collection of telescopes around the world acting in concert, was able to take an image of the gas swirling around a gigantic black hole in another galaxy.

    Event Horizon Telescope Array

    The locations of the radio dishes that will be part of the Event Horizon Telescope array. Image credit: Event Horizon Telescope. via University of Arizona.

    About the Event Horizon Telescope (EHT)

    The EHT consortium consists of 13 stakeholder institutes; The Academia Sinica Institute of Astronomy & Astrophysics [中央研究院天文及天文物理研究所](TW) , The University of Arizona, The University of Chicago, The East Asian Observatory, Goethe University Frankfurt [Goethe-Universität](DE), Institut de Radioastronomie Millimétrique, Large Millimeter Telescope, The MPG Institute for Radio Astronomy[MPG Institut für Radioastronomie](DE), MIT Haystack Observatory, The National Astronomical Observatory of Japan[[国立天文台](JP), The Perimeter Institute for Theoretical Physics (CA), Radboud University [Radboud Universiteit](NL) and The Center for Astrophysics | Harvard & Smithsonian.

    They followed this in 2022 with an image of our “own” black hole—the one that sits in the center of the Milky Way [above]. We are making more discoveries all the time!

    What do black holes tell us about the universe?

    Black holes are kind of like a playground for physicists. “They are literally made out of space and time,” said Prof. Holz. Because they are so extreme, they are the perfect place to test the limits of the rules of the universe.

    Observing them and thinking about their properties have yielded enormous insights about the nature of the universe. For example, detecting their collisions allowed us to test Einstein’s theories about how mass, space, and time are related (as well as lots of other theories about the universe). Black holes also seem to play a role in the formation of galaxies; it’s likely our supermassive black hole has something to do with how we came to be here today.

    Some other things we can learn about the universe include:

    Understanding extreme physics and how stars and planets grow. Some supermassive black holes are extremely active, gobbling up stars amid swirling magnetic fields and flinging out jets of superheated gas and material; these systems are known as quasars. Watching this process can tell us about the physics of these extreme environments. It can also tell us about the conditions under which stars, planets, and galaxies are born, grow, and die.

    Understanding how fast the universe is expanding and therefore how it evolved. As we get more and more data on pairs of black holes colliding, Holz and others have worked out methods to use them to calculate how fast the universe is expanding. This number, called the Hubble Constant, is key to understanding the past, present, and future behavior of the universe, as well as the nature of dark matter and dark energy.

    Reconciling our major theories of the universe. One of the most fundamental questions in modern physics is how to reconcile quantum mechanics, which is the law for the very smallest particles in the universe, with general relativity, which is the law for the very biggest things in the universe. These two sets of laws don’t quite match up. But black holes are the perfect place to explore the links between them.

    For example, Stephen Hawking theorized that the laws of Quantum Mechanics suggest that black holes have a very tiny temperature—which was surprising to scientists, since that implies some radiation is leaving the black hole. This has all sorts of implications for our understanding of physics. (One such implication: Black holes should eventually lose mass faster and faster over time until they explode. However, that will take trillions upon trillions of years to happen, so none of us will be around when it does.)

    How was the first picture of a black hole taken?

    In 2019, people around the world were thrilled to see the first image ever taken of a black hole—and then, in 2022, an image of our “personal” supermassive black hole in the Milky Way. The bright ring around each one is created by material glowing very hot as it circles the black hole.

    This was the first direct image of a black hole ever taken—all of the other pictures you’ve seen are simulations or artist illustrations.

    These black holes are so far away that no normal telescope would ever be powerful enough to see them. You would need a telescope the size of the Earth—but scientists figured out that they could piece together images taken simultaneously from telescopes situated all around the Earth instead. (One of these was the South Pole Telescope, run by a collaboration headed by the University of Chicago, which provided the view from the bottom of the Earth.)

    All together, this network of combined telescopes is referred to as the Event Horizon Telescope [above]. They next hope to create a “movie” of the glowing gas moving around a black hole as it’s pulled in. Learn more about the quest to take the images here.

    What do scientists still not know about black holes?

    Even as new detectors and telescopes have been able to tell us more and more about black holes in the past decades, scientists still have hundreds of questions about black holes. What do they eat, and how often? What happens as stuff falls in? When it falls in, how much comes back out? Does this stuff end up causing the black hole to spin? How are these black holes created in the first place?

    There are more fundamental questions, too, ranging from ‘What’s inside a black hole?’ to ‘How are supermassive black holes tied to their galaxies?’

    “Everything about black holes is absurd. It’s very appealing to say they can’t possibly exist, except that both our theories and our observations show that they must and in fact do exist,” said Prof. Holz.

    One thing that keeps scientists awake at night is whether information that falls into a black hole is truly gone forever. There are other laws of physics that say that all information in the universe is preserved; even if you burn a notebook, its information could theoretically be recovered from the traces and gases that are left behind, as well as the light that was emitted.

    The Black Hole War Stephen Hawking and Leonard Susskind

    But as far as we can tell, it’s possible that the information within a notebook dropped into a black hole could be truly erased from the universe.

    Understanding Sagittarius A*, the supermassive black hole at the center of the Milky Way, could help us understand how the Milky Way formed, as well as the strange physics that happen in and near black holes. University of Chicago Profs. John Carlstrom and Daniel Holz explain what it takes to “photograph” a black hole and what mysteries remain about black holes. Video by UChicago Creative.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Chicago Campus

    The University of Chicago is an urban research university that has driven new ways of thinking since 1890. Our commitment to free and open inquiry draws inspired scholars to our global campuses, where ideas are born that challenge and change the world.

    We empower individuals to challenge conventional thinking in pursuit of original ideas. Students in the College develop critical, analytic, and writing skills in our rigorous, interdisciplinary core curriculum. Through graduate programs, students test their ideas with University of Chicago scholars, and become the next generation of leaders in academia, industry, nonprofits, and government.

    University of Chicago research has led to such breakthroughs as discovering the link between cancer and genetics, establishing revolutionary theories of economics, and developing tools to produce reliably excellent urban schooling. We generate new insights for the benefit of present and future generations with our national and affiliated laboratories: DOE’s Argonne National Laboratory, DOE’s Fermi National Accelerator Laboratory , and the Marine Biological Laboratory in Woods Hole, Massachusetts.
    The University of Chicago is enriched by the city we call home. In partnership with our neighbors, we invest in Chicago’s mid-South Side across such areas as health, education, economic growth, and the arts. Together with our medical center, we are the largest private employer on the South Side.

    In all we do, we are driven to dig deeper, push further, and ask bigger questions—and to leverage our knowledge to enrich all human life. Our diverse and creative students and alumni drive innovation, lead international conversations, and make masterpieces. Alumni and faculty, lecturers and postdocs go on to become Nobel laureates, CEOs, university presidents, attorneys general, literary giants, and astronauts. The University of Chicago is a private research university in Chicago, Illinois. Founded in 1890, its main campus is located in Chicago’s Hyde Park neighborhood. It enrolled 16,445 students in Fall 2019, including 6,286 undergraduates and 10,159 graduate students. The University of Chicago is ranked among the top universities in the world by major education publications, and it is among the most selective in the United States.

    The university is composed of one undergraduate college and five graduate research divisions, which contain all of the university’s graduate programs and interdisciplinary committees. Chicago has eight professional schools: the Law School, the Booth School of Business, the Pritzker School of Medicine, the School of Social Service Administration, the Harris School of Public Policy, the Divinity School, the Graham School of Continuing Liberal and Professional Studies, and the Pritzker School of Molecular Engineering. The university has additional campuses and centers in London, Paris, Beijing, Delhi, and Hong Kong, as well as in downtown Chicago.

    University of Chicago scholars have played a major role in the development of many academic disciplines, including economics, law, literary criticism, mathematics, religion, sociology, and the behavioralism school of political science, establishing the Chicago schools in various fields. Chicago’s Metallurgical Laboratory produced the world’s first man-made, self-sustaining nuclear reaction in Chicago Pile-1 beneath the viewing stands of the university’s Stagg Field. Advances in chemistry led to the “radiocarbon revolution” in the carbon-14 dating of ancient life and objects. The university research efforts include administration of DOE’s Fermi National Accelerator Laboratory and DOE’s Argonne National Laboratory, as well as the U Chicago Marine Biological Laboratory in Woods Hole, Massachusetts (MBL). The university is also home to the University of Chicago Press, the largest university press in the United States. The Barack Obama Presidential Center is expected to be housed at the university and will include both the Obama presidential library and offices of the Obama Foundation.

    The University of Chicago’s students, faculty, and staff have included 100 Nobel laureates as of 2020, giving it the fourth-most affiliated Nobel laureates of any university in the world. The university’s faculty members and alumni also include 10 Fields Medalists, 4 Turing Award winners, 52 MacArthur Fellows, 26 Marshall Scholars, 27 Pulitzer Prize winners, 20 National Humanities Medalists, 29 living billionaire graduates, and have won eight Olympic medals.

    The University of Chicago is enriched by the city we call home. In partnership with our neighbors, we invest in Chicago’s mid-South Side across such areas as health, education, economic growth, and the arts. Together with our medical center, we are the largest private employer on the South Side.


    According to the National Science Foundation, University of Chicago spent $423.9 million on research and development in 2018, ranking it 60th in the nation. It is classified among “R1: Doctoral Universities – Very high research activity” and is a founding member of the Association of American Universities and was a member of the Committee on Institutional Cooperation from 1946 through June 29, 2016, when the group’s name was changed to the Big Ten Academic Alliance. The University of Chicago is not a member of the rebranded consortium, but will continue to be a collaborator.

    The university operates more than 140 research centers and institutes on campus. Among these are the Oriental Institute—a museum and research center for Near Eastern studies owned and operated by the university—and a number of National Resource Centers, including the Center for Middle Eastern Studies. Chicago also operates or is affiliated with several research institutions apart from the university proper. The university manages DOE’s Argonne National Laboratory, part of the United States Department of Energy’s national laboratory system, and co-manages DOE’s Fermi National Accelerator Laboratory, a nearby particle physics laboratory, as well as a stake in the Apache Point Observatory in Sunspot, New Mexico.

    SDSS Telescope at Apache Point Observatory, near Sunspot NM, USA, Altitude 2,788 meters (9,147 ft).

    Apache Point Observatory, near Sunspot, New Mexico Altitude 2,788 meters (9,147 ft).

    Faculty and students at the adjacent Toyota Technological Institute at Chicago collaborate with the university. In 2013, the university formed an affiliation with the formerly independent Marine Biological Laboratory in Woods Hole, Mass. Although formally unrelated, the National Opinion Research Center is located on Chicago’s campus.

  • richardmitnick 4:48 pm on September 26, 2022 Permalink | Reply
    Tags: "Isometry", "Unitarity", Albert Einstein's theory of general relativity, , ,   

    From “Quanta Magazine” : “Physicists Rewrite a Quantum Rule That Clashes With Our Universe” 

    From “Quanta Magazine”

    September 26, 2022
    Charlie Wood

    The past and the future are tightly linked in conventional Quantum Mechanics. Perhaps too tightly. A tweak to the theory could let quantum possibilities increase as space expands.

    The expansion of space spells trouble for Quantum Mechanics, by presenting particles with a growing smorgasbord of options for where to be. DVDP for Quanta Magazine.

    A jarring divide cleaves modern Physics. On one side lies Quantum Theory, which portrays subatomic particles as probabilistic waves. On the other lies Albert Einstein’s Theory of General Relativity, that space and time can bend, causing gravity. For 90 years, physicists have sought a reconciliation, a more fundamental description of reality that encompasses both Quantum Mechanics and Relativity. But the quest has run up against thorny paradoxes.

    Hints are mounting that at least part of the problem lies with a principle at the center of Quantum Mechanics, an assumption about how the world works that seems so obvious it’s barely worth stating, much less questioning.

    “Unitarity”, as the principle is called, says that something always happens. When particles interact, the probability of all possible outcomes must sum to 100%. Unitarity severely limits how atoms and subatomic particles might evolve from moment to moment. It also ensures that change is a two-way street: Any imaginable event at the quantum scale can be undone, at least on paper. These requirements have long guided physicists as they derive valid quantum formulas. “It’s a very restrictive condition, even though it might seem a little bit trivial at first glance,” said Yonatan Kahn, an assistant professor at the University of Illinois.

    But what once seemed an essential scaffold may have become a stifling straitjacket preventing physicists from reconciling Quantum Mechanics and Relativity. “Unitarity in Quantum Gravity is a very open question,” said Bianca Dittrich, a theorist at the Perimeter Institute for Theoretical Physics in Waterloo, Canada.

    The main problem is that the universe is expanding.

    This expansion is well described by General Relativity. But it means that the future of the cosmos looks totally different from its past, while unitarity demands a tidy symmetry between past and future on the quantum level. “There is a tension there, and it’s something quite puzzling if you think about it,” said Steve Giddings, a Quantum Gravity theorist at the University of California-Santa Barbara.

    Concern over this conflict has been in the air for years. But recently, two Quantum Gravity theorists may have found a way to loosen unitarity’s buckles to better fit our growing cosmos. Andrew Strominger and Jordan Cotler of Harvard University argue that a more relaxed principle called “Isometry” can accommodate an expanding universe while still satisfying the stringent requirements that first made unitary a guiding light.

    “You don’t need unitarity,” said Strominger. “Unitarity is too strong of a condition.”

    While many physicists are receptive to the isometry proposal — some have even come to similar conclusions independently — opinions vary as to whether the update is too radical or not radical enough.

    A Fixed Sum

    In everyday life, events can’t help but play out in a unitary way. A coin toss, for instance, has a 100% chance of coming up heads or tails.

    But a century ago, the pioneers of Quantum Mechanics made a surprising discovery — one that elevated unitarity from common sense to a hallowed principle. The surprise was that, mathematically, the quantum world operates not by probabilities but by more complicated numbers known as amplitudes. An amplitude is essentially the degree to which a particle is in a certain state; it can be a positive, or a negative “imaginary” number. To calculate the probability of actually observing a particle in a certain state, physicists square the amplitude, which gets rid of the imaginary and negative bits and produces a positive probability. Unitarity says the sum of these probabilities (really, the squares of all the amplitudes) must equal 1.

    Merrill Sherman/Quanta Magazine.

    It’s this twist — the squaring of hidden amplitudes to calculate the outcomes we actually see — that gives unitarity teeth. As a particle’s state changes (as it flies through a magnetic field, say, or collides with another particle), its amplitudes change too. In working out how a particle is allowed to evolve or interact, physicists use the fact that amplitudes never change in a way that disrupts the fixed sum of their squares. In the 1920s, for instance, this unitarity requirement guided the British physicist Paul Dirac to discover an equation that implied the existence of antimatter. “I was not interested in considering any theory which would not fit in with my darling,” Dirac wrote, referring to unitarity.

    Physicists keep probabilities and amplitudes in line by tracking how the quantum state of a particle moves around in Hilbert space — an abstract space representing all possible states available to the particle. The particle’s amplitudes correspond to its coordinates in Hilbert space, and physicists capture changes to the particle with mathematical objects called matrices, which transform its coordinates. Unitarity dictates that a physically allowed change must correspond to a special “unitary” matrix that rotates the particle’s state in Hilbert space without changing that the sum of the squares of its coordinates equals 1.

    It’s a mathematical fact with philosophical consequences: If you know the specific unitary matrix corresponding to some change over time, any quantum state can be swiveled into the future or unswiveled into the past. It will always land on another viable state in the Hilbert space, which never grows or shrinks. “The past completely determines the future, and the future completely determines the past,” said Cotler. “It’s related to the statement that information is neither created nor destroyed.”

    And yet, this bedrock assumption seems to conflict with the universe that surrounds us.

    A Cosmic Clash

    Galaxies are flying ever farther apart.

    Nobel Prize in Physics for 2011 Expansion of the Universe

    4 October 2011

    The Royal Swedish Academy of Sciences has decided to award the Nobel Prize in Physics for 2011

    with one half to

    Saul Perlmutter
    The Supernova Cosmology Project
    The DOE’s Lawrence Berkeley National Laboratory and The University of California-Berkeley,

    and the other half jointly to

    Brian P. SchmidtThe High-z Supernova Search Team, The Australian National University, Weston Creek, Australia.


    Adam G. Riess

    The High-z Supernova Search Team,The Johns Hopkins University and The Space Telescope Science Institute, Baltimore, MD.

    Written in the stars

    “Some say the world will end in fire, some say in ice…” *

    What will be the final destiny of the Universe? Probably it will end in ice, if we are to believe this year’s Nobel Laureates in Physics. They have studied several dozen exploding stars, called supernovae, and discovered that the Universe is expanding at an ever-accelerating rate. The discovery came as a complete surprise even to the Laureates themselves.

    In 1998, cosmology was shaken at its foundations as two research teams presented their findings. Headed by Saul Perlmutter, one of the teams had set to work in 1988. Brian Schmidt headed another team, launched at the end of 1994, where Adam Riess was to play a crucial role.

    The research teams raced to map the Universe by locating the most distant supernovae. More sophisticated telescopes on the ground and in space, as well as more powerful computers and new digital imaging sensors (CCD, Nobel Prize in Physics in 2009), opened the possibility in the 1990s to add more pieces to the cosmological puzzle.

    The teams used a particular kind of supernova, called Type 1a supernova. It is an explosion of an old compact star that is as heavy as the Sun but as small as the Earth. A single such supernova can emit as much light as a whole galaxy. All in all, the two research teams found over 50 distant supernovae whose light was weaker than expected – this was a sign that the expansion of the Universe was accelerating. The potential pitfalls had been numerous, and the scientists found reassurance in the fact that both groups had reached the same astonishing conclusion.

    For almost a century, the Universe has been known to be expanding as a consequence of the Big Bang about 14 billion years ago. However, the discovery that this expansion is accelerating is astounding. If the expansion will continue to speed up the Universe will end in ice.

    The acceleration is thought to be driven by dark energy, but what that dark energy is remains an enigma – perhaps the greatest in physics today. What is known is that dark energy constitutes about three quarters of the Universe. Therefore the findings of the 2011 Nobel Laureates in Physics have helped to unveil a Universe that to a large extent is unknown to science. And everything is possible again.

    *Robert Frost, Fire and Ice, 1920

    While our expanding universe is a perfectly valid solution to the equations of General Relativity, physicists have increasingly realized that its growth spells trouble for Quantum Mechanics, by presenting particles with an expanding smorgasbord of options for where to be and how to behave. As space grows, how can the Hilbert space of possibilities not grow with it? “It’s definitely true that there are more degrees of freedom in the universe now than in the early universe,” said Nima Arkani-Hamed, a theoretical physicist at the Institute for Advanced Study in Princeton.

    “I’ve felt for many years [that] it was the elephant in the room,” said Strominger.

    Andrew Strominger, left, and Jordan Cotler of Harvard University have collaborated on an effort to replace unitarity in quantum physics with an alternative rule called isometry. Credit: Miguel Montrero.

    Giddings sharpens the issue with a paradoxical thought experiment set in a universe that’s both unitary and expanding. Imagine taking the current state of the universe, said Giddings, and adding “one innocuous photon” — perhaps lodged in newly created space halfway between here and the Andromeda galaxy. Unitarity insists that we must be able to calculate what this universe looked like in the past, unswiveling its quantum state as much as we wish.

    But rewinding the state of the universe plus an extra photon creates a glitch. Going into the past, the universe gets smaller, and the wavelength of photons will shrink too. In our real universe, this isn’t a problem: A photon shrinks only until the moment of its creation through some subatomic process; the reversal of that process will make it disappear. But the extra photon wasn’t created by that special process, so instead of disappearing when you turn back time, its wavelength will eventually get impossibly small, concentrating its energy so greatly that the photon collapses into a black hole. This creates a paradox, absurdly implying that — in this fictional, expanding universe — microscopic black holes convert into photons. The thought experiment suggests that a naïve mashup of unitarity and cosmic expansion doesn’t work.

    Dittrich thinks unitarity smells fishy on more general grounds. Quantum Mechanics treats time as absolute, but General Relativity messes with the ticking of clocks, complicating the notion of change from one moment to the next. “I personally never relied so much on unitarity,” she said.

    The question is: What sort of alternative framework could accommodate both cosmic expansion and the rigid mathematics of Quantum Theory?

    Unitarity 2.0

    Last year, Strominger struck up a collaboration with Cotler, who splits his time between Quantum Gravity research and Quantum Information Theory — the study of information stored in quantum states. The duo realized that there is a well-studied scheme in Quantum Information Theory that resembles the expanding universe: quantum error correction, a scheme where a small message made from quantum states is redundantly encoded inside a bigger system. Perhaps, they thought, the contents of the young universe are similarly stitched into the modern cosmos’s swollen form.

    “In hindsight, the obvious answer is this is exactly what people doing quantum encoding have been doing,” Strominger said.

    In a paper earlier this year, the two homed in on a class of transformations that quantum error-correcting codes belong to, known as isometries. An isometric change resembles a unitary one with added flexibility.

    Think of an electron that can occupy two possible locations. Its Hilbert space consists of all possible combinations of amplitudes in the two locations. These possibilities can be imagined as the points on a circle — every point has some value in both the horizontal and vertical directions. Unitary changes rotate states around the circle but do not expand or shrink the set of possibilities.

    To visualize an isometric change, though, let the universe of this electron swell just enough to allow a third position. The electron’s Hilbert space grows, but in a special way: It gains another dimension. The circle becomes a sphere, on which the particle’s quantum state can swivel around to accommodate mixtures of all three locations. The distance between any two states on the circle holds steady under the change — another requirement of unitarity. In short, the options increase, but without unphysical consequences.

    “Working with isometries is sort of a generalization” of unitarity, said Giddings. “It keeps some of the essence.”

    Our universe would have a Hilbert space with a huge number of dimensions that proliferate continuously as real space expands. As a simpler proof of concept, Strominger and Cotler studied the expansion of a toy universe consisting of a line ending in a receding mirror. They calculated the probability that the universe would grow from one length to another.

    For such calculations, quantum practitioners often use the Schrödinger equation, which predicts how a quantum system evolves in time. But changes dictated by the Schrödinger equation are perfectly reversible; its “literal purpose in life is to enforce unitarity,” Arkani-Hamed said. So instead, Strominger and Cotler used an alternative version of Quantum Mechanics dreamed up by Richard Feynman, called the path integral. This method, which involves tallying up all the paths a quantum system can take from some starting point to an endpoint, has no trouble accommodating the creation of new states (which appear as branching paths leading to multiple endpoints). In the end, Strominger and Cotler’s path integral spit out a matrix encapsulating the growth of the toy cosmos, and it was indeed an isometric matrix rather than a unitary one.

    “If you want to describe an expanding universe, the Schrödinger equation as it stands just won’t work,” Cotler said. “But in the Feynman formulation, it keeps on working on its own volition.” Cotler concludes that this alternative way of doing Quantum Mechanics based on isometry “will be more useful to us in understanding an expanding universe.”

    A Mirage of Possibilities

    Relaxing unitarity could resolve the glitches in the thought experiment that has troubled Giddings and others. It would do so through a conceptual change to how we think about the relationship between the past and the future, and which states of the universe are really possible.

    Merrill Sherman/Quanta Magazine.

    To see why isometry solves the problem, Cotler describes a toy universe, one born in one of two possible initial states, 0 or 1 (a two-dimensional Hilbert space). He makes up an isometric rule to govern this universe’s expansion: At every successive moment, each 0 becomes 01, and each 1 becomes 10. If the universe starts at 0, its first three moments will see it grow as follows: 0 → 01 → 0110 → 01101001 (an 8D Hilbert space). If it starts at 1, it will become 10010110. The string captures everything about this universe — all its particles’ positions, for instance. A considerably longer string made from superpositions of 0s and 1s presumably describes the real universe.

    At any given time, the toy universe has two possible states: one arising from 0 and another arising from 1. The initial one-digit configuration has been “encoded” in a larger, eight-digit state. That evolution resembles a unitary one, in that there are two possibilities at the beginning and two at the end. But the isometric evolution provides a more capable framework for describing the expanding universe. Crucially, it does so without creating the freedom to add, say, an extra photon between here and Andromeda, which would spell trouble when you turn back the clock. Imagine, for instance, that the universe is in the 01101001 state. Flip the first 0 to a 1 — representing a minor, local tweak, such as the extra photon — and you’ll get a state that looks fine on paper (11101001), with a seemingly valid set of coordinates in the larger Hilbert space. But knowing the specific isometric rule, you can see that such a state has no parent state. This imaginary universe could never have arisen.

    “There are some configurations of the future that don’t correspond to anything in the past,” Cotler said. “There’s nothing in the past that would evolve into them.”

    Giddings has proposed a similar principle for ruling out paradoxical states he encountered while studying black holes last year. He calls it “history matters,” and it holds that a given state of the universe is only physically possible if it can evolve backward without generating contradictions. “This has been kind of a lingering puzzle,” he said. Strominger and Cotler “are taking that puzzle and using it to try to motivate possibly a new way of thinking about things.”

    Giddings thinks the approach deserves further development. So does Dittrich, who came to similar realizations about isometry a decade ago while attempting to formulate a toy quantum theory of space-time with her collaborator Philipp Höhn. One hope is that such work could eventually lead to the specific isometric rule that might govern our universe — a rather more complicated prescription than “0 goes to 01.” A true cosmological isometry, Cotler speculates, could be verified by calculating which specific patterns in the distribution of the matter in the sky are possible and which aren’t, and then testing those predictions against observational data. “If you look closer at it, you’ll find this but not this,” he said. “That could be really useful.”

    To Isometry and Beyond

    While such experimental evidence could accrue in the future, in the near term, evidence for isometry is more likely to come from theoretical studies and thought experiments showing that it helps combine the malleability of space-time with the amplitudes of quantum theory.

    One thought experiment where unitarity looks creaky involves black holes, intense concentrations of matter that warp space-time into a dead end. Stephen Hawking calculated in 1974 that black holes evaporate over time, erasing the quantum state of anything that fell in — a seemingly blatant unitarity violation known as the black hole information paradox. If black holes have Hilbert spaces that mature isometrically, as Cotler and Strominger hypothesize, physicists may face a somewhat different puzzle than they thought. “I don’t think there can be a solution that doesn’t take this into account,” Strominger said.

    Another prize would be a detailed quantum theory that described not just how the cosmos grows, but where everything came from in the first place. “We have no universe, and all of a sudden we have a universe,” Arkani-Hamed said. “What the hell kind of unitary evolution is that?”

    For his part, however, Arkani-Hamed doubts that swapping in isometry for unitarity goes far enough. He is one of the leaders of a research program that is trying to break free of many fundamental assumptions in quantum theory and general relativity, not just unitarity.

    Whatever theory comes next, he suspects, will take a completely novel form, just as quantum mechanics was a clean break from Isaac Newton’s laws of motion. As an illustrative example of what a new form might look like, he points to a research program stemming from a 2014 discovery he made together with Jaroslav Trnka, his student at the time. They showed that when certain particles collide, the amplitude of each possible outcome equals the volume of a geometric object, dubbed the amplituhedron. Calculating the object’s volume is much easier than using standard methods for computing the amplitudes, which laboriously reconstruct all the ways a particle collision might play out, moment by moment.

    Intriguingly, while the amplituhedron gives answers that obey unitarity, the principle isn’t used to construct the shape itself. Neither are any assumptions about how particles move in space and time. The success of this purely geometric formulation of particle physics raises the possibility of a fresh perspective on reality, one free from the cherished principles that currently conflict. Researchers have gradually been generalizing the approach to explore related geometric shapes pertaining to different particles and quantum theories.

    “[It] may be a different way to organize unitarity,” Cotler said, “and perhaps it has the seeds to transcend it.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

  • richardmitnick 10:13 pm on January 22, 2022 Permalink | Reply
    Tags: "This New Record in Laser Beam Stability Could Help Answer Physics' Biggest Questions", Albert Einstein's theory of general relativity, , , , , ,   

    From The University of Western Australia (AU) via Science Alert (AU) : “This New Record in Laser Beam Stability Could Help Answer Physics’ Biggest Questions” 

    U Western Australia bloc

    From The University of Western Australia (AU)


    Science Alert (AU)

    The laser setup at the University of Western Australia. Credit: D. Gozzard/UWA.

    22 JANUARY 2022

    Scientists are on a mission to create a global network of atomic clocks that will enable us to, among other things, better understand the fundamental laws of physics, investigate dark matter, and navigate across Earth and space more precisely.

    However, to be at their most effective, these clocks will need to be reliably and speedily linked together through layers of the atmosphere, which is far from easy. New research outlines a successful experiment with a laser beam that has been kept stable across a distance of 2.4 kilometers (1.5 miles).

    For comparison, the new link is around 100 times more stable than anything that’s been put together before. It also demonstrates stability that’s around 1,000 times better than the atomic clocks these lasers could be used to monitor.

    “The result shows that the phase and amplitude stabilization technologies presented in this paper can provide the basis for ultra-precise timescale comparison of optical atomic clocks through the turbulent atmosphere,” write the researchers in their published paper [Physical Review Letters].

    The system builds on research carried out last year in which scientists developed a laser link capable of holding its own through the atmosphere with unprecedented stability.

    In the new study, researchers shot a laser beam from a fifth-floor window to a reflector 1.2 kilometers (0.74 miles) away. The beam was then bounced back to the source to achieve the total distance for a period of five minutes.

    Using noise reduction techniques, temperature controls, and tiny adjustments to the reflector, the team was able to keep the laser stable through the pockets of fluctuating air. The atmospheric turbulence at ground level here is likely to equate to ground-to-satellite turbulence (the air is calmer and less dense higher in the atmosphere) of several hundred kilometers.

    While laser accuracy has remained fairly constant for a decade or so, we’ve seen some significant improvements recently, including a laser setup operated by the Boulder Atomic Clock Optical Network (BACON) Collaboration and tested last March [Nature].

    That setup involved a pulse laser rather than the continuous wave laser tested in this new study. Both have their advantages in different scenarios, but continuous wave lasers offer better stability and can transfer more data in a set period of time.

    “Both systems beat the current best atomic clock, so we’re splitting hairs here, but our ultimate precision is better,” says astrophysicist David Gozzard from the University of Western Australia.

    Once an atomic clock network is put together, among the tests scientists will be able to perform is Albert Einstein’s Theory of General Relativity, and how its incompatibility with what we know about quantum physics could be resolved.

    By very precisely comparing the time-keeping of two atomic clocks – one on Earth and one in space – scientists are eventually hoping to be able to work out where General Relativity does and doesn’t hold up. If Einstein’s ideas are correct, the clock further away from Earth’s gravity should tick ever-so-slightly faster.

    But its usefulness doesn’t stop there. Lasers like this could eventually be used for managing the launching of objects into orbit, for communications between Earth and space, or for connecting two points in space.

    “Of course, you can’t run fiber optic cable to a satellite,” says Gozzard.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Western Australia is a public research university in the Australian state of Western Australia. The university’s main campus is in Perth, the state capital, with a secondary campus in Albany and various other facilities elsewhere.

    UWA was established in 1911 by an act of the Parliament of Western Australia and began teaching students two years later. It is the sixth-oldest university in Australia and was Western Australia’s only university until the establishment of Murdoch University (AU) in 1973. Because of its age and reputation, UWA is classed one of the “sandstone universities”, an informal designation given to the oldest university in each state. The university also belongs to several more formal groupings, including The Group of Eight (AU) and The Matariki Network of Universities. In recent years, UWA has generally been ranked either in the bottom half or just outside the world’s top 100 universities, depending on the system used.

    Alumni of UWA include one Prime Minister of Australia (Bob Hawke), five Justices of the High Court of Australia (including one Chief Justice, Robert French, now Chancellor), one Governor of the Reserve Bank (H. C. Coombs), various federal cabinet ministers, and seven of Western Australia’s eight most recent premiers. In 2018 alumnus mathematician Akshay Venkatesh was a recipient of the Fields Medal. As at 2021, the university had produced 106 Rhodes Scholars. Two members of the UWA faculty, Barry Marshall and Robin Warren won Nobel Prizes as a result of research at the university.


    The university was established in 1911 following the tabling of proposals by a royal commission in September 1910. The original campus, which received its first students in March 1913, was located on Irwin Street in the centre of Perth, and consisted of several buildings situated between Hay Street and St Georges Terrace. Irwin Street was also known as “Tin Pan Alley” as many buildings featured corrugated iron roofs. These buildings served as the university campus until 1932, when the campus relocated to its present-day site in Crawley.

    The founding chancellor, Sir John Winthrop Hackett, died in 1916, and bequeathed property which, after being carefully managed for ten years, yielded £425,000 to the university, a far larger sum than expected. This allowed the construction of the main buildings. Many buildings and landmarks within the university bear his name, including Winthrop Hall and Hackett Hall. In addition, his bequest funded many scholarships, because he did not wish eager students to be deterred from studying because they could not afford to do so.

    During UWA’s first decade there was controversy about whether the policy of free education was compatible with high expenditure on professorial chairs and faculties. An “old student” publicised his concern in 1921 that there were 13 faculties serving only 280 students.

    A remnant of the original buildings survives to this day in the form of the “Irwin Street Building”, so called after its former location. In the 1930s it was transported to the new campus and served a number of uses till its 1987 restoration, after which it was moved across campus to James Oval. Recently, the building has served as the Senate meeting room and is currently in use as a cricket pavilion and office of the university archives. The building has been heritage-listed by both the National Trust and the Australian Heritage Council.

    The university introduced the Doctorate of Philosophy degree in 1946 and made its first award in October 1950 to Warwick Bottomley for his research of the chemistry of native plants in Western Australia.

  • richardmitnick 11:48 am on December 24, 2021 Permalink | Reply
    Tags: "Lasers and Ultracold Atoms for a Changing Earth", Albert Einstein's theory of general relativity, , Applying new technology rooted in quantum mechanics and relativity to terrestrial and space geodesy will sharpen our understanding of how the planet responds to natural and human-induced changes., , Improving technology for laser interferometric ranging between spacecraft to achieve nanometer-scale accuracy, Laser altimetry, , Measuring Earth’s gravity field from space requires precisely monitoring the changing distance between paired orbiting satellites., NASA Grace mission, NASA Grace-FO mission, , The future of high-precision geodesy lies in the development and application of novel technologies based on quantum mechanics and relativity.   

    From Eos: “Lasers and Ultracold Atoms for a Changing Earth” 

    From AGU
    Eos news bloc

    From Eos

    20 December 2021
    Michel Van Camp
    F. Pereira dos Santos
    Michael Murböck
    Gérard Petit and
    Jürgen Müller

    Applying new technology rooted in quantum mechanics and relativity to terrestrial and space geodesy will sharpen our understanding of how the planet responds to natural and human-induced changes.

    Credit: VENTRIS/Science Photo Library via Getty Images.

    Quantum mechanics rules the atomic world, challenging our intuitions based on Newton’s classical mechanics. And yet atoms share at least one commonality with Newton’s apple and with you and me: They experience gravity and fall in the same way.

    Of course, observing free-falling atoms requires extremely sophisticated experimental devices, which became available only in the 1990s with the advent of laser cooling. Heat represents the extent to which atoms move, so cooling atoms eases their manipulation, allowing scientists to measure their free fall and to quantify and study the effects of gravity with extraordinary precision. Creating samples of ultracold atoms involves slowing the atoms using the momentum of photons in specialized laser beams.

    Today novel developments in methods using ultracold atoms and laser technologies open enhanced prospects for applying quantum physics in both satellite and terrestrial geodesy—the science of measuring the shape, rotation, and gravity of Earth—and for improving measurement reference systems. Such methods have great potential for more accurately monitoring how the Earth system is responding to natural and human-induced forcing, from the planet’s solid surface shifting in response to tectonic and magmatic movements to sea level rising in response to melting glaciers.

    Taking Earth’s Measure

    Earth’s shape is always changing, even if the changes are mostly imperceptible to us humans. In the subsurface, large convection currents and plate tectonics influence each other, shifting huge masses of rock around and causing earthquakes and volcanic eruptions. On the surface, the ocean, atmosphere, glaciers, rivers, and aquifers never rest either—nor do we as we excavate rock, extract groundwater and oil, and generally move mass around. All these movements subtly affect not only the planet’s shape but also its rotation and its gravitational field.

    Fig. 1. The colored bubbles indicate the ranges of spatial resolution (in kilometers) and signal amplitude (in equivalent water height, EWH) characteristic of mass change processes related to continental hydrology (yellow), ice sheets and glaciers (pink), ocean processes (blue), and volcanoes and earthquakes (gray). The current measurement limits of laser interferometric ranging methods (e.g., aboard the Gravity Recovery and Climate Experiment (GRACE) and GRACE Follow-On (GRACE-FO) missions; solid black line) and of terrestrial absolute gravimetry (dashed green line) are shown, along with the directions of improvement in these technologies (arrows) needed to cover more of the ranges of the processes. Credit: IfE/LUH.

    NASA Grace

    National Aeronautics Space Agency (US)/GFZ German Research Centre for Geosciences [Deutsches Forschungszentrum für Geowissenschaften](GFZ)(DE) Grace-FO satellites.

    Geodetic methods allow us to measure minute quantities that tell scientists a lot about Earth’s size, shape, and makeup. As such, geodesy is essential to all branches of geophysics: tectonics, seismology, volcanology, oceanography, hydrology, glaciology, geomagnetism, climatology, meteorology, planetology, and even metrology, physics, and astronomy. Measuring these changes sheds light on many important Earth processes, such as mass loss from polar ice sheets, yet making these measurements accurately remains a challenging task (Figure 1).

    Determining the elevation of an ice sheet’s surface, to gauge whether it might have lost or gained mass, is often done using laser altimetry—that is, by observing the travel time of a laser beam emitted from a plane or a satellite and reflected off the ice surface back up to the observer. It’s a powerful technique, but the laser does not necessarily distinguish between light, fresh snow and dense, old ice, introducing uncertainty into the measurement and into our understanding of the ice sheet’s change.

    Beyond this ambiguity, what happens if Earth’s crust beneath the ice cap is deforming and influencing the elevation of the ice surface? Moreover, the altimeter’s observation is relative: The elevation of the ice sheet surface is measured with respect to the position of the observing aircraft or satellite, which itself must be determined in comparison to a reference height datum (typically sea level). This feat requires measuring quantities that are exceedingly small compared with the size of Earth. If you drew a circle representing Earth on a standard piece of printer paper, even the 20-kilometer difference in height between Mount Everest’s peak and the bottom of abyssal oceanic trenches—would be thinner than the thickness of your pencil line!

    Meanwhile, measuring variation in Earth’s rotation means determining its instantaneous orientation relative to fixed stars to within a fraction of a thousandth of an arc second—the amount Earth rotates in a few micro arc seconds. Assessing velocities and deformations of the tectonic plates requires determining positions at the millimeter scale. And detecting groundwater mass changes requires measuring the associated gravitational effect of a 1-centimeter-thick layer of water (i.e., equivalent water height, or EWH) spread over a 160,000-square-kilometer area. In other words, changes in Earth’s rotation, deformations, and gravity must be measured with precisions that are 10 orders of magnitude shorter than the length of the day, smaller than Earth’s diameter, and weaker than the gravity itself, respectively.

    The Challenges of Attraction

    Performing gravity measurements and analyses remains especially demanding. For land-based measurements, gravimeters are generally cumbersome, expensive, tricky to use and, in the case of the most precise superconducting instruments, require a high-wattage (1,500-watt) continuous power supply. In addition, most gravimeters, including superconducting instruments, offer only relative measurements—that is, they inform us about spatial and temporal variations in gravitational attraction, but they drift with time and do not provide the absolute value of gravitational acceleration (about 9.8 meters per second squared). Absolute gravimeters do, but these instruments are rare, expensive (costing roughly $500,000 apiece), and heavy. And as most are mechanical, wear and tear prevents their use for continuous measurements.

    This absolute gravimeter developed by the SYRTE (Time and Space Reference Systems) department at the Paris Observatory uses ultracold-atom technology to make high-precision measurements of gravity. Credit: Sébastien Merlet, LNE­SYRTE.

    Moreover, terrestrial gravimeters are mostly sensitive to the mass distribution nearby, in a radius of a few hundred meters from the instrument. This sensitivity and scale allow observation of rapid and small-scale changes, such as from flash floods, in small watersheds or glaciers, and in volcanic systems, but they complicate data gathering over larger areas.

    On the other hand, space-based gravimetry, realized in the Gravity Recovery and Climate Experiment mission and its follow-on mission, GRACE-FO, is blind to structures smaller than a few hundred kilometers. However, it offers unique information of homogeneous quality about mass anomalies over larger areas within Earth or at its surface. These missions can detect and monitor a mass change equivalent to a 1-centimeter EWH spread over a 400- × 400-kilometer area, with a temporal resolution of 10 days.

    To monitor change from important Earth processes—from flooding and volcanism to glacier melting and groundwater movement—reliably and across scales, we need gravitational data with better spatiotemporal resolution and higher accuracy than are currently available (Figure 1). We also need highly stable and accurate reference systems to provide the fundamental backbone required to monitor sea level changes and tectonic and human-induced deformation. The needed improvements can be achieved only by using innovative quantum technologies.

    The past few years have seen new efforts to develop such technologies for many uses. In 2018, for example, the European Commission began a long-term research and innovation initiative called Quantum Flagship. For geodetic applications, efforts are being coordinated and supported largely through the Novel Sensors and Quantum Technology for Geodesy (QuGe) program, a worldwide initiative organized under the umbrella of the International Association of Geodesy and launched in 2019. QuGe fosters synergies in technology development, space mission requirements, and geodetic and geophysical modeling by organizing workshops and conference sessions and by otherwise providing a platform where experts from different fields can collaborate.

    A Quantum Upgrade for Gravity Sensing

    QuGe emphasizes three pillars of development. The first focuses on investigations of ultracold-atom technologies for gravimetry on the ground and in space. Quantum gravimetry will benefit a comprehensive set of applications, from fast, localized gravity surveys and exploration to observing regional and global Earth system processes with high spatial and temporal resolution.

    On Earth, the ideal instrument is an absolute, rather than relative, gravimeter capable of taking continuous measurements. This is not possible with a classical mechanical absolute gravimeter, in which a test mass is repeatedly dropped and lifted. In atomic instruments, there are no mobile parts or mechanical wear; instead, lasers control falling rubidium atoms. Recent achievements should enable production of such instruments on a larger scale, allowing scientists to establish dense networks of absolute gravimetric instruments to monitor, for example, aquifer and volcanic systems.

    Today achieving dense coverage with gravimetric surveys, with measurements made at perhaps dozens of points, involves huge efforts, and sampling rates—with measurements taken typically once every month, year, or more—are still poor. Moreover, errors related to instrument calibration and drift remain problematic. Alternatively, a fixed instrument provides a measurement every second but at only a single location. The ability to continuously measure gravity at multiple locations, without the difficulties of drifting instruments, will allow much less ambiguous interpretations of gravity changes and related geophysical phenomena.

    Measuring Earth’s gravity field from space requires precisely monitoring the changing distance between paired orbiting satellites—as in the GRACE-FO mission—which accelerate and decelerate slightly as they are tugged more or less by the gravitational pull exerted by different masses on Earth. However, the satellites can also speed up and slow down because of forces other than changes in Earth’s gravity field, including aerodynamic drag in the thin upper atmosphere. Currently, these other forces acting on the satellites are measured using electrostatic, suspended-mass accelerometers, which also tend to exhibit gradual, low-frequency drifts that hamper their accuracy.

    The performance of these traditional accelerometers is thus challenged by quantum sensors, which have already demonstrated improved long-term stability and lower noise levels on the ground. In addition, hybrid systems combining the benefits of quantum accelerometers with electrostatic accelerometers, which still provide higher measurement rates, could cover a wider range of slower and faster accelerations and could greatly support navigation and inertial sensing on the ground and in space. Quantum accelerometers will also serve as a basis for developing the next generation of gravity-monitoring missions, such as the follow-on to the Gravity field and steady-state Ocean Circulation Explorer (GOCE) mission, which will measure gravity differences in 3D and allow higher-resolution mapping of Earth’s static gravity field.

    Wide-Ranging Improvement

    The second pillar of QuGe focuses on improving technology for laser interferometric ranging between spacecraft to achieve nanometer-scale accuracy, which will become the standard for future geodetic gravity-sensing missions. This method involves comparing the difference in phase between two laser beams: a reference beam and a test beam received back from the second satellite. Such optical measurements are much more precise than similar measurements using microwave ranging or mechanical devices, allowing intersatellite distances to be observed with an accuracy of tens of nanometers or better compared with micrometer accuracies achieved with microwaves.

    High-precision laser ranging was successfully tested in 2017 during the Laser Interferometer Space Antenna (LISA) Pathfinder mission, in which the main goal was to hold the spacecraft as motionless as possible to test technology for use in future missions that will seek to detect gravitational waves with a space-based observatory. It has also been applied successfully in the GRACE-FO mission, demonstrating the superior performance for intersatellite tracking of laser interferometry over traditional microwave-based ranging methods used in the original GRACE mission.

    Although extremely useful, recent satellite gravity missions give only rather rough pictures of global mass variations. Enhanced monitoring of intersatellite distances should improve the ability to resolve 1-centimeter EWH to about 200 kilometers or finer, instead of the 400 kilometers presently. This improvement will allow better separation of overlapping effects, such as continental versus oceanic mass contributions along coastlines, changes in neighboring subsurface aquifers, and variations in glaciers and nearby groundwater tables.

    Even more refined concepts, like intersatellite tracking using laser interferometry for multiple satellite pairs or among a swarm of satellites, might be realized as well within the coming years. Using more satellites in next-generation geodetic missions would yield data with higher temporal and spatial resolution and accuracy—and hence with greater ability to distinguish smaller-scale processes—than are available with current two-satellite configurations.

    Measuring Height with Optical Clocks

    QuGe’s third pillar of development focuses on applying general relativity and optical clocks to improve measurement reference systems. Einstein told us that gravity distorts space and time. In particular, a clock closer to a mass—or, say, at a lower elevation on Earth’s surface, closer to the planet’s center of mass—runs slower than one farther away. Hence, comparing the ticking rates of accurate clocks placed at different locations on Earth informs us about height differences, a technique called chronometric leveling. This technique has been achieved by comparing outputs from highly precise optical clocks connected by optical links over distances on the order of 1,000 kilometers.

    Today systems for measuring height are referenced to mean sea level in some way, for example, through tide gauges. However, sea level is not stable enough to be used as a reference.

    The transportable optical clock of the PTB (left) is housed inside a trailer (right). Credit: PTB Braunschweig, CC BY 4.0.

    Optical clocks keep time by measuring the high frequency of a laser light that is kept locked to the transition frequency between two given energy levels of electrons in ultracold (laser-cooled) atoms or ions. These clocks have demonstrated at least a 100-fold improvement in accuracy over the usual atomic clocks, which measure lower-frequency microwave transitions. With a global network of such optical clocks, if we can remotely compare the clocks’ frequencies with the same accuracy, we could realize a global height reference with 1-centimeter consistency. One can even imagine the reference clocks being placed in a high satellite orbit, far from the noisy Earth environment, to serve as a stable reference for terrestrial height systems and improve measurement accuracy.

    In addition to chronometric leveling, such clocks will improve the accuracy of the International Atomic Time standard—the basis for the Coordinated Universal Time used for civil timekeeping—and will have many other impacts on science and technology. For example, global navigation satellite systems could provide better navigation by using more predictable clocks on satellites, which would have the added advantage of requiring less input from the ground operators controlling the satellite orbits. Space navigation could rely on one-way range measurements instead of on more time-consuming two-way ranging if a spacecraft’s clock were highly accurate. And radio astronomers could make use of more stable frequency references for easier processing and better results in very long baseline interferometry experiments. More fundamental applications are also envisioned for optical clocks, such as detecting gravitational waves, testing the constancy of the fundamental constants of physics, and even redefining the second.

    The Best Tools for the Job

    Our knowledge of Earth’s shape and gravity and the subtle shifts they undergo in response to numerous natural and human-induced processes has grown immensely as geodetic methods and tools have matured. But with current technologies, the clarity and confidence with which we can discern these changes remain limited. Such limitations, namely, insufficient accuracy and resolution in time and space, will become increasingly important as we look to better understand and predict the consequences of accelerating—or even perhaps previously unrecognized—changes occurring as the planet responds to warming temperatures and other anthropogenic influences.

    The future of high-precision geodesy lies in the development and application of novel technologies based on quantum mechanics and relativity. QuGe is working to ensure that the Earth and planetary sciences benefit from the vast potential of these technologies. In particular, ultracold-atom accelerometry, high-precision laser ranging between satellites, and relativistic geodesy with optical clocks are very promising approaches that will overcome problems of classical gravimetric Earth observations. With such advances, we will have the best tools available not only to understand vital geophysical processes but also to better navigate on Earth and in space and to discern the fundamental physics that underlie our world.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 10:38 am on December 14, 2021 Permalink | Reply
    Tags: "Einstein's theory passes rigorous 16-year tests", Albert Einstein's theory of general relativity, , , , , , Double pulsar system   

    From CSIRO -Commonwealth Scientific and Industrial Research Organisation (AU) : “Einstein’s theory passes rigorous 16-year tests” 

    CSIRO bloc

    From CSIRO -Commonwealth Scientific and Industrial Research Organisation (AU)


    Ms. Mikayla Keen
    Communication advisor, Space
    Tel +61 2 9372 4433
    Fax +61 4 0148 8562

    © Michael Kramer/The MPG Institute for Radio Astronomy [MPG Institut für Radioastronomie](DE)

    The team, led by Professor Michael Kramer from The MPG Institute for Radio Astronomy [MPG Institut für Radioastronomie](DE), showed that Einstein’s theory published in 1915 still holds true.

    Dr Dick Manchester, a Fellow at Australia’s national science agency, CSIRO, and a member of the research team, explained how this result provides us with a more precise understanding of our Universe.

    “The theory of general relativity describes how gravity works at large scales in the Universe, but it breaks down at the atomic scale where quantum mechanics reigns supreme,” Dr Manchester said.

    “We needed to find ways of testing Einstein’s theory at an intermediate scale to see if it still holds true. Fortunately, just the right cosmic laboratory, known as the ‘double pulsar’, was found using the Parkes telescope in 2003.

    CSIRO-Commonwealth Scientific and Industrial Research Organisation (AU) CSIRO Parkes Observatory Australia Telescope National Facility [ Murriyang, the traditional Indigenous name], located 20 kilometres north of the town of Parkes, New South Wales, Australia, 414.80m above sea level.

    “Our observations of the double pulsar over the past 16 years proved to be amazingly consistent with Einstein’s General Theory of Relativity, within 99.99 per cent to be precise,” he said.

    The double pulsar system is made up of two pulsars, rapidly rotating compact stars that emit radio waves like a cosmic lighthouse and create very strong gravitational fields.

    One star rotates 45 times every second, while the second spins just 2.8 times per second. The stars complete an orbit every 2.5 hours.

    Dame Susan Jocelyn Bell Burnell, discovered pulsars with radio astronomy. Jocelyn Bell at the Mullard Radio Astronomy Observatory, University of Cambridge(UK), taken for the Daily Herald newspaper in 1968. Denied the Nobel.

    According to general relativity, the extreme accelerations in the double pulsar system strain the fabric of space-time and send out ripples that will slow the system. The two pulsars are predicted to collide in 85 million years’ time.

    With such a long timescale for this energy loss its effects are difficult to detect. Fortunately, clock-like ticks coming from the spinning pulsars are perfect tools to trace the tiny perturbations.

    Associate Professor Adam Deller from The Swinburne University of Technology (AU) and OzGrav-ARC CENTRE OF EXCELLENCE FOR GRAVITATIONAL WAVE DISCOVERY (AU), another member of the research team, explained that the ticks from the pulsar ‘clocks’ had taken around 2,400 years to reach Earth.

    “We modelled the precise arrival times of more than 20 billion of these clock ticks over 16 years,” Dr Deller said.

    “That still wasn’t enough to tell us how far away the stars are, and we needed to know that to test general relativity.”

    By adding in data from the Global VLBI Array– a network of telescopes spread across the globe – the research team was able to spot a tiny wobble in the star’s positions every year, which revealed their distance from Earth.

    GMVA The Global VLBI Array

    “We’ll be back in the future using new radio telescopes and new data analysis hoping to spot a weakness in general relativity that will lead us to an even better gravitational theory,” Dr Deller said.

    The research is published today in the journal Physical Review X.

    Results in detail:

    An international research team has completed the most rigorous tests yet of Albert Einstein’s Theory of General Relativity, showing that the theory published in 1915 holds true.
    These tests include the emission of gravitational waves, effects of light propagation in strong gravitational fields, and the effect of ‘time dilation’ that makes clocks run slower in gravitational fields.
    These results provide us with a more precise understanding of our Universe.
    Key to the research was the double pulsar system, which was first discovered using CSIRO’s Parkes radio telescope, Murriyang, in 2003.
    The double pulsar system is made up of two pulsars, rapidly rotating compact stars that emit radio waves like a cosmic lighthouse and create very strong gravitational fields. One star rotates 45 times every second, while the second spins just 2.8 times per second. The stars complete an orbit every 2.5 hours.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    CSIRO campus

    CSIRO -Commonwealth Scientific and Industrial Research Organisation (AU) , is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

    CSIRO works with leading organisations around the world. From its headquarters in Canberra, CSIRO maintains more than 50 sites across Australia and in France, Chile and the United States, employing about 5,500 people.

    Federally funded scientific research began in Australia 104 years ago. The Advisory Council of Science and Industry was established in 1916 but was hampered by insufficient available finance. In 1926 the research effort was reinvigorated by establishment of the Council for Scientific and Industrial Research (CSIR), which strengthened national science leadership and increased research funding. CSIR grew rapidly and achieved significant early successes. In 1949 further legislated changes included renaming the organisation as CSIRO.

    Notable developments by CSIRO have included the invention of atomic absorption spectroscopy; essential components of Wi-Fi technology; development of the first commercially successful polymer banknote; the invention of the insect repellent in Aerogard and the introduction of a series of biological controls into Australia, such as the introduction of myxomatosis and rabbit calicivirus for the control of rabbit populations.

    Research and focus areas

    Research Business Units

    As at 2019, CSIRO’s research areas are identified as “Impact science” and organised into the following Business Units:

    Agriculture and Food
    Health and Biosecurity
    Data 61
    Land and Water
    Mineral Resources
    Oceans and Atmosphere

    National Facilities

    CSIRO manages national research facilities and scientific infrastructure on behalf of the nation to assist with the delivery of research. The national facilities and specialized laboratories are available to both international and Australian users from industry and research. As at 2019, the following National Facilities are listed:

    Australian Animal Health Laboratory (AAHL)
    Australia Telescope National Facility – radio telescopes included in the Facility include the Australia Telescope Compact Array, the Parkes Observatory, Mopra Observatory and the Australian Square Kilometre Array Pathfinder.


    CSIRO Pawsey Supercomputing Centre AU)

    Others not shown


    SKA- Square Kilometer Array


  • richardmitnick 12:05 pm on September 8, 2019 Permalink | Reply
    Tags: Albert Einstein's theory of general relativity, , , , , , Craig Callender, , Second law of thermodynamics,   

    From WIRED: “Are We All Wrong About Black Holes?” 

    Wired logo

    From WIRED

    Brendan Z. Foster

    Craig Callender, a philosopher of science at the University of California San Diego, argues that the connection between black holes and thermodynamics is less ironclad than assumed.Photograph: Peggy Peattie/Quanta Magazine

    In the early 1970s, people studying general relativity, our modern theory of gravity, noticed rough similarities between the properties of black holes and the laws of thermodynamics. Stephen Hawking proved that the area of a black hole’s event horizon—the surface that marks its boundary—cannot decrease. That sounded suspiciously like the second law of thermodynamics, which says entropy—a measure of disorder—cannot decrease.

    Yet at the time, Hawking and others emphasized that the laws of black holes only looked like thermodynamics on paper; they did not actually relate to thermodynamic concepts like temperature or entropy.

    Then in quick succession, a pair of brilliant results—one by Hawking himself—suggested that the equations governing black holes were in fact actual expressions of the thermodynamic laws applied to black holes. In 1972, Jacob Bekenstein argued that a black hole’s surface area was proportional to its entropy [Physical Review D], and thus the second law similarity was a true identity. And in 1974, Hawking found that black holes appear to emit radiation [Nature]—what we now call Hawking radiation—and this radiation would have exactly the same “temperature” in the thermodynamic analogy.

    This connection gave physicists a tantalizing window into what many consider the biggest problem in theoretical physics—how to combine quantum mechanics, our theory of the very small, with general relativity. After all, thermodynamics comes from statistical mechanics, which describes the behavior of all the unseen atoms in a system. If a black hole is obeying thermodynamic laws, we can presume that a statistical description of all its fundamental, indivisible parts can be made. But in the case of a black hole, those parts aren’t atoms. They must be a kind of basic unit of gravity that makes up the fabric of space and time.

    Modern researchers insist that any candidate for a theory of quantum gravity must explain how the laws of black hole thermodynamics arise from microscopic gravity, and in particular, why the entropy-to-area connection happens. And few question the truth of the connection between black hole thermodynamics and ordinary thermodynamics.

    But what if the connection between the two really is little more than a rough analogy, with little physical reality? What would that mean for the past decades of work in string theory, loop quantum gravity, and beyond? Craig Callender, a philosopher of science at the University of California, San Diego, argues that the notorious laws of black hole thermodynamics may be nothing more than a useful analogy stretched too far [Phil Sci]. The interview has been condensed and edited for clarity.

    Why did people ever think to connect black holes and thermodynamics?

    Callender: In the early ’70s, people noticed a few similarities between the two. One is that both seem to possess an equilibrium-like state. I have a box of gas. It can be described by a small handful of parameters—say, pressure, volume, and temperature. Same thing with a black hole. It might be described with just its mass, angular momentum, and charge. Further details don’t matter to either system.

    Nor does this state tell me what happened beforehand. I walk into a room and see a box of gas with stable values of pressure, volume and temperature. Did it just settle into that state, or did that happen last week, or perhaps a million years ago? Can’t tell. The black hole is similar. You can’t tell what type of matter fell in or when it collapsed.

    The second feature is that Hawking proved that the area of black holes is always non-decreasing. That reminds one of the thermodynamic second law, that entropy always increases. So both systems seem to be heading toward simply described states.

    Now grab a thermodynamics textbook, locate the laws, and see if you can find true statements when you replace the thermodynamic terms with black hole variables. In many cases you can, and the analogy improves.

    Hawking then discovers Hawking radiation, which further improves the analogy. At that point, most physicists start claiming the analogy is so good that it’s more than an analogy—it’s an identity! That’s a super-strong and surprising claim. It says that black hole laws, most of which are features of the geometry of space-time, are somehow identical to the physical principles underlying the physics of steam engines.

    Because the identity plays a huge role in quantum gravity, I want to reconsider this identity claim. Few in the foundations of physics have done so.

    So what’s the statistical mechanics for black holes?

    Well, that’s a good question. Why does ordinary thermodynamics hold? Well, we know that all these macroscopic thermodynamic systems are composed of particles. The laws of thermodynamics turn out to be descriptions of the most statistically likely configurations to happen from the microscopic point of view.

    Why does black hole thermodynamics hold? Are the laws also the statistically most likely way for black holes to behave? Although there are speculations in this direction, so far we don’t have a solid microscopic understanding of black hole physics. Absent this, the identity claim seems even more surprising.

    What led you to start thinking about the analogy?

    Many people are worried about whether theoretical physics has become too speculative. There’s a lot of commentary about whether holography, the string landscape—all sorts of things—are tethered enough to experiment. I have similar concerns. So my former Ph.D. student John Dougherty and I thought, where did it all start?

    To our mind a lot of it starts with this claimed identity between black holes and thermodynamics. When you look in the literature, you see people say, “The only evidence we have for quantum gravity, the only solid hint, is black hole thermodynamics.”

    If that’s the main thing we’re bouncing off for quantum gravity, then we ought to examine it very carefully. If it turns out to be a poor clue, maybe it would be better to spread our bets a little wider, instead of going all in on this identity.

    What problems do you see with treating a black hole as a thermodynamic system?

    I see basically three. The first problem is: What is a black hole? People often think of black holes as just kind of a dark sphere, like in a Hollywood movie or something; they’re thinking of it like a star that collapsed. But a mathematical black hole, the basis of black hole thermodynamics, is not the material from the star that’s collapsed. That’s all gone into the singularity. The black hole is what’s left.

    The black hole isn’t a solid thing at the center. The system is really the entire space-time.

    Yes, it’s this global notion for which black hole thermodynamics was developed, in which case the system really is the whole space-time.

    Here is another way to think about the worry. Suppose a star collapses and forms an event horizon. But now another star falls past this event horizon and it collapses, so it’s inside the first. You can’t think that each one has its own little horizon that is behaving thermodynamically. It’s only the one horizon.

    Here’s another. The event horizon changes shape depending on what’s about to be thrown into it. It’s clairvoyant. Weird, but there is nothing spooky here so long as we remember that the event horizon is only defined globally. It’s not a locally observable quantity.

    The picture is more counterintuitive than people usually think. To me, if the system is global, then it’s not at all like thermodynamics.

    The second objection is: Black hole thermodynamics is really a pale shadow of thermodynamics. I was surprised to see the analogy wasn’t as thorough as I expected it to be. If you grab a thermodynamics textbook and start replacing claims with their black hole counterparts, you will not find the analogy goes that deep.

    Craig Callender explains why the connection between black holes and thermodynamics is little more than an analogy.

    For instance, the zeroth law of thermodynamics sets up the whole theory and a notion of equilibrium — the basic idea that the features of the system aren’t changing. And it says that if one system is in equilibrium with another — A with B, and B with C — then A must be in equilibrium with C. The foundation of thermodynamics is this equilibrium relation, which sets up the meaning of temperature.

    The zeroth law for black holes is that the surface gravity of a black hole, which measures the gravitational acceleration, is a constant on the horizon. So that assumes temperature being constant is the zeroth law. That’s not really right. Here we see a pale shadow of the original zeroth law.

    The counterpart of equilibrium is supposed to be “stationary,” a technical term that basically says the black hole is spinning at a constant rate. But there’s no sense in which one black hole can be “stationary with” another black hole. You can take any thermodynamic object and cut it in half and say one half is in equilibrium with the other half. But you can’t take a black hole and cut it in half. You can’t say that this half is stationary with the other half.

    Here’s another way in which the analogy falls flat. Black hole entropy is given by the black hole area. Well, area is length squared, volume is length cubed. So what do we make of all those thermodynamic relations that include volume, like Boyle’s law? Is volume, which is length times area, really length times entropy? That would ruin the analogy. So we have to say that volume is not the counterpart of volume, which is surprising.

    The most famous connection between black holes and thermodynamics comes from the notion of entropy. For normal stuff, we think of entropy as a measure of the disorder of the underlying atoms. But in the 1970s, Jacob Bekenstein said that the surface area of a black hole’s event horizon is equivalent to entropy. What’s the basis of this?

    This is my third concern. Bekenstein says, if I throw something into a black hole, the entropy vanishes. But this can’t happen, he thinks, according to the laws of thermodynamics, for entropy must always increase. So some sort of compensation must be paid when you throw things into a black hole.

    Bekenstein notices a solution. When I throw something into the black hole, the mass goes up, and so does the area. If I identify the area of the black hole as the entropy, then I’ve found my compensation. There is a nice deal between the two—one goes down while the other one goes up—and it saves the second law.

    When I saw that I thought, aha, he’s thinking that not knowing about the system anymore means its entropy value has changed. I immediately saw that this is pretty objectionable, because it identifies entropy with uncertainty and our knowledge.

    There’s a long debate in the foundations of statistical mechanics about whether entropy is a subjective notion or an objective notion. I’m firmly on the side of thinking it’s an objective notion. I think trees unobserved in a forest go to equilibrium regardless of what anyone knows about them or not, that the way heat flows has nothing to do with knowledge, and so on.

    Chuck a steam engine behind the event horizon. We can’t know anything about it apart from its mass, but I claim it can still do as much work as before. If you don’t believe me, we can test this by having a physicist jump into the black hole and follow the steam engine! There is only need for compensation if you think that what you can no longer know about ceases to exist.

    Do you think it’s possible to patch up black hole thermodynamics, or is it all hopeless?

    My mind is open, but I have to admit that I’m deeply skeptical about it. My suspicion is that black hole “thermodynamics” is really an interesting set of relationships about information from the point of view of the exterior of the black hole. It’s all about forgetting information.

    Because thermodynamics is more than information theory, I don’t think there’s a deep thermodynamic principle operating through the universe that causes black holes to behave the way they do, and I worry that physics is all in on it being a great hint for quantum gravity when it might not be.

    Playing the role of the Socratic gadfly in the foundations of physics is sometimes important. In this case, looking back invites a bit of skepticism that may be useful going forward.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 8:37 am on September 2, 2019 Permalink | Reply
    Tags: "Physicists mash quantum and gravity and find time but not as we know it", A new kind of quantum time order, Albert Einstein's theory of general relativity, , , , ,   

    From University of Queensland via Science Bulletin: “Physicists mash quantum and gravity and find time, but not as we know it” 


    From University of Queensland



    Science Bulletin

    August 28, 2019


    A University of Queensland-led international team of researchers say they have discovered “a new kind of quantum time order.”

    UQ physicist Dr Magdalena Zych said the discovery arose from an experiment the team designed to bring together elements of the two big — but contradictory — physics theories developed in the past century.

    “Our proposal sought to discover: what happens when an object massive enough to influence the flow of time is placed in a quantum state?” Dr Zych said.

    She said Einstein’s theory described how the presence of a massive object slowed time.

    “Imagine two space ships, asked to fire at each other at a specified time while dodging the other’s attack,” she said.

    “If either fires too early, it will destroy the other.”

    “In Einstein’s theory, a powerful enemy could use the principles of general relativity by placing a massive object — like a planet — closer to one ship to slow the passing of time.”

    “Because of the time lag, the ship furthest away from the massive object will fire earlier, destroying the other.”

    Dr Zych said the second theory, of quantum mechanics, says any object can be in a state of “superposition”.

    “This means it can be found in different states — think Schrodinger’s cat,” she said.

    Dr Zych said using the theory of quantum mechanics, if the enemy put the planet into a state of “quantum superposition,” then time also should be disrupted.

    “There would be a new way for the order of events to unfold, with neither of the events being first or second — but in a genuine quantum state of being both first and second,” she said.

    UQ researcher Dr Fabio Costa said although “a superposition of planets” as described in the paper — may never be possible, technology allowed a simulation of how time works in the quantum world — without using gravity.

    “Even if the experiment can never be done, the study is relevant for future technologies,” Dr Costa said.

    “We are currently working towards quantum computers that — very simply speaking — could effectively jump through time to perform their operations much more efficiently than devices operating in fixed sequence in time, as we know it in our ‘normal’ world.”

    Stevens Institute of Technology and the University of Vienna scientists were co-authors on Bell’s Theorem for Temporal Order, published in Nature Communications.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition


    The University of Queensland (UQ) is one of Australia’s leading research and teaching institutions. We strive for excellence through the creation, preservation, transfer and application of knowledge. For more than a century, we have educated and worked with outstanding people to deliver knowledge leadership for a better world.

    UQ ranks in the top 50 as measured by the QS World University Rankings and the Performance Ranking of Scientific Papers for World Universities. The University also ranks 52 in the US News Best Global Universities Rankings, 60 in the Times Higher Education World University Rankings and 55 in the Academic Ranking of World Universities.

  • richardmitnick 5:44 pm on April 17, 2019 Permalink | Reply
    Tags: Albert Einstein's theory of general relativity, , Einstein’s Unfinished Revolution: The Search for What Lies Beyond the Quantum, , , ,   

    From Scientific American: “Cosmologist Lee Smolin says that at certain key points, the scientific worldview is based on fallacious reasoning” 

    Scientific American

    From Scientific American

    April 17, 2019
    Jim Daley

    Lee Smolin, author of six books about the philosophical issues raised by contemporary physics, says every time he writes a new one, the experience completely changes the direction his own research is taking. In his latest book, Einstein’s Unfinished Revolution: The Search for What Lies Beyond the Quantum, Smolin, a cosmologist and quantum theorist at the Perimeter Institute for Theoretical Physics in Ontario, tackles what he sees as the limitations in quantum theory.

    Credit: Perimeter Institute

    “I want to say the scientific worldview is based on fallacious reasoning at certain key points,” Smolin says. In Einstein’s Unfinished Revolution, he argues one of those key points was the assumption that quantum physics is a complete theory. This incompleteness, Smolin argues, is the reason quantum physics has not been able to solve certain questions about the universe.

    “Most of what we do [in science] is take the laws that have been discovered by experiments to apply to parts of the universe, and just assume that they can be scaled up to apply to the whole universe,” Smolin says. “I’m going to be suggesting that’s wrong.”

    Join Smolin at the Perimeter Institute as he discusses his book and takes the audience on a journey through the basics of quantum physics and the experiments and scientists who have changed our understanding of the universe. The discussion, “Einstein’s Unfinished Revolution,” is part of Perimeter’s public lecture series and will take place on Wednesday, April 17, at 7 P.M. Eastern time. Online viewers can participate in the discussion by tweeting to @Perimeter using the #piLIVE hashtag.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

  • richardmitnick 10:08 am on April 10, 2019 Permalink | Reply
    Tags: Albert Einstein's theory of general relativity, Although the telescopes are not physically connected they are able to synchronize their recorded data with atomic clocks — hydrogen masers — which precisely time their observations., , , , BlackHoleCam, , Data were flown to highly specialised supercomputers — known as correlators — at the Max Planck Institute for Radio Astronomy and MIT Haystack Observatory to be combined., , , , Sagittarius A* the supermassive black hole at the center of our galaxy,   

    From European Southern Observatory: “Astronomers Capture First Image of a Black Hole” 

    ESO 50 Large

    From European Southern Observatory

    10 April 2019

    Heino Falcke
    Chair of the EHT Science Council, Radboud University
    The Netherlands
    Tel: +31 24 3652020
    Email: h.falcke@astro.ru.nl

    Luciano Rezzolla
    EHT Board Member, Goethe Universität
    Tel: +49 69 79847871
    Email: rezzolla@itp.uni-frankfurt.de

    Eduardo Ros
    EHT Board Secretary, Max-Planck-Institut für Radioastronomie
    Tel: +49 22 8525125
    Email: ros@mpifr.de

    Calum Turner
    ESO Public Information Officer
    Garching bei München, Germany
    Tel: +49 89 3200 6655
    Email: pio@eso.org

    ESO, ALMA, and APEX contribute to paradigm-shifting observations of the gargantuan black hole at the heart of the galaxy Messier 87.

    The Event Horizon Telescope (EHT) — a planet-scale array of eight ground-based radio telescopes forged through international collaboration — was designed to capture images of a black hole. Today, in coordinated press conferences across the globe, EHT researchers reveal that they have succeeded, unveiling the first direct visual evidence of a supermassive black hole and its shadow.

    This breakthrough was announced today in a series of six papers published in a special issue of The Astrophysical Journal Letters. The image reveals the black hole at the centre of Messier 87 [1], a massive galaxy in the nearby Virgo galaxy cluster. This black hole resides 55 million light-years from Earth and has a mass 6.5 billion times that of the Sun [2].

    The EHT links telescopes around the globe to form an unprecedented Earth-sized virtual telescope [3]. The EHT offers scientists a new way to study the most extreme objects in the Universe predicted by Einstein’s general relativity during the centenary year of the historic experiment that first confirmed the theory [4].

    “We have taken the first picture of a black hole,” said EHT project director Sheperd S. Doeleman of the Center for Astrophysics | Harvard & Smithsonian. “This is an extraordinary scientific feat accomplished by a team of more than 200 researchers.”

    Black holes are extraordinary cosmic objects with enormous masses but extremely compact sizes. The presence of these objects affects their environment in extreme ways, warping spacetime and superheating any surrounding material.

    “If immersed in a bright region, like a disc of glowing gas, we expect a black hole to create a dark region similar to a shadow — something predicted by Einstein’s general relativity that we’ve never seen before,” explained chair of the EHT Science Council Heino Falcke of Radboud University, the Netherlands. “This shadow, caused by the gravitational bending and capture of light by the event horizon, reveals a lot about the nature of these fascinating objects and has allowed us to measure the enormous mass of Messier 87’s black hole.”

    Multiple calibration and imaging methods have revealed a ring-like structure with a dark central region — the black hole’s shadow — that persisted over multiple independent EHT observations.

    “Once we were sure we had imaged the shadow, we could compare our observations to extensive computer models that include the physics of warped space, superheated matter and strong magnetic fields. Many of the features of the observed image match our theoretical understanding surprisingly well,” remarks Paul T.P. Ho, EHT Board member and Director of the East Asian Observatory [5]. “This makes us confident about the interpretation of our observations, including our estimation of the black hole’s mass.”

    “The confrontation of theory with observations is always a dramatic moment for a theorist. It was a relief and a source of pride to realise that the observations matched our predictions so well,” elaborated EHT Board member Luciano Rezzolla of Goethe Universität, Germany.

    Creating the EHT was a formidable challenge which required upgrading and connecting a worldwide network of eight pre-existing telescopes deployed at a variety of challenging high-altitude sites. These locations included volcanoes in Hawai`i and Mexico, mountains in Arizona and the Spanish Sierra Nevada, the Chilean Atacama Desert, and Antarctica.

    The EHT observations use a technique called very-long-baseline interferometry (VLBI) which synchronises telescope facilities around the world and exploits the rotation of our planet to form one huge, Earth-size telescope observing at a wavelength of 1.3mm. VLBI allows the EHT to achieve an angular resolution of 20 micro-arcseconds — enough to read a newspaper in New York from a café in Paris [6].

    The telescopes contributing to this result were ALMA, APEX, the IRAM 30-meter telescope, the James Clerk Maxwell Telescope, the Large Millimeter Telescope Alfonso Serrano, the Submillimeter Array, the Submillimeter Telescope, and the South Pole Telescope [7]. Petabytes of raw data from the telescopes were combined by highly specialised supercomputers hosted by the Max Planck Institute for Radio Astronomy and MIT Haystack Observatory.

    Max Planck Institute for Radio Astronomy Bonn Germany

    MIT Haystack Observatory, Westford, Massachusetts, USA, Altitude 131 m (430 ft)

    ESO/NRAO/NAOJ ALMA Array in Chile in the Atacama at Chajnantor plateau, at 5,000 metres

    ESO/MPIfR APEX high on the Chajnantor plateau in Chile’s Atacama region, at an altitude of over 4,800 m (15,700 ft)

    IRAM 30m Radio telescope, on Pico Veleta in the Spanish Sierra Nevada,, Altitude 2,850 m (9,350 ft)

    East Asia Observatory James Clerk Maxwell telescope, Mauna Kea, Hawaii, USA,4,207 m (13,802 ft) above sea level

    The University of Massachusetts Amherst and Mexico’s Instituto Nacional de Astrofísica, Óptica y Electrónica
    Large Millimeter Telescope Alfonso Serrano, Mexico, at an altitude of 4850 meters on top of the Sierra Negra

    CfA Submillimeter Array Mauna Kea, Hawaii, USA, Altitude 4,080 m (13,390 ft)

    U Arizona Submillimeter Telescope located on Mt. Graham near Safford, Arizona, USA, Altitude 3,191 m (10,469 ft)

    South Pole Telescope SPTPOL. The SPT collaboration is made up of over a dozen (mostly North American) institutions, including the University of Chicago, the University of California, Berkeley, Case Western Reserve University, Harvard/Smithsonian Astrophysical Observatory, the University of Colorado Boulder, McGill University, The University of Illinois at Urbana-Champaign, University of California, Davis, Ludwig Maximilian University of Munich, Argonne National Laboratory, and the National Institute for Standards and Technology. It is funded by the National Science Foundation. Altitude 2.8 km (9,200 ft)

    European facilities and funding played a crucial role in this worldwide effort, with the participation of advanced European telescopes and the support from the European Research Council — particularly a €14 million grant for the BlackHoleCam project [8]. Support from ESO, IRAM and the Max Planck Society was also key. “This result builds on decades of European expertise in millimetre astronomy”, commented Karl Schuster, Director of IRAM and member of the EHT Board.

    The construction of the EHT and the observations announced today represent the culmination of decades of observational, technical, and theoretical work. This example of global teamwork required close collaboration by researchers from around the world. Thirteen partner institutions worked together to create the EHT, using both pre-existing infrastructure and support from a variety of agencies. Key funding was provided by the US National Science Foundation (NSF), the EU’s European Research Council (ERC), and funding agencies in East Asia.

    “ESO is delighted to have significantly contributed to this result through its European leadership and pivotal role in two of the EHT’s component telescopes, located in Chile — ALMA and APEX,” commented ESO Director General Xavier Barcons. “ALMA is the most sensitive facility in the EHT, and its 66 high-precision antennas were critical in making the EHT a success.”

    “We have achieved something presumed to be impossible just a generation ago,” concluded Doeleman. “Breakthroughs in technology, connections between the world’s best radio observatories, and innovative algorithms all came together to open an entirely new window on black holes and the event horizon.”

    [1] The shadow of a black hole is the closest we can come to an image of the black hole itself, a completely dark object from which light cannot escape. The black hole’s boundary — the event horizon from which the EHT takes its name — is around 2.5 times smaller than the shadow it casts and measures just under 40 billion km across.

    [2] Supermassive black holes are relatively tiny astronomical objects — which has made them impossible to directly observe until now. As the size of a black hole’s event horizon is proportional to its mass, the more massive a black hole, the larger the shadow. Thanks to its enormous mass and relative proximity, M87’s black hole was predicted to be one of the largest viewable from Earth — making it a perfect target for the EHT.

    [3] Although the telescopes are not physically connected, they are able to synchronize their recorded data with atomic clocks — hydrogen masers — which precisely time their observations. These observations were collected at a wavelength of 1.3 mm during a 2017 global campaign. Each telescope of the EHT produced enormous amounts of data – roughly 350 terabytes per day – which was stored on high-performance helium-filled hard drives. These data were flown to highly specialised supercomputers — known as correlators — at the Max Planck Institute for Radio Astronomy and MIT Haystack Observatory to be combined. They were then painstakingly converted into an image using novel computational tools developed by the collaboration.

    [4] 100 years ago, two expeditions set out for Principe Island off the coast of Africa and Sobral in Brazil to observe the 1919 solar eclipse, with the goal of testing general relativity by seeing if starlight would be bent around the limb of the sun, as predicted by Einstein. In an echo of those observations, the EHT has sent team members to some of the world’s highest and most isolated radio facilities to once again test our understanding of gravity.

    [5] The East Asian Observatory (EAO) partner on the EHT project represents the participation of many regions in Asia, including China, Japan, Korea, Taiwan, Vietnam, Thailand, Malaysia, India and Indonesia.

    [6] Future EHT observations will see substantially increased sensitivity with the participation of the IRAM NOEMA Observatory, the Greenland Telescope and the Kitt Peak Telescope.

    [7] ALMA is a partnership of the European Southern Observatory (ESO; Europe, representing its member states), the U.S. National Science Foundation (NSF), and the National Institutes of Natural Sciences(NINS) of Japan, together with the National Research Council (Canada), the Ministry of Science and Technology (MOST; Taiwan), Academia Sinica Institute of Astronomy and Astrophysics (ASIAA; Taiwan), and Korea Astronomy and Space Science Institute (KASI; Republic of Korea), in cooperation with the Republic of Chile. APEX is operated by ESO, the 30-meter telescope is operated by IRAM (the IRAM Partner Organizations are MPG (Germany), CNRS (France) and IGN (Spain)), the James Clerk Maxwell Telescope is operated by the EAO, the Large Millimeter Telescope Alfonso Serrano is operated by INAOE and UMass, the Submillimeter Array is operated by SAO and ASIAA and the Submillimeter Telescope is operated by the Arizona Radio Observatory (ARO). The South Pole Telescope is operated by the University of Chicago with specialized EHT instrumentation provided by the University of Arizona.

    [8] BlackHoleCam is an EU-funded project to image, measure and understand astrophysical black holes. The main goal of BlackHoleCam and the Event Horizon Telescope (EHT) is to make the first ever images of the billion solar masses black hole in the nearby galaxy Messier 87 and of its smaller cousin, Sagittarius A*, the supermassive black hole at the centre of our Milky Way. This allows the determination of the deformation of spacetime caused by a black hole with extreme precision.

    More information

    This research was presented in a series of six papers published today in a special issue of The Astrophysical Journal Letters.

    The EHT collaboration involves more than 200 researchers from Africa, Asia, Europe, North and South America. The international collaboration is working to capture the most detailed black hole images ever by creating a virtual Earth-sized telescope. Supported by considerable international investment, the EHT links existing telescopes using novel systems — creating a fundamentally new instrument with the highest angular resolving power that has yet been achieved.

    The EHT consortium consists of 13 stakeholder institutes; the Academia Sinica Institute of Astronomy and Astrophysics, the University of Arizona, the University of Chicago, the East Asian Observatory, Goethe-Universitaet Frankfurt, Institut de Radioastronomie Millimétrique, Large Millimeter Telescope, Max Planck Institute for Radio Astronomy, MIT Haystack Observatory, National Astronomical Observatory of Japan, Perimeter Institute for Theoretical Physics, Radboud University and the Smithsonian Astrophysical Observatory.


    ESO EHT web page
    EHT Website & Press Release
    ESOBlog on the EHT Project


    Paper I: The Shadow of the Supermassive Black Hole
    Paper II: Array and Instrumentation
    Paper III: Data processing and Calibration
    Paper IV: Imaging the Central Supermassive Black Hole
    Paper V: Physical Origin of the Asymmetric Ring
    Paper VI: The Shadow and Mass of the Central Black Hole

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Visit ESO in Social Media-




    ESO Bloc Icon

    ESO is the foremost intergovernmental astronomy organisation in Europe and the world’s most productive ground-based astronomical observatory by far. It is supported by 16 countries: Austria, Belgium, Brazil, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Poland, Portugal, Spain, Sweden, Switzerland and the United Kingdom, along with the host state of Chile. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates three unique world-class observing sites in Chile: La Silla, Paranal and Chajnantor. At Paranal, ESO operates the Very Large Telescope, the world’s most advanced visible-light astronomical observatory and two survey telescopes. VISTA works in the infrared and is the world’s largest survey telescope and the VLT Survey Telescope is the largest telescope designed to exclusively survey the skies in visible light. ESO is a major partner in ALMA, the largest astronomical project in existence. And on Cerro Armazones, close to Paranal, ESO is building the 39-metre EEuropean Extremely Large Telescope, the E-ELT, which will become “the world’s biggest eye on the sky”.

  • richardmitnick 9:16 am on August 20, 2018 Permalink | Reply
    Tags: Albert Einstein's theory of general relativity, , , , , , , , ,   

    From ARC Centres of Excellence via Science Alert: “We May Soon Know How a Crucial Einstein Principle Works in The Quantum Realm” 


    From ARC Centres of Excellence


    Science Alert


    20 AUG 2018

    The puzzle of how Einstein’s equivalence principle plays out in the quantum realm has vexed physicists for decades. Now two researchers may have finally figured out the key that will allow us to solve this mystery.

    Einstein’s physical theories have held up under pretty much every classical physics test thrown at them. But when you get down to the very smallest scales – the quantum realm – things start behaving a little bit oddly.

    The thing is, it’s not really clear how Einstein’s theory of general relativity and quantum mechanics work together. The laws that govern the two realms are incompatible with each other, and attempts to resolve these differences have come up short.

    But the equivalence principle – one of the cornerstones of modern physics – is an important part of general relativity. And if it can be resolved within the quantum realm, that may give us a toehold into resolving general relativity and quantum mechanics.

    The equivalence principle, in simple terms, means that gravity accelerates all objects equally, as can be observed in the famous feather and hammer experiment conducted by Apollo 15 Commander David Scott on the Moon.

    It also means that gravitational mass and inertial mass are equivalent; to put it simply, if you were in a sealed chamber, like an elevator, you would be unable to tell if the force outside the chamber was gravity or acceleration equivalent to gravity. The effect is the same.

    “Einstein’s equivalence principle contends that the total inertial and gravitational mass of any objects are equivalent, meaning all bodies fall in the same way when subject to gravity,” explained physicist Magdalena Zych of the ARC Centre of Excellence for Engineered Quantum Systems in Australia.

    “Physicists have been debating whether the principle applies to quantum particles, so to translate it to the quantum world we needed to find out how quantum particles interact with gravity.

    “We realised that to do this we had to look at the mass.”

    According to relativity, mass is held together by energy. But in quantum mechanics, that gets a bit complicated. A quantum particle can have two different energy states, with different numerical values, known as a superposition.

    And because it has a superposition of energy states, it also has a superposition of inertial masses.

    This means – theoretically, at least – that it should also have a superposition of gravitational masses. But the superposition of quantum particles isn’t accounted for by the equivalence principle.

    “We realised that we had to look how particles in such quantum states of the mass behave in order to understand how a quantum particle sees gravity in general,” Zych said.

    “Our research found that for quantum particles in quantum superpositions of different masses, the principle implies additional restrictions that are not present for classical particles – this hadn’t been discovered before.”

    This discovery allowed the team to re-formulate the equivalence principle to account for the superposition of values in a quantum particle.

    The new formulation hasn’t yet been applied experimentally; but, the researchers said, opens a door to experiments that could test the newly discovered restrictions.

    And it offers a new framework for testing the equivalence principle in the quantum realm – we can hardly wait.

    The team’s research has been published in the journal Nature Physics.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The objectives for the ARC Centres of Excellence are to:

    undertake highly innovative and potentially transformational research that aims to achieve international standing in the fields of research envisaged and leads to a significant advancement of capabilities and knowledge
    link existing Australian research strengths and build critical mass with new capacity for interdisciplinary, collaborative approaches to address the most challenging and significant research problems
    develope relationships and build new networks with major national and international centres and research programs to help strengthen research, achieve global competitiveness and gain recognition for Australian research
    build Australia’s human capacity in a range of research areas by attracting and retaining, from within Australia and abroad, researchers of high international standing as well as the most promising research students
    provide high-quality postgraduate and postdoctoral training environments for the next generation of researchers
    offer Australian researchers opportunities to work on large-scale problems over long periods of time
    establish Centres that have an impact on the wider community through interaction with higher education institutes, governments, industry and the private and non-profit sector.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: