Tagged: General Relativity Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 6:26 pm on January 19, 2016 Permalink | Reply
    Tags: , , EHT, eLISA, General Relativity, ,   

    From PI: “Preparing for a cosmological challenge” 

    Perimeter Institute
    Perimeter Institute

    January 19, 2016
    Rose Simone

    Einstein’s theory of general relativity may soon be put to the ultimate test through measurements of a black hole’s shadow, say a pair of Perimeter researchers.
    Even though it is over 100 years old, Albert Einstein’s theory of general relativity is still a formidable prizefighter.

    The theory, which successfully describes gravity as a consequence of the curvature of spacetime itself, has withstood all the experimental tests that physicists have been able to throw at it over the decades.

    So now, to have any hope of challenging general relativity, they need to bring in a heavyweight. Enter the closest challenger: the smallish but still formidable 4.5-million- at the centre of our own Milky Way galaxy.

    The challenge will be assisted by the Event Horizon Telescope (EHT), a radio telescope array as large as the Earth, being configured to take precise images of the silhouette (or the shadow) of that black hole, known as Sagittarius A*.

    Sag A*. This image was taken with NASA’s Chandra X-Ray Observatory. Ellipses indicate light echoes.

    NASA Chandra Telescope

    Event Horizon Telescope map
    EHT map

    Meanwhile, Tim Johannsen, a postdoctoral fellow at Perimeter Institute and the University of Waterloo, who works with Avery Broderick, an Associate Faculty member at Perimeter Institute jointly appointed at Waterloo, has led a group of researchers in calculating the measurements that will be used to determine whether general relativity really does stand up in the strong gravity regime of that black hole.

    Perimeter postdoctoral researcher Tim Johannsen.

    Perimeter Associate Faculty member Avery Broderick.

    Their paper was recently published in Physical Review Letters, along with an accessible synopsis of the work.

    When the images from the black hole come in and the measurements outlined in the recent paper are actually taken, it will be the first truly broad test of general relativity in the strong gravity regime.

    “That is very exciting and we expect to be able to do that within the next few years,” Johannsen says.

    Black holes are regions of spacetime, where gravity is so strong that not even light can escape once it has passed the threshold of no return − the event horizon. So as the name implies, they are dark.

    But owing to its immense gravity, the black hole pulls in vast quantities of dust and gas from surrounding stars. These accrete into a hot swirling plasma disk that illuminates the silhouette of the black hole. The EHT will be able to capture this, in images that will be historic firsts.

    A lot of physics will be done with the data gleaned from those images, but putting general relativity to the test is perhaps the most exciting challenge.

    General relativity has been fantastically successful. In every experiment that has been done to test how the sun and stars in our cosmos affect spacetime and exert gravitational pull on other objects, its predictions have held up.

    But the question is whether the theory will continue to hold up in a strong gravity environment, such as the surroundings of a black hole.

    Black holes are so massive and compact that the spacetime-warping effects, predicted by general relativity, would be more evident than around the sun or other stars. They are “orders of magnitude” different as gravitational environments go, Broderick says.

    “That means that this is terra incognita and we don’t know what we are going to find,” Broderick says. The EHT provides “an opportunity to begin probing in a critical way the non-linear nature of general relativity in the strong gravity regime.”

    This is important to physicists because even though general relativity has been enormously successful in explaining the cosmos that we can see, there are a number of difficulties with it. “It is not clear, for example, exactly how it should be combined with the quantum theory that we have, and in fact, it is very difficult to reconcile the two in a grand unification scheme,” Johannsen says.

    Moreover, there is the problem of the mysterious “dark energy” driving the accelerated expansion of spacetime, as well as the conundrum about the nature of “dark matter,” unseen mass theorized as an explanation for observed galaxy rotation rates that prevent galaxy clusters from flying apart. Physicists are hoping for some insights about general relativity in the strong gravity regime to make sense of these mysteries.

    Johannsen’s team has developed a way of checking how much the gravitational environment of this black hole might deviate from the theory of general relativity and other gravity theories.

    The paper sets constraints on the parameters of the size of the shadow to fit with general relativity. Other gravity models also propose modifications to the theory of general relativity, such as the Modified Gravity Theory (MOG) and the Randall-Sundrum-type braneworld model (RS2). The paper sets the constraints for the black hole to fit with these gravity models as well.

    “We have made the first realistic estimate of the high precision with which the EHT can detect the size of the shadow,”Johannsen says. “We show that such a measurement can be a precise test of general relativity.”

    A nice bonus from this work is that researchers will also get much more precise measurements of the mass of the black hole and its distance. “Sharpening the precision is great because that will enable us to get even more precise constraints on deviations from general relativity,” Johannsen adds.

    There are already good measurements of how far away Sagittarius A* is and how massive it is, based on other experiments that have looked at the motion of stars as they orbit the black hole, as well as of masers throughout the Milky Way, Johannsen explains. “People have been doing this for about 20 years.”

    This can be used to figure out what it should look like. But once the images from the EHT are available, it will be possible to check: “Do we get what we expect? Or do we get something else?” Johannsen says.

    Getting the measurements is really a matter of drawing a series of lines from the centre of the black hole image to the edge of its shadow. On the image, it looks like a pie shape with slices. Measuring the lines of each slice and calculating an average “gives us the angular radius of the shadow and then we know how big it is,” Johannsen says.

    A reconstructed image of Sgr A* for an EHT observation at 230 GHz with a seven-station array.

    From the measurements of the size of the shadow, it is possible to see how closely the gravity in the black hole environment matches the predictions of general relativity and of other theories of gravity.

    “If general relativity is not correct, there can be significant change in the size. The shadow can also become asymmetric so that it is no longer circular, but egg-shaped, for example,” Johannsen says.

    Getting to the point of making these measurements will take a couple more years because at least seven or eight of the telescopes in the EHT array must be coordinated to get the data at the same time in a massive worldwide collaboration.

    The amount of raw data that has to be gathered to get the images is so enormous, it can’t even be transmitted over the internet.

    “These are humongous data sets. So they literally have to save all this data on hard drives and put them in a box and ship them,” Johannsen says.

    The hard drives get shipped to the MIT Haystack Observatory, which is the headquarters for the EHT. From there, the raw data is analyzed and the images are produced.

    After the images are produced, Johannsen gets to use his measurement technique to find out if general relativity is correct for the strong gravity environment around this black hole.

    This isn’t the only test of general relativity in the strong gravity regime in the works. There are other sophisticated experiments to detect, for example, the gravitational waves that are predicted by general relativity. But the prime experimental candidate to confirm the existence of gravitational waves would be the Evolved Laser Interferometer Space Antenna (eLISA), a space-based telescope with an estimated launch date of 2034.

    LISA graphic

    The EHT will produce images in the next few years.

    If it turns out that the measurements yield what was expected and general relativity holds up, that would be interesting, “because Einstein had this theory 100 years ago, and then we will know that it is true,” Johannsen says.

    But if the challenger should prevail, and strong gravity does strike a blow to the theory of general relativity, “that would be big,” he adds.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Perimeter

    Perimeter Institute is a leading centre for scientific research, training and educational outreach in foundational theoretical physics. Founded in 1999 in Waterloo, Ontario, Canada, its mission is to advance our understanding of the universe at the most fundamental level, stimulating the breakthroughs that could transform our future. Perimeter also trains the next generation of physicists through innovative programs, and shares the excitement and wonder of science with students, teachers and the general public.

  • richardmitnick 4:19 pm on January 11, 2016 Permalink | Reply
    Tags: , , General Relativity, ,   

    From Ethan Siegel: “A distant galaxy cluster and the power of Einstein’s gravity” 

    Starts with a bang
    Starts with a Bang

    Ethan Siegel

    Temp 1
    Image credit: NASA, ESA, and G. Tremblay (European Southern Observatory).

    The ability for mass to bend and magnify background light is a unique feature of General Relativity. But it can fool us, too.

    “Gravitational and electromagnetic interactions are long-range interactions, meaning they act on objects no matter how far they are separated from each other.” -Francois Englert

    A century ago, [Albert] Einstein put forth a new theory of gravity: General Relativity. The solar eclipse of 1919 finally confirmed that mass gravitationally bent light around it.

    Images credit: New York Times, 10 November 1919 (L); Illustrated London News, 22 November 1919 (R).

    But only much later was the phenomenon of gravitational lensing confirmed: where a distant galaxy cluster acted as a lens, magnifying and distorting the background galaxies behind it.

    view the mp4 video here.

    In 2014, the Hubble Space Telescope imaged an ultra-massive galaxy cluster found by the Sloan Digital Sky Survey3 [SDSS], and unveiled what appeared to be a spectacular, multiply-imaged distortion of blue, star-forming background galaxies.

    NASA Hubble Telescope
    NASA/ESA Hubble

    SDSS Telescope
    SDSS telescope at Apache Point, NM, USA

    Image credit: NASA, ESA, and G. Tremblay (European Southern Observatory).

    The multiple images of similar structures, the distortions and the similar colorations all pointed to gravitational lensing.

    Temp 4
    Image credit: NASA, ESA, and G. Tremblay (European Southern Observatory).

    But a careful analysis of the data showed that while the outer arcs are indeed lensed background galaxies…

    Image credit: K. Sharon et al., 2014, via http://arxiv.org/abs/1407.2266.

    the brightest blue lights, interconnecting the two giant ellipticals at the cluster’s center, come from the merger of the galaxies and the surrounding gas themselves.

    Image credit: NASA, ESA, and G. Tremblay (European Southern Observatory).

    What we’re looking at is a combination of the stars and galaxies of the foregrounds cluster, some 4,000 times as massive as the Milky Way, a transient burst of star formation, and only a few background objects.

    view mp4 video here.

    Despite our excellent intuition, there’s no substitute for good data.

    Image credit: NASA, ESA, and G. Tremblay (European Southern Observatory).

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

  • richardmitnick 4:24 pm on December 24, 2015 Permalink | Reply
    Tags: , General Relativity, , , ,   

    From Ethan Siegel: “What Are Quantum Gravity’s Alternatives To String Theory?” 

    Starts with a bang
    Starts with a Bang

    Ethan Siegel

    Image credit: CPEP (Contemporary Physics Education Project), NSF/DOE/LBNL.

    If there is a quantum theory of gravity, is String Theory the only game in town?

    “I just think too many nice things have happened in string theory for it to be all wrong. Humans do not understand it very well, but I just don’t believe there is a big cosmic conspiracy that created this incredible thing that has nothing to do with the real world.” –Edward Witten

    The Universe we know and love — with [Albert] Einstein’s General Relativity as our theory of gravity and quantum field theories of the other three forces — has a problem that we don’t often talk about: it’s incomplete, and we know it. Einstein’s theory on its own is just fine, describing how matter-and-energy relate to the curvature of space-and-time. Quantum field theories on their own are fine as well, describing how particles interact and experience forces. Normally, the quantum field theory calculations are done in flat space, where spacetime isn’t curved. We can do them in the curved space described by Einstein’s theory of gravity as well (although they’re harder — but not impossible — to do), which is known as semi-classical gravity. This is how we calculate things like Hawking radiation and black hole decay.

    Image credit: NASA, via http://www.nasa.gov/topics/universe/features/smallest_blackhole.html

    But even that semi-classical treatment is only valid near and outside the black hole’s event horizon, not at the location where gravity is truly at its strongest: at the singularities (or the mathematically nonsensical predictions) theorized to be at the center. There are multiple physical instances where we need a quantum theory of gravity, all having to do with strong gravitational physics on the smallest of scales: at tiny, quantum distances. Important questions, such as:

    What happens to the gravitational field of an electron when it passes through a double slit?
    What happens to the information of the particles that form a black hole, if the black hole’s eventual state is thermal radiation?
    And what is the behavior of a gravitational field/force at and around a singularity?

    Image credit: Nature 496, 20–23 (04 April 2013) doi:10.1038/496020a, via http://www.nature.com/news/astrophysics-fire-in-the-hole-1.12726.

    In order to explain what happens at short distances in the presence of gravitational sources — or masses — we need a quantum, discrete, and hence particle-based theory of gravity. The known quantum forces are mediated by particles known as bosons, or particles with integer spin. The photon mediates the electromagnetic force, the W-and-Z bosons mediate the weak force, while the gluons mediate the strong force. All these types of particles have a spin of 1, which for massive (W-and-Z) particles mean they can take on spin values of -1, 0, or +1, while for massless ones (like gluons and photons), they can take on values of -1 or +1 only.

    The Higgs boson is also a boson, although it doesn’t mediate any forces, and has a spin of 0. Because of what we know about gravitation — General Relativity is a tensor theory of gravity — it must be mediated by a massless particle with a spin of 2, meaning it can take on a spin value of -2 or +2 only.

    This is fantastic! It means that we already know a few things about a quantum theory of gravity before we even try to formulate one! We know this because whatever the true quantum theory of gravity turns out to be, it must be consistent with General Relativity when we’re not at very small distances from a massive particle or object, just as — 100 years ago — we knew that General Relativity needed to reduce to Newtonian gravity in the weak-field regime.

    Image credit: NASA, of an artist’s concept of Gravity Probe B orbiting the Earth to measure space-time curvature.

    NASA Gravity Probe B
    Gravity Probe B

    The big question, of course is how? How do you quantize gravity in a way that’s correct (at describing reality), consistent (with both GR and QFT), and hopefully leads to calculable predictions for new phenomena that might be observed, measured or somehow tested. The leading contender, of course, is something you’ve long heard of: String Theory.

    String Theory is an interesting framework — it can include all of the standard model fields and particles, both the fermions and the bosons.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    It includes also a 10-dimensional Tensor-Scalar theory of gravity: with 9 space and 1 time dimensions, and a scalar field parameter. If we erase six of those spatial dimensions (through an incompletely defined process that people just call compactification) and let the parameter (ω) that defines the scalar interaction go to infinity, we can recover General Relativity.

    Image credit: NASA/Goddard/Wade Sisler, of Brian Greene presenting on String Theory.

    But there are a whole host of phenomenological problems with String Theory. One is that it predicts a large number of new particles, including all the supersymmetric ones, none of which have been found.

    Supersymmetry standard model
    Standard Model of Supersymmetry

    It claims to not need to need “free parameters” like the standard model has (for the masses of the particles), but it replaces that problem with an even worse one. String theory refers to “10⁵⁰⁰ possible solutions,” where these solutions refer to the vacuum expectation values of the string fields, and there’s no mechanism to recover them; if you want String Theory to work, you need to give up on dynamics, and simply say, “well, it must’ve been anthropically selected.” There are frustrations, drawbacks, and problems with the very idea of String Theory. But the biggest problem with it may not be these mathematical ones. Instead, it may be that there are four other alternatives that may lead us to quantum gravity instead; approaches that are completely independent of String Theory.

    Image credit: Wikimedia Commons user Linfoxman, of an illustration of a quantized “fabric of space.”

    1.) Loop Quantum Gravity [reader, please take the time to visit this link and read the article]. LQG is an interesting take on the problem: rather than trying to quantize particles, LQG has as one of its central features that space itself is discrete. Imagine a common analogy for gravity: a bedsheet pulled taut, with a bowling ball in the center. Rather than a continuous fabric, though, we know that the bedsheet itself is really quantized, in that it’s made up of molecules, which in turn are made of atoms, which in turn are made of nuclei (quarks and gluons) and electrons.

    Space might be the same way! Perhaps it acts like a fabric, but perhaps it’s made up of finite, quantized entities. And perhaps it’s woven out of “loops,” which is where the theory gets it name from. Weave these loops together and you get a spin network, which represents a quantum state of the gravitational field. In this picture, not just the matter itself but space itself is quantized. The way to go from this idea of a spin network to a perhaps realistic way of doing gravitational computations is an active area of research, one that saw a tremendous leap forward made in just 2007/8, so this is still actively advancing.

    Image credit: Wikimedia Commons user & reasNink, generated with Wolfram Mathematica 8.0.

    2.) Asymptotically Safe Gravity. This is my personal favorite of the attempts at a quantum theory of gravity. Asymptotic freedom was developed in the 1970s to explain the unusual nature of the strong interaction: it was a very weak force at extremely short distances, then got stronger as (color) charged particles got farther and farther apart. Unlike electromagnetism, which had a very small coupling constant, the strong force has a large one. Due to some interesting properties of QCD, if you wound up with a (color) neutral system, the strength of the interaction fell off rapidly. This was able to account for properties like the physical sizes of baryons (protons and neutrons, for example) and mesons (pions, for example).

    Asymptotic safety, on the other hand, looks to solve a fundamental problem that’s related to this: you don’t need small couplings (or couplings that tend to zero), but rather for the couplings to simply be finite in the high-energy limit. All coupling constants change with energy, so what asymptotic safety does is pick a high-energy fixed point for the constant (technically, for the renormalization group, from which the coupling constant is derived), and then everything else can be calculated at lower energies.

    At least, that’s the idea! We’ve figured out how to do this in 1+1 dimensions (one space and one time), but not yet in 3+1 dimensions. Still, progress has been made, most notably by Christof Wetterich, who had two ground breaking papers in the 1990s. More recently, Wetterich used asymptotic safety — just six years ago — to calculate a prediction for the mass of the Higgs boson before the LHC found it. The result?

    Image credit: Mikhail Shaposhnikov & Christof Wetterich.

    Amazingly, what it indicated was perfectly in line with what the LHC wound up finding.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    It’s such an amazing prediction that if asymptotic safety is correct, and — when the error bars are beaten down further — the masses of the top quark, the W-boson and the Higgs boson are finalized, there may not even be a need for any other fundamental particles (like SUSY particles) for physics to be stable all the way up to the Planck scale. It’s not only very promising, it has many of the same appealing properties of string theory: quantizes gravity successfully, reduces to GR in the low energy limit, and is UV-finite. In addition, it beats string theory on at least one account: it doesn’t need the addition of new particles or parameters that we have no evidence for! Of all the string theory alternatives, this one is my favorite.

    3.) Causal Dynamical Triangulations. This idea, CDT, is one of the new kids in town, first developed only in 2000 by Renate Loll and expanded on by others since. It’s similar to LQG in that space itself is discrete, but is primarily concerned with how that space itself evolves. One interesting property of this idea is that time must be discrete as well! As an interesting feature, it gives us a 4-dimensional spacetime (not even something put in a priori, but something that the theory gives us) at the present time, but at very, very high energies and small distances (like the Planck scale), it displays a 2-dimensional structure. It’s based on a mathematical structure called a simplex, which is a multi-dimensional analogue of a triangle.

    Image credit: screenshot from the Wikipedia page for Simplex, via https://en.wikipedia.org/wiki/Simplex.

    A 2-simplex is a triangle, a 3-simplex is a tetrahedron, and so on. One of the “nice” features of this option is that causality — a notion held sacred by most human beings — is explicitly preserved in CDT. (Sabine has some words on CDT here, and its possible relation to asymptotically safe gravity.) It might be able to explain gravity, but it isn’t 100% certain that the standard model of elementary particles can fit suitably into this framework. It’s only major advances in computation that have enabled this to become a fairly well-studied alternative of late, and so work in this is both ongoing and relatively young.

    4.) Emergent gravity. And finally, we come to what’s probably the most speculative, recent of the quantum gravity possibilities. Emergent gravity only gained prominence in 2009, when Erik Verlinde proposed entropic gravity, a model where gravity was not a fundamental force, but rather emerged as a phenomenon linked to entropy. In fact, the seeds of emergent gravity go back to the discoverer of the conditions for generating a matter-antimatter asymmetry, Andrei Sakharov, who proposed the concept back in 1967. This research is still in its infancy, but as far as developments in the last 5–10 years go, it’s hard to ask for more than this.

    Image credit: flickr gallery of J. Gabas Esteban.

    We’re sure we need a quantum theory of gravity to make the Universe work at a fundamental level, but we’re not sure what that theory looks like or whether any of these five avenues (string theory included) are going to prove fruitful or not. String Theory is the best studied of all the options, but Loop Quantum Gravity is a rising second, with the others being given serious consideration at long last. They say the answer’s always in the last place you look, and perhaps that’s motivation enough to start looking, seriously, in newer places.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

  • richardmitnick 8:20 am on December 12, 2015 Permalink | Reply
    Tags: , , General Relativity,   

    From Daily Galaxy: “”Gravity Alters the Quantum Nature of Particles on Earth” –What Does It Imply at Cosmological Scales?” 

    Daily Galaxy
    The Daily Galaxy

    December 11, 2015
    University of Vienna


    “It is quite surprising that gravity can play any role in quantum mechanics“, says Igor Pikovski, a theoretical physicist working at the Harvard-Smithsonian Center for Astrophysics:”Gravity is usually studied on astronomical scales, but it seems that it also alters the quantum nature of the smallest particles on Earth”. “It remains to be seen what the results imply on cosmological scales, where gravity can be much stronger”, adds Caslav Brukner University Professor at the University of Vienna and Director of the Institute for Quantum Optics and Quantum Information.

    In 1915 Albert Einstein formulated the theory of general relativity which fundamentally changed our understanding of gravity. He explained gravity as the manifestation of the curvature of space and time. Einstein’s theory predicts that the flow of time is altered by mass. This effect, known as “gravitational time dilation“, causes time to be slowed down near a massive object. It affects everything and everybody; in fact, people working on the ground floor will age slower than their colleagues a floor above, by about 10 nanoseconds in one year. This tiny effect has actually been confirmed in many experiments with very precise clocks.

    This past June, a team of researchers from the University of Vienna, Harvard University and the University of Queensland discovered that the slowing down of time can explain another perplexing phenomenon: the transition from quantum behavior to our classical, everyday world.

    The image below is an illustration of a molecule in the presence of gravitational time dilation. The molecule is in a quantum superposition of being several places at the same time.


    Quantum theory, the other major discovery in physics in the early 20th century, predicts that the fundamental building blocks of nature show fascinating and mind-boggling behavior. Extrapolated to the scales of our everyday life quantum theory leads to situations such as the famous example of Schroedinger’s cat: the cat is neither dead nor alive, but in a so-called quantum superposition of both.


    Yet such a behavior has only been confirmed experimentally with small particles and has never been observed with real-world cats. Therefore, scientists conclude that something must cause the suppression of quantum phenomena on larger, everyday scales. Typically this happens because of interaction with other surrounding particles.

    The research team, headed by ?aslav Brukner from the University of Vienna and the Institute of Quantum Optics and Quantum Information, found that time dilation also plays a major role in the demise of quantum effects. They calculated that once the small building blocks form larger, composite objects – such as molecules and eventually larger structures like microbes or dust particles -, the time dilation on Earth can cause a suppression of their quantum behavior.

    The tiny building blocks jitter ever so slightly, even as they form larger objects. And this jitter is affected by time dilation: it is slowed down on the ground and speeds up at higher altitudes. The researchers have shown that this effect destroys the quantum superposition and, thus, forces larger objects to behave as we expect in everyday life.

    The results of Pikovski and his co-workers reveal how larger particles lose their quantum behavior due to their own composition, if one takes time dilation into account. This prediction should be observable in experiments in the near future, which could shed some light on the fascinating interplay between the two great theories of the 20th century, quantum theory and general relativity.

    Publication in Nature Physics: “Universal decoherence due to gravitational time dilation”. I. Pikovski, M. Zych, F. Costa, C. Brukner. Nature Physics (2015) doi:10.1038/nphys3366

    See the full article here .

    Please help promote STEM in your local schools


    STEM Education Coalition

  • richardmitnick 11:05 pm on November 24, 2015 Permalink | Reply
    Tags: , General Relativity, Leonard Susskind, ,   

    From Nature: “Theoretical physics: Complexity on the horizon” 2014 

    Nature Mag

    28 May 2014
    Amanda Gefter

    Temp 1

    When physicist Leonard Susskind gives talks these days, he often wears a black T-shirt proclaiming “I ♥ Complexity”. In place of the heart is a Mandelbrot set, a fractal pattern widely recognized as a symbol for complexity at its most beautiful.

    Initial image of a Mandelbrot set zoom sequence with a continuously colored environment

    That pretty much sums up his message. The 74-year-old Susskind, a theorist at Stanford University in California, has long been a leader in efforts to unify quantum mechanics with the general theory of relativityAlbert Einstein’s framework for gravity. The quest for the elusive unified theory has led him to advocate counter-intuitive ideas, such as superstring theory or the concept that our three-dimensional Universe is actually a two-dimensional hologram. But now he is part of a small group of researchers arguing for a new and equally odd idea: that the key to this mysterious theory of everything is to be found in the branch of computer science known as computational complexity.

    This is not a subfield to which physicists have tended to look for fundamental insight. Computational complexity is grounded in practical matters, such as how many logical steps are required to execute an algorithm. But if the approach works, says Susskind, it could resolve one of the most baffling theoretical conundrums to hit his field in recent years: the black-hole firewall paradox, which seems to imply that either quantum mechanics or general relativity must be wrong. And more than that, he says, computational complexity could give theorists a whole new way to unify the two branches of their science — using ideas based fundamentally on information.

    Behind a firewall

    It all began 40 years ago, when physicist Stephen Hawking at the University of Cambridge, UK, realized that quantum effects would cause a black hole to radiate photons and other particles until it completely evaporates away.

    As other researchers were quick to point out, this revelation brings a troubling contradiction. According to the rules of quantum mechanics, the outgoing stream of radiation has to retain information about everything that ever fell into the black hole, even as the matter falling in carries exactly the same information through the black hole’s event horizon, the boundary inside which the black hole’s gravity gets so strong that not even light can escape. Yet this two-way flow could violate a key law of quantum mechanics known as the no-cloning theorem, which dictates that making a perfect copy of quantum information is impossible.

    Happily, as Susskind and his colleagues observed (1) in 1995, nature seemed to sidestep any such violation by making it impossible to see both copies at once: an observer who remains outside the horizon cannot communicate with one who has fallen in. But in 2012, four physicists at the University of California, Santa Barbara — Ahmed Almheiri, Donald Marolf, Joseph Polchinski and James Sully, known collectively as AMPS — spotted a dangerous exception to this rule (2). They found a scenario in which an observer could decode the information in the radiation, jump into the black hole and then compare that information with its forbidden duplicate on the way down.

    AMPS concluded that nature prevents this abomination by creating a blazing firewall just inside the horizon that will incinerate any observer — or indeed, any particle — trying to pass through. In effect, space would abruptly end at the horizon, even though Einstein’s gravitational theory says that space must be perfectly continuous there. If AMPS’s theory is true, says Raphael Bousso, a theoretical physicist at the University of California, Berkeley, “this is a terrible blow to general relativity”.

    Does not compute

    Fundamental physics has been in an uproar ever since, as practitioners have struggled to find a resolution to this paradox. The first people to bring computational complexity into the debate were Stanford’s Patrick Hayden, a physicist who also happens to be a computer scientist, and Daniel Harlow, a physicist at Princeton University in New Jersey. If the firewall argument hinges on an observer’s ability to decode the outgoing radiation, they wondered, just how hard is that to do?

    Impossibly hard, they discovered. A computational-complexity analysis showed that the number of steps required to decode the outgoing information would rise exponentially with the number of radiation particles that carry it. No conceivable computer could finish the calculations until long after the black hole had radiated all of its energy and vanished, along with the forbidden information clones. So the firewall has no reason to exist: the decoding scenario that demands it cannot happen, and the paradox disappears.

    “The black hole’s interior is protected by an armour of computational complexity.”

    Hayden was sceptical of the result at first. But then he and Harlow found much the same answer for many types of black hole (3). “It did seem to be a robust principle,” says Hayden: “a conspiracy of nature preventing you from performing this decoding before the black hole had disappeared on you.”

    The Harlow–Hayden argument made a big impression on Scott Aaronson, who works on computational complexity and the limits of quantum computation at the Massachusetts Institute of Technology in Cambridge. “I regard what they did as one of the more remarkable syntheses of physics and computer science that I’ve seen in my career,” he says.

    It also resonated strongly among theoretical physicists. But not everyone is convinced. Even if the calculation is correct, says Polchinski, “it is hard to see how one would build a fundamental theory on this framework”. Nevertheless, some physicists are trying to do just that. There is a widespread belief in the field that the laws of nature must somehow be based on information. And the idea that the laws might actually be upheld by computational complexity — which is defined entirely in terms of information — offers a fresh perspective.

    It certainly inspired Susskind to dig deeper into the role of complexity. For mathematical clarity, he chose to make his calculations in a theoretical realm known as anti-de Sitter space (AdS). This describes a cosmos that is like our own Universe in the sense that everything in it, including black holes, is governed by gravity. Unlike our Universe, however, it has a boundary — a domain where there is no gravity, just elementary particles and fields governed by quantum physics. Despite this difference, studying physics in AdS has led to many insights, because every object and physical process inside the space can be mathematically mapped to an equivalent object or process on its boundary. A black hole in AdS, for example, is equivalent to a hot gas of ordinary quantum particles on the boundary. Better still, calculations that are complicated in one domain often turn out to be simple in the other. And after the calculations are complete, the insights gained in AdS can generally be translated back into our own Universe.

    Increasing complexity

    Susskind decided to look at a black hole sitting at the centre of an AdS universe, and to use the boundary description to explore what happens inside a black hole’s event horizon. Others had attempted this and failed, and Susskind could see why after he viewed the problem through the lens of computational complexity. Translating from the boundary of the AdS universe to the interior of a black hole requires an enormous number of computational steps, and that number increases exponentially as one moves closer to the event horizon (4). As Aaronson puts it, “the black hole’s interior is protected by an armour of computational complexity”.

    Furthermore, Susskind noticed, the computational complexity tends to grow with time. This is not the increase of disorder, or entropy, that is familiar from everyday physics. Rather, it is a pure quantum effect arising from the way that interactions between the boundary particles cause an explosive growth in the complexity of their collective quantum state.

    If nothing else, Susskind argued, this growth means that complexity behaves much like a gravitational field. Imagine an object floating somewhere outside the black hole. Because this is AdS, he said, the object can be described by some configuration of particles and fields on the boundary. And because the complexity of that boundary description tends to increase over time, the effect is to make the object move towards regions of higher complexity in the interior of the space. But that, said Susskind, is just another way of saying that the object will be pulled down towards the black hole. He captured that idea in a slogan (4): “Things fall because there is a tendency toward complexity.”

    Another implication of increasing complexity turns out to be closely related to an argument (5) that Susskind made last year in collaboration with Juan Maldacena, a physicist at the Institute for Advanced Study in Princeton, New Jersey, and the first researcher to recognize the unique features of AdS. According to general relativity, Susskind and Maldacena noted, two black holes can be many light years apart yet still have their interiors connected by a space-time tunnel known as a wormhole. But according to quantum theory, these widely separated black holes can also be connected by having their states entangled, meaning that information about their quantum states is shared between them in a way that is independent of distance.

    After exploring the many similarities between these connections, Susskind and Maldacena concluded that they were two aspects of the same thing — that the black hole’s degree of entanglement, a purely quantum phenomenon, will determine the wormhole’s width, a matter of pure geometry.

    With his latest work, Susskind says, it turns out that the growth of complexity on the boundary of AdS shows up as an increase in the wormhole’s length. So putting it all together, it seems that entanglement is somehow related to space, and that computational complexity is somehow related to time.

    Susskind is the first to admit that such ideas by themselves are only provocative suggestions; they do not make up a fully fledged theory. But he and his allies are confident that the ideas transcend the firewall paradox.

    “I don’t know where all of this will lead,” says Susskind. “But I believe these complexity–geometry connections are the tip of an iceberg.”

    See the full article for References

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

  • richardmitnick 11:22 am on July 26, 2015 Permalink | Reply
    Tags: , , General Relativity, , Time Travel   

    From RT: “Time-traveling photons connect general relativity to quantum mechanics” 

    RT Logo


    23 Jun, 2014
    No Writer Credit

    Space-time structure exhibiting closed paths in space (horizontal) and time (vertical). A quantum particle travels through a wormhole back in time and returns to the same location in space and time. (Photo credit: Martin Ringbauer)

    Scientists have simulated time travel by using particles of light acting as quantum particles sent away and then brought back to their original space-time location. This is a huge step toward marrying two of the most irreconcilable theories in physics.

    Since traveling all the way to a black hole to see if an object you’re holding would bend, break or put itself back together in inexplicable ways is a bit of a trek, scientists have decided to find a point of convergence between general relativity and quantum mechanics in lab conditions, and they achieved success.

    Australian researchers from the UQ’s School of Mathematics and Physics wanted to plug the holes in the discrepancies that exist between two of our most commonly accepted physics theories, which is no easy task: on the one hand, you have Einstein’s theory of general relativity, which predicts the behavior of massive objects like planets and galaxies; but on the other, you have something whose laws completely clash with Einstein’s – and that is the theory of quantum mechanics, which describes our world at the molecular level. And this is where things get interesting: we still have no concrete idea of all the principles of movement and interaction that underpin this theory.

    Natural laws of space and time simply break down there.

    The light particles used in the study are known as photons, and in this University of Queensland study, they stood in for actual quantum particles for the purpose of finding out how they behaved while moving through space and time.

    The team simulated the behavior of a single photon that travels back in time through a wormhole and meets its older self – an identical photon. “We used single photons to do this but the time-travel was simulated by using a second photon to play the part of the past incarnation of the time traveling photon,” said UQ Physics Professor Tim Ralph asquotedby The Speaker.

    The findings were published in the journal Nature Communications and gained support from the country’s key institutions on quantum physics.

    Some of the biggest examples of why the two approaches can’t be reconciled concern the so-called space-time loop. Einstein suggested that you can travel back in time and return to the starting point in space and time. This presented a problem, known commonly as the ‘grandparents paradox,’ theorized by Kurt Godel in 1949: if you were to travel back in time and prevent your grandparents from meeting, and in so doing prevent your own birth, the classical laws of physics would prevent you from being born.

    But Tim Ralph has reminded that in 1991, such situations could be avoided by harnessing quantum mechanics’ flexible laws: “The properties of quantum particles are ‘fuzzy’ or uncertain to start with, so this gives them enough wiggle room to avoid inconsistent time travel situations,” he said.

    There are still ways in which science hasn’t tested the meeting points between general relativity and quantum mechanics – such as when relativity is tested under extreme conditions, where its laws visibly seem to bend, just like near the event horizon of a black hole.

    But since it’s not really easy to approach one, the UQ scientists were content with testing out these points of convergence on photons.

    “Our study provides insights into where and how nature might behave differently from what our theories predict,” Professor Ralph said.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 10:31 am on July 18, 2015 Permalink | Reply
    Tags: , , , General Relativity, ,   

    From NOVA: “How Time Got Its Arrow” 



    15 Jul 2015

    Lee Smolin, Perimeter Institute for Theoretical Physics

    I believe in time.

    I haven’t always believed in it. Like many physicists and philosophers, I had once concluded from general relativity and quantum gravity that time is not a fundamental aspect of nature, but instead emerges from another, deeper description. Then, starting in the 1990s and accelerated by an eight year collaboration with the Brazilian philosopher Roberto Mangabeira Unger, I came to believe instead that time is fundamental. (How I came to this is another story.) Now, I believe that by taking time to be fundamental, we might be able to understand how general relativity and the standard model emerge from a deeper theory, why time only goes one way, and how the universe was born.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Flickr user Robert Couse-Baker, adapted under a Creative Commons license.

    The story starts with change. Science, most broadly defined, is the systematic study of change. The world we observe and experience is constantly changing. And most of the changes we observe are irreversible. We are born, we grow, we age, we die, as do all living things. We remember the past and our actions influence the future. Spilled milk is hard to clean up; a cool drink or a hot bath tend towards room temperature. The whole world, living and non-living, is dominated by irreversible processes, as captured mathematically by the second law of thermodynamics, which holds that the entropy of a closed system usually increases and seldom decreases.

    It may come as a surprise, then, that physics regards this irreversibility as a cosmic accident. The laws of nature as we know them are all reversible when you change the direction of time. Film a process described by those laws, and then run the movie backwards: the rewound version is also allowed by the laws of physics. To be more precise, you may have to change left for right and particles for antiparticles, along with reversing the direction of time, but the standard model of particle physics predicts that the original process and its reverse are equally likely.

    The same is true of Einstein’s theory of general relativity, which describes gravity and cosmology. If the whole universe were observed to run backwards in time, so that it heated up while it collapsed, rather than cooled as it expanded, that would be equally consistent with these fundamental laws, as we currently understand them.

    This leads to a fundamental question: Why, if the laws are reversible, is the universe so dominated by irreversible processes? Why does the second law of thermodynamics hold so universally?

    Gravity is one part of the answer. The second law tells us that the entropy of a closed system, which is a measure of disorder or randomness in the motions of the atoms making up that system, will most likely increase until a state of maximum disorder is reached. This state is called equilibrium. Once it is reached, the system is as mixed as possible, so all parts have the same temperature and all the elements are equally distributed.

    But on large scales, the universe is far from equilibrium. Galaxies like ours are continually forming stars, turning nuclear potential energy into heat and light, as they drive the irreversible flows of energy and materials that characterize the galactic disks. On these large scales, gravity fights the decay to equilibrium by causing matter to clump,,creating subsystems like stars and planets. This is beautifully illustrated in some recent papers by Barbour, Koslowski and Mercati.

    But this is only part of the answer to why the universe is out of equilibrium. There remains the mystery of why the universe at the big bang was not created in equilibrium to start with, for the picture of the universe given us by observations requires that the universe be created in an extremely improbable state—very far from equilibrium. Why?

    So when we say that our universe started off in a state far from equilibrium, we are saying that it started off in a state that would be very improbable, were the initial state chosen randomly from the set of all possible states. Yet we must accept this vast improbability to explain the ubiquity of irreversible processes in our world in terms of the reversible laws we know.

    In particular, the conditions present in the early universe, being far from equilibrium, are highly irreversible. Run the early universe backwards to a big crunch and they look nothing like the late universe that might be in our future.

    In 1979 Roger Penrose proposed a radical answer to the mystery of irreversibility. His proposal concerned quantum gravity, the long-searched-for unification of all the known laws, which is believed to govern the processes that created the universe in the big bang—or transformed it from whatever state it was in before the big bang.

    Penrose hypothesized that quantum gravity, as the most fundamental law, will be unlike the laws we know in that it will be irreversible. The known laws, along with their time-reversibility, emerge as approximations to quantum gravity when the universe grows large and cool and dilute, Penrose argued. But those approximate laws will act within a universe whose early conditions were set up by the more fundamental, irreversible laws. In this way the improbability of the early conditions can be explained.

    In the intervening years our knowledge of the early universe has been dramatically improved by a host of cosmological observations, but these have only deepened the mysteries we have been discussing. So a few years ago, Marina Cortes, a cosmologist from the Institute for Astronomy in Edinburgh, and I decided to revive Penrose’s suggestion in the light of all the knowledge gained since, both observationally and theoretically.

    Dr. Cortes argued that time is not only fundamental but fundamentally irreversible. She proposed that the universe is made of processes that continuously generate new events from present events. Events happen, but cannot unhappen. The reversal of an event does not erase that event, Cortes says: It is a new event, which happens after it.

    In December of 2011, Dr. Cortes began a three-month visit to Perimeter Institute, where I work, and challenged me to collaborate with her on realizing these ideas. The first result was a model we developed of a universe created by events, which we called an energetic causal set model.

    This is a version of a kind of model called a causal set model, in which the history of the universe is considered to be a discrete set of events related only by cause-and-effect. Our model was different from earlier models, though. In it, events are created by a process which maximizes their uniqueness. More precisely, the process produces a universe created by events, each of which is different from all the others. Space is not fundamental, only the events and the causal process that creates them are fundamental. But if space is not fundamental, energy is. The events each have a quantity of energy, which they gain from their predecessors and pass on to their successors. Everything else in the world emerges from these events and the energy they convey.

    We studied the model universes created by these processes and found that they generally pass through two stages of evolution. In the first stage, they are dominated by the irreversible processes that create the events, each unique. The direction of time is clear. But this gives rise to a second stage in which trails of events appear to propagate, creating emergent notions of particles. Particles emerge only when the second, approximately reversible stage is reached. These emergent particles propagate and appear to interact through emergent laws which seem reversible. In fact, we found, there are many possible models in which particles and approximately reversible laws emerge after a time from a more fundamental irreversible, particle-free system.

    This might explain how general relativity and the standard model emerged from a more fundamental theory, as Penrose hypothesized. Could we, we wondered, start with general relativity and, staying within the language of that theory, modify it to describe an irreversible theory? This would give us a framework to bridge the transition between the early, irreversible stage and the later, reversible stage.

    In a recent paper, Marina Cortes, PI postdoc Henrique Gomes and I showed one way to modify general relativity in a way that introduces a preferred direction of time, and we explored the possible consequences for the cosmology of the early universe. In particular, we showed that there were analogies of dark matter and dark energy, but which introduce a preferred direction of time, so a contracting universe is no longer the time-reverse of an expanding universe.

    To do this we had to first modify general relativity to include a physically preferred notion of time. Without that there is no notion of reversing time. Fortunately, such a modification already existed. Called shape dynamics, it had been proposed in 2011 by three young people, including Gomes. Their work was inspired by Julian Barbour, who had proposed that general relativity could be reformulated so that a relativity of size substituted for a relativity of time.

    Using the language of shape dynamics, Cortes, Gomes and I found a way to gently modify general relativity so that little is changed on the scale of stars, galaxies and planets. Nor are the predictions of general relativity regarding gravitational waves affected. But on the scale of the whole universe, and for the early universe, there are deviations where one cannot escape the consequences of a fundamental direction of time.

    Very recently I found still another way to modify the laws of general relativity to make them irreversible. General relativity incorporates effects of two fixed constants of nature, Newton’s constant, which measures the strength of the gravitational force, and the cosmological constant [usually denoted by the Greek capital letter lambda: Λ], which measures the density of energy in empty space. Usually these both are fixed constants, but I found a way they could evolve in time without destroying the beautiful harmony and consistency of the Einstein equations of general relativity.

    These developments are very recent and are far from demonstrating that the irreversibility we see around us is a reflection of a fundamental arrow of time. But they open a way to an understanding of how time got its direction that does not rely on our universe being a consequence of a cosmic accident.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

  • richardmitnick 7:59 am on June 19, 2015 Permalink | Reply
    Tags: , General Relativity,   

    From NOVA: “Do We Need to Rewrite General Relativity?” 



    18 Jun 2015
    Matthew Francis

    A cosmological computer simulation shows dark matter density overlaid with a gas velocity field. Credit: Illustris Collaboration/Illustris Simulation

    General relativity, the theory of gravity Albert Einstein published 100 years ago, is one of the most successful theories we have. It has passed every experimental test; every observation from astronomy is consistent with its predictions. Physicists and astronomers have used the theory to understand the behavior of binary pulsars, predict the black holes we now know pepper every galaxy, and obtain deep insights into the structure of the entire universe.

    Yet most researchers think general relativity is wrong.

    To be more precise: most believe it is incomplete. After all, the other forces of nature are governed by quantum physics; gravity alone has stubbornly resisted a quantum description. Meanwhile, a small but vocal group of researchers thinks that phenomena such as dark matter are actually failures of general relativity, requiring us to look at alternative ideas.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

  • richardmitnick 10:05 am on March 17, 2015 Permalink | Reply
    Tags: , General Relativity, ,   

    From phys.org: “Confirming Einstein, scientists find ‘spacetime foam’ not slowing down photons from faraway gamma-ray burst (Update)” 


    Mar 16, 2015
    No Writer Credit

    This is the “South Pillar” region of the star-forming region called the Carina Nebula. Like cracking open a watermelon and finding its seeds, the infrared telescope “busted open” this murky cloud to reveal star embryos tucked inside finger-like pillars of thick dust. Credit: NASA

    One hundred years after Albert Einstein formulated the General Theory of Relativity, an international team has proposed another experimental proof. In a paper published today in Nature Physics, researchers from the Hebrew University of Jerusalem, the Open University of Israel, Sapienza University of Rome, and University of Montpellier in France, describe a proof for one of the theory’s basic assumptions: the idea that all light particles, or photons, propagate at exactly the same speed.

    The researchers analyzed data, obtained by NASA’s Fermi Gamma-ray Space Telescope, of the arrival times of photons from a distant gamma-ray burst [GRB]. The data showed that photons traveling for billions of years from the distant burst toward Earth all arrived within a fraction of a second of each other.

    NASA Fermi Telescope

    This finding indicates that the photons all moved at the same speed, even though different photons had different energies. This is one of the best measurements ever of the independence of the speed of light from the energy of the light particles.

    Beyond confirming the general theory of relativity, the observation rules out one of the interesting ideas concerning the unification of general relativity and quantum theory. While these two theories are the pillars of physics today, they are still inconsistent, and there is an intrinsic contradiction between the two that is partially based on Heisenberg’s uncertainty principle that is at the heart of quantum theory.

    One of the attempts to reconcile the two theories is the idea of “space-time foam.” According to this concept, on a microscopic scale space is not continuous, and instead it has a foam-like structure. The size of these foam elements is so tiny that it is difficult to imagine and is at present impossible to measure directly. However light particles that are traveling within this foam will be affected by the foamy structure, and this will cause them to propagate at slightly different speeds depending on their energy.

    Yet this experiment shows otherwise. The fact that all the photons with different energies arrived with no time delay relative to each other indicates that such a foamy structure, if it exists at all, has a much smaller size than previously expected.

    “When we began our analysis, we didn’t expect to obtain such a precise measurement,” said Prof. Tsvi Piran, the Schwartzmann University Chair at the Hebrew University’s Racah Institute of Physics and a leader of the research. “This new limit is at the level expected from quantum gravity theories and can direct us how to combine quantum theory and relativity.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

  • richardmitnick 3:44 pm on November 10, 2014 Permalink | Reply
    Tags: , General Relativity, , , ,   

    From Quanta: “Multiverse Collisions May Dot the Sky” 

    Quanta Magazine
    Quanta Magazine

    November 10, 2014
    Jennifer Ouellette

    Like many of her colleagues, Hiranya Peiris, a cosmologist at University College London, once largely dismissed the notion that our universe might be only one of many in a vast multiverse. It was scientifically intriguing, she thought, but also fundamentally untestable. She preferred to focus her research on more concrete questions, like how galaxies evolve.

    Then one summer at the Aspen Center for Physics, Peiris found herself chatting with the Perimeter Institute’s Matt Johnson, who mentioned his interest in developing tools to study the idea. He suggested that they collaborate.

    At first, Peiris was skeptical. “I think as an observer that any theory, however interesting and elegant, is seriously lacking if it doesn’t have testable consequences,” she said. But Johnson convinced her that there might be a way to test the concept. If the universe that we inhabit had long ago collided with another universe, the crash would have left an imprint on the cosmic microwave background (CMB), the faint afterglow from the Big Bang. And if physicists could detect such a signature, it would provide a window into the multiverse.

    Cosmic Background Radiation Planck
    Cosmic Microwave Background per ESA/Planck

    Erick Weinberg, a physicist at Columbia University, explains this multiverse by comparing it to a boiling cauldron, with the bubbles representing individual universes — isolated pockets of space-time. As the pot boils, the bubbles expand and sometimes collide. A similar process may have occurred in the first moments of the cosmos.

    In the years since their initial meeting, Peiris and Johnson have studied how a collision with another universe in the earliest moments of time would have sent something similar to a shock wave across our universe. They think they may be able to find evidence of such a collision in data from the Planck space telescope, which maps the CMB.

    The project might not work, Peiris concedes. It requires not only that we live in a multiverse but also that our universe collided with another in our primal cosmic history. But if physicists succeed, they will have the first improbable evidence of a cosmos beyond our own.

    When Bubbles Collide

    Multiverse theories were once relegated to science fiction or crackpot territory. “It sounds like you’ve gone to crazy land,” said Johnson, who holds joint appointments at the Perimeter Institute of Theoretical Physics and York University. But scientists have come up with many versions of what a multiverse might be, some less crazy than others.

    The multiverse that Peiris and her colleagues are interested in is not the controversial “many worlds” hypothesis that was first proposed in the 1950s and holds that every quantum event spawns a separate universe. Nor is this concept of a multiverse related to the popular science-fiction trope of parallel worlds, new universes that pinch off from our space-time and become separate realms. Rather, this version arises as a consequence of inflation, a widely accepted theory of the universe’s first moments.

    Inflation holds that our universe experienced a sudden burst of rapid expansion an instant after the Big Bang, blowing up from a infinitesimally small speck to one spanning a quarter of a billion light-years in mere fractions of a second.

    Yet inflation, once started, tends to never completely stop. According to the theory, once the universe starts expanding, it will end in some places, creating regions like the universe we see all around us today. But elsewhere inflation will simply keep on going eternally into the future.

    This feature has led cosmologists to contemplate a scenario called eternal inflation. In this picture, individual regions of space stop inflating and become “bubble universes” like the one in which we live. But on larger scales, exponential expansion continues forever, and new bubble universes are continually being created. Each bubble is deemed a universe in its own right, despite being part of the same space-time, because an observer could not travel from one bubble to the next without moving faster than the speed of light. And each bubble may have its own distinct laws of physics. “If you buy eternal inflation, it predicts a multiverse,” Peiris said.

    In 2012, Peiris and Johnson teamed up with Anthony Aguirre and Max Wainwright — both physicists at the University of California, Santa Cruz — to build a simulated multiverse with only two bubbles. They studied what happened after the bubbles collided to determine what an observer would see. The team concluded that a collision of two bubble universes would appear to us as a disk on the CMB with a distinctive temperature profile.

    Olena Shmahalo/Quanta Magazine; source: S. M. Freeney et. al., Physical Review Letters

    An ancient collision with a bubble universe would have altered the temperature of the cosmic microwave background (left), creating a faint disk in the sky (right) that could potentially be observed.

    To guard against human error — we tend to see the patterns we want to see — they devised a set of algorithms to automatically search for these disks in data from the Wilkinson Microwave Anisotropy Probe (WMAP), a space-based observatory. The program identified four potential regions with temperature fluctuations consistent with what could be a signature of a bubble collision. When data from the Planck satellite becomes available later this year, researchers should be able to improve on that earlier analysis.


    ESA Planck

    Yet detecting convincing signatures of the multiverse is tricky. Simply knowing what an encounter might look like requires a thorough understanding of the dynamics of bubble collisions — something quite difficult to model on a computer, given the complexity of such interactions.

    When tackling a new problem, physicists typically find a good model that they already understand and adapt it by making minor tweaks they call “perturbations.” For instance, to model the trajectory of a satellite in space, a physicist might use the classical laws of motion outlined by Isaac Newton in the 17th century and then make small refinements by calculating the effects of other factors that might influence its motion, such as pressure from the solar wind. For simple systems, there should be only small discrepancies from the unperturbed model. Try to calculate the airflow patterns of a complex system like a tornado, however, and those approximations break down. Perturbations introduce sudden, very large changes to the original system instead of smaller, predictable refinements.

    Modeling bubble collisions during the inflationary period of the early universe is akin to modeling a tornado. By its very nature, inflation stretches out space-time at an exponential rate — precisely the kind of large jumps in values that make calculating the dynamics so challenging.

    “Imagine you start with a grid, but within an instant, the grid has expanded to a massive size,” Peiris said. With her collaborators, she has used techniques like adaptive mesh refinement — an iterative process of winnowing out the most relevant details in such a grid at increasingly finer scales — in her simulations of inflation to deal with the complexity. Eugene Lim, a physicist at King’s College London, has found that an unusual type of traveling wave might help simplify matters even further.

    Waves of Translation

    In August 1834, a Scottish engineer named John Scott Russell was conducting experiments along Union Canal with an eye toward improving the efficiency of the canal boats. One boat being drawn by a team of horses stopped suddenly, and Russell noted a solitary wave in the water that kept rolling forward at a constant speed without losing its shape. The behavior was unlike typical waves, which tend to flatten out or rise to a peak and topple quickly. Intrigued, Russell tracked the wave on horseback for a couple of miles before it finally dissipated in the channel waters. This was the first recorded observation of a soliton.

    Russell was so intrigued by the indomitable wave that he built a 30-foot wave tank in his garden to further study the phenomenon, noting key characteristics of what he called “the wave of translation.” Such a wave could maintain size, shape and speed over longer distances than usual. The speed depended on the wave’s size, and the width depended on the depth of the water. And if a large solitary wave overtook a smaller one, the larger, faster wave would just pass right through.

    Russell’s observations were largely dismissed by his peers because his findings seemed to contradict what was known about water wave physics at the time. It wasn’t until the mid-1960s that such waves were dubbed solitons and physicists realized their usefulness in modeling problems in diverse areas such as fiber optics, biological proteins and DNA. Solitons also turn up in certain configurations of quantum field theory. Poke a quantum field and you will create an oscillation that usually dissipates outward, but configure things in just the right way and that oscillation will maintain its shape — just like Russell’s wave of translation.

    Because solitons are so stable, Lim believes they could work as a simplified toy model for the dynamics of bubble collisions in the multiverse, providing physicists with better predictions of what kinds of signatures might show up in the CMB. If his hunch is right, the expanding walls of our bubble universe are much like solitons.

    However, while it is a relatively straightforward matter to model a solitary standing wave, the dynamics become vastly more complicated and difficult to calculate when solitons collide and interact, forcing physicists to rely on computer simulations instead. In the past, researchers have used a particular class of soliton with an exact mathematical solution and tweaked that model to suit their purposes. But this approach only works if the target system under study is already quite similar to the toy model; otherwise the changes are too large to calculate.

    To get around that hurdle, Lim devised a neat trick based on a quirky feature of soliton collisions. When imagining two objects colliding, we naturally assume that the faster they are moving, the greater the impact and the more complicated the dynamics. Two cars ramming each other at high speeds, for instance, will produce scattered debris, heat, noise and other effects. The same is true for colliding solitons — at least initially. Collide two solitons very slowly, and there will be very little interaction, according to Lim. As the speed increases, the solitons interact more strongly.

    But Lim found that as the speed continues to increase, the pattern eventually reverses: The soliton interaction begins to decrease. By the time they are traveling at the speed of light, there is no interaction at all. “They just fly right past each other,” Lim said. “The faster you collide two solitons, the simpler they become.” The lack of interactions makes it easier to model the dynamics of colliding solitons, as well as colliding bubble universes with solitons as their “edges,” since the systems are roughly similar.

    According to Johnson, Lim has uncovered a very simple rule that can be applied broadly: Multiverse interactions are weak during high-speed collisions, making it easier to simulate the dynamics of those encounters. One can simply create a new model of the multiverse, use solitons as a tool to map the new model’s expected signatures onto cosmic microwave data, and rule out any theories that don’t match what researchers see. This process would help physicists identify the most viable models for the multiverse, which — while still speculative — would be consistent both with the latest observational data and with inflationary theory.

    The Multiverse’s Case for String Theory

    One reason that more physicists are taking the idea of the multiverse seriously is that certain such models could help resolve a significant challenge in string theory. One of the goals of string theory has been to unify quantum mechanics and general relativity, two separate “rule books” in physics that govern very different size scales, into a single, simple solution.

    But around 10 years ago, “the dream of string theory kind of exploded,” Johnson said — and not in a good way. Researchers began to realize that string theory doesn’t provide a unique solution. Instead, it “gives you the theory of a vast number of worlds,” Weinberg said. A common estimate — one that Weinberg thinks is conservative — is 10500 possibilities. This panoply of worlds implies that string theory can predict every possible outcome.

    The multiverse would provide a possible means of incorporating all the different worlds predicted by string theory. Each version could be realized in its own bubble universe. “Everything depends on which part of the universe you live in,” Lim said.

    Peiris acknowledges that this argument has its critics. “It can predict anything, and therefore it’s not valid,” Peiris said of the reasoning typically used to dismiss the notion of a multiverse as a tautology, rather than a true scientific theory. “But I think that’s the wrong way to think about it.” The theory of evolution, Peiris argues, also resembles a tautology in certain respects — “an organism exists because it survived” — yet it holds tremendous explanatory power. It is a simple model that requires little initial input to produce the vast diversity of species we see today.

    A multiverse model tied to eternal inflation could have the same kind of explanatory power. In this case, the bubble universes function much like speciation. Those universes that happen to have the right laws of physics will eventually “succeed” — that is, they will become home to conscious observers like ourselves. If our universe is one of many in a much larger multiverse, our existence seems less unlikely.

    Uncertain Signals

    Ultimately, however, Peiris’ initial objection still stands: Without some means of gathering experimental evidence, the multiverse hypothesis will be untestable by definition. As such, it will lurk on the fringes of respectable physics — hence the strong interest in detecting bubble collision signatures in the CMB.

    Of course, “just because these bubble collisions can leave a signature doesn’t mean they do leave a signature,” Peiris emphasized. “We need nature to be kind to us.” An observable signal could be a rare find, given how quickly space expanded during inflation. The collisions may not have been rare, but subsequent inflation “tends to dilute away the effects of the collision just like it dilutes away all other prior ‘structure’ in the early universe, leaving you with a small chance of seeing a signal in the CMB sky,” Peiris said.

    “My own feeling is you need to adjust the numbers rather finely to get it to work,” Weinberg said. The rate of formation of the bubble universes is key. If they had formed slowly, collisions would not have been possible because space would have expanded and driven the bubbles apart long before any collision could take place. Alternatively, if the bubbles had formed too quickly, they would have merged before space could expand sufficiently to form disconnected pockets. Somewhere in between is the Goldilocks rate, the “just right” rate at which the bubbles would have had to form for a collision to be possible.

    Researchers also worry about finding a false positive. Even if such a collision did happen and evidence was imprinted on the CMB, spotting the telltale pattern would not necessarily constitute evidence of a multiverse. “You can get an effect and say it will be consistent with the calculated predictions for these [bubble] collisions,” Weinberg said. “But it might well be consistent with lots of other things.” For instance, a distorted CMB might be evidence of theoretical entities called cosmic strings. These are like the cracks that form in the ice when a lake freezes over, except here the ice is the fabric of space-time. Magnetic monopoles are another hypothetical defect that could affect the CMB, as could knots or twists in space-time called textures.

    Weinberg isn’t sure it would even be possible to tell the difference between these different possibilities, especially because many models of eternal inflation exist. Without knowing the precise details of the theory, trying to make a positive identification of the multiverse would be like trying to distinguish between the composition of two meteorites that hit the roof of a house solely by the sound of the impacts, without knowing how the house is constructed and with what materials.

    Should a signature for a bubble collision be confirmed, Peiris doesn’t see a way to study another bubble universe any further because by now it would be entirely out of causal contact with ours. But it would be a stunning validation that the notion of a multiverse deserves a seat at the testable physics table.

    And should that signal turn out to be evidence for cosmic strings or magnetic monopoles instead, it would still constitute exciting new physics at the frontier of cosmology. In that respect, “the cosmic microwave background radiation is the underpinning of modern cosmology,” Peiris said. “It’s the gift that keeps on giving.”

    See the full article, with video, here.

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

    ScienceSprings relies on technology from

    MAINGEAR computers



Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc

Get every new post delivered to your Inbox.

Join 535 other followers

%d bloggers like this: