Tagged: Theoretical Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:52 pm on May 13, 2016 Permalink | Reply
    Tags: , , , , , , Theoretical Physics   

    From FNAL: “What do theorists do?” 

    FNAL II photo

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    May 13, 2016
    Leah Hesla
    Rashmi Shivni

    1
    Pilar Coloma (left) and Seyda Ipek write calculations from floor to ceiling as they try to find solutions to lingering questions about our current models of the universe. Photo: Rashmi Shivni, OC

    Some of the ideas you’ve probably had about theoretical physicists are true.

    They toil away at complicated equations. The amount of time they spend on their computers rivals that of millennials on their hand-held devices. And almost nothing of what they turn up will ever be understood by most of us.

    The statements are true, but as you might expect, the resulting portrait of ivory tower isolation misses the mark.

    The theorist’s task is to explain why we see what we see and predict what we might expect to see, and such pronouncements can’t be made from the proverbial armchair. Theorists work with experimentalists, their counterparts in the proverbial field, as a vital part of the feedback loop of scientific investigation.

    “Sometimes I bounce ideas off experimentalists and learn from what they have seen in their results,” said Fermilab theorist Pilar Coloma, who studies neutrino physics. “Or they may find something profound in theory models that they want to test. My job is all about pushing the knowledge forward so other people can use it.”

    Predictive power

    Theorists in particle physics — the Higgses and Hawkings of the world — push knowledge by making predictions about particle interactions. Starting from the framework known as the Standard Model, they calculate, say, the likelihood of numerous outcomes from the interaction of two electrons, like a blackjack player scanning through the possibilities for the dealer’s next draw.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Experimentalists can then seek out the predicted phenomena, rooting around in the data for a never-before-seen phenomenon.

    Theorists’ predictions keep experimentalists from having to shoot in the dark. Like an experienced paleontologist, the theorist can tell the experimentalist where to dig to find something new.

    “We simulate many fake events,” Coloma said. “The simulated data determines the prospects for an experiment or puts a bound on a new physics model.”

    The Higgs boson provides one example.

    CERN ATLAS Higgs Event
    CERN ATLAS Higgs Event

    By 2011, a year before CERN’s ATLAS and CMS experiments announced they’d discovered the Higgs boson, theorists had put forth nearly 100 different proposals by as many different methods for the particle’s mass. Many of the predictions were indeed in the neighborhood of the mass as measured by the two experiments.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    CERN/ATLAS
    CERN/ATLAS

    CERN/CMS Detector
    CERN/CMS Detector

    And like the paleontologist presented with a new artifact, the theorist also offers explanations for unexplained sightings in experimentalists’ data. She might compare the particle signatures in the detector against her many fake events. Or given an intriguing measurement, she might fold it into the next iteration of calculations. If experimentalists see a particle made of a quark combination not yet on the books, theorists would respond by explaining the underlying mechanism or, if there isn’t one yet, work it out.

    “Experimentalists give you information. ‘We think this particle is of this type. Do you know of any Standard Model particle that fits?’” said Seyda Ipek, a theorist studying the matter-antimatter imbalance in the universe. “At first it might not be obvious, because when you add something new, you change the other observations you know are in the Standard Model, and that puts a constraint on your models.”

    And since the grand aim of particle physics theory is to be able to explain all of nature, the calculation developed to explain a new phenomenon must be extendible to a general principle.

    “Unless you have a very good prediction from theory, you can’t convert that experimental measurement into a parameter that appears in the underlying theory of the Standard Model,” said Fermilab theorist John Campbell, who works on precision theoretical predictions for the ATLAS and CMS experiments at the Large Hadron Collider.

    Calculating moves

    The theorist’s calculation starts with the prospect of a new measurement or a hole in a theory.

    “You look at the interesting things that an experiment is going to measure or that you have a chance of measuring,” Campbell said. “If the data agrees with theory everywhere, there’s not much room for new physics. So you look for small deviations that might be a sign of something. You’re really trying to dream up a new set of interactions that might explain why the data doesn’t agree somewhere.”

    In its raw form, particle physics data is the amount and location of the energy a particle deposits in a particle detector. The more sensitive the detector, the more accurate the experimentalists’ measurement, and the more precise the corresponding calculation needs to be.

    2
    Fermilab theorists John Campbell (left) and Ye Li work on a calculation that describes the interactions you might expect to see in the complicated environment of the LHC. Photo: Rashmi Shivni

    The CMS detector at the Large Hadron Collider, for example, allows scientists to measure some probabilities of particle interactions to within a few percent. And that’s after taking into account that it takes one million or even one billion proton-proton collisions to produce just one interesting interaction that CMS would like to measure.

    “When you’re making the measurement that accurately, it demands a prediction at a very high level,” Campbell said. “If you’re looking for something unexpected, then you need to know the expected part in quite a lot of detail.”

    A paleontologist recognizes the vertebra of a brachiosaurus, and the theoretical particle physicist knows what the production of a pair of top quarks looks like in the detector. A departure from the known picture triggers him to take action.

    “So then you embark on this calculation,” Campbell said.

    Embark, indeed. These calculations are not pencil-and-paper assignments. A single calculation predicting the details of a particle interaction, for example, can be a prodigious effort that takes months or years.

    So-called loop corrections are one example: Theorists home in on what happens during a particle event by adding detail — a correction — to an approximate picture.

    Consider two electrons that approach each other, exchange a photon and diverge. Zooming in further, you predict that the photon emits and reabsorbs yet another pair of particles before it itself is reabsorbed by the electron pair. And perhaps you predict that, at the same time, one of the electrons emits and reabsorbs another photon all on its own.

    Each additional quantum-scale effect, or loop, in the big-picture interaction is like pennies on the dollar, changing the accounting of the total transaction — the precision of a particle mass calculation or of the interaction strength between two particles.

    With each additional loop, the task of performing the calculation becomes that much more formidable. (“Loop” reflects how the effects are represented pictorially in Feynman diagrams — details in the approximate picture of the interaction.) Theorists were computing one-loop corrections for the production of a Higgs boson arising from two protons until 1991. It took another 10 years to complete the two-loop corrections for the process. And it wasn’t until this year, 2016, that they finished computing the three-loop corrections. Precise measurements at the Large Hadron Collider would (and do) require precise predictions to determine the kind of Higgs boson that scientists would see, demanding the decades-long investment.

    “Doing these calculations is not straightforward, or we would have done them a long time ago,” Campbell said.

    Once the theorist completes a calculation, they might publish a paper or otherwise make their code broadly available. From there, experimentalists can use the code to simulate how it will look in the detector. Farms of computers map out millions of fake events that take into account the new predictions provided courtesy of the theorist.

    “Without a network of computers available, our studies can’t be done in a reasonable time,” Coloma said. “A single computer can not analyze millions of data points, just as a human being could never take on such a task.”

    If the simulation shows that, for example, a particle might decay in more ways than what the experiment has seen, the theorist could suggest that experimentalists expand their search.

    “We’ve pushed experiments to look in different channels,” Ipek said. “They could look into decays of particles into two-body states, but why not also 10-body states?”

    Theorists also work with an experiment, or multiple experiments, to put their calculations to best use. Armed with code, experimentalists can change a parameter or two to guide them in their search for new physics. What happens, for example, if the Higgs boson interacts a little more strongly with the top quark than we expect? How would that change what we see in our detectors?

    “That’s a question they can ask and then answer,” Campbell said. “Anyone can come up with a new theory. It is best to try to provide a concrete plan that they can follow.”

    Outlandish theories and concrete plans

    Concrete plans ensure a fruitful relationship between experiment and theory. The wilder, unconventional theories scientists dream up take the field into exciting, uncharted territory, but that isn’t to say that they don’t also have their utility.

    Theorists who specialize in physics beyond the Standard Model, for example, generate thousands of theories worldwide for new physics – new phenomena seen as new energy deposits in the detector where you don’t expect to see them.

    “Even if things don’t end up existing, it encourages the experiment to look at its data in different ways,” Campbell said. An experiment could take so much data that you might worry that some fun effect is hiding, never to be seen. Having truckloads of theories helps mitigate against that. “You’re trying to come up with as many outlandish ideas as you can in the hope that you cover as many of those possibilities as you can.”

    Theorists bridge the gap between the pure mathematics that describes nature and the data through which nature manifests.

    “The field itself is challenging, but theory takes us to new places and helps us imagine new phenomena,” Ipek said.” We collectively work toward understanding every detail of our universe and that’s what ultimately matters most.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 4:12 pm on January 2, 2016 Permalink | Reply
    Tags: , , , , Theoretical Physics   

    From PI Via Daily Galaxy: “The Big Bang was a Mirage from a Collapsing Higher-Dimensional Star” February 2015 but Very Interesting 

    Daily Galaxy
    The Daily Galaxy

    Perimeter Institute
    Perimeter Institute
    Perimeter Institute bloc

    February 14, 2015 [Just brought forward – again]
    No writer credit

    Temp 1

    Big Bang was a mirage from collapsing higher-dimensional star, theorists propose. While the recent [ESA]Planck results “prove that inflation is correct”, they leave open the question of how inflation happened.

    ESA Planck
    ESA/Planck

    A new The study could help to show how inflation was triggered by the motion of the Universe through a higher-dimensional reality.
    The event horizon of a black hole — the point of no return for anything that falls in — is a spherical surface. In a higher-dimensional universe, a black hole could have a three-dimensional event horizon, which could spawn a whole new universe as it forms.

    It could be time to bid the Big Bang bye-bye. Cosmologists have speculated that the Universe formed from the debris ejected when a four-dimensional star collapsed into a black hole — a scenario that would help to explain why the cosmos seems to be so uniform in all directions.

    Cosmic Background Radiation Planck
    CMB per Planck

    The standard Big Bang model tells us that the Universe exploded out of an infinitely dense point, or singularity. But nobody knows what would have triggered this outburst: the known laws of physics cannot tell us what happened at that moment.

    “For all physicists know, dragons could have come flying out of the singularity,” says Niayesh Afshordi, an astrophysicist at the Perimeter Institute for Theoretical Physics in Waterloo, Canada.

    It is also difficult to explain how a violent Big Bang would have left behind a Universe that has an almost completely uniform temperature, because there does not seem to have been enough time since the birth of the cosmos for it to have reached temperature equilibrium.

    To most cosmologists, the most plausible explanation for that uniformity is that, soon after the beginning of time, some unknown form of energy made the young Universe inflate at a rate that was faster than the speed of light. That way, a small patch with roughly uniform temperature would have stretched into the vast cosmos we see today. But Afshordi notes that “the Big Bang was so chaotic, it’s not clear there would have been even a small homogenous patch for inflation to start working on”.

    In a paper posted last week on the arXiv preprint server1, Afshordi and his colleagues turn their attention to a proposal made in 2000 by a team including Gia Dvali, a physicist now at the Ludwig Maximilians University in Munich, Germany. In that model, our three-dimensional (3D) Universe is a membrane, or brane, that floats through a ‘bulk universe’ that has four spatial dimensions.

    Ashfordi’s team realized that if the bulk universe contained its own four-dimensional (4D) stars, some of them could collapse, forming 4D black holes in the same way that massive stars in our Universe do: they explode as supernovae, violently ejecting their outer layers, while their inner layers collapse into a black hole.

    In our Universe, a black hole is bounded by a spherical surface called an event horizon. Whereas in ordinary three-dimensional space it takes a two-dimensional object (a surface) to create a boundary inside a black hole, in the bulk universe the event horizon of a 4D black hole would be a 3D object — a shape called a hypersphere. When Afshordi’s team modelled the death of a 4D star, they found that the ejected material would form a 3D brane surrounding that 3D event horizon, and slowly expand.

    The authors postulate that the 3D Universe we live in might be just such a brane — and that we detect the brane’s growth as cosmic expansion. “Astronomers measured that expansion and extrapolated back that the Universe must have begun with a Big Bang — but that is just a mirage,” says Afshordi.

    The model also naturally explains our Universe’s uniformity. Because the 4D bulk universe could have existed for an infinitely long time in the past, there would have been ample opportunity for different parts of the 4D bulk to reach an equilibrium, which our 3D Universe would have inherited.

    The picture has some problems, however. Earlier this year, the European Space Agency’s Planck space observatory released data that mapped the slight temperature fluctuations in the cosmic microwave background — the relic radiation that carries imprints of the Universe’s early moments. The observed patterns matched predictions made by the standard Big Bang model and inflation, but the black-hole model deviates from Planck’s observations by about 4%. Hoping to resolve the discrepancy, Afshordi says that his is now refining its model.

    Despite the mismatch, Dvali praises the ingenious way in which the team threw out the Big Bang model. “The singularity is the most fundamental problem in cosmology and they have rewritten history so that we never encountered it,” he says. Whereas the Planck results “prove that inflation is correct”, they leave open the question of how inflation happened, Dvali adds. The study could help to show how inflation is triggered by the motion of the Universe through a higher-dimensional reality, he says.

    Nature doi:10.1038/nature.2013.13743

    See the full article here .

    Please help promote STEM in your local schools

    stem

    STEM Education Coalition

     
  • richardmitnick 3:47 pm on November 25, 2015 Permalink | Reply
    Tags: , , , Theoretical Physics   

    From Nature: “Theoretical physics: The origins of space and time” 2013 but Very Informative 

    Nature Mag
    Nature

    28 August 2013
    Zeeya Merali

    1

    “Imagine waking up one day and realizing that you actually live inside a computer game,” says Mark Van Raamsdonk, describing what sounds like a pitch for a science-fiction film. But for Van Raamsdonk, a physicist at the University of British Columbia in Vancouver, Canada, this scenario is a way to think about reality. If it is true, he says, “everything around us — the whole three-dimensional physical world — is an illusion born from information encoded elsewhere, on a two-dimensional chip”. That would make our Universe, with its three spatial dimensions, a kind of hologram, projected from a substrate that exists only in lower dimensions.

    This ‘holographic principle’ is strange even by the usual standards of theoretical physics. But Van Raamsdonk is one of a small band of researchers who think that the usual ideas are not yet strange enough. If nothing else, they say, neither of the two great pillars of modern physics — general relativity, which describes gravity as a curvature of space and time, and quantum mechanics, which governs the atomic realm — gives any account for the existence of space and time. Neither does string theory, which describes elementary threads of energy.

    Van Raamsdonk and his colleagues are convinced that physics will not be complete until it can explain how space and time emerge from something more fundamental — a project that will require concepts at least as audacious as holography. They argue that such a radical reconceptualization of reality is the only way to explain what happens when the infinitely dense ‘singularity‘ at the core of a black hole distorts the fabric of space-time beyond all recognition, or how researchers can unify atomic-level quantum theory and planet-level general relativity — a project that has resisted theorists’ efforts for generations.

    “All our experiences tell us we shouldn’t have two dramatically different conceptions of reality — there must be one huge overarching theory,” says Abhay Ashtekar, a physicist at Pennsylvania State University in University Park.

    Finding that one huge theory is a daunting challenge. Here, Nature explores some promising lines of attack — as well as some of the emerging ideas about how to test these concepts.

    2

    Gravity as thermodynamics

    One of the most obvious questions to ask is whether this endeavour is a fool’s errand. Where is the evidence that there actually is anything more fundamental than space and time?

    A provocative hint comes from a series of startling discoveries made in the early 1970s, when it became clear that quantum mechanics and gravity were intimately intertwined with thermodynamics, the science of heat.

    In 1974, most famously, Stephen Hawking of the University of Cambridge, UK, showed that quantum effects in the space around a black hole will cause it to spew out radiation as if it was hot. Other physicists quickly determined that this phenomenon was quite general. Even in completely empty space, they found, an astronaut undergoing acceleration would perceive that he or she was surrounded by a heat bath. The effect would be too small to be perceptible for any acceleration achievable by rockets, but it seemed to be fundamental. If quantum theory and general relativity are correct — and both have been abundantly corroborated by experiment — then the existence of Hawking radiation seemed inescapable.

    A second key discovery was closely related. In standard thermodynamics, an object can radiate heat only by decreasing its entropy, a measure of the number of quantum states inside it. And so it is with black holes: even before Hawking’s 1974 paper, Jacob Bekenstein, now at the Hebrew University of Jerusalem, had shown that black holes possess entropy. But there was a difference. In most objects, the entropy is proportional to the number of atoms the object contains, and thus to its volume. But a black hole’s entropy turned out to be proportional to the surface area of its event horizon — the boundary out of which not even light can escape. It was as if that surface somehow encoded information about what was inside, just as a two-dimensional hologram encodes a three-dimensional image.

    In 1995, Ted Jacobson, a physicist at the University of Maryland in College Park, combined these two findings, and postulated that every point in space lies on a tiny black-hole horizon that also obeys the entropy–area relationship. From that, he found, the mathematics yielded [Albert]Einstein’s equations of general relativity — but using only thermodynamic concepts, not the idea of bending space-time(1).

    “This seemed to say something deep about the origins of gravity,” says Jacobson. In particular, the laws of thermodynamics are statistical in nature — a macroscopic average over the motions of myriad atoms and molecules — so his result suggested that gravity is also statistical, a macroscopic approximation to the unseen constituents of space and time.

    In 2010, this idea was taken a step further by Erik Verlinde, a string theorist at the University of Amsterdam, who showed (2) that the statistical thermodynamics of the space-time constituents — whatever they turned out to be — could automatically generate Newton’s law of gravitational attraction.

    And in separate work, Thanu Padmanabhan, a cosmologist at the Inter-University Centre for Astronomy and Astrophysics in Pune, India, showed (3) that Einstein’s equations can be rewritten in a form that makes them identical to the laws of thermodynamics — as can many alternative theories of gravity. Padmanabhan is currently extending the thermodynamic approach in an effort to explain the origin and magnitude of dark energy: a mysterious cosmic force that is accelerating the Universe’s expansion.

    Testing such ideas empirically will be extremely difficult. In the same way that water looks perfectly smooth and fluid until it is observed on the scale of its molecules — a fraction of a nanometre — estimates suggest that space-time will look continuous all the way down to the Planck scale: roughly 10−35 metres, or some 20 orders of magnitude smaller than a proton.

    But it may not be impossible. One often-mentioned way to test whether space-time is made of discrete constituents is to look for delays as high-energy photons travel to Earth from distant cosmic events such as supernovae and γ-ray bursts [?]. In effect, the shortest-wavelength photons would sense the discreteness as a subtle bumpiness in the road they had to travel, which would slow them down ever so slightly. Giovanni Amelino-Camelia, a quantum-gravity researcher at the University of Rome, and his colleagues have found (4) hints of just such delays in the photons from a γ-ray burst recorded in April. The results are not definitive, says Amelino-Camelia, but the group plans to expand its search to look at the travel times of high-energy neutrinos produced by cosmic events. He says that if theories cannot be tested, “then to me, they are not science. They are just religious beliefs, and they hold no interest for me.”

    Other physicists are looking at laboratory tests. In 2012, for example, researchers from the University of Vienna and Imperial College London proposed (5) a tabletop experiment in which a microscopic mirror would be moved around with lasers. They argued that Planck-scale granularities in space-time would produce detectable changes in the light reflected from the mirror (see Nature http://doi.org/njf; 2012).

    Loop quantum gravity

    Even if it is correct, the thermodynamic approach says nothing about what the fundamental constituents of space and time might be. If space-time is a fabric, so to speak, then what are its threads?

    One possible answer is quite literal. The theory of loop quantum gravity, which has been under development since the mid-1980s by Ashtekar and others, describes the fabric of space-time as an evolving spider’s web of strands that carry information about the quantized areas and volumes of the regions they pass through (6). The individual strands of the web must eventually join their ends to form loops — hence the theory’s name — but have nothing to do with the much better-known strings of string theory. The latter move around in space-time, whereas strands actually are space-time: the information they carry defines the shape of the space-time fabric in their vicinity.

    Because the loops are quantum objects, however, they also define a minimum unit of area in much the same way that ordinary quantum mechanics defines a minimum ground-state energy for an electron in a hydrogen atom. This quantum of area is a patch roughly one Planck scale on a side. Try to insert an extra strand that carries less area, and it will simply disconnect from the rest of the web. It will not be able to link to anything else, and will effectively drop out of space-time.

    One welcome consequence of a minimum area is that loop quantum gravity cannot squeeze an infinite amount of curvature onto an infinitesimal point. This means that it cannot produce the kind of singularities that cause Einstein’s equations of general relativity to break down at the instant of the Big Bang and at the centres of black holes.

    In 2006, Ashtekar and his colleagues reported (7) a series of simulations that took advantage of that fact, using the loop quantum gravity version of Einstein’s equations to run the clock backwards and visualize what happened before the Big Bang. The reversed cosmos contracted towards the Big Bang, as expected. But as it approached the fundamental size limit dictated by loop quantum gravity, a repulsive force kicked in and kept the singularity open, turning it into a tunnel to a cosmos that preceded our own.

    This year, physicists Rodolfo Gambini at the Uruguayan University of the Republic in Montevideo and Jorge Pullin at Louisiana State University in Baton Rouge reported (8) a similar simulation for a black hole. They found that an observer travelling deep into the heart of a black hole would encounter not a singularity, but a thin space-time tunnel leading to another part of space. “Getting rid of the singularity problem is a significant achievement,” says Ashtekar, who is working with other researchers to identify signatures that would have been left by a bounce, rather than a bang, on the cosmic microwave background — the radiation left over from the Universe’s massive expansion in its infant moments.

    Loop quantum gravity is not a complete unified theory, because it does not include any other forces. Furthermore, physicists have yet to show how ordinary space-time would emerge from such a web of information. But Daniele Oriti, a physicist at the Max Planck Institute for Gravitational Physics in Golm, Germany, is hoping to find inspiration in the work of condensed-matter physicists, who have produced exotic phases of matter that undergo transitions described by quantum field theory. Oriti and his colleagues are searching for formulae to describe how the Universe might similarly change phase, transitioning from a set of discrete loops to a smooth and continuous space-time. “It is early days and our job is hard because we are fishes swimming in the fluid at the same time as trying to understand it,” says Oriti.

    Causal sets

    Such frustrations have led some investigators to pursue a minimalist programme known as causal set theory. Pioneered by Rafael Sorkin, a physicist at the Perimeter Institute in Waterloo, Canada, the theory postulates that the building blocks of space-time are simple mathematical points that are connected by links, with each link pointing from past to future. Such a link is a bare-bones representation of causality, meaning that an earlier point can affect a later one, but not vice versa. The resulting network is like a growing tree that gradually builds up into space-time. “You can think of space emerging from points in a similar way to temperature emerging from atoms,” says Sorkin. “It doesn’t make sense to ask, ‘What’s the temperature of a single atom?’ You need a collection for the concept to have meaning.”

    In the late 1980s, Sorkin used this framework to estimate(9) the number of points that the observable Universe should contain, and reasoned that they should give rise to a small intrinsic energy that causes the Universe to accelerate its expansion. A few years later, the discovery of dark energy confirmed his guess. “People often think that quantum gravity cannot make testable predictions, but here’s a case where it did,” says Joe Henson, a quantum-gravity researcher at Imperial College London. “If the value of dark energy had been larger, or zero, causal set theory would have been ruled out.”

    Causal dynamical triangulations

    That hardly constituted proof, however, and causal set theory has offered few other predictions that could be tested. Some physicists have found it much more fruitful to use computer simulations. The idea, which dates back to the early 1990s, is to approximate the unknown fundamental constituents with tiny chunks of ordinary space-time caught up in a roiling sea of quantum fluctuations, and to follow how these chunks spontaneously glue themselves together into larger structures.

    The earliest efforts were disappointing, says Renate Loll, a physicist now at Radboud University in Nijmegen, the Netherlands. The space-time building blocks were simple hyper-pyramids — four-dimensional counterparts to three-dimensional tetrahedrons — and the simulation’s gluing rules allowed them to combine freely. The result was a series of bizarre ‘universes’ that had far too many dimensions (or too few), and that folded back on themselves or broke into pieces. “It was a free-for-all that gave back nothing that resembles what we see around us,” says Loll.

    But, like Sorkin, Loll and her colleagues found that adding causality changed everything. After all, says Loll, the dimension of time is not quite like the three dimensions of space. “We cannot travel back and forth in time,” she says. So the team changed its simulations to ensure that effects could not come before their cause — and found that the space-time chunks started consistently assembling themselves into smooth four-dimensional universes with properties similar to our own(10).

    Intriguingly, the simulations also hint that soon after the Big Bang, the Universe went through an infant phase with only two dimensions — one of space and one of time. This prediction has also been made independently by others attempting to derive equations of quantum gravity, and even some who suggest that the appearance of dark energy is a sign that our Universe is now growing a fourth spatial dimension. Others have shown that a two-dimensional phase in the early Universe would create patterns similar to those already seen in the cosmic microwave background.

    Holography

    Meanwhile, Van Raamsdonk has proposed a very different idea about the emergence of space-time, based on the holographic principle. Inspired by the hologram-like way that black holes store all their entropy at the surface, this principle was first given an explicit mathematical form by Juan Maldacena, a string theorist at the Institute of Advanced Study in Princeton, New Jersey, who published (11) his influential model of a holographic universe in 1998. In that model, the three-dimensional interior of the universe contains strings and black holes governed only by gravity, whereas its two-dimensional boundary contains elementary particles and fields that obey ordinary quantum laws without gravity.

    Hypothetical residents of the three-dimensional space would never see this boundary, because it would be infinitely far away. But that does not affect the mathematics: anything happening in the three-dimensional universe can be described equally well by equations in the two-dimensional boundary, and vice versa.

    In 2010, Van Raamsdonk studied what that means when quantum particles on the boundary are ‘entangled’ — meaning that measurements made on one inevitably affect the other (12). He discovered that if every particle entanglement between two separate regions of the boundary is steadily reduced to zero, so that the quantum links between the two disappear, the three-dimensional space responds by gradually dividing itself like a splitting cell, until the last, thin connection between the two halves snaps. Repeating that process will subdivide the three-dimensional space again and again, while the two-dimensional boundary stays connected. So, in effect, Van Raamsdonk concluded, the three-dimensional universe is being held together by quantum entanglement on the boundary — which means that in some sense, quantum entanglement and space-time are the same thing.

    Or, as Maldacena puts it: “This suggests that quantum is the most fundamental, and space-time emerges from it.”

    [For references, please see the full article.]

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: