Tagged: Gravity Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:33 pm on August 20, 2018 Permalink | Reply
    Tags: , Anomalies, Bosons and fermions, Branes, , , , Gravity, Murray Gell-Mann, Parity violation, , , , , , , The second superstring revolution, Theorist John Schwarz   

    From Caltech: “Long and Winding Road: A Conversation with String Theory Pioneer” John Schwarz 

    Caltech Logo

    From Caltech

    08/20/2018

    Whitney Clavin
    (626) 395-1856
    wclavin@caltech.edu

    John Schwarz discusses the history and evolution of superstring theory.

    1
    John Schwarz. Credit: Seth Hansen for Caltech

    The decades-long quest for a theory that would unify all the known forces—from the microscopic quantum realm to the macroscopic world where gravity dominates—has had many twists and turns. The current leading theory, known as superstring theory and more informally as string theory, grew out of an approach to theoretical particle physics, called S-matrix theory, which was popular in the 1960s. Caltech’s John H. Schwarz, the Harold Brown Professor of Theoretical Physics, Emeritus, began working on the problem in 1971, while a junior faculty member at Princeton University. He moved to Caltech in 1972, where he continued his research with various collaborators from other universities. Their studies in the 1970s and 1980s would dramatically shift the evolution of the theory and, in 1984, usher in what’s known as the first superstring revolution.

    Essentially, string theory postulates that our universe is made up, at its most fundamental level, of infinitesimal tiny vibrating strings and contains 10 dimensions—three for space, one for time, and six other spatial dimensions curled up in such a way that we don’t perceive them in everyday life or even with the most sensitive experimental searches to date. One of the many states of a string is thought to correspond to the particle that carries the gravitational force, the graviton, thereby linking the two pillars of fundamental physics—quantum mechanics and the general theory of relativity, which includes gravity.

    We sat down with Schwarz to discuss the history and evolution of string theory and how the theory itself might have moved past strings.

    What are the earliest origins of string theory?

    The first study often regarded as the beginning of string theory came from an Italian physicist named Gabriele Veneziano in 1968. He discovered a mathematical formula that had many of the properties that people were trying to incorporate in a fundamental theory of the strong nuclear force [a fundamental force that holds nuclei together]. This formula was kind of pulled out of the blue, and ultimately Veneziano and others realized, within a couple years, that it was actually describing a quantum theory of a string—a one-dimensional extended object.

    How did the field grow after this paper?

    In the early ’70s, there were several hundred people worldwide working on string theory. But then everything changed when quantum chromodynamics, or QCD—which was developed by Caltech’s Murray Gell-Mann [Nobel Laureate, 1969] and others—became the favored theory of the strong nuclear force. Almost everyone was convinced QCD was the right way to go and stopped working on string theory. The field shrank down to just a handful of people in the course of a year or two. I was one of the ones who remained.

    How did Gell-Mann become interested in your work?

    Gell-Mann is the one who brought me to Caltech and was very supportive of my work. He took an interest in studies I had done with a French physicist, André Neveu, when we were at Princeton. Neveu and I introduced a second string theory. The initial Veneziano version had many problems. There are two kinds of fundamental particles called bosons and fermions, and the Veneziano theory only described bosons. The one I developed with Neveu included fermions. And not only did it include fermions but it led to the discovery of a new kind of symmetry that relates bosons and fermions, which is called supersymmetry. Because of that discovery, this version of string theory is called superstring theory.

    When did the field take off again?

    A pivotal change happened after work I did with another French physicist, Joël Scherk, whom Gell-Mann and I had brought to Caltech as a visitor in 1974. During that period, we realized that many of the problems we were having with string theory could be turned into advantages if we changed the purpose. Instead of insisting on constructing a theory of the strong nuclear force, we took this beautiful theory and asked what it was good for. And it turned out it was good for gravity. Neither of us had worked on gravity. It wasn’t something we were especially interested in but we realized that this theory, which was having trouble describing the strong nuclear force, gives rise to gravity. Once we realized this, I knew what I would be doing for the rest of my career. And I believe Joël felt the same way. Unfortunately, he died six years later. He made several important discoveries during those six years, including a supergravity theory in 11 dimensions.

    Surprisingly, the community didn’t respond very much to our papers and lectures. We were generally respected and never had a problem getting our papers published, but there wasn’t much interest in the idea. We were proposing a quantum theory of gravity, but in that era physicists who worked on quantum theory weren’t interested in gravity, and physicists who worked on gravity weren’t interested in quantum theory.

    That changed after I met Michael Green [a theoretical physicist then at the University of London and now at the University of Cambridge], at the CERN cafeteria in Switzerland in the summer of 1979. Our collaboration was very successful, and Michael visited Caltech for several extended visits over the next few years. We published a number of papers during that period, which are much cited, but our most famous work was something we did in 1984, which had to do with a problem known as anomalies.

    What are anomalies in string theory?

    One of the facts of nature is that there is what’s called parity violation, which means that the fundamental laws are not invariant under mirror reflection. For example, a neutrino always spins clockwise and not counterclockwise, so it would look wrong viewed in a mirror. When you try to write down a fundamental theory with parity violation, mathematical inconsistencies often arise when you take account of quantum effects. This is referred to as the anomaly problem. It appeared that one couldn’t make a theory based on strings without encountering these anomalies, which, if that were the case, would mean strings couldn’t give a realistic theory. Green and I discovered that these anomalies cancel one another in very special situations.

    When we released our results in 1984, the field exploded. That’s when Edward Witten [a theoretical physicist at the Institute for Advanced Study in Princeton], probably the most influential theoretical physicist in the world, got interested. Witten and three collaborators wrote a paper early in 1985 making a particular proposal for what to do with the six extra dimensions, the ones other than the four for space and time. That proposal looked, at the time, as if it could give a theory that is quite realistic. These developments, together with the discovery of another version of superstring theory, constituted the first superstring revolution.

    Richard Feynman was here at Caltech during that time, before he passed away in 1988. What did he think about string theory?

    After the 1984 to 1985 breakthroughs in our understanding of superstring theory, the subject no longer could be ignored. At that time it acquired some prominent critics, including Richard Feynman and Stephen Hawking. Feynman’s skepticism of superstring theory was based mostly on the concern that it could not be tested experimentally. This was a valid concern, which my collaborators and I shared. However, Feynman did want to learn more, so I spent several hours explaining the essential ideas to him. Thirty years later, it is still true that there is no smoking-gun experimental confirmation of superstring theory, though it has proved its value in other ways. The most likely possibility for experimental support in the foreseeable future would be the discovery of supersymmetry particles. So far, they have not shown up.

    What was the second superstring revolution about?

    The second superstring revolution occurred 10 years later in the mid ’90s. What happened then is that string theorists discovered what happens when particle interactions become strong. Before, we had been studying weakly interacting systems. But as you crank up the strength of the interaction, a 10th dimension of space can emerge. New objects called branes also emerge. Strings are one dimensional; branes have all sorts of dimensions ranging from zero to nine. An important class of these branes, called D-branes, was discovered by the late Joseph Polchinski [BS ’75]. Strings do have a special role, but when the system is strongly interacting, then the strings become less fundamental. It’s possible that in the future the subject will get a new name but until we understand better what the theory is, which we’re still struggling with, it’s premature to invent a new name.

    What can we say now about the future of string theory?

    It’s now over 30 years since a large community of scientists began pooling their talents, and there’s been enormous progress in those 30 years. But the more big problems we solve, the more new questions arise. So, you don’t even know the right questions to ask until you solve the previous questions. Interestingly, some of the biggest spin-offs of our efforts to find the most fundamental theory of nature are in pure mathematics.

    Do you think string theory will ultimately unify the forces of nature?

    Yes, but I don’t think we’ll have a final answer in my lifetime. The journey has been worth it, even if it did take some unusual twists and turns. I’m convinced that, in other intelligent civilizations throughout the galaxy, similar discoveries will occur, or already have occurred, in a different sequence than ours. We’ll find the same result and reach the same conclusions as other civilizations, but we’ll get there by a very different route.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”

    Caltech campus

    Advertisements
     
  • richardmitnick 11:11 am on August 17, 2018 Permalink | Reply
    Tags: , , Gravity, Is Gravity Quantum?, , , ,   

    From Scientific American: “Is Gravity Quantum?” 

    Scientific American

    From Scientific American

    August 14, 2018
    Charles Q. Choi

    1
    Artist’s rendition of gravitational waves generated by merging neutron stars. The primordial universe is another source of gravitational waves, which, if detected, could help physicists devise a quantum theory of gravity. Credit: R. Hurt, Caltech-JPL

    All the fundamental forces of the universe are known to follow the laws of quantum mechanics, save one: gravity. Finding a way to fit gravity into quantum mechanics would bring scientists a giant leap closer to a “theory of everything” that could entirely explain the workings of the cosmos from first principles. A crucial first step in this quest to know whether gravity is quantum is to detect the long-postulated elementary particle of gravity, the graviton. In search of the graviton, physicists are now turning to experiments involving microscopic superconductors, free-falling crystals and the afterglow of the big bang.

    Quantum mechanics suggests everything is made of quanta, or packets of energy, that can behave like both a particle and a wave—for instance, quanta of light are called photons. Detecting gravitons, the hypothetical quanta of gravity, would prove gravity is quantum. The problem is that gravity is extraordinarily weak. To directly observe the minuscule effects a graviton would have on matter, physicist Freeman Dyson famously noted, a graviton detector would have to be so massive that it collapses on itself to form a black hole.

    “One of the issues with theories of quantum gravity is that their predictions are usually nearly impossible to experimentally test,” says quantum physicist Richard Norte of Delft University of Technology in the Netherlands. “This is the main reason why there exist so many competing theories and why we haven’t been successful in understanding how it actually works.”

    In 2015 [Physical Review Letters], however, theoretical physicist James Quach, now at the University of Adelaide in Australia, suggested a way to detect gravitons by taking advantage of their quantum nature. Quantum mechanics suggests the universe is inherently fuzzy—for instance, one can never absolutely know a particle’s position and momentum at the same time. One consequence of this uncertainty is that a vacuum is never completely empty, but instead buzzes with a “quantum foam” of so-called virtual particles that constantly pop in and out of existence. These ghostly entities may be any kind of quanta, including gravitons.

    Decades ago, scientists found that virtual particles can generate detectable forces. For example, the Casimir effect is the attraction or repulsion seen between two mirrors placed close together in vacuum. These reflective surfaces move due to the force generated by virtual photons winking in and out of existence. Previous research suggested that superconductors might reflect gravitons more strongly than normal matter, so Quach calculated that looking for interactions between two thin superconducting sheets in vacuum could reveal a gravitational Casimir effect. The resulting force could be roughly 10 times stronger than that expected from the standard virtual-photon-based Casimir effect.

    Recently, Norte and his colleagues developed a microchip to perform this experiment. This chip held two microscopic aluminum-coated plates that were cooled almost to absolute zero so that they became superconducting. One plate was attached to a movable mirror, and a laser was fired at that mirror. If the plates moved because of a gravitational Casimir effect, the frequency of light reflecting off the mirror would measurably shift. As detailed online July 20 in Physical Review Letters, the scientists failed to see any gravitational Casimir effect. This null result does not necessarily rule out the existence of gravitons—and thus gravity’s quantum nature. Rather, it may simply mean that gravitons do not interact with superconductors as strongly as prior work estimated, says quantum physicist and Nobel laureate Frank Wilczek of the Massachusets Institute of Technology, who did not participate in this study and was unsurprised by its null results. Even so, Quach says, this “was a courageous attempt to detect gravitons.”

    Although Norte’s microchip did not discover whether gravity is quantum, other scientists are pursuing a variety of approaches to find gravitational quantum effects. For example, in 2017 two independent studies suggested that if gravity is quantum it could generate a link known as “entanglement” between particles, so that one particle instantaneously influences another no matter where either is located in the cosmos. A tabletop experiment using laser beams and microscopic diamonds might help search for such gravity-based entanglement. The crystals would be kept in a vacuum to avoid collisions with atoms, so they would interact with one another through gravity alone. Scientists would let these diamonds fall at the same time, and if gravity is quantum the gravitational pull each crystal exerts on the other could entangle them together.

    The researchers would seek out entanglement by shining lasers into each diamond’s heart after the drop. If particles in the crystals’ centers spin one way, they would fluoresce, but they would not if they spin the other way. If the spins in both crystals are in sync more often than chance would predict, this would suggest entanglement. “Experimentalists all over the world are curious to take the challenge up,” says quantum gravity researcher Anupam Mazumdar of the University of Groningen in the Netherlands, co-author of one of the entanglement studies.

    Another strategy to find evidence for quantum gravity is to look at the cosmic microwave background [CMB] radiation, the faint afterglow of the big bang, says cosmologist Alan Guth of M.I.T.

    Cosmic Background Radiation per ESA/Planck

    ESA/Planck 2009 to 2013

    Quanta such as gravitons fluctuate like waves, and the shortest wavelengths would have the most intense fluctuations. When the cosmos expanded staggeringly in size within a sliver of a second after the big bang, according to Guth’s widely supported cosmological model known as inflation, these short wavelengths would have stretched to longer scales across the universe.

    Inflation

    4
    Alan Guth, from Highland Park High School and M.I.T., who first proposed cosmic inflation

    HPHS Owls

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    Alan Guth’s notes:
    5

    This evidence of quantum gravity could be visible as swirls in the polarization, or alignment, of photons from the cosmic microwave background radiation.

    However, the intensity of these patterns of swirls, known as B-modes, depends very much on the exact energy and timing of inflation. “Some versions of inflation predict that these B-modes should be found soon, while other versions predict that the B-modes are so weak that there will never be any hope of detecting them,” Guth says. “But if they are found, and the properties match the expectations from inflation, it would be very strong evidence that gravity is quantized.”

    One more way to find out whether gravity is quantum is to look directly for quantum fluctuations in gravitational waves, which are thought to be made up of gravitons that were generated shortly after the big bang. The Laser Interferometer Gravitational-Wave Observatory (LIGO) first detected gravitational waves in 2016, but it is not sensitive enough to detect the fluctuating gravitational waves in the early universe that inflation stretched to cosmic scales, Guth says.


    VIRGO Gravitational Wave interferometer, near Pisa, Italy

    Caltech/MIT Advanced aLigo Hanford, WA, USA installation


    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project

    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger-Zib

    ESA/eLISA the future of gravitational wave research

    1
    Skymap showing how adding Virgo to LIGO helps in reducing the size of the source-likely region in the sky. (Credit: Giuseppe Greco (Virgo Urbino group)

    A gravitational-wave observatory in space, such as the Laser Interferometer Space Antenna (eLISA, just above), could potentially detect these waves, Wilczek adds.

    In a paper recently accepted by the journal Classical and Quantum Gravity, however, astrophysicist Richard Lieu of the University of Alabama, Huntsville, argues that LIGO should already have detected gravitons if they carry as much energy as some current models of particle physics suggest. It might be that the graviton just packs less energy than expected, but Lieu suggests it might also mean the graviton does not exist. “If the graviton does not exist at all, it will be good news to most physicists, since we have been having such a horrid time in developing a theory of quantum gravity,” Lieu says.

    Still, devising theories that eliminate the graviton may be no easier than devising theories that keep it. “From a theoretical point of view, it is very hard to imagine how gravity could avoid being quantized,” Guth says. “I am not aware of any sensible theory of how classical gravity could interact with quantum matter, and I can’t imagine how such a theory might work.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 2:21 pm on September 21, 2017 Permalink | Reply
    Tags: But quantum mechanics doesn’t really define what a measurement is, Gravity, Gravity at its most fundamental comes in indivisible parcels called quanta, GRW model-Ghirardi–Rimini–Weber theory, In quantum theory the state of a particle is described by its wave function, Much like the electromagnetic force comes in quanta called photons, , ,   

    From New Scientist: “Gravity may be created by strange flashes in the quantum realm” 

    NewScientist

    New Scientist

    20 September 2017
    Anil Ananthaswamy

    1
    Gravity comes about in a flash. Emma Johnson/Getty

    HOW do you reconcile the two pillars of modern physics: quantum theory and gravity? One or both will have to give way. A new approach says gravity could emerge from random fluctuations at the quantum level, making quantum mechanics the more fundamental of the two theories.

    Of our two main explanations of reality, quantum theory governs the interactions between the smallest bits of matter. And general relativity deals with gravity and the largest structures in the universe. Ever since Einstein, physicists have been trying to bridge the gap between the two, with little success.

    Part of the problem is knowing which strands of each theory are fundamental to our understanding of reality.

    One approach towards reconciling gravity with quantum mechanics has been to show that gravity at its most fundamental comes in indivisible parcels called quanta, much like the electromagnetic force comes in quanta called photons. But this road to a theory of quantum gravity has so far proved impassable.

    Now Antoine Tilloy at the Max Planck Institute of Quantum Optics in Garching, Germany, has attempted to get at gravity by tweaking standard quantum mechanics.

    In quantum theory, the state of a particle is described by its wave function. The wave function lets you calculate, for example, the probability of finding the particle in one place or another on measurement. Before the measurement, it is unclear whether the particle exists and if so, where. Reality, it seems, is created by the act of measurement, which “collapses” the wave function.

    But quantum mechanics doesn’t really define what a measurement is. For instance, does it need a conscious human? The measurement problem leads to paradoxes like Schrödinger’s cat, in which a cat can be simultaneously dead and alive inside a box, until someone opens the box to look.

    One solution to such paradoxes is a so-called GRW model that was developed in the late 1980s. It incorporates “flashes”, which are spontaneous random collapses of the wave function of quantum systems. The outcome is exactly as if there were measurements being made, but without explicit observers.

    Tilloy has modified this model to show how it can lead to a theory of gravity. In his model, when a flash collapses a wave function and causes a particle to be in one place, it creates a gravitational field at that instant in space-time. A massive quantum system with a large number of particles is subject to numerous flashes, and the result is a fluctuating gravitational field.

    It turns out that the average of these fluctuations is a gravitational field that one expects from Newton’s theory of gravity (arxiv.org/abs/1709.03809). This approach to unifying gravity with quantum mechanics is called semiclassical: gravity arises from quantum processes but remains a classical force. “There is no real reason to ignore this semiclassical approach, to having gravity being classical at the fundamental level,” says Tilloy.

    “I like this idea in principle,” says Klaus Hornberger at the University of Duisburg-Essen in Germany. But he points out that other problems need to be tackled before this approach can be a serious contender for unifying all the fundamental forces underpinning the laws of physics on scales large and small. For example, Tilloy’s model can be used to get gravity as described by Newton’s theory, but the maths still has to be worked out to see if it is effective in describing gravity as governed by Einstein’s general relativity.

    Tilloy agrees. “This is very hard to generalise to relativistic settings,” he says. He also cautions that no one knows which of the many tweaks to quantum mechanics is the correct one.

    Nonetheless, his model makes predictions that can be tested. For example, it predicts that gravity will behave differently at the scale of atoms from how it does on larger scales. Should those tests find that Tilloy’s model reflects reality and gravity does indeed originate from collapsing quantum fluctuations, it would be a big clue that the path to a theory of everything would involve semiclassical gravity.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 6:57 am on May 16, 2017 Permalink | Reply
    Tags: , , EPR paradox, , Gravity, , ,   

    From COSMOS: “Using Einstein’s ‘spooky action at a distance’ to hear ripples in spacetime” 

    Cosmos Magazine bloc

    COSMOS

    16 May 2017
    Cathal O’Connell

    1
    The new technique will aid in the detection of gravitational waves caused by colliding black holes. Henze / NASA

    In new work that connects two of Albert Einstein’s ideas in a way he could scarcely have imagined, physicists have proposed a way to improve gravitational wave detectors, using the weirdness of quantum physics.

    The new proposal, published in Nature Physics, could double the sensitivity of future detectors listening out for ripples in spacetime caused by catastrophic collisions across the universe.

    When the advanced Laser Interferometer Gravitational-Wave Observatory (LIGO) detected gravitational waves in late 2015 it was the first direct evidence of the gravitational waves Einstein had predicted a century before.


    Caltech/MIT Advanced aLigo Hanford, WA, USA installation


    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project


    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger-Zib

    Now it another of Einstein’s predictions – one he regarded as a failure – could potentially double the sensitivity of LIGOs successors.

    The story starts with his distaste for quantum theory – or at least for the fundamental fuzziness of all things it seemed to demand.

    Einstein thought the universe would ultimately prove predictable and exact, a clockwork universe rather than one where God “plays dice”. In 1935 he teamed up with Boris Podolsky and Nathan Rosen to publish a paper they thought would be a sort of reductio ad absurdum. They hoped to disprove quantum mechanics by following it to its logical, ridiculous conclusion. Their ‘EPR paradox’ (named for their initials) described the instantaneous influence of one particle on another, what Einstein called “spooky action at a distance” because it seemed at first to be impossible.

    Yet this sally on the root of quantum physics failed, as the EPR effect turned out not to be a paradox after all. Quantum entanglement, as it’s now known, has been repeatedly proven to exist, and features in several proposed quantum technologies, including quantum computation and quantum cryptography.

    2
    Artistic rendering of the generation of an entangled pair of photons by spontaneous parametric down-conversion as a laser beam passes through a nonlinear crystal. Inspired by an image in Dance of the Photons by Anton Zeilinger. However, this depiction is from a different angle, to better show the “figure 8” pattern typical of this process, clearly shows that the pump beam continues across the entire image, and better represents that the photons are entangled.
    Date 31 March 2011
    Source Entirely self-generated using computer graphics applications.
    Author J-Wiki at English Wikipedia

    Now we can add gravity wave detection to the list.

    LIGO works by measuring the minute wobbling of mirrors as a gravitational wave stretches and squashes spacetime around them. It is insanely sensitive – able to detect wobbling down to 10,000th the width of a single proton.

    At this level of sensitivity the quantum nature of light becomes a problem. This means the instrument is limited by the inherent fuzziness of the photons bouncing between its mirrors — this quantum noise washes out weak signals.

    To get around this, physicists plan to use so-called squeezed light to dial down the level of quantum noise near the detector (while increasing it elsewhere).

    The new scheme aids this by adding two new, entangled laser beams to the mix. Because of the ‘spooky’ connection between the two entangled beams, their quantum noise is correlated – detecting one allows the prediction of the other.

    This way, the two beams can be used to probe the main LIGO beam, helping nudge it into a squeezed light state. This reduces the noise to a level that standard quantum theory would deem impossible.

    The authors of the new proposal write that it is “appropriate for all future gravitational-wave detectors for achieving sensitivities beyond the standard quantum limit”.

    Indeed, the proposal could as much as double the sensitivity of future detectors.

    Over the next 30 years, astronomers aim to improve the sensitivity of the detectors, like LIGO, by 30-fold. At that level, we’d be able to hear all black hole mergers in the observable universe.

    ESA/eLISA, the future of gravitational wave research

    However, along with improved sensitivity, the proposed system would also increase the number of photons lost in the detector. Raffaele Flaminio, a physicist at the National Astronomical Observatory of Japan, points out in a perspective piece for Nature Physics [no link], Flaminio that the team need to do more work to understand how this will affect ultimate performance.

    “But the idea of using Einstein’s most famous (mistaken) paradox to improve the sensitivity of gravitational-wave detectors, enabling new tests of his general theory of relativity, is certainly intriguing,” Flaminio writes. “Einstein’s ideas – whether wrong or right – continue to have a strong influence on physics and astronomy.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 9:54 am on April 23, 2017 Permalink | Reply
    Tags: , , , Computer modelling, , , , Gravity, Modified Newtonian Dynamics, or MOND, , Simulating galaxies,   

    From Durham: “Simulated galaxies provide fresh evidence of dark matter” 

    Durham U bloc

    Durham University

    21 April 2017
    No writer credit

    1
    A simulated galaxy is pictured, showing the main ingredients that make up a galaxy: the stars (blue), the gas from which the stars are born (red), and the dark matter halo that surrounds the galaxy (light grey). No image credit.

    Further evidence of the existence of dark matter – the mysterious substance that is believed to hold the Universe together – has been produced by Cosmologists at Durham University.

    Using sophisticated computer modelling techniques, the research team simulated the formation of galaxies in the presence of dark matter and were able to demonstrate that their size and rotation speed were linked to their brightness in a similar way to observations made by astronomers.

    One of the simulations is pictured, showing the main ingredients that make up a galaxy: the stars (blue), the gas from which the stars are born (red), and the dark matter halo that surrounds the galaxy (light grey).

    Alternative theories

    Until now, theories of dark matter have predicted a much more complex relationship between the size, mass and brightness (or luminosity) of galaxies than is actually observed, which has led to dark matter sceptics proposing alternative theories that are seemingly a better fit with what we see.

    The research led by Dr Aaron Ludlow of the Institute for Computational Cosmology, is published in the academic journal, Physical Review Letters.

    Most cosmologists believe that more than 80 per cent of the total mass of the Universe is made up of dark matter – a mysterious particle that has so far not been detected but explains many of the properties of the Universe such as the microwave background measured by the Planck satellite.

    CMB per ESA/Planck

    ESA/Planck

    Convincing explanations

    Alternative theories include Modified Newtonian Dynamics, or MOND. While this does not explain some observations of the Universe as convincingly as dark matter theory it has, until now, provided a simpler description of the coupling of the brightness and rotation velocity, observed in galaxies of all shapes and sizes.

    The Durham team used powerful supercomputers to model the formation of galaxies of various sizes, compressing billions of years of evolution into a few weeks, in order to demonstrate that the existence of dark matter is consistent with the observed relationship between mass, size and luminosity of galaxies.

    Long-standing problem resolved

    Dr Ludlow said: “This solves a long-standing problem that has troubled the dark matter model for over a decade. The dark matter hypothesis remains the main explanation for the source of the gravity that binds galaxies. Although the particles are difficult to detect, physicists must persevere.”

    Durham University collaborated on the project with Leiden University, Netherlands; Liverpool John Moores University, England and the University of Victoria, Canada. The research was funded by the European Research Council, the Science and Technology Facilities Council, Netherlands Organisation for Scientific Research, COFUND and The Royal Society.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Durham U campus

    Durham University is distinctive – a residential collegiate university with long traditions and modern values. We seek the highest distinction in research and scholarship and are committed to excellence in all aspects of education and transmission of knowledge. Our research and scholarship affect every continent. We are proud to be an international scholarly community which reflects the ambitions of cultures from around the world. We promote individual participation, providing a rounded education in which students, staff and alumni gain both the academic and the personal skills required to flourish.

     
  • richardmitnick 8:38 am on December 29, 2016 Permalink | Reply
    Tags: , , , , DDM hypothesis, , , Gravity, Institute for Nuclear Research in Moscow, , The Universe is losing dark matter and researchers have finally measured how much   

    From Science Alert: “The Universe is losing dark matter, and researchers have finally measured how much” 

    ScienceAlert

    Science Alert

    28 DEC 2016
    JOSH HRALA

    1
    MIPT

    Researchers from Russia have, for the first time, been able to measure the amount of dark matter the Universe has lost since the Big Bang some 13.7 billion years ago, and calculate that as much as 5 percent of dark matter could have deteriorated.

    The finding could explain one of the biggest mysteries in physics – why our Universe appears to function in a slightly different way than it did in the years just after the Big Bang, and it could also shed insight into how it might continue to evolve in future.

    “The discrepancy between the cosmological parameters in the modern Universe and the Universe shortly after the Big Bang can be explained by the fact that the proportion of dark matter has decreased,” said co-author Igor Tkachev, from the Institute for Nuclear Research in Moscow.

    “We have now, for the first time, been able to calculate how much dark matter could have been lost, and what the corresponding size of the unstable component would be.”

    The mystery surrounding dark matter was first brought up way back in the 1930s, when astrophysicists and astronomers observed that galaxies moved in weird ways, appearing to be under the effect of way more gravity than could be explained by the visible matter and energy in the Universe.

    This gravitational pull has to come from somewhere. So, researchers came up with a new type of ‘dark matter’ to describe the invisible mass responsible for the things they were witnessing.

    As of right now, the current hypothesis states that the Universe is made up of 4.9 percent normal matter – the stuff we can see, such as galaxies and stars – 26.8 percent dark matter, and 68.3 percent dark energy, a hypothetical type of energy that’s spread throughout the Universe, and which might be responsible for the Universe’s expansion.

    But even though the majority of matter predicted to be in the Universe is actually dark, very little is known about dark matter – in fact, scientists still haven’t been able to prove that it actually exists.

    One of the ways scientists study dark matter is by examining the cosmic microwave background (CMB), which some call the ‘echo of the Big Bang’.

    CMB per ESA/Planck
    CMB per ESA/Planck

    The CMB is the thermal radiation left over from the Big Bang, making it somewhat of an astronomical time capsule that researchers can use to understand the early, newly born Universe.

    The problem is that the cosmological parameters that govern how our Universe works – such as the speed of light and the way gravity works – appear to differ ever so slightly in the CMB compared to the parameters we know to exist in the modern Universe.

    “This variance was significantly more than margins of error and systematic errors known to us,” Tkachev explains. “Therefore, we are either dealing with some kind of unknown error, or the composition of the ancient universe is considerably different to the modern Universe.”

    One of the hypotheses that might explain why the early Universe was so different is the ‘decaying dark matter‘ [Nature] (DDM) hypothesis – the idea that dark matter has slowly been disappearing from the Universe.

    And that’s exactly what Tkachev and his colleagues set out to analyse on a mathematical level, looking for just how much dark matter might have decayed since the creation of the Universe.

    The study’s lead author, Dmitry Gorbunov, also from the Institute for Nuclear Research, explains:

    “Let us imagine that dark matter consists of several components, as in ordinary matter (protons, electrons, neutrons, neutrinos, photons). And one component consists of unstable particles with a rather long lifespan.

    In the era of the formation of hydrogen, hundreds of thousands of years after the Big Bang, they are still in the Universe, but by now (billions of years later), they have disappeared, having decayed into neutrinos or hypothetical relativistic particles. In that case, the amount of dark matter in the era of hydrogen formation and today will be different.”

    To come up with a figure, the team analysed data taken from the Planck Telescope observations on the CMB, and compared it to different dark matter models like DDM.

    ESA/Planck
    ESA/Planck

    They found that the DDM model accurately depicts the observational data found in the modern Universe over other possible explanations for why our Universe looks so different today compared to straight after the Big Bang.

    The team was able to take the study a step further by comparing the CMB data to the modern observational studies of the Universe and error-correcting for various cosmological effects – such as gravitational lensing, which can amplify regions of space thanks to the way gravity can bend light.

    In the end, they suggest that the Universe has lost somewhere between 2 and 5 percent of its dark matter since the Big Bang, as a result of these hypothetical dark matter particles decaying over time.

    “This means that in today’s Universe, there is 5 percent less dark matter than in the recombination era,” Tkachev concludes.

    “We are not currently able to say how quickly this unstable part decayed; dark matter may still be disintegrating even now, although that would be a different and considerably more complex model.”

    These findings suggest that dark matter decays over time, making the Universe move in different ways than it had in the past, though the findings call for more outside research before anything is said for certain.

    Even so, this research is another step closer to potentially understanding the nature of dark matter, and solving one of science’s greatest mysteries – why the Universe looks the way it does, and how it will evolve in the future.

    The team’s work was published in Physical Review D.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 10:43 am on November 12, 2016 Permalink | Reply
    Tags: , , , Gravity,   

    From EarthSky: “No need for dark matter?” 

    1

    EarthSky

    November 10, 2016
    Deborah Byrd

    Erik Verlinde just released the latest installment of his new theory of gravity. He now says he doesn’t need dark matter to explain the motions of stars in galaxies.

    Theoretical physicist Erik Verlinde has a new theory of gravity, which describes gravity not a force but as an illusion. The theory says gravity is an emergent phenomenon, possible to be derived from the microscopic building blocks that make up our universe’s entire existence. This week, he published the latest installment of his theory showing that – if he’s correct – there’s no need for dark matter to describe the motions of stars in galaxies.

    Verlinde, who is at the University of Amsterdam, first released his new theory in 2010. According to a statement released this week (November 8, 2016):

    … gravity is not a fundamental force of nature, but an emergent phenomenon. In the same way that temperature arises from the movement of microscopic particles, gravity emerges from the changes of fundamental bits of information, stored in the very structure of spacetime.

    Dark matter – the invisible “something” that most modern physicists believe makes up a substantial fraction of our universe – came to be necessary when astronomers found in the mid-20th century they couldn’t explain why stars in galaxies moved as they did. The outer parts of galaxies, including our own Milky Way, rotate much faster around their centers than they should, according to the theories of gravity as explained by Isaac Newton and Albert Einstein. According to these very accepted theories, there must be more mass in galaxies than that we can see, and thus scientists began speaking of invisible matter, which they called dark matter.

    They’ve been speaking of it, and trying to understand it, ever since.

    Verlinde is now saying we don’t need dark matter to explain what’s happening in galaxies. He says his new theory of gravity accurately predicts star velocities in the Milky Way and other galaxies. In his statement, he said:

    “We have evidence that this new view of gravity actually agrees with the observations. At large scales, it seems, gravity just doesn’t behave the way Einstein’s theory predicts.

    If true, it’s a revolution in science, since essentially all of modern cosmology – including the Big Bang theory that describes how our universe began – is based on Einstein’s theory of gravity. In recent decades, dark matter and its cousin dark energy have been bugaboos to the accepted theories; despite searches, for example, no one has ever actually observed dark matter.

    If Verlinde’s theory of gravity is true, it doesn’t mean Einstein’s theory is wrong, just as Einstein’s description of gravity didn’t exactly nullify Isaac Newton’s theory of gravity from two centuries before. Newton’s theory is still taught in physics classes, but Einstein’s theory was a refinement – a major one – in our way of thinking about gravity. Likewise, Verlinde’s theory, if correct, would be a refinement of Einstein’s ideas and a chance to have a deeper understanding of the way our universe works. Verlinde commented in his statement:

    “Many theoretical physicists like me are working on a revision of the [accepted modern theories of gravity], and some major advancements have been made. We might be standing on the brink of a new scientific revolution that will radically change our views on the very nature of space, time and gravity.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 10:13 am on July 22, 2016 Permalink | Reply
    Tags: , , Gravity   

    from ars technica: “Gravity doesn’t care about quantum spin” 

    Ars Technica
    ars technica

    7/16/2016
    Chris Lee

    1
    An atomic clock based on a fountain of atoms. NSF

    Physics, as you may have read before, is based around two wildly successful theories. On the grand scale, galaxies, planets, and all the other big stuff dance to the tune of gravity. But, like your teenage daughter, all the little stuff stares in bewildered embarrassment at gravity’s dancing. Quantum mechanics is the only beat the little stuff is willing get down to. Unlike teenage rebellion, though, no one claims to understand what keeps relativity and quantum mechanics from getting along.

    Because we refuse to believe that these two theories are separate, physicists are constantly trying to find a way to fit them together. Part and parcel with creating a unifying model is finding evidence of a connection between the gravity and quantum mechanics. For example, showing that the gravitational force experienced by a particle depended on the particle’s internal quantum state would be a great sign of a deeper connection between the two theories. The latest attempt to show this uses a new way to look for coupling between gravity and the quantum property called spin.

    I’m free, free fallin’

    One of the cornerstones of general relativity is that objects move in straight lines through a curved spacetime. So, if two objects have identical masses and are in free fall, they should follow identical trajectories. And this is what we have observed since the time of Galileo (although I seem to recall that Galileo’s public experiment came to an embarrassing end due to differences in air resistance).

    The quantum state of an object doesn’t seem to make a difference. However, if there is some common theory that underlies general relativity and quantum mechanics, at some level, gravity probably has to act differently on different quantum states.

    To see this effect means measuring very tiny differences in free fall trajectories. Until recently, that was close to impossible. But it may be possible now thanks to the realization of Bose-Einstein condensates. The condensates themselves don’t necessarily provide the tools we need, but the equipment used to create a condensate allows us to manipulate clouds of atoms with exquisite precision. This precision is the basis of a new free fall test from researchers in China.

    Surge like a fountain, like tide

    The basic principle behind the new work is simple. If you want to measure acceleration due to gravity, you create a fountain of atoms and measure how long it takes for an atom to travel from the bottom of the fountain to the top and back again. As long as you know the starting velocity of the atoms and measure the time accurately, then you can calculate the force due to gravity. To do that, you need to impart a well-defined momentum to the cloud at a specific time.

    Quantum superposition

    Superposition is nothing more than addition for waves. Let’s say we have two sets of waves that overlap in space and time. At any given point, a trough may line up with a peak, their peaks may line up, or anything in between. Superposition tells us how to add up these waves so that the result reconstructs the patterns that we observe in nature.

    Then you need to measure the transit time. This is done using the way quantum states evolve in time, which also means you need to prepare the cloud of atoms in a precisely defined quantum state.

    If I put the cloud into a superposition of two states, then that superposition will evolve in time. What do I mean by that? Let’s say that I set up a superposition between states A and B. Now, when I take a measurement, I won’t get a mixture of A and B; I only ever get A or B. But the probability of obtaining A (or B) oscillates in time. So at one moment, the probability might be 50 percent, a short time later it is 75 percent, then a little while later it is 100 percent. Then it starts to fall until it reaches zero and then it starts to increase again.

    This oscillation has a regular period that is defined by the environment. So, under controlled circumstances, I set the superposition state as the atomic cloud drifts out the top of the fountain, and at a certain time later, I make a measurement. Each atom reports either state A or state B. The ratio of the amount of A and B tells me how much time has passed for the atoms, and, therefore, what the force of gravity was during their time in the fountain.

    Once you have that working, the experiment is dead simple (he says in the tone of someone who is confident he will never have to actually build the apparatus or perform the experiment). Essentially, you take your atomic cloud and choose a couple of different atomic states. Place the atoms in one of those states and measure the free fall time. Then repeat the experiment for the second state. Any difference, in this ideal case, is due to gravity acting differently on the two quantum states. Simple, right?

    Practically speaking, this is kind-a-sorta really, really difficult.

    I feel like I’m spinnin’

    Obviously, you have to choose a pair of quantum states to compare. In the case of our Chinese researchers, they chose to test for coupling between gravity and a particle’s intrinsic angular momentum, called spin. This choice makes sense because we know that in macroscopic bodies, the rotation of a body (in other words, its angular momentum) modifies the local gravitational field. So, depending on the direction and magnitude of the angular momentum, the local gravitational field will be different. Maybe we can see this classical effect in quantum states, too?

    However, quantum spin is, confusingly, not related to the rotation of a body. Indeed, if you calculate how fast an electron needs to rotate in order to generate its spin angular momentum, you’ll come up with a ridiculous number (especially if you take the idea of the electron being a point particle seriously). Nevertheless, particles like electrons and protons, as well as composite particles like atoms, have intrinsic spin angular momentum. So, an experiment comparing the free fall of particles with the same spin, but oriented in different directions, makes perfect sense.

    Except for one thing: magnetic fields. The spin of a particle is also coupled to its magnetic moment. That means that if there are any changes in the magnetic field around the atom fountain, the atomic cloud will experience a force due to these variations. Since the researchers want to measure a difference between two spin states that have opposite orientations, this is bad. They will always find that the two spin populations have different fountain trajectories, but the difference will largely be due to variations in the magnetic field, rather than to differences in gravitational forces.

    So the story of this research is eliminating stray magnetic fields. Indeed, the researchers spend most of their paper describing how they test for magnetic fields before using additional electromagnets to cancel out stray fields. They even invented a new measurement technique that partially compensates for any remaining variations in the magnetic fields. To a large extent, the researchers were successful.

    So, does gravity care about your spin?

    Short answer: no. The researchers obtained a null result, meaning that, to within the precision of their measurements, there was no detectable difference in atomic free falls when atoms were in different spin states.

    But this is really just the beginning of the experiment. We can expect even more sensitive measurements from the same researchers within the next few years. And the strategies that they used to increase accuracy can be transferred to other high-precision measurements.

    Physical Review Letters, 2016, DOI: 10.1103/PhysRevLett.117.023001

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon
    Stem Education Coalition
    Ars Technica was founded in 1998 when Founder & Editor-in-Chief Ken Fisher announced his plans for starting a publication devoted to technology that would cater to what he called “alpha geeks”: technologists and IT professionals. Ken’s vision was to build a publication with a simple editorial mission: be “technically savvy, up-to-date, and more fun” than what was currently popular in the space. In the ensuing years, with formidable contributions by a unique editorial staff, Ars Technica became a trusted source for technology news, tech policy analysis, breakdowns of the latest scientific advancements, gadget reviews, software, hardware, and nearly everything else found in between layers of silicon.

    Ars Technica innovates by listening to its core readership. Readers have come to demand devotedness to accuracy and integrity, flanked by a willingness to leave each day’s meaningless, click-bait fodder by the wayside. The result is something unique: the unparalleled marriage of breadth and depth in technology journalism. By 2001, Ars Technica was regularly producing news reports, op-eds, and the like, but the company stood out from the competition by regularly providing long thought-pieces and in-depth explainers.

    And thanks to its readership, Ars Technica also accomplished a number of industry leading moves. In 2001, Ars launched a digital subscription service when such things were non-existent for digital media. Ars was also the first IT publication to begin covering the resurgence of Apple, and the first to draw analytical and cultural ties between the world of high technology and gaming. Ars was also first to begin selling its long form content in digitally distributable forms, such as PDFs and eventually eBooks (again, starting in 2001).

     
  • richardmitnick 7:26 am on July 11, 2016 Permalink | Reply
    Tags: , , Gravity,   

    From COSMOS: “Gravity shrugs off differences of quantum spin” 

    Cosmos Magazine bloc

    COSMOS

    11 July 2016
    Cathal O’Connell

    1
    Italian astronomer and scientist Galileo Galilei performs his legendary experiment – dropping a cannonball and a wooden ball from the top of the Leaning Tower of Pisa, circa 1620.
    Hulton Archive/Getty Images

    A new experiment, based on measuring the free fall of rubidium atoms in a vacuum, confirms that atoms of different quantum spin experience identical acceleration due to gravity.

    The result is important in the quest to unify general relativity with quantum mechanics, and may already rule out some proposed theories of quantum gravity.

    Resolving the mismatch between the two great pillars of physics – the theory of gravity, and the theory of matter – is probably the grandest challenge in contemporary physics.

    Einstein’s theory of general relativity tells how gravity arises from mass bending space and time. The theory describes the universe on the largest of scales, from the orbits of the planets to the rotation of galaxies to the Big Bang itself.

    Quantum mechanics, on the other hand, describes the microscopic world of particles and how they join together to make the matter around us.

    The problem is these two ideas don’t seem to mesh. The very large and the very small seem to play by different rules.

    Now, a new breed of experiments is allowing physicists to measure the force of gravity at the scale of quantum objects and so test, for the first time, some of the theories proposing to bridge the chasm between gravity and quantum mechanics.

    In the new work, a team of Chinese scientists from Huazhong University of Science and Technology in Wuhan has compared the acceleration of rubidium atoms due to gravity and found it to be identical regardless of the orientation of the atom’s spin.

    This research is published in Physical Review Letters.

    At heart, this experiment is a test of the equivalence principle, which says the acceleration due to gravity is identical for any object.

    Tests of this principle have been performed in various guises over the centuries, from renaissance Europe to the surface of the moon.

    One of the most famous images in all of physics is that of the Italian scientist Galileo Galilei, atop the leaning Tower of Pisa, letting go of two metal balls of different masses to show they fell at the same rate. Although this account may be apocryphal, Galileo certainly did describe an experiment rolling balls down a slope, which showed the same thing.

    And in 1971, Commander David Scott famously tested the equivalence principle by dropping a hammer and feather at the same time, while standing on the moon.


    The hammer and the feather on the Moon
    Access mp4 video here .

    Although the equivalence principle is central to general relativity, many quantum theories of gravity, which attempt to describe gravity using quantum mechanics, predict that the equivalence principle could be violated.

    In particular, some quantum properties, such as the spin of an atom, might affect free fall the theories say.

    To test this, the Chinese team, led by physicist Zhong-Kun Hu, set up an intricate experiment, which measured the rate of free fall of atoms of rubidium.

    The experiment is based on atom interferometry, which exploits the wave nature of atoms to monitor their motion extremely precisely.

    First, the team isolated and cooled a collection of rubidium atoms to few millionths of a degree above absolute zero.

    The atoms started out at the bottom of a tube that had been emptied completely of air.

    The team then pointed a laser beam from below and using the light to give the cold atoms a kick, propelling them upwards in the tube. But what goes up, must come down. This set up a “fountain” of atoms, rising and falling.

    The scientists found that the free fall acceleration of the rubidium atoms with opposite spins agreed to within one part in 10 million.

    In the past decade, similar experiments have already verified universality of free fall for different atoms, and for different isotopes of the same element.

    But this is the first time gravity has been tested in terms of quantum spin. It means that several exotic theories which had predicted a significant interaction between quantum spin and gravity will have to be modified, or thrown out.

    Back to the drawing board.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: