Tagged: Relativity Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:21 pm on January 24, 2018 Permalink | Reply
    Tags: , , , , Relativity   

    From Don Lincoln at Fermilab – Video – “What is relativity all about?” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    From Don Lincoln Video – “What is relativity all about?”

    Published on Jan 24, 2018
    Einstein’s theory of special relativity is one of the fascinating scientific advances of the 20th century. Fermilab’s Dr. Don Lincoln has decided to make a series of videos describing this amazing idea. In this video, he lays out what relativity is all about… what is the entire point. And it’s not what you think. It’s not about clocks moving slower and objects shrinking. It’s about… well, you’ll have to watch to see.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

  • richardmitnick 3:27 pm on May 30, 2016 Permalink | Reply
    Tags: , , , , Relativity,   

    From PI: “Bridging Two Roads of Physics” Women in Science 

    Perimeter Institute
    Perimeter Institute

    May 30, 2016
    Rose Simone

    Recent Perimeter research based on the holographic principle seeks new connections between general relativity and quantum field theory.

    Imagine driving along a road that traverses a beautiful landscape. Around every corner, there is a new vista of natural beauty to explore. Suddenly you come to a chasm.

    You can see a road on the other side, but how do you get there to complete the journey? You need a bridge.

    That’s the state of physics today, and Bianca Dittrich, Perimeter Institute researcher in mathematical physics and quantum gravity, is one of the people trying to build that bridge.

    Bianca Dittrich

    On one side of the chasm is the road built by Albert Einstein’s theory of general relativity. It describes the force of gravity as the warping of spacetime by large masses such as planets and stars.

    On the other side is quantum field theory, our best description of interacting particles and the three other forces (the strong and weak nuclear forces and electromagnetism) operating at minuscule subatomic distances.

    The theories are incredibly successful in their respective realms, yet they are so different, both in formulation and conceptually, that it is difficult to bridge them.

    “Basically, we are trying to bridge all of the scales that we know,” Dittrich says. “That is what physics is about, but it is very hard. You need to bridge all of these scales by modelling the tiny scales, and show that this model actually does indeed describe reality as we know it at macroscopic scales.”

    In general relativity, spacetime is smooth and continuous. If you were to zoom in with a microscope to arbitrarily small distances, it should look the same as it does when you zoom out for the larger view. Quantum field theory, on the other hand, describes particles and forces that come as discrete “packets,” and spacetime would also have to be discrete and granular, like the pixels in a photograph.

    Scientists need a theory to describe the force of gravity at the quantum scale, and it must be consistent with the larger picture of general relativity. Building the bridge to a theory of quantum gravity is what occupies many physicists around the world today.

    It is easier said than done. If general relativity is scaled down to the quantum size, you start to get nonsensical “infinities” in the calculations. “Quantizing gravity sounds simple, in that it should be just the quantization of another force, besides the three forces (the non-gravitational forces) that were quantized decades ago,” Dittrich says. “But in fact it is a very hard and open problem.”

    There are many approaches to this longstanding problem. In loop quantum gravity, for example, physicists speak in terms of “spacetime atoms” linked together in a network like a fine mesh. This provides a model of what spacetime itself is made of.

    But in a recent paper*, “3D Holography: From Discretum to Continuum,” Dittrich and co-author Valentin Bonzom, now an assistant professor at Université Paris 13 who was previously a postdoctoral researcher at Perimeter Institute, tested a different approach, based on the holographic principle.

    The holographic principle says everything that happens in a given space can be explained in terms of information stored on the boundary of that space. (The principle takes its name from holograms, in which two-dimensional surfaces contain all the information needed to project a three-dimensional image.)

    A popular mathematical framework based on the holographic principle is known as the AdS/CFT correspondence. AdS is short for anti-de Sitter space, which describes a particular kind of geometry. Just like a bowling ball will stretch a rubber sheet, the elliptical shape of anti-de Sitter space can also stretch or contract, thus allowing it to describe gravity.

    CFT, meanwhile, is short for conformal field theory. Field theories are the language of quantum mechanics and can describe, for example, how an electrical field might change over space and time.

    The holographic principle applies because the AdS/CFT correspondence basically states that for every conformal field theory, there is a corresponding theory of gravity with one more dimension. So a two-dimensional CFT would correspond to a three-dimensional theory of gravity, for instance.

    But the holographic principle applies to infinitely large boundaries, and Dittrich and Bonzom wanted to see if it could also hold for finite boundaries, and for other types of geometries apart from AdS. This would then provide a more manageable way of describing a piece of spacetime, and understanding the microscopic details as they reconstruct the spacetime bulk.

    Working with a boundary without worrying too much about the bulk “very much simplifies the construction of a theory of quantum gravity,” Dittrich explains.

    They tested this in three spacetime dimensions, and “it turned out that the holographic principle indeed holds for finite boundaries, and we also obtained a very simple description of how to translate the boundary data into the geometry of the bulk,” she says.

    That this could be done in 3D was not too surprising, but the more challenging part will be extending this work into 4D space, Dittrich adds.

    Most theories of quantum gravity require the force of gravity to also be mediated by hypothetical particles called gravitons. If Dittrich can get her model to work in 4D, then she will have successfully taken it into a realm where gravitons exist. “Gravity can propagate through that spacetime,” Dittrich says.

    Dittrich has been on the physics road for some time. She grew up in Germany, reading a lot of popular books about science, as well as history and literature, and when she finished high school she considered various options, including areas such as geo-ecology.

    But she realized it was physics that could take her on the journey to the most complete understanding of nature. “If you want to understand why something works, the answer is in physics,” she says.

    Now, she is designing another bridge that will span that chasm between the two great roads and carry physicists to that more complete understanding of nature.

    *Science paper:
    3D holography: from discretum to continuum

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Perimeter

    Perimeter Institute is a leading centre for scientific research, training and educational outreach in foundational theoretical physics. Founded in 1999 in Waterloo, Ontario, Canada, its mission is to advance our understanding of the universe at the most fundamental level, stimulating the breakthroughs that could transform our future. Perimeter also trains the next generation of physicists through innovative programs, and shares the excitement and wonder of science with students, teachers and the general public.

  • richardmitnick 10:56 am on November 25, 2015 Permalink | Reply
    Tags: , , Relativity   

    From EPFL: “100 years of relativity and enthusiasm for bringing science to public” 

    EPFL bloc

    École polytechnique fédérale de Lausanne EPFL

    download mp4 video here.

    25.11.15 – Time and space are celebrating their 100th wedding anniversary. To mark the centenary of relativity theory, Anais Rassat and her cross-Channel accomplices have put [Albert] Einstein front and center in an entertaining animated film. The EPFL physicist explains her approach.

    On November 25, 1915, Albert Einstein presented his theory of general relativity at the Prussian Academy of Science. Time and space became just two sides of a single coin, and we never saw things in quite the same way again. The Universe got a birthday with the Big Bang, GPS satellites that use the same equations could be developed, much to drivers’ delight, and the portrait of the physicist sticking his tongue out has become the iconic image of the genius. Today, Anais Rassat and her colleagues are launching an animated film to explain the theory to the public.

    This isn’t the EPFL physicist’s first shot at communicating difficult scientific concepts to the layperson. That happened in Hyde Park, where she took to a podium amongst all sorts of religious orators and political satirists. She has been a Huffington Post contributor, lead of Euclid’s education and public outreach activities Project, a member of the TedX Paris committee, a LIFT conference participant… In short, she combines her passion for research with her vocation as a scientific communicator, and explains the reasons behind her commitment.

    Physicists are saying that the 100-year mark of general relativity is not just another birthday. Do you feel this way?
    Anais Rassat: This was an extraordinary moment in science, one that changed the world. For a century, or nearly a century, relativity has been practically unassailable. It gave a verifiable explanation for the phenomenon of gravity; it allowed us to give the Universe a birth date, to imagine the Big Bang, whereas before, we didn’t even know if it had a beginning. At a more down to earth level, without Einstein’s equations, GPS satellites wouldn’t work. At the same time, we’re coming the point where the theory appears to be reaching its limits. In the late 90s we realized that the Universe was accelerating. That doesn’t jive with the equations, which basically tell us what the Universe should look like as a function of the matter that’s in it. In reality, for the theory of relativity to explain the Universe accurately, there would need to be a whole lot more matter than what is observed.

    Does that mean the theory is wrong, or has mistakes in it?
    That’s exactly what I’m trying to explain to the public. In science, we don’t talk about absolute truth. The theory of relativity is correct insomuch as it agrees with observations. And for it to agree, we’ve introduced, among other things, the concept of dark matter. Yet recent observations are giving us very good reason to think that this invisible matter actually exists. It must be in the form of exotic particles, we don’t yet know exactly.


    The theory had somehow predicted the existence of this matter even before we observed it?
    That’s probably the case. In fact, it’s possible that general relativity only describes about 5% of the Universe, in other words, visible matter. Dark matter makes up another 25%. The remaining 70% would be dark energy, a sort of exotic force that decrees that the laws of gravity change at the very large scale. But that introduces more complex problems. It’s possible that it doesn’t really exist, and corresponds instead to a problem in the theory. That’s why I try not to talk about it too much! Whatever the case may be, we have a theory that precisely explains everything that we can directly observe, but doesn’t account for up to 95% of everything that exists!

    Do you think we’ll figure it out in the near future?
    A lot could happen in the next 20 to 30 years. For example via the Euclid project, which I’m involved with along with 1300 other physicists from all over the world.

    ESA Euclid spacecraft

    Some people think that the equations should be changed to correspond to the observations, and others think that we should add other elements that still haven’t been observed, like dark energy. We will map out the Universe in detail and obtain new elements of a solution. The final goal is to understand if the theory needs to be changed, or if there are new elements that exist that have been invisible up to this point.

    Let’s go back to your animation. It doesn’t go into these details.
    It’s a film for the public, and a way of giving this historically important scientific anniversary some visibility. I’ve been preparing this for years with director Jamie Lochhead. Together, we came up with the scenario, and the lion’s share of the work was done by animation expert Eoin Duffy. Finally, we had the good fortune to obtain the participation of David Tennant, a British actor famous for having incarnated the main character in Doctor Who, for the voice-over. Our project was funded by the British Science and Technology Facilities Council.

    You seem to have a very good network of scientific communicators on the other side of the Channel!
    It was in England that I began to be interested in the issue of scientific communication. I was writing the introduction to my thesis, fifty or so pages in which I had to recount the entire history of cosmology, among other things. It seemed to me that I could do a better job of this by explaining it to the public. I went to Hyde Park in London, to the well-known “Speaker’s Corner,” where anyone can stand up on a soapbox and hold forth on anything their heart desires. I had brought a telescope along for observing the Sun and a huge poster on which I had written “ask me about the Big Bang.” I found myself among lots of other speakers, mostly carrying on about religion or politics, and I did some speed communication, a bit like speed dating. I gave myself three minutes to give my speech, no more.

    Was it a good experience?
    I found that it helped me better understand my own subject matter. I learned to speak without using jargon. That’s essential, because the more you use a specialized language, the more you lose sight of what you really want to say. You sometimes even forget what the fundamental question you’re trying to answer is. I discovered that by communicating to the public, I was able to gain perspective on my own work.

    Be that as it may, scientific communication doesn’t count for much on an academic CV.
    Even so, it’s an investment that could realistically have an enormous return. Eventually, if you have the public’s support, you will obtain funding to do your research. But it’s a collective investment, which benefits the entire scientific community. Hence the importance of motivating individuals and rewarding this kind of ability. Fortunately, things are changing and communication with the public is becoming increasingly valued. For example, if you are applying for a position at NASA, this aspect counts for 25% of the evaluation.

    Is it part of scientists’ mission to gain public support?
    As scientists we have a responsibility. We’re funded by the public, and we owe them something in return. In a time of crisis, such as the one we’re currently in, I’m often asked why we should continue to fund basic research. My favorite example is the discovery of quantum mechanics. We had funded this research in the 1930s, at a time of economic and social crisis far greater than anything we’re experiencing today. Yet without this research, we wouldn’t now have computers or the Internet. Science transforms society, sometimes in unpredictable ways, but it has always and will always do so.

    Science also brings about social change, and new ways of seeing the world.
    Of course. That’s what I call cultural capital, as opposed to technological capital. Where did we come from? What is our place in the Universe? These are very good questions that are important in themselves, but they also have a real impact on how we see the world. For example, when you realize that we’re not at the center of the Universe, that our solar system is just one of innumerable solar systems in our galaxy, and that there are billions of galaxies… that understanding unites us and brings us closer together as human beings. I don’t want to wax philosophical, but I think that this vision puts plenty of things into perspective and opens up new ways of thinking about who we are.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    EPFL campus

    EPFL is Europe’s most cosmopolitan technical university with students, professors and staff from over 120 nations. A dynamic environment, open to Switzerland and the world, EPFL is centered on its three missions: teaching, research and technology transfer. EPFL works together with an extensive network of partners including other universities and institutes of technology, developing and emerging countries, secondary schools and colleges, industry and economy, political circles and the general public, to bring about real impact for society.

  • richardmitnick 7:43 pm on November 19, 2015 Permalink | Reply
    Tags: , , , Relativity,   

    From SPACE.com: “Einstein’s Unfinished Dream: Marrying Relativity to the Quantum World” 

    space-dot-com logo


    November 18, 2015
    FNAL Don Lincoln
    Don Lincoln, Senior Scientist, Fermi National Accelerator Laboratory; Adjunct Professor of Physics, University of Notre Dame

    This artist’s illustration depicts how the foamy structure of space-time may appear, showing tiny bubbles quadrillions of times smaller than the nucleus of an atom that are constantly fluctuating and last for only infinitesimal fractions of a second. Credit: NASA/CXC/M.Weiss

    This November marks the centennial of Albert Einstein’s theory of general relativity. This theory was the crowning achievement of Einstein’s extraordinary scientific life. It taught us that space itself is malleable, bending and stretching under the influence of matter and energy. His ideas revolutionized humanity’s vision of the universe and added such mind-blowing concepts as black holes and wormholes to our imagination.

    Einstein’s theory of general relativity describes a broad range of phenomena, from nearly the moment of creation to the end of time, and even a journey spiraling from the deepest space down into a ravenous black hole, passing through the point of no return of the event horizon, down, down, down, to nearly the center, where the singularity lurks.

    Deep into a quantum world

    If you were reading that last paragraph carefully, you’ll note that I used the word “nearly” twice. And that wasn’t an accident. Einstein’s theory has been brilliantly demonstrated at large size scales. It deftly explains the behavior of orbiting binary pulsars and the orbit of Mercury. It is a crucial component of the GPS system that helps many of us navigate in our cars every day.

    But the beginning of the universe and the region near the center of a black hole are very different worlds — quantum worlds. The size scales involved in those environments are subatomic. And that’s where the trouble starts.

    Einstein’s heyday coincided with the birth of quantum mechanics, and the stories of his debates with physicist Niels Bohr over the theory’s counterintuitive and probabilistic predictions are legendary. “God does not play dice with the universe,” he is famously reported to have said.

    However, regardless of his disdain for the theory of quantum mechanics, Einstein was well aware of the need to understand the quantum realm. And, in his quest to understand and explain general relativity, he sought to understand how of gravity performed in his epic theory when it was applied to the world of the supersmall. The result can be summarized in three words: It failed badly.

    download the mp4 video here.

    Bridging the quantum world to relativity

    Einstein spent the rest of his life, without success, pursuing ways to integrate his theory of general relativity with quantum mechanics. While it is tempting to describe the history of this attempt, the effort is of interest primarily to historians. After all, he didn’t succeed, nor did anyone in the decades that followed.

    Instead, it is more interesting to get a sense of the fundamental problems associated with wedding these two pivotal theories of the early 20th century. The initial issue was a systemic one: General relativity uses a set of differential equations that describe what mathematicians call a smooth and differentiable space. In layman’s terms, this means that the mathematics of general relativity is smooth, without any sharp edges.

    In contrast, quantum mechanics describes a quantized world, e.g. a world in which matter comes in discrete chunks. This means that there is an object here, but not there. Sharp edges abound.

    The water analogy

    In order to clarify these different mathematical formulations, one need think a bit more deeply than usual about a very familiar substance we know quite well: liquid water. Without knowing it, you already hold two different ideas about water that illustrate the tension between differential equations and discrete mathematics.

    For example, when you think of the familiar experience of running your hand through water, you think of water as a continuous substance. The water near your hand is similar to the water a foot away. That distant water might be hotter or colder or moving at a different speed, but the essence of water is the same. As you consider different volumes of water that get closer and closer to your hand, your experience is the same. Even if you think about two volumes of water separated by just a millimeter or half a millimeter, the space between them consists of more water. In fact, the mathematics of fluid flow and turbulence assumes that there is no smallest, indivisible bit of water. Between any two arbitrarily-close distances, there will be water. The mathematics that describes this situation is differential equations. Digging down to its very essence, you find that differential equations assume that there is no smallest distance.

    But you also know that this isn’t true. You know about water molecules. If you consider distances smaller than about three angstroms (the size of a water molecule), everything changes. You can’t get smaller than that, because when you probe even smaller distances, water is no longer a sensible concept. At that point, you’re beginning to probe the empty space inside atoms, in which electrons swirl around a small and dense nucleus. In fact, quantum mechanics is built around the idea that there are smallest objects and discrete distances and energies. This is the reason that a heated gas emits light at specific wavelengths: the electrons orbit at specific energies, with no orbits between the prescribed few.

    Thus a proper quantum theory of water has to take into account the fact that there are individual molecules. There is a smallest distance for which the idea of “water” has any meaning.

    Thus, at the very core, the mathematics of the two theories (e.g. the differential equations of general relativity and the discrete mathematics of quantum mechanics) are fundamentally at odds.

    download the mp4 video here.

    Can the theories merge?

    This is not, in and of itself, an insurmountable difficulty. After all, parts of quantum mechanics are well described by differential equations. But a related problem is that when one tries to merge the two theories, infinities abound; and when an infinity arises in a calculation, this is a red flag that you have somehow done something wrong.

    As an example, suppose you treat an electron as a classical object with no size and calculate how much energy it takes to bring two electrons together. If you did that, you’d find that the energy is infinite. And infinite to a mathematician is a serious business. That’s more energy than all of the energy emitted by all of the stars in the visible universe. While that energy is mind-boggling in its scale, it isn’t infinite. Imagining the energy of the entire universe concentrated in a single point is just unbelievable, and infinite energy is much more than that.

    Therefore, infinities in real calculations are a clear sign that you’ve pushed your model beyond the realm of applicability and you need to start looking to find some new physical principles that you’ve overlooked in your simplified model.

    In the modern day, scientists have tried to solve the same conundrum that so flummoxed Einstein. And the reason is simple: The goal of science is to explain all of physical reality, from the smallest possible objects to the grand vista of the cosmos.

    The hope is to show that all matter originates from a small number of building blocks (perhaps only one) and a single underlying force from which the forces we currently recognize originates. Of the four known fundamental forces of nature, we have been able to devise quantum theories of three: electromagnetism, the strong nuclear force, and the weak nuclear forces. However, a quantum theory of gravity has eluded us.

    General relativity is no doubt an important advance, but until we can devise a quantum theory of gravity, there is no hope of devising a unified theory of everything. While there is no consensus in the scientific community on the right direction in which to proceed, there have been some ideas that have had limited success.

    Superstring theory

    The best-known theory that can describe gravity in the microworld is called superstring theory. In this theory, the smallest known particles should not be thought of as little balls, but rather tiny strings, kind of like an incredibly small stick of uncooked spaghetti or a micro-miniature Hula-Hoop. The basic idea is that these tiny strings (which are smaller compared to a proton than a proton is compared to you) vibrate, and each vibration presents a different fundamental particle.

    Employing a musical metaphor, an electron might be an A-sharp, while a photon could be a D-flat. In the same way that a single violin string can have many overtones, the vibrations of a single superstring can be different particles. The beauty of superstring theory is that it allows for one of the vibrations to be a graviton, which is a particle that has never been discovered but is thought to be the particle that causes gravity.

    It should be noted that superstring theory is not generally accepted, and indeed, some in the scientific community don’t even consider it to be a scientific theory at all. The reason is that, in order for a theory to be scientific, it must be able to be tested, and have the potential to be proven wrong. However, the very small scale of these theoretical strings makes it difficult to imagine any tests that could be done in the foreseeable future. And, some say, if you can’t realistically do a test, it isn’t science.

    Personally, I think that is an extreme opinion, as one can imagine doing such a test when technology advances. But that time will be far in the future.

    Another idea for explaining quantum gravity is called loop quantum gravity. This theory actually quantizes space-time itself. In other words, this model says that there is a smallest bit of space and a shortest time. This provocative idea suggests, among other things, that the speed of light might be different for different wavelengths. However, this effect, if it exists, is small and requires that light travel for great distances before such differences could be observed. Toward that end, scientists are looking at gamma-ray bursts, explosions so bright that they can be seen across billions of light-years — an example of the cosmic helping scientists study the microscopic.

    The simple fact is that we don’t yet have a good and generally accepted theory of quantum gravity. The question is simply just too difficult, for now. The microworld of the quantum and the macroworld of gravity have long resisted a life of wedded bliss and, at least for the moment, they continue to resist. However, scientists continue to find the linkage that blends the two. In the meantime, a theory of quantum gravity remains one of the most ambitious goals of modern science — the hope that we will one day fulfill Einstein’s unfinished dream.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 3:27 pm on November 9, 2015 Permalink | Reply
    Tags: , , , Relativity   

    From Princeton: “Princeton celebrates 100 years of Einstein’s theory of general relativity” 

    Princeton University
    Princeton University

    November 9, 2015
    Catherine Zandonella

    This month the world is celebrating the 100th anniversary of Albert Einstein’s theory of general relativity, which shaped our concepts of space, time and gravity, and spurred generations of scientists to contemplate new ideas about the universe. The anniversary was celebrated on Nov. 5-6 at a conference co-hosted by Princeton University and the Institute for Advanced Study in the town of Princeton.

    The conference was sponsored by Institute Trustee Eric Schmidt, who graduated from Princeton in 1976 and is executive chairman of Alphabet Inc., and his wife, Wendy.

    One of Princeton’s most notable residents, Einstein was a faculty member at the Institute for Advanced Study (IAS) from 1933 until his death in 1955. IAS is an independent research institution located about one mile from Princeton University. During construction of the institute, from 1933 to 1939, Einstein’s office was located in Fine Hall (now Jones Hall) on the University campus.

    Albert Einstein was a faculty member at the Institute for Advanced Study from 1933 until his death in 1955 and had an office on the University campus from 1933 to 1939. (Image courtesy of Münchner Stadtmuseum, Sammlung Fotografie, Archiv Landshoff)

    Einstein’s theory of general relativity, set down in a series of lectures in Berlin in late 1915, predicted many features of the universe — including black holes and gravitational waves — for which we now have experimental evidence.

    The theory also predicted some things that have not yet been discovered, like wormholes and travel back in time. In additions to relvations about the universe, the theory has enabled technologies in our everyday lives, like the accurate GPS systems in smartphones.

    “Einstein’s theory of general relativity completely changed our view of the universe,” said Lyman Page, the James S. McDonnell Distinguished University Professor in Physics and chair of Department of Physics. “It had a huge impact on researchers in physics, astrophysical sciences and mathematics, here at Princeton and around the world.”

    Robbert Dijkgraaf, director of IAS and the Leon Levy Professor, called Einstein’s theory of general relativity “the largest intellectual achievement in the last few centuries.”

    “The fact that this celebration is happening in Princeton is important for two reasons,” Dijkgraaf said about the conference. “Princeton was the home of Einstein for a long time, and it was also the home of the revival of interest [in the 1950s and 1960s] in the study of general relativity.”

    A theory of how gravity works

    At the time Einstein developed his theory, people already knew from the work of Sir Isaac Newton more than 200 hundred years earlier that massive objects, such as stars, attract smaller objects, such as stars and planets, attract each other through the force of gravity. While Newton’s laws enabled highly accurate predictions of planetary orbits, they didn’t explain how the attractive force of gravity comes about.

    Einstein’s theory of general relativity takes care of that, according to David Spergel, the Charles A. Young Professor of Astronomy on the Class of 1897 Foundation and chair of the Department of Astrophysical Sciences. “It essentially describes how gravity works.”

    Einstein’s theory showed that massive objects cause distortions in the fabric of the universe. Imagine that the universe is a large bedsheet held on all four corners so that the sheet is taut but can still deform, and that on the sheet sits our sun, represented by a bowling ball. The mass of the sun deforms the fabric of the universe the way a bowling bowl causes a depression on the sheet. A marble placed on the sheet would begin a circular trajectory around the bowling ball, just as the planets orbit the sun.

    Through an elegant set of mathematical “field equations,” Einstein explained that gravity is the curving of this fabric, which is made of the three dimensions of space and the fourth dimension of time. He also showed that just as space-time is curved by matter such as stars, this matter is also influenced by the curvature of space-time. One of the first predictions to come out of the theory was that light passing by a star would be bent due to the star’s gravitational pull. Just a few years later in 1919, scientists observed this effect during a solar eclipse. The confirmation of this key prediction of his theory catapulted Einstein to international fame.

    Over the next decades, a few scientists and mathematicians studied Einstein’s equations and made interesting discoveries. For example, the physicist Karl Schwarzschild, who was the father of Princeton professor Martin Schwarzschild, found that the theory predicted points of extreme gravity that Princeton faculty member John Archibald Wheeler later renamed “black holes.”

    For the most part, however, the development of general relativity languished as the physics community became focused on the theory of quantum mechanics.

    Einstein’s theory of general relativity has helped scientists understand how the universe’s faint temperature fluctuations, known as the cosmic microwave background, can reveal the structure of the early universe. The image shows these fluctuations as captured by the Wilkinson Microwave Anisotropy Probe (WMAP), named after Princeton faculty member David Wilkinson and launched in 2001 by NASA in partnership with Princeton and other institutions. (Image courtesy of NASA / WMAP Science Team)

    A renaissance of relativity

    That changed in the late 1950s and early 1960s, said Page, largely due to the work of Wheeler and his contemporary Robert Dicke. The two made major contributions to the development of the theory, and inspired many more people to study general relativity, said Michael Strauss, professor of astrophysical sciences. “Wheeler and Dicke trained a generation of people who had an enormous impact on the field,” he said.

    Dicke also made major contributions to experiments designed to detect the effects of general relativity, Page said. “Dicke was a genius at experimentation, and came up with tests that answered many questions about the theory,” he said.

    The renewed interest in general relativity led to new ideas about the formation and structure of the universe, an area of science known as cosmology. The theory has helped scientists understand the importance of the universe’s faint temperature fluctuations, or cosmic microwave background, left over from the birth of the universe. One of the first comprehensive studies of these fluctuations was the Wilkinson Microwave Anisotropy Probe (WMAP), named after Princeton faculty member David Wilkinson and carried out by NASA in partnership with Princeton and other institutions.

    Page, Spergel, Norman Jarosik and many others were involved in the successful 2001 launch and later analysis of the project’s data. “They found spectacular agreement with the predictions of general relativity and the Big Bang model developed by Jim Peebles [Princeton’s Albert Einstein Professor of Science, Emeritus] and others, and were able to precisely quantify the amount of dark matter and dark energy in the universe,” Strauss said.

    Gravitational waves

    Another prediction to emerge from Einstein’s theory is that the universe is bathed in ripples in space-time called gravitational waves. These waves can be created by the collision of two very dense and massive objects, such as two neutron stars or two black holes. Joseph Taylor, Princeton’s James S. McDonnell Distinguished University Professor of Physics, Emeritus, and his graduate student Russell Hulse earned the 1993 Nobel Prize in physics for their discovery of a pair of neutron stars whose orbit closely matched the predictions of general relativity, including the emission of gravitational waves. The newly built Laser Interferometer Gravitational-Wave Observatory (LIGO), composed of two gravity-wave detectors in Louisiana and Washington, is expected to directly observe the waves in the near future.

    Caltech Ligo
    MIT/Caltech LIGO

    The detection of the waves would not be possible, however, without first having some idea of what the waves will look like. “The detectors are so sensitive,” said Princeton’s Frans Pretorius, professor of physics, “that we need a sort of template that will allow us to filter out ordinary vibrations.” Pretorius made a major contribution to this effort by solving Einstein’s general relativity equations on a computer to determine what signals will come from two colliding black holes.

    Professor of Physics Frans Pretorius uses computer simulations based on Einstein’s equations of general relativity to model the merging of two neutron stars, which can create ripples of gravity known as gravitational waves. Simulations such as the ones by Pretorius yield insight into what these waves will look like by the time they reach Earth, information that could help in their detection. (Image courtesy of Frans Pretorius, Department of Physics)

    Mathematics implications

    Einstein’s work also spurred developments in the field of mathematics. Einstein’s equations are difficult to solve, so Pretorius and others do so by using sophisticated computer algorithms. Yet mathematicians at Princeton have made major strides in proving that Einstein’s equations accurately represent our physical world.

    Mihalis Dafermos, Princeton’s Thomas D. Jones Professor of Mathematical Physics, is one of the mathematicians who studies black holes. “We look at questions such as what do black holes look like, and if you were unfortunate enough to go inside one, what would it look like from the inside?” Dafermos said. “There is really no other way than by using mathematics to know what is going on inside a black hole.” Dafermos earned his Ph.D. at Princeton and was advised by Demetrios Christodoulou, a former Princeton faculty member now at ETH Zurich, who, with Sergiu Klainerman, the Eugene Higgins Professor of Mathematics, made important contributions to the mathematical understanding of Einstein’s theory.
    New horizons

    Einstein’s equations also led to predictions that have not yet been realized, like wormholes, which are hypothetical dense regions of space that could connect distances of a billion light years or more. Wormholes, if they exist, could enable travel of the type featured in the 2014 movie Interstellar, which was based partly on the work of California Institute of Technology physicist Kip Thorne, who earned his Ph.D. at Princeton with John Wheeler as his adviser.

    The movie’s plot built on several features of general relativity, including the finding that time and space can be stretched or squished depending on the effects of gravity. As one travels away from the Earth in a spaceship, the influence of the Earth’s gravity weakens and time passes more quickly. Thus a clock on Earth moves slightly slower than a clock in orbit around the Earth.

    This slowed passage of time amounts to tiny fractions of a second, but it is enough to impede the accuracy of GPS systems. These systems work via very accurate timing between satellites and ground-based instruments. To make the systems as accurate as possible, it is essential that the slight effects of general relativity on time be taken into account.

    Reflecting on the anniversary of general relativity, Pretorius said: “It is not just that the past 100 years were exceptional, but with the impending detection of gravitational waves, and the mysteries that are still out there, such as dark energy and dark matter, we are really entering a new era in the study of gravity. We are not just celebrating the past but also looking forward, and I think the next couple of decades are going to be very exciting.”

    Einstein’s desk at the Institute for Advanced Study. (Photo by Alan W. Richards, courtesy of the Department of Rare Books and Special Collections)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    Princeton University Campus

    About Princeton: Overview

    Princeton University is a vibrant community of scholarship and learning that stands in the nation’s service and in the service of all nations. Chartered in 1746, Princeton is the fourth-oldest college in the United States. Princeton is an independent, coeducational, nondenominational institution that provides undergraduate and graduate instruction in the humanities, social sciences, natural sciences and engineering.

    As a world-renowned research university, Princeton seeks to achieve the highest levels of distinction in the discovery and transmission of knowledge and understanding. At the same time, Princeton is distinctive among research universities in its commitment to undergraduate teaching.

    Today, more than 1,100 faculty members instruct approximately 5,200 undergraduate students and 2,600 graduate students. The University’s generous financial aid program ensures that talented students from all economic backgrounds can afford a Princeton education.

    Princeton Shield

  • richardmitnick 9:30 am on November 9, 2015 Permalink | Reply
    Tags: , , , Relativity   

    From ESA: “Galileo satellites set for year-long Einstein experiment” 

    European Space Agency

    9 November 2015

    Javier Ventura-Traveset
    ESA Global Navigation Satellite Systems Senior Advisor
    Email: Javier.ventura-traveset@esa.int

    Dr Pacôme Delva
    SYRTE, Observatoire de Paris
    Email: pacome.delva@obspm.fr

    Dr Sven Hermann
    ZARM Center of Applied Space Technology and Microgravity
    Email: sven.herrmann@zarm.uni-bremen.de

    ESA Galileo Spacecraft
    ESA/Galileo satellites

    Europe’s fifth and sixth Galileo satellites – subject to complex salvage manoeuvres following their launch last year into incorrect orbits – will help to perform an ambitious year-long test of Einstein’s most famous theory.

    Galileos 5 and 6 were launched together by a Soyuz rocket on 22 August 2014. But the faulty upper stage stranded them in elongated orbits that blocked their use for navigation.

    ESA’s specialists moved into action and oversaw a demanding set of manoeuvres to raise the low points of their orbits and make them more circular.

    Corrected Galileo orbits

    “The satellites can now reliably operate their navigation payloads continuously, and the European Commission, with the support of ESA, is assessing their eventual operational use,” explains ESA’s senior satnav advisor Javier Ventura-Traveset.

    “In the meantime, the satellites have accidentally become extremely useful scientifically, as tools to test Einstein’s General Theory of Relativity by measuring more accurately than ever before the way that gravity affects the passing of time.”

    Although the satellites’ orbits have been adjusted, they remain elliptical, with each satellite climbing and falling some 8500 km twice per day.

    It is those regular shifts in height, and therefore gravity levels, that are valuable to researchers.

    Albert Einstein predicted a century ago that time would pass more slowly close to a massive object. It has been verified experimentally, most significantly in 1976 when a hydrogen maser atomic clock on Gravity Probe A was launched 10 000 km into space, confirming the prediction to within 140 parts in a million.

    Galileo maser clock

    Gravity Probe A

    Atomic clocks on navigation satellites have to take into account they run faster in orbit than on the ground – a few tenths of a microsecond per day, which would give us navigation errors of around 10 km per day.

    “Now, for the first time since Gravity Probe A, we have the opportunity to improve the precision and confirm Einstein’s theory to a higher degree,” comments Javier.

    “This increased precision is of great interest because it will test several alternative theories of gravity.”

    This new effort takes advantage of the passive hydrogen maser atomic clock aboard each Galileo, the elongated orbits creating varying time dilation, and the continuous monitoring thanks to the global network of ground stations.

    “Moreover, while the Gravity Probe A experiment involved a single orbit of Earth, we will be able to monitor hundreds of orbits over the course of a year,” explains Javier.

    “This opens up the prospect of gradually refining our measurements by identifying and removing errors. Eliminating those errors is actually one of the big challenges.

    “For that we count on the support of Europe’s best experts in Europe plus precise tracking from the International Global Navigation Satellite System Service, along with tracking to centimetre accuracy by laser.”

    The results are expected in about one year, projected to quadruple the accuracy on the Gravity Probe A results.

    The two teams devising the experiments are Germany’s ZARM Center of Applied Space Technology and Microgravity, and France’s Systèmes de Référence Temps-Espace, both specialists in fundamental physics research.

    ESA’s forthcoming Atomic Clock Ensemble in Space experiment, planned to fly on the International Space Station in 2017, will go on to test Einstein’s theory down to 2–3 parts per million.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The European Space Agency (ESA), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ESA50 Logo large

  • richardmitnick 11:44 pm on October 29, 2015 Permalink | Reply
    Tags: , , , Relativity   

    From Nautilus: “Will Quantum Mechanics Swallow Relativity?” 



    October 29, 2015
    By Corey S. Powell
    Illustration by Nicholas Garber

    Temp 1

    The contest between gravity and quantum physics takes a new turn.

    It is the biggest of problems, it is the smallest of problems.

    At present physicists have two separate rulebooks explaining how nature works. There is general relativity, which beautifully accounts for gravity and all of the things it dominates: orbiting planets, colliding galaxies, the dynamics of the expanding universe as a whole. That’s big. Then there is quantum mechanics, which handles the other three forces—electromagnetism and the two nuclear forces [weak interaction and strong interaction]. Quantum theory is extremely adept at describing what happens when a uranium atom decays, or when individual particles of light hit a solar cell. That’s small.

    Now for the problem: Relativity and quantum mechanics are fundamentally different theories that have different formulations. It is not just a matter of scientific terminology; it is a clash of genuinely incompatible descriptions of reality.

    The conflict between the two halves of physics has been brewing for more than a century—sparked by a pair of 1905 papers by [Albert]Einstein, one outlining relativity and the other introducing the quantum—but recently it has entered an intriguing, unpredictable new phase. Two notable physicists have staked out extreme positions in their camps, conducting experiments that could finally settle which approach is paramount.

    Just as a pixel is the smallest unit of an image on your screen, so there might be an unbreakable smallest unit of distance: a quantum of space.

    Basically you can think of the division between the relativity and quantum systems as “smooth” versus “chunky.” In general relativity, events are continuous and deterministic, meaning that every cause matches up to a specific, local effect. In quantum mechanics, events produced by the interaction of subatomic particles happen in jumps (yes, quantum leaps), with probabilistic rather than definite outcomes. Quantum rules allow connections forbidden by classical physics. This was demonstrated in a much-discussed recent experiment, in which Dutch researchers defied the local effect. They showed two particles—in this case, electrons—could influence each other instantly, even though they were a mile apart. When you try to interpret smooth relativistic laws in a chunky quantum style, or vice versa, things go dreadfully wrong.

    Relativity gives nonsensical answers when you try to scale it down to quantum size, eventually descending to infinite values in its description of gravity. Likewise, quantum mechanics runs into serious trouble when you blow it up to cosmic dimensions. Quantum fields carry a certain amount of energy, even in seemingly empty space, and the amount of energy gets bigger as the fields get bigger. According to Einstein, energy and mass are equivalent (that’s the message of e=mc2), so piling up energy is exactly like piling up mass. Go big enough, and the amount of energy in the quantum fields becomes so great that it creates a black hole that causes the universe to fold in on itself. Oops.

    Craig Hogan, a theoretical astrophysicist at the University of Chicago and the director of the Center for Particle Astrophysics at Fermilab, is reinterpreting the quantum side with a novel theory in which the quantum units of space itself might be large enough to be studied directly. Meanwhile, Lee Smolin, a founding member of the Perimeter Institute for Theoretical Physics in Waterloo, Canada, is seeking to push physics forward by returning back to Einstein’s philosophical roots and extending them in an exciting direction.

    To understand what is at stake, look back at the precedents. When Einstein unveiled general relativity, he not only superseded Isaac Newton’s theory of gravity; he also unleashed a new way of looking at physics that led to the modern conception of the Big Bang and black holes, not to mention atomic bombs and the time adjustments essential to your phone’s GPS. Likewise, quantum mechanics did much more than reformulate James Clerk Maxwell’s textbook equations of electricity, magnetism, and light. It provided the conceptual tools for the Large Hadron Collider, solar cells, all of modern microelectronics.

    What emerges from the dustup could be nothing less than a third revolution in modern physics, with staggering implications. It could tell us where the laws of nature came from, and whether the cosmos is built on uncertainty or whether it is fundamentally deterministic, with every event linked definitively to a cause.

    THE MAN WITH THE HOLOMETER: Craig Hogan, a theoretical astrophysicist at Fermilab, has built a device to measure what he sees as the exceedingly fine graininess of space. “I’m hoping for an experimental result that forces people to focus the theoretical thinking in a different direction,” Hogan says. The Department of Astronomy and Astrophysics, the University of Chicago.

    A Chunky Cosmos

    Hogan, champion of the quantum view, is what you might call a lamp-post physicist: Rather than groping about in the dark, he prefers to focus his efforts where the light is bright, because that’s where you are most likely to be able to see something interesting. That’s the guiding principle behind his current research. The clash between relativity and quantum mechanics happens when you try to analyze what gravity is doing over extremely short distances, he notes, so he has decided to get a really good look at what is happening right there. “I’m betting there’s an experiment we can do that might be able to see something about what’s going on, about that interface that we still don’t understand,” he says.

    A basic assumption in Einstein’s physics—an assumption going all the way back to Aristotle, really—is that space is continuous and infinitely divisible, so that any distance could be chopped up into even smaller distances. But Hogan questions whether that is really true. Just as a pixel is the smallest unit of an image on your screen and a photon is the smallest unit of light, he argues, so there might be an unbreakable smallest unit of distance: a quantum of space.

    In Hogan’s scenario, it would be meaningless to ask how gravity behaves at distances smaller than a single chunk of space. There would be no way for gravity to function at the smallest scales because no such scale would exist. Or put another way, general relativity would be forced to make peace with quantum physics, because the space in which physicists measure the effects of relativity would itself be divided into unbreakable quantum units. The theater of reality in which gravity acts would take place on a quantum stage.

    The holometer will show the right way (or rule out the wrong way) to understand the underlying quantum structure of space.

    Hogan acknowledges that his concept sounds a bit odd, even to a lot of his colleagues on the quantum side of things. Since the late 1960s, a group of physicists and mathematicians have been developing a framework called string theory to help reconcile general relativity with quantum mechanics; over the years, it has evolved into the default mainstream theory, even as it has failed to deliver on much of its early promise. Like the chunky-space solution, string theory assumes a fundamental structure to space, but from there the two diverge. String theory posits that every object in the universe consists of vibrating strings of energy. Like chunky space, string theory averts gravitational catastrophe by introducing a finite, smallest scale to the universe, although the unit strings are drastically smaller even than the spatial structures Hogan is trying to find.

    Chunky space does not neatly align with the ideas in string theory—or in any other proposed physics model, for that matter. “It’s a new idea. It’s not in the textbooks; it’s not a prediction of any standard theory,” Hogan says, sounding not the least bit concerned. “But there isn’t any standard theory right?”

    If he is right about the chunkiness of space, that would knock out a lot of the current formulations of string theory and inspire a fresh approach to reformulating general relativity in quantum terms. It would suggest new ways to understand the inherent nature of space and time. And weirdest of all, perhaps, it would bolster an au courant notion that our seemingly three-dimensional reality is composed of more basic, two-dimensional units. Hogan takes the “pixel” metaphor seriously: Just as a TV picture can create the impression of depth from a bunch of flat pixels, he suggests, so space itself might emerge from a collection of elements that act as if they inhabit only two dimensions.

    Like many ideas from the far edge of today’s theoretical physics, Hogan’s speculations can sound suspiciously like late-night philosophizing in the freshman dorm. What makes them drastically different is that he plans to put them to a hard experimental test. As in, right now.

    Starting in 2007, Hogan began thinking about how to build a device that could measure the exceedingly fine graininess of space. As it turns out, his colleagues had plenty of ideas about how to do that, drawing on technology developed to search for gravitational waves. Within two years Hogan had put together a proposal and was working with collaborators at Fermilab, the University of Chicago, and other institutions to build a chunk-detecting machine, which he more elegantly calls a “holometer.” (The name is an esoteric pun, referencing both a 17th-century surveying instrument and the theory that 2-D space could appear three-dimensional, analogous to a hologram.)

    Beneath its layers of conceptual complexity, the holometer is technologically little more than a laser beam, a half-reflective mirror to split the laser into two perpendicular beams, and two other mirrors to bounce those beams back along a pair of 40-meter-long tunnels. The beams are calibrated to register the precise locations of the mirrors. If space is chunky, the locations of the mirrors would constantly wander about (strictly speaking, space itself is doing the wandering), creating a constant, random variation in their separation. When the two beams are recombined, they’d be slightly out of sync, and the amount of the discrepancy would reveal the scale of the chunks of space.

    For the scale of chunkiness that Hogan hopes to find, he needs to measure distances to an accuracy of 10-18 meters, about 100 million times smaller than a hydrogen atom, and collect data at a rate of about 100 million readings per second. Amazingly, such an experiment is not only possible, but practical. “We were able to do it pretty cheaply because of advances in photonics, a lot of off the shelf parts, fast electronics, and things like that,” Hogan says. “It’s a pretty speculative experiment, so you wouldn’t have done it unless it was cheap.” The holometer is currently humming away, collecting data at the target accuracy; he expects to have preliminary readings by the end of the year.

    Hogan has his share of fierce skeptics, including many within the theoretical physics community. The reason for the disagreement is easy to appreciate: A success for the holometer would mean failure for a lot of the work being done in string theory. Despite this superficial sparring, though, Hogan and most of his theorist colleagues share a deep core conviction: They broadly agree that general relativity will ultimately prove subordinate to quantum mechanics. The other three laws of physics follow quantum rules, so it makes sense that gravity must as well.

    For most of today’s theorists, though, belief in the primacy of quantum mechanics runs deeper still. At a philosophical—epistemological—level, they regard the large-scale reality of classical physics as a kind of illusion, an approximation that emerges from the more “true” aspects of the quantum world operating at an extremely small scale. Chunky space certainly aligns with that worldview.

    Hogan likens his project to the landmark Michelson-Morley experiment of the 19th century, which searched for the aether—the hypothetical substance of space that, according to the leading theory of the time, transmitted light waves through a vacuum. The experiment found nothing; that perplexing null result helped inspire Einstein’s special theory of relativity, which in turn spawned the general theory of relativity and eventually turned the entire world of physics upside down. Adding to the historical connection, the Michelson-Morley experiment also measured the structure of space using mirrors and a split beam of light, following a setup remarkably similar to Hogan’s.

    “We’re doing the holometer in that kind of spirit. If we don’t see something or we do see something, either way it’s interesting. The reason to do the experiment is just to see whether we can find something to guide the theory,” Hogan says. “You find out what your theorist colleagues are made of by how they react to this idea. There’s a world of very mathematical thinking out there. I’m hoping for an experimental result that forces people to focus the theoretical thinking in a different direction.”

    Whether or not he finds his quantum structure of space, Hogan is confident the holometer will help physics address its big-small problem. It will show the right way (or rule out the wrong way) to understand the underlying quantum structure of space and how that affects the relativistic laws of gravity flowing through it.

    Sidebar: The Black Hole Resolution


    Here on Earth, the clash between the top-down and bottom-up views of physics is playing out in academic journals and in a handful of complicated experimental apparatuses. Theorists on both sides concede that neither pure thought nor technologically feasible tests may be enough to break the deadlock, however. Fortunately, there are other places to look for a more definitive resolution. One of the most improbable of these is also one of the most promising—an idea embraced by physicists almost regardless of where they stand ideologically.

    “Black hole physics gives us a clean experimental target to look for,” says Craig Hogan, a theoretical astrophysicist at the University of Chicago and the director of the Center for Particle Astrophysics at Fermilab. “The issues around quantum black holes are important,” agrees Lee Smolin, a founding member of the Perimeter Institute for Theoretical Physics in Waterloo, Canada.

    Black holes? Really? Granted, these objects are more commonly associated with questions than with answers. They are not things you can create in the laboratory, or poke and prod with instruments, or even study up close with a space probe. Nevertheless, they are the only places in the universe where Hogan’s ideas unavoidably smash into Smolin’s and, more importantly, where the whole of quantum physics collides with general relativity in a way that is impossible to ignore.

    At the outer boundary of the black hole—the event horizon—gravity is so extreme that even light cannot escape, making it an extreme test of how general relativity behaves. At the event horizon, atomic-scale events become enormously stretched out and slowed down; the horizon also divides the physical world into two distinct zones, inside and outside. And there is a very interesting meeting place in terms of the size of a black hole. A stellar-mass black hole is about the size of Los Angeles; a black hole with the mass of the Earth would be roughly the size of a marble. Black holes literally bring the big-small problem in physics home to the human scale.

    The importance of black holes for resolving that problem is the reason why Stephen Hawking and his cohorts debate about them so often and so vigorously. It turns out that we don’t actually need to cozy up close to black holes in order to run experiments with them. Quantum theory implies that a single particle could potentially exist both inside and outside the event horizon, which makes no sense. There is also the question of what happens to information about things that fall into a black hole; the information seems to vanish, even though theory says that information cannot be destroyed. Addressing these contradictions is forcing theoretical physicists to grapple more vigorously than ever before with the interplay of quantum mechanics and general relativity.

    Best of all, the answers will not be confined to the world of theory. Astrophysicists have increasingly sophisticated ways to study the region just outside the event horizon by monitoring the hot, brilliant clouds of particles that swirl around some black holes. An even greater breakthrough is just around the corner: the Event Horizon Telescope. This project is in the process of linking together about a dozen radio dishes from around the world, creating an enormous networked telescope so powerful that it will be able to get a clear look at Sagittarius A*, the massive black hole that resides in the center of our galaxy. Soon, possibly by 2020, the Event Horizon Telescope should deliver its first good portraits. What they show will help constrain the theories of black holes, and so offer telling clues about how to solve the big-small problem.

    Human researchers using football stadium-size radio telescopes, linked together into a planet-size instrument, to study a star-size black hole, to reconcile the subatomic-and-cosmic-level enigma at the heart of physics … if it works, the scale of the achievement will be truly unprecedented.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

  • richardmitnick 10:12 am on October 1, 2015 Permalink | Reply
    Tags: , , , Relativity   

    From Lawrence Krauss at Nautilus: “The Trouble with Theories of Everything” 



    October 1, 2015

    Lawrence Krauss
    By Lawrence M. Krauss
    Illustrations by Melinda Beck

    Whenever you say anything about your daily life, a scale is implied. Try it out. “I’m too busy” only works for an assumed time scale: today, for example, or this week. Not this century or this nanosecond. “Taxes are onerous” only makes sense for a certain income range. And so on.

    Surely the same restriction doesn’t hold true in science, you might say. After all, for centuries after the introduction of the scientific method, conventional wisdom held that there were theories that were absolutely true for all scales, even if we could never be empirically certain of this in advance. [Sir Isaac] Newton’s universal law of gravity, for example, was, after all, universal! It applied to falling apples and falling planets alike, and accounted for every significant observation made under the sun, and over it as well.

    With the advent of relativity, and general relativity in particular, it became clear that Newton’s law of gravity was merely an approximation of a more fundamental theory. But the more fundamental theory, general relativity, was so mathematically beautiful that it seemed reasonable to assume that it codified perfectly and completely the behavior of space and time in the presence of mass and energy.

    The advent of quantum mechanics changed everything. When quantum mechanics is combined with relativity, it turns out, rather unexpectedly in fact, that the detailed nature of the physical laws that govern matter and energy actually depend on the physical scale at which you measure them. This led to perhaps the biggest unsung scientific revolution in the 20th century: We know of no theory that both makes contact with the empirical world, and is absolutely and always true. (I don’t envisage this changing anytime soon, string theorists’ hopes notwithstanding.) Despite this, theoretical physicists have devoted considerable energy to chasing exactly this kind of theory. So, what is going on? Is a universal theory a legitimate goal, or will scientific truth always be scale-dependent?


    The combination of quantum mechanics and relativity implies an immediate scaling problem. Heisenberg’s famous uncertainty principle, which lies at the heart of quantum mechanics, implies that on small scales, for short times, it is impossible to completely constrain the behavior of elementary particles. There is an inherent uncertainty in energy and momenta that can never be reduced. When this fact is combined with special relativity, the conclusion is that you cannot actually even constrain the number of particles present in a small volume for short times. So called “virtual particles” can pop in and out of the vacuum on timescales so short you cannot measure their presence directly.

    One striking effect of this is that when we measure the force between electrons, say, the actual measured charge on the electron—the thing that determines how strong the electric force is—depends on what scale you measure it at. The closer you get to the electron, the more deeply you are penetrating inside of the “cloud” of virtual particles that are surrounding the electron. Since positive virtual particles are attracted to the electron, the deeper you penetrate into the cloud, the less of the positive cloud and more of the negative charge on the electron you see.

    Then, when you set out to calculate the force between two particles, you need to include the effects of all possible virtual particles that could pop out of empty space during the period of measuring the force. This includes particles with arbitrarily large amounts of mass and energy, appearing for arbitrarily small amounts of time. When you include such effects, the calculated force is infinite.

    Richard Feynman shared the Nobel Prize for arriving at a method to consistently calculate a finite residual force after extracting a variety of otherwise ambiguous infinities. As a result, we can now compute, from fundamental principles, quantities such as the magnetic moment of the electron to 10 significant figures, comparing it with experiments at a level unachievable in any other area of science.

    But Feynman was ultimately disappointed with what he had accomplished—something that is clear from his 1965 Nobel lecture, where he said, “I think that the renormalization theory is simply a way to sweep the difficulties of the divergences of electrodynamics under the rug.” He thought that no sensible complete theory should produce infinities in the first place, and that the mathematical tricks he and others had developed were ultimately a kind of kludge.

    Now, though, we understand things differently. Feynman’s concerns 
were, in a sense, misplaced. The problem was not with the theory, but with trying to push the theory beyond the scales where it provides the correct description of nature.

    There is a reason that the infinities produced by virtual particles with arbitrarily
 large masses and energies are not physically relevant: They are
 based on the erroneous 
presumption that the
 theory is complete. Or,
 put another way, that the theory describes physics on all scales, even arbitrarily small scales of distance and time. But if we expect our theories to be complete, that means that before we can have a theory of anything, we would first have to have a theory of everything—a theory that included the effects of all elementary particles we already have discovered, plus all the particles we haven’t yet discovered! That is impractical at best, and impossible at worst.

    Thus, theories that make sense must be insensitive, at the scales we can measure in the laboratory, to the effects of possible new physics at much smaller distance scales (or less likely, on much bigger scales). This is not just a practical workaround of a temporary problem, which we expect will go away as we move toward ever-better descriptions of nature. Since our empirical knowledge is likely to always be partially incomplete, the theories that work to explain that part of the universe we can probe will, by practical necessity, be insensitive to possible new physics at scales beyond our current reach. It is a feature of our epistemology, and something we did not fully appreciate before we began to explore the extreme scales where quantum mechanics and relativity both become important.

    This applies even to the best physical theory we have in nature: quantum electrodynamics, which describes the quantum interactions between electrons and light. The reason we can, following Feynman’s lead, throw away with impunity the infinities that theory produces is that they are artificial. They correspond to extrapolating the theory to domains where it is probably no longer valid. Feynman was wrong to have been disappointed with his own success in maneuvering around these infinities—that is the best he could have done without understanding new physics at scales far smaller than could have been probed at the time. Even today, half a century later, the theory that takes over at the scales where quantum electrodynamics is no longer the correct description is itself expected to break down at still smaller scales.


    There is an alternative narrative to the story of scale in physical theory. Rather than legitimately separating theories into their individual domains, outside of which they are ineffective, scaling arguments have revealed hidden connections between theories, and pointed the way to new unified theories that encompass the original theories and themselves apply at a broader range of scale.

    For example, all of the hoopla over the past several years associated with the discovery of the Higgs particle was due to the fact that it was the last missing link in a theory that unifies quantum electrodynamics with another force, called the weak interaction. These are two of the four known forces in nature, and on the surface they appear very different. But we now understand that on very small scales, and very high energies, the two forces can be understood as different manifestations of the same underlying force, called the electroweak force.

    Scale has also motivated physicists to try
 to unify another of 
nature’s basic forces,
 the strong force, into
 a broader theory. The 
strong force, which 
acts on the quarks 
that make up protons
 and neutrons, resisted
 understanding until
 1973. That year, three
 theorists, David Gross,
 Frank Wilczek, and 
David Politzer, demonstrated something 
absolutely unexpected 
and remarkable. They
 demonstrated that a 
candidate theory to
 describe this force, called quantum chromodynamics—in analogy with quantum electrodynamics—possessed a property they called Asymptotic Freedom.

    Asymptotic Freedom causes the strong force between quarks to get weaker as the quarks are brought closer together. This explained not only an experimental phenomenon that had become known as “scaling”—where quarks within protons appeared to behave as if they were independent non-interacting particles at high energies and small distances—but it also offered the possibility to explain why no free quarks are observed in nature. If the strong force becomes weaker at small distances, it presumably can be strong enough at large distances to ensure that no free quarks ever escape their partners.

    The discovery that the strong force gets weaker at small distances, while electromagnetism, which gets united with the weak force, gets stronger at small distances, led theorists in the 1970s to propose that at sufficiently small scales, perhaps 15 orders of magnitude smaller than the size of a proton, all three forces (strong, weak, and electromagnetic) get unified together as a single force in what has become known as a Grand Unified Theory. Over the past 40 years we have been searching for direct evidence of this—in fact the Large Hadron Collider is just now searching for a whole set of new elementary particles that appear to be necessary for the scaling of the three forces to be just right.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    But while there is indirect evidence, no direct smoking gun has yet been found.

    Naturally, efforts to unify three of the four known forces led to further efforts to incorporate the fourth force, gravity, into the mix. In order to do this, proposals have been made that gravity itself is merely an effective theory and at sufficiently small scales it gets merged with the other forces, but only if there are a host of extra spatial dimensions in nature that we do not observe. This theory, often called superstring theory, produced a great deal of excitement among theorists in the 1980s and 1990s, but to date there is not any evidence that it actually describes the universe we live in.

    If it does then it will possess a unique and new feature. Superstring theory may ultimately produce no infinities at all. Therefore, it has the potential to apply at all distance scales, no matter how small. For this reason it has become known to some as a “theory of everything”—though, in fact, the scale where all the exotica of the theory would actually appear is so small as to be essentially physically irrelevant as far as foreseeable experimental measurements would be concerned.

    The recognition of the scale dependence of our understanding of physical reality has led us, over time, toward a proposed theory—string theory—for which this limitation vanishes. Is that effort the reflection of a misplaced audacity by theoretical physicists accustomed to success after success in understanding reality at ever-smaller scales?

    While we don’t know the answers to that question, we should, at the very least, be skeptical. There is no example so far where an extrapolation as grand as that associated with string theory, not grounded by direct experimental or observational results, has provided a successful model of nature. In addition, the more we learn about string theory, the more complicated it appears to be, and many early expectations about its universalism may have been optimistic.

    At least as likely is the possibility that nature, as Feynman once speculated, could be like an onion, with a huge number of layers. As we peel back each layer we may find that our beautiful existing theories get subsumed in a new and larger framework. So there would always be new physics to discover, and there would never be a final, universal theory that applies for all scales of space and time, without modification.

    Which road is the real road to reality is up for grabs. If we knew the correct path to discovery, it wouldn’t be discovery. Perhaps my own predilection is just based on a misplaced hope of continued job security for physicists! But I also like the possibility that there will forever be mysteries to solve. Because life without mystery can get very boring, at any scale.

    Lawrence M. Krauss is a theoretical physicist and cosmologist, Director of the Origins Project and Foundation Professor in
 the School of Earth and Space Exploration at Arizona State University. He is also the author of bestselling books including A Universe from Nothing and The Physics of Star Trek.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

  • richardmitnick 11:41 am on August 28, 2015 Permalink | Reply
    Tags: , , , Quantum Weirdness, Relativity   

    From New Scientist: “Quantum weirdness proved real in first loophole-free experiment” 


    New Scientist

    28 August 2015
    Jacob Aron

    Welcome to quantum reality (Image: Julie Guiche/Picturetank)

    It’s official: the universe is weird. Our everyday experience tells us that distant objects cannot influence each other, and don’t disappear just because no one is looking at them. Even Albert Einstein was dead against such ideas because they clashed so badly with our views of the real world.

    But it turns out we’re wrong – the quantum nature of reality means, on some level, these things can and do actually happen. A groundbreaking experiment puts the final nail in the coffin of our ordinary “local realism” view of the universe, settling an argument that has raged through physics for nearly a century.

    Teams of physicists around the world have been racing to complete this experiment for decades. Now, a group led by Ronald Hanson at Delft University of Technology in the Netherlands has finally cracked it. “It’s a very nice and beautiful experiment, and one can only congratulate the group for that,” says Anton Zeilinger, head of one of the rival teams at the University of Vienna, Austria. “Very well done.”

    To understand what Hanson and his colleagues did, we have to go back to the 1930s, when physicists were struggling to come to terms with the strange predictions of the emerging science of quantum mechanics. The theory suggested that particles could become entangled, so that measuring one would instantly influence the measurement of the other, even if they were separated by a great distance. Einstein dubbed this “spooky action at a distance”, unhappy with the implication that particles could apparently communicate faster than any signal could pass between them.

    What’s more, the theory also suggested that the properties of a particle are only fixed when measured, and prior to that they exist in a fuzzy cloud of probabilities.

    Nonsense, said Einstein, who famously proclaimed that God does not play dice with the universe. He and others were guided by the principle of local realism, which broadly says that only nearby objects can influence each other and that the universe is “real” – our observing it doesn’t bring it into existence by crystallising vague probabilities. They argued that quantum mechanics was incomplete, and that “hidden variables” operating at some deeper layer of reality could explain the theory’s apparent weirdness.

    On the other side, physicists like Niels Bohr insisted that we just had to accept the new quantum reality, since it explained problems that classical theories of light and energy couldn’t handle.

    Test it out

    It wasn’t until the 1960s that the debate shifted further to Bohr’s side, thanks to John Bell, a physicist at CERN. He realised that there was a limit to how connected the properties of two particles could be if local realism was to be believed. So he formulated this insight into a mathematical expression called an inequality. If tests showed that the connection between particles exceeded the limit he set, local realism was toast.

    “This is the magic of Bell’s inequality,” says Zeilinger’s colleague Johannes Kofler. “It brought an almost purely philosophical thing, where no one knew how to decide between two positions, down to a thing you could experimentally test.”

    And test they did. Experiments have been violating Bell’s inequality for decades, and the majority of physicists now believe Einstein’s views on local realism were wrong. But doubts remained. All prior experiments were subject to a number of potential loopholes, leaving a gap that could allow Einstein’s camp to come surging back.

    “The notion of local realism is so ingrained into our daily thinking, even as physicists, that it is very important to definitely close all the loopholes,” says Zeilinger.

    Loophole trade-off

    A typical Bell test begins with a source of photons, which spits out two at the same time and sends them in different directions to two waiting detectors, operated by a hypothetical pair conventionally known as Alice and Bob. The pair have independently chosen the settings on their detectors so that only photons with certain properties can get through. If the photons are entangled according to quantum mechanics, they can influence each other and repeated tests will show a stronger pattern between Alice and Bob’s measurements than local realism would allow.

    But what if Alice and Bob are passing unseen signals – perhaps through Einstein’s deeper hidden layer of reality – that allow one detector to communicate with the other? Then you couldn’t be sure that the particles are truly influencing each other in their instant, spooky quantum-mechanical way – instead, the detectors could be in cahoots, altering their measurements. This is known as the locality loophole, and it can be closed by moving the detectors far enough apart that there isn’t enough time for a signal to cross over before the measurement is complete. Previously Zeilinger and others have done just that, including shooting photons between two Canary Islands 144 kilometres apart.

    Close one loophole, though, and another opens. The Bell test relies on building up a statistical picture through repeated experiments, so it doesn’t work if your equipment doesn’t pick up enough photons. Other experiments closed this detection loophole, but the problem gets worse the further you separate the detectors, as photons can get lost on the way. So moving the detectors apart to close the locality loophole begins to widen the detection one.

    “There’s a trade-off between these two things,” says Kofler. That meant hard-core local realists always had a loophole to explain away previous experiments – until now.

    “Our experiment realizes the first Bell test that simultaneously addressed both the detection loophole and the locality loophole,” writes Hanson’s team in a paper detailing the study. Hanson declined to be interviewed because the work is currently under review for publication in a journal.

    Entangled diamonds

    In this set-up, Alice and Bob sit in two laboratories 1.3 kilometres apart. Light takes 4.27 microseconds to travel this distance and their measurement takes only 3.7 microseconds, so this is far enough to close the locality loophole.

    Each laboratory has a diamond that contains an electron with a property called spin. The team hits the diamonds with randomly produced microwave pulses. This makes them each emit a photon, which is entangled with the electron’s spin. These photons are then sent to a third location, C, in between Alice and Bob, where another detector clocks their arrival time.

    If photons arrive from Alice and Bob at exactly the same time, they transfer their entanglement to the spins in each diamond. So the electrons are entangled across the distance of the two labs – just what we need for a Bell test. What’s more, the electrons’ spin is constantly monitored, and the detectors are of high enough quality to close the detector loophole.

    But the downside is that the two photons arriving at C rarely coincide – just a few per hour. The team took 245 measurements, so it was a long wait. “This is really a very tough experiment,” says Kofler.

    The result was clear: the labs detected more highly correlated spins than local realism would allow. The weird world of quantum mechanics is our world.

    “If they’ve succeeded, then without any doubt they’ve done a remarkable experiment,” says Sandu Popescu of the University of Bristol, UK. But he points out that most people expected this result – “I can’t say everybody was holding their breath to see what happens.” What’s important is that these kinds of experiments drive the development of new quantum technology, he says.

    One of the most important quantum technologies in use today is quantum cryptography. Data networks that use the weird properties of the quantum world to guarantee absolute secrecy are already springing up across the globe, but the loopholes are potential bugs in the laws of physics that might have allowed hackers through. “Bell tests are a security guarantee,” says Kofler. You could say Hanon’s team just patched the universe.
    Freedom of choice

    There are still a few ways to quibble with the result. The experiment was so tough that the p-value – a measure of statistical significance – was relatively high for work in physics. Other sciences like biology normally accept a p-value below 5 per cent as a significant result, but physicists tend to insist on values millions of times smaller, meaning the result is more statistically sound. Hanson’s group reports a p-value of around 4 per cent, just below that higher threshold.

    That isn’t too concerning, says Zeilinger. “I expect they have improved the experiment, and by the time it is published they’ll have better data,” he says. “There is no doubt it will withstand scrutiny.”

    And there is one remaining loophole for local realists to cling to, but no experiment can ever rule it out. What if there is some kind of link between the random microwave generators and the detectors? Then Alice and Bob might think they’re free to choose the settings on their equipment, but hidden variables could interfere with their choice and thwart the Bell test.

    Hanson’s team note this is a possibility, but assume it isn’t the case. Zeilinger’s experiment attempts to deal with this freedom of choice loophole by separating their random number generators and detectors, while others have proposed using photons from distant quasars to produce random numbers, resulting in billions of years of separation.

    None of this helps in the long run. Suppose the universe is somehow entirely predetermined, the flutter of every photon carved in stone since time immemorial. In that case, no one would ever have a choice about anything. “The freedom of choice loophole will never be closed fully,” says Kofler. As such, it’s not really worth experimentalists worrying about – if the universe is predetermined, the complete lack of free will means we’ve got bigger fish to fry.

    So what would Einstein have made of this new result? Unfortunately he died before Bell proposed his inequality, so we don’t know if subsequent developments would have changed his mind, but he’d likely be enamoured with the lengths people have gone to prove him wrong. “I would give a lot to know what his reaction would be,” says Zeilinger. “I think he would be very impressed.”

    Journal reference: arxiv.org/abs/1508.05949v1

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 10:55 am on March 18, 2015 Permalink | Reply
    Tags: , , Relativity,   

    From ars technica: “Shining an X-Ray torch on quantum gravity” 

    Ars Technica
    ars technica

    Mar 17, 2015
    Chris Lee

    This free electron laser could eventually provide a test of quantum gravity. BNL

    Quantum mechanics has been successful beyond the wildest dreams of its founders. The lives and times of atoms, governed by quantum mechanics, play out before us on the grand stage of space and time. And the stage is an integral part of the show, bending and warping around the actors according to the rules of general relativity. The actors—atoms and molecules—respond to this shifting stage, but they have no influence on how it warps and flows around them.

    This is puzzling to us. Why is it such a one directional thing: general relativity influences quantum mechanics, but quantum mechanics has no influence on general relativity? It’s a puzzle that is born of human expectation rather than evidence. We expect that, since quantum mechanics is punctuated by sharp jumps, somehow space and time should do the same.

    There’s also the expectation that, if space and time acted a bit more quantum-ish, then the equations of general relativity would be better behaved. In general relativity, it is possible to bend space and time infinitely sharply. This is something we simply cannot understand: what would infinitely bent space look like? To most physicists, it looks like something that cannot actually be real, indicating a problem with the theory. Might this be where the actors influence the stage?

    Quantum mechanics and relativity on the clock

    To try and catch the actors modifying the stage requires the most precise experiments ever devised. Nothing we have so far will get us close, so a new idea from a pair of German physicists is very welcome. They focus on what’s perhaps the most promising avenue for detecting quantum influences on space-time: time-dilation experiments. Modern clocks rely on the quantum nature of atoms to measure time. And the flow of time depends on relative speed and gravitational acceleration. Hence, we can test general relativity, special relativity, and quantum mechanics all in the same experiment.

    To get an idea of how this works, let’s take a look at the traditional atomic clock. In an atomic clock, we carefully prepare some atoms in a predefined superposition state: that is the atom is prepared such that it has a fifty percent chance of being in state A, and a fifty percent chance of being in state B. As time passes, the environment around the atom forces the superposition state to change. At some later point, it will have a seventy five percent chance of being in state A; even later, it will certainly be in state A. Keep on going, however, and the chance of being in state A starts to shrink, and it continues to do so until the atom is certainly in state B. Provided that the atom is undisturbed, these oscillations will continue.

    These periodic oscillations provide the perfect ticking clock. We simply define the period of an oscillation to be our base unit of time. To couple this to general relativity measurements is, in principle, rather simple. Build two clocks and place them beside each other. At a certain moment, we start counting ticks from both clocks. When one clock reaches a thousand (for instance), we compare the number of ticks from the two clocks. If we have done our job right, both clocks should have reached a thousand ticks.

    If we shoot one into space, however, and perform the same experiment, and relativity demands that the clock in orbit record more ticks than the clock on Earth. The way we record the passing of time is by a phenomena that is purely quantum in nature, while the passing of time is modified by gravity. These experiments work really well. But at present, they are not sensitive enough to detect any deviation from either quantum mechanics or general relativity.

    Going nuclear

    That’s where the new ideas come in. The researchers propose, essentially, to create something similar to an atomic clock, but instead of tracking the oscillation atomic states, they want to track nuclear states. Usually, when I discuss atoms, I ignore the nucleus entirely. Yes, it is there, but I only really care about the influence the nucleus has on the energetic states of the electrons that surround it. However, in one key way the nucleus is just like the electron cloud that surrounds it: it has its own set of energetic states. It is possible to excite nuclear states (using X-Ray radiation) and, afterwards, they will return the ground state by emitting an X-Ray.

    So let’s imagine that we have a crystal of silver sitting on the surface of the Earth. The silver atoms all experience a slightly different flow of time because the atoms at the top of the crystal are further away from the center of the Earth compared to the atoms at the bottom of the crystal.

    To kick things off, we send in a single X-Ray photon, which is absorbed by the crystal. This is where the awesomeness of quantum mechanics puts on sunglasses and starts dancing. We don’t know which silver atom absorbed the photon, so we have to consider that all of them absorbed a tiny fraction of the photon. This shared absorption now means that all of the silver atoms enter a superposition state of having absorbed and not absorbed a photon. This superposition state changes with time, just like in an atomic clock.

    In the absence of an outside environment, all the silver atoms will change in lockstep. And when the photon is re-emitted from the crystal, all the atoms will contribute to that emission. So each atom behaves as if it is emitting a partial photon. These photons add together, and a single photon flies off in the same direction as the absorbed photon had been traveling. Essentially because all the atoms are in lockstep, the charge oscillations that emit the photon add up in phase only in the direction that the absorbed photon was flying.

    Gravity, though, causes the atoms to fall out of lockstep. So when the time comes to emit, the charge oscillations are all slightly out of phase with each other. But they are not random: those at the top of the crystal are just slightly ahead of those at the bottom of the crystal. As a result, the direction for which the individual contributions add up in phase is not in the same direction as the flight path of the absorbed photon, but at a very slight angle.

    How big is this angle? That depends on the size of the crystal and how long it takes the environment to randomize the emission process. For a crystal of silver atoms that is less than 1mm thick, the angle could be as large as 100 micro-degrees, which is small but probably measurable.
    Spinning crystals

    That, however, is only the beginning of a seam of clever. If the crystal is placed on the outside of a cylinder and rotated during the experiment, then the top atoms of the crystal are moving faster than the bottom, meaning that the time-dilation experienced at the top of the crystal is greater than that at the bottom. This has exactly the same effect as placing the crystal in a gravitational field, but now the strength of that field is governed by the rate of rotation.

    In any case, by spinning a 10mm diameter cylinder very fast (70,000 revolutions per second), the angular deflection is vastly increased. For silver, for instance, it reaches 90 degrees. With such a large signal, even smaller deviations from the predictions of general relativity should be detectable in the lab. Importantly, these deviations happen on very small length scales, where we would normally start thinking about quantum effects in matter. Experiments like these may even be sensitive enough to see the influence of quantum mechanics on space and time.

    A physical implementation of this experiment will be challenging but not impossible. The biggest issue is probably the X-Ray source and doing single photon experiments in the X-Ray regime. Following that, the crystals need to be extremely pure, and something called a coherent state needs to be created within them. This is certainly not trivial. Given that it took atomic physicists a long time to achieve this for electronic transitions, I think it will take a lot more work to make it happen at X-Ray frequencies.

    On the upside free electron lasers have come a very long way, and they have much better control over beam intensities and stability. This is, hopefully, the sort of challenge that beam-line scientists live for.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon
    Stem Education Coalition
    Ars Technica was founded in 1998 when Founder & Editor-in-Chief Ken Fisher announced his plans for starting a publication devoted to technology that would cater to what he called “alpha geeks”: technologists and IT professionals. Ken’s vision was to build a publication with a simple editorial mission: be “technically savvy, up-to-date, and more fun” than what was currently popular in the space. In the ensuing years, with formidable contributions by a unique editorial staff, Ars Technica became a trusted source for technology news, tech policy analysis, breakdowns of the latest scientific advancements, gadget reviews, software, hardware, and nearly everything else found in between layers of silicon.

    Ars Technica innovates by listening to its core readership. Readers have come to demand devotedness to accuracy and integrity, flanked by a willingness to leave each day’s meaningless, click-bait fodder by the wayside. The result is something unique: the unparalleled marriage of breadth and depth in technology journalism. By 2001, Ars Technica was regularly producing news reports, op-eds, and the like, but the company stood out from the competition by regularly providing long thought-pieces and in-depth explainers.

    And thanks to its readership, Ars Technica also accomplished a number of industry leading moves. In 2001, Ars launched a digital subscription service when such things were non-existent for digital media. Ars was also the first IT publication to begin covering the resurgence of Apple, and the first to draw analytical and cultural ties between the world of high technology and gaming. Ars was also first to begin selling its long form content in digitally distributable forms, such as PDFs and eventually eBooks (again, starting in 2001).

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: