Tagged: Quantum Mechanics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:08 am on October 8, 2015 Permalink | Reply
    Tags: , Quantum Mechanics,   

    From Nautilus: “How Big Can Schrödinger’s Kittens Get?” 



    October 8, 2015
    Philip Ball
    Illustration by Ellen Weinstein

    Scientists are slowly scaling up quantum effects from atomic to human size.

    It’s time we thought again about quantum theory. There’s nothing actually wrong with the theory itself—it works fantastically well for understanding how atoms and subatomic particles behave.

    The problem is how we talk about quantum theory. We keep insisting that it’s weird: waves becoming particles, things being in two places (or two states) at once, spooky action at a distance, that sort of thing. Isn’t it perverse to clothe in mystery a theory that scientists use routinely to understand the world?

    Part of the issue is that everyday objects are discrete, localized, and unambiguous, and so, very different to quantum objects. But why is that the case? Why is our everyday world always “this or that” and never “this and that”? Why, as things get bigger, does quantum physics turn into classical physics, governed by laws like those that [Sir] Isaac Newton wrote down over three centuries ago?

    This switch is called the quantum-classical transition, and it has puzzled scientists for many decades. We still don’t completely understand it. But over the past two or so decades, new experimental techniques have pushed the transition to ever-larger sizes. Most scientists agree that technical difficulties will prevent us from ever putting a basketball, or even a human, in two places at once. But an emerging understanding of the quantum-classical transition also suggests that there is nothing in principle that prohibits it—no cosmic censorship separates our “normal” world from the “weird” world that lurks beneath it. In other words, the quantum world may not be so weird after all.

    Life-And-Death physics: If a quantum event determined whether a cat in a box were killed, would it be both alive and dead? Jie Qi / Flickr

    Imagine a broken drying machine that spits out pairs of unmatched socks. They come in complementary contrasts: if one is red, the other is green. Or, if one is white, the other is black, and so on. We don’t know which of these options we’ll get until we look—but we do know that if we find one is red, we can be sure the other is green. Whatever the actual colors are, they are correlated with one another.

    Now imagine the quantum mechanical version of this same machine. According to the Copenhagen interpretation of quantum mechanics developed in the mid-1920s by Niels Bohr, Werner Heisenberg, and collaborators, quantum socks in a correlated state (where the color of one is linked to the color of the other) don’t actually have any fixed colors until we look. The very act of looking at one quantum sock determines the color of the other. If we look in one way, the first sock might be red (and the other therefore green). If we look in another, the first is white (and the other black).

    Crudely, you could say that in these correlated pairs the colors of the socks are characteristics that extend well beyond the socks themselves. The color of a given sock is not local, that is, not contained in the properties of just that one sock. The two colors are said to be entangled with each other.

    The physicist Erwin Schrödinger described entanglement as the key to quantum behavior, and used it to construct a famous paradox. It begins with an unfortunate cat that Schrödinger imagined trapped inside a box, into which a lethal poison was released by the outcome of some quantum event. Because the event was quantum, it could be in what physicists call a superposition state: both triggering the poison release, and not triggering it.

    These superpositions are not unusual for tiny objects like atoms that are firmly in the quantum realm. But, because Schrödinger entangled the event with a large cat, the result is the paradoxical conclusion that the cat is both killed and not killed.

    The conventional resolution to the paradox was to claim that making a measurement on a superposition state, like the live–dead cat, forces a choice, so that the superposition collapses the cat—indeed, in effect the whole universe—into one state or another: The cat is either dead or alive, but not both. In that view, we can never see the live–dead cat.

    But what was the state of the cat before we looked? According to the Copenhagen interpretation, the question has no meaning. Reality, it maintains, is what we can observe and measure, and it makes no sense to wonder about what things are really like before we make those observations.

    Others, most prominently Albert Einstein, couldn’t accept this. They stuck with the classical “realist” view, which says that everything has particular, objective properties, whether we look or not. Einstein and two young colleagues, Boris Podolsky and Nathan Rosen, came up with a version of the “quantum drying machine” thought experiment to try to demonstrate how quantum theory led to a paradox, in which a measurement in one place instantly affected an object in another place. But in the 1980s, measurements of laser photons showed that entanglement really does work that way—not because of “faster-than-light” communication, but because quantum properties can be genuinely non-local, spread over more than one particle.

    Since then, experimentalists have been working on building ever-larger quantum objects, which are big compared with atoms but small compared with real cats. They are often called “Schrödinger’s kittens,” and they are rapidly growing up.

    A Schrödinger Kitten: The beam in the center of this shape (surrounded with a dotted red line) can be made to vibrate in two different ways at the same time, an example of quantum superposition. Oskar Painter

    One key to these kittens becoming cats has been learning how to maintain quantum coherence, or roughly, the ability for the peaks and troughs of wave-like quantum particles to stay synchronized. As a quantum state evolves, it gets entangled with its environment, and quantum coherence can leak away into the surroundings. One might very crudely imagine it to be a little like the way heat in a hot body gets dissipated into a cooler surrounding environment.

    Another way to think of it is to say that information gets increasingly local. The point about quantum systems is that non-local correlations mean you can’t know everything about some part of it by making measurements just on that part. There’s always some residual ignorance. In contrast, once we have established that a sock is red or green, there’s nothing left to be known about what color it is. Wojciech Zurek of Los Alamos National Laboratory in New Mexico has formulated an expression for the ignorance that remains once the state of the measuring apparatus has been determined, which he calls quantum discord. For a classical system, the discord is zero. If it is greater than zero, the system has some quantumness to it.

    Decoherence bleeds away discord. Quantum phenomena are converted to ones that obey classical rules: no superpositions, no entanglement, no non-locality, and a time and a place for everything.

    How big, then, can quantum systems get before decoherence starts to destroy their quantumness? We have known that very small particles like electrons can behave as coherent quantum waves ever since the groundbreaking observation of electron interference in the late 1920s. Soon after, the wavelike properties of entire atoms were demonstrated. But it wasn’t until the 1990s, when it became possible to create coherent “matter waves,” that quantum wave interference was observed for atoms and molecules.

    How big can these chunks of matter get while still undergoing interference? In 1999 a team at the University of Vienna led by Anton Zeilinger and Markus Arndt marshaled 60-atom carbon molecules called fullerenes (C60) into a beam, passed it through a grating of slits spaced 100 nanometers apart and made from the ceramic silicon nitride, and detected an interference pattern on the far side. Arndt and his coworkers have now demonstrated that this quantum waviness persists for tailor-made organic molecules containing 430 atoms and up to 6 nanometers across: easily big enough to see in an electron microscope and comparable to the size of small proteins. The interference patterns can be washed out by decoherence: They vanish as the researchers admit gas into the apparatus, increasing the interactions of the molecules with their environment.

    Because this interference depends on the molecules being in superposition states—in effect, each passes through more than one slit at a time—the molecules can be thought of as molecular Schrödinger’s kittens. They’re still very tiny, though, and obviously not alive. Might it be possible to push up the size scale to that at which life becomes possible—for example, to look for interference in “Schrödinger’s viruses?”

    That idea has been proposed by Ignacio Cirac and Oriol Romero-Isart at the Max Planck Institute for Quantum Optics in Garching, Germany. They have outlined an experimental method for preparing superposition states not only for viruses (with sizes of around 100 nanometers or more) but also for extremely hardy microscopic creatures called tardigrades or water bears (which are up to 1 millimeter or so in size). These objects would be levitated in an optical trap made of intense laser-light fields and then coaxed into a superposition of their vibrational states within the trapping force field (like balls rolling back and forth in the bottom of a bowl). Tardigrades have been shown to survive on the outside of spacecraft, and so might withstand the rigors of a high-vacuum experiment like this. So far, however, it’s just a proposal.

    We know already, however, that objects large enough to see with the naked eye can be placed in entangled stages. A team led by Ian Walmsley, a physicist at the University of Oxford, achieved this in 2011 using laser pulses to excite entangled quantum vibrations (phonons) in two diamond crystals 3 millimeters wide and 15 centimeters apart. Each phonon involves the coherent vibration of about 1016 atoms, corresponding to a region of the crystal measuring about 0.05 by 0.25 millimeters. To create the superposition, the researchers first placed a laser photon in an entangled state by using a beam splitter to send it toward either diamond with equal probability. So long as they don’t detect this path, the photon creates an entangled vibration in both crystals. When a phonon is excited, it emits a secondary photon, which the researchers could detect without finding out which crystal it comes from. In that case the phonon must be considered non-local, in a sense embracing both diamonds.

    Another way to look at quantum effects in relatively large systems is to study the vibrations of very small springy structures like nanometer-scale cantilevers and other “nanomechanical resonators.” At the scale of molecules, vibrations are quantized: They can only occur at well-defined frequencies, or in mixed superpositions of these allowed quantum states. Nanomechanical resonators are also small and light enough to have, in theory, distinguishable quantized vibration states. An ideal way to read out the vibrational state of the resonating element is to couple its mechanical motion to light, an approach called optomechanics. In its simplest form, this might involve making a chamber in which light can bounce back and forth between mirrors, with one of the mirrors attached to a spring so that it can oscillate.

    Several groups have now demonstrated quantum behavior in such nanoscale optomechanical systems. John Teufel and his coworkers at the National Institute of Standards and Technology in Boulder, Colorado, for example, used a drum-like aluminum membrane 100 nanometers thick and 15 micrometers (μm) wide as the resonator, coupled to a microwave-frequency cavity, while Oskar Painter and colleagues at the California Institute of Technology in Pasadena used a thin silicon beam 15 micrometers long, with a 600 by 100 nanometer cross-section, clamped at both ends. You need a microscope to see those objects, but they’re immense compared with molecules. To ensure that their oscillators stayed in a single, lowest-energy vibrating state, both teams chilled their devices close to absolute zero using cryogenics, and then used laser beams or microwaves to reduce the temperature even further.

    If you want to generate quantum effects such as superpositions and entanglement in these resonators, you need to be able to control their quantum behavior. One way to do this is to couple the resonators to a quantum object whose state can be switched at will, such as a two-state “quantum bit” of the kind being used to build quantum computers. Andrew Cleland of the University of California at Santa Barbara and his coworkers achieved this for a microscopic sheet of aluminum nitride. Others are hoping to prepare oscillators in superposition states and then watch how they decohere as they get entangled with their environment: middle-sized Schrödinger kittens bouncing in the void.

    Ellen Weinstein

    If we could totally suppress decoherence, would that get us all the way to a full-size Schrödinger cat? It might not be that simple. This is because, to know that you’d made one, you’d have to look at it. Sure, the act of entangling a system with a measuring apparatus could itself decohere it—but the problem might be even worse than that. Physicists Johannes Kofler, now at the Max Planck Institute for Quantum Optics in Garching, and Caslav Brukner of the University of Vienna proposed in 2007 that the very act of studying a large quantum system experimentally may induce the emergence of classical behavior even without any decoherence. Measurement itself can turn quantum multiplicity into classical uniqueness.

    This, say Kofler and Brukner, is because measurements can’t be infinitely precise. The argument is often made in textbooks that the limits of experimental resolution prevent us from being able to see quantum discreteness in a macroscopic system: Because the discrete energy states get ever closer as the size of the system increases, they seem to blur into the continuum of energies that we perceive in, say, a moving tennis ball. But that can’t be the only reason why tennis balls are “classical,” because it doesn’t actually eliminate the quantumness of the object—forbidding, for example, a superposition of tennis-ball velocities.

    Kofler and Brukner showed that, when a measurement is “coarse-grained,” so that the resolution is insufficient to distinguish several closely spaced quantum states of a very large system, the quantum-mechanical equations describing how it evolves in time collapse into the classical equations of mechanics devised by Isaac Newton. “We can rigorously show that under the coarse-grained measurements, entanglement or nonlocal features of many-particle states are washed out,” says Brukner. Classical physics emerges from quantum physics when measurement becomes fuzzy, as it always must for “big” systems: ones composed of many particles with many possible states.

    The argument is not airtight: It’s possible in principle (though extremely hard in practice) to create exotic situations in which the coarse-graining of measuring some property of the system doesn’t ensure classicality. But Hyunseok Jeong of Seoul National University in South Korea and his collaborators have shown that even here there’s an aspect of measurement that destroys quantum behavior. In addition to some inevitable fuzziness in what we measure, says Jeong, there is also a degree of ambiguity about exactly when and where we measure: what he calls the measurement references. This too has the effect of making a quantum system appear to behave like a classical one.

    Kofler says that decoherence and coarse-graining of measurements offer two complementary routes to the classical world. “If you have sufficiently strong decoherence, you get classicality independent of your measurements,” he says. “And if you have coarse-grained measurement, you get classicality independent of the interaction with the environment.”

    This picture offers a striking resolution of the Schrödinger’s cat puzzle. We could never see it in a live–dead superposition, Brukner says, not because it can’t exist as such, or because of decoherence, but because, well, we just couldn’t actually see it. “Even if somebody would prepare a Schrödinger-cat state in front of us, we would not be able to reveal it as such without having an instrument of sufficient precision.” That’s to say, any measurement we could actually make on the cat wouldn’t show anything that couldn’t equally be explained by a classical picture. Even for the oscillators of optomechanical devices, detecting genuine superposition states will be challenging, involving positional differences of just fractions of an ångstrøm (10-10 meters). For such reasons, “it is quite challenging to test these ideas in a real experiment,” Jeong admits. Even so, he optimistically adds, “I hope to see my idea be tested in a laboratory in the near future.”

    There are other arguments, too, for why decoherence isn’t the whole explanation for the quantum-classical transition. In the 1980s and 1990s the eminent mathematical physicist Roger Penrose, and independently the Hungarian physicist Lajos Diósi, suggested that quantum behavior of mechanical systems might also be disrupted by gravity. If that’s so, it means that classical behavior is bound to manifest itself at a certain mass limit even if you could entirely suppress decoherence—because there is never any hiding from gravity. When one object “feels” the position of the other via gravity, it amounts to a kind of measurement that can destroy the quantum coherence.

    Some researchers, such as Markus Aspelmeyer at the University of Vienna and Dirk Bouwmeester at the University of California at Santa Barbara, are hoping to test this type of decoherence using optomechanics. Among the proposals, Aspelmeyer and colleagues want to conduct an experiment called MAQRO on a space satellite in zero gravity, where they could very sensitively probe matter-wave interference of particles about 100 nanometers across (huge in quantum terms) as they undergo free fall. Some theories, such as the gravitational-collapse idea of Penrose and Diósi, predict that for large enough particles the interference should vanish.

    Very recently, physicist Roman Schnabel of the University of Hamburg outlined another experimental test of gravity-induced decoherence. It would involve two large mirrors, weighing 100 grams each and attached to springs that let them oscillate, that become entangled with light beams bouncing between them, so that entanglement in the light (which is relatively easy to arrange) can be converted into entanglement of the two mirrors. By switching off the light and watching how the mirrors’ oscillations evolve over the ensuing microseconds, it would be possible to look for quantum correlations between them, and to search for deviations of the decoherence rate beyond that predicted by standard quantum theory owing to gravitational effects.

    There’s no doubt that strictly quantum-mechanical effects can be seen at the macroscale: Both superfluidity, when an ultracold fluid flows with no viscosity, and superconductivity, when a material carries an electrical current without resistance, are examples of that. And in a sense pretty much everything we experience, from vision to the solidity of objects, depends on effects that only quantum physics can explain.

    But what seem to us to be the real peculiarities of quantum physics (entanglement and superpositions, or in other words retaining quantum discord) are another matter. There is a chance we may not need to scale these effects up to large sizes to see them: The human eye can register just three or so photons, and physicists at the University of Illinois, Urbana-Champaign, are hoping to find out how the brain responds to photons in a superposition or entangled state. Some researchers have argued that such a superposition could persist in the nerve signal sent from the retina to the brain, so that fleeting “perceptual superpositions” are possible.

    Still, engineering entanglement and superposition into macroscopically large systems remains an important goal, even if it’s a distant one. Putting large systems in Schrödinger’s cat states isn’t just a question of seeing whether curiosity really does kill/not kill the cat. There would be practical benefits too: Quantum computers, which use quantum effects to give a huge boost to processing power, will need to achieve the entanglement and superpositions of large numbers of quantum bits to be practical. So understanding how decoherence kicks in as the scale increases, and finding ways to suppress it, is one of the keys to a viable quantum information technology.

    More and more, though, physicists seem to be concluding that the roadblocks to real-life Schrödinger cats are technical, not fundamental. For now, that distinction might not matter much, because of the limits on what an experiment can realistically attain. “I think, it is practically impossible to completely suppress decoherence of macroscopic superpositions or entanglement,” says Jeong. “And even if you could, another enemy—coarsening of measurements—might be waiting to kill macroscopic quantum superpositions.” But he thinks that, if we were ever to develop instruments fine enough, and systems isolated enough, there’s no reason to suppose that quantum effects wouldn’t survive to human-size scales. So far, nothing we have discovered about objects in the middle ground between micro and macro contradicts that view.

    For 2,000 years we have assumed that Plato’s common-sense view in the Republic applies to our tangible world: “The same thing cannot ever act or be acted upon in two opposite ways, or be two opposite things, at the same time.” Now we’re not so sure. With Schrödinger’s kittens growing up, weird isn’t what it used to be.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

  • richardmitnick 9:43 am on October 2, 2015 Permalink | Reply
    Tags: , , Heisenberg's Uncertainty Principle, , Quantum Mechanics   

    From AAAS: “Physicists observe weird quantum fluctuations of empty space—maybe” 



    1 October 2015
    Adrian Cho

    The setup in which a long “pump” light pulse (red) changes the polarization of a short “probe” light pulse (green) also serves to measure the effect of vacuum fluctuations—just by turning off the pump beam. ADAPTED FROM C. RIEK ET AL., SCIENCE (2015)

    Empty space is anything but, according to quantum mechanics: Instead, it roils with quantum particles flitting in and out of existence. Now, a team of physicists claims it has measured those fluctuations directly, without disturbing or amplifying them. However, others say it’s unclear exactly what the new experiment measures—which may be fitting for a phenomenon that originates in quantum mechanics’ famous uncertainty principle.

    “There are many experiments that have observed indirect effects of vacuum fluctuations,” says Diego Dalvit, a theorist at Los Alamos National Laboratory in New Mexico who was not involved in the current work. “If this [new experiment] is correct, it would be the first direct observation of the field [of fluctuations] itself.”

    Thanks to the [Heisenberg’s] uncertainty principle, the vacuum buzzes with particle-antiparticle pairs popping in and out of existence. They include, among many others, electron-positron pairs and pairs of photons, which are their own antiparticles. Ordinarily, those “virtual” particles cannot be directly captured. But like some spooky Greek chorus, they exert subtle influences on the “real” world.

    For example, the virtual photons flitting in and out of existence produce a randomly fluctuating electric field. In 1947, physicists found that the field shifts the energy levels of an electron inside a hydrogen atom and hence the spectrum of radiation the atom emits. A year later, Dutch theorist Hendrik Casimir predicted that the field would also exert a subtle force on two closely spaced metal plates, squeezing them together. That’s because the electric field must vanish on the plates’ surfaces, so only certain wavelike ripples of the electric field can fit between the plates. In contrast, more ripples can push on the plates from the outside, exerting a net force. The Casimir effect was observed in 1997.

    But now, Claudius Riek, Alfred Leitenstorfer, and colleagues at the University of Konstanz in Germany say they have directly observed those electric field fluctuations by charting their influence on a light wave. The experiment riffs on a technique they developed to study a longer light pulse with a much shorter one by shooting them simultaneously through a crystal (see diagram). The longer “pump” pulse is polarized horizontally, meaning that the electric field in it oscillates sideways. The shorter “probe” pulse starts out polarized vertically. However, the properties of the crystal depend on the electric field in it, so the pump beam causes the polarization of the probe beam to change and emerge from the crystal tracing an elliptical pattern. By adjusting the timing of the pulses, researchers can use the polarization effect to map out the wiggles in the electric field in the pump wave.

    But vacuum fluctuations themselves will affect the crystal and hence the polarization of the probe pulse, Leitensdorfer says. So to measure the fluctuations of the vacuum field, “we only put in the probe pulse, nothing else.” On average the polarization of the lone probe pulse remained vertical. But over many repeated trials, it varied slightly, and that noise was the sign of the vacuum fluctuations, the team says.

    Spotting the effect is no mean feat, as the polarization also varies because of random variation in the number of photons in each pulse, or “shot noise.” To tease the two apart, the physicists vary the duration and width of the pulse, but not the number of photons in it. The shot noise should stay constant, whereas the noise from quantum fluctuations should shrink as the pulses become bigger. The researchers saw a change of a few percent in the noise, an effect they attribute to vacuum fluctuations.

    Some physicists question what the new experiment actually measures, however. The researchers assume that fluctuating optical properties of the crystal reflect the vacuum fluctuations, says Steve Lamoreaux, a physicist at Yale University and one of the first to observe the Casimir effect. But the variations in the crystal’s optical properties could have some other source, such as thermal fluctuations, he says. “The material properties will fluctuate on their own,” he says, so “how does one attribute these fluctuations to the vacuum alone?”

    Moreover, Leitenstorfer’s group is not the first to directly probe such fluctuations. In 2011, Christopher Wilson, a physicist now at the University of Waterloo in Canada, and colleagues reported in Nature that they had pumped up vacuum fluctuations and turned them into real photons. In principle, that can be done by accelerating a mirror back and forth at near light speed. Wilson used a more practical analog: a system in in which the effective length of a small superconducting cavity could be changed electronically. Leitenstorfer notes that the new experiment differs from Wilson’s in that it does not require amplifying the fluctuations. Wilson responds, “While I agree that that’s a difference, I don’t think that it’s fundamental.”

    Leitenstorfer contends that the new work makes a qualitative advance over previous efforts. “We clearly have gone a significant step further in comparison to anybody else by directly measuring the electric field amplitude of the vacuum as it fluctuates in space and time,” he says. Others seem less certain about that.

    See the full article here .

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

  • richardmitnick 2:31 pm on September 27, 2015 Permalink | Reply
    Tags: , , Quantum Mechanics   

    From NPR: “Quantum Physics And The Need For A New Paradigm” 


    National Public Radio (NPR)

    September 27, 2015
    Ruth E. Kastner


    Quantum physics, celebrated for its predictive success, has also become notorious for being an inscrutable mass of paradoxes.

    One of the founders of the theory, Niels Bohr, stated that “those who are not shocked when they first come across quantum theory cannot possibly have understood it.” Nobel laureate Richard Feynman said, “I think I can safely say that nobody understands quantum mechanics.”

    The shocking aspects of quantum theory can be summarized by three issues: uncertainty; nonlocality; and the measurement problem (or the problem of Schrödinger’s Cat).

    The first issue consists in the fact that the tiny objects described by quantum theory, such as the constituents of atoms — protons and electrons, for example — cannot be pinned down to definite locations and speeds at the same time. If one of these properties is definite, the other must be in a quantum superposition, a kind of “fuzziness” that we never see in the ordinary macroscopic world of experience.

    The second issue arises in certain kinds of composite systems, such as pairs of electrons, in a so-called “entangled” state. If you send two such electrons off to the opposite ends of the galaxy, quantum physics tells us that they are still somehow in direct communication, such that the result of a measurement performed on one of them is instantly known to the other. This seems to be in conflict with another very successful theory, [Albert] Einstein’s theory of relativity, which tells us that no signal can be transferred faster than the speed of light.

    The third issue comes from Erwin Schrodinger’s observation that quantum physics seems to tell us that measuring instruments become “entangled” with the quantum objects they are measuring in a way that dictates that even macroscopic objects, like cats, inherit the “fuzziness” of the quantum world. In this case, the famous unfortunate cat seemingly ends up in a superposition of “alive and dead” based on the superposition of a radioactive atom in an uncertain state of “decayed and undecayed.”

    Schrödinger’s cat: a cat, a flask of poison, and a radioactive source are placed in a sealed box. If an internal monitor detects radioactivity (i.e., a single atom decaying), the flask is shattered, releasing the poison that kills the cat. The Copenhagen interpretation of quantum mechanics implies that after a while, the cat is simultaneously alive and dead. Yet, when one looks in the box, one sees the cat either alive or dead, not both alive and dead. This poses the question of when exactly quantum superposition ends and reality collapses into one possibility or the other.

    It may come as a surprise to learn that there is a way to make sense of all three of these seemingly paradoxical features of quantum mechanics. However, there is, of course, a price to pay for that solution: a paradigm change as startling as the one that accompanied Einstein’s theory of relativity — which told us, despite our intuitions, that there is no such thing as absolute space or time. Quantum physics requires that we “think outside the box,” and that box turns out to be space-time itself. The message of quantum physics is that not only is there no absolute space or time, but that reality extends beyond space-time. Metaphorically speaking, space-time is just the “tip of the iceberg”: Below the surface is a vast, unseen world of possibility. And it is that vast, unseen world that is described by quantum physics.

    This is not a wholly new idea: Another founder of quantum theory, Werner Heisenberg, stated that a quantum object is “something standing in the middle between the idea of an event and the actual event, a strange kind of physical reality just in the middle between possibility and reality.” Heisenberg called this potentia, a concept originally introduced by the ancient Greek philosopher Aristotle. It turns out that if we apply Heisenberg’s insight to an intriguing interpretation of quantum theory called the Transactional Interpretation (TI), we gain a unified understanding of all three paradoxical aspects of quantum theory.

    TI was originally proposed by John G. Cramer, professor emeritus at the University of Washington. Its key feature is that the process of absorption of a quantum state is just as important as the process of emission of a quantum state. This symmetry is nicely consistent with relativistic quantum theory, in which quantum states are both created and destroyed. But it comes with a counterintuitive feature: The absorption (or destruction) process involves quantum states with negative energy. For this reason, TI has generally been neglected by the mainstream physics community.

    However, it turns out that if you include this “response of the absorber,” you get a solution to the so-called “measurement problem” — the problem of Schrödinger’s Cat. A clear physical account can be given for why the cat does not end up in a “fuzzy” superposition of alive and dead. We even get a natural explanation for the rule used to calculate the probabilities of measurement outcomes (the so-called “Born Rule” after its inventor, Max Born).

    In TI, the “collapse of the quantum state” is called a transaction, because it involves an “offer” from the emitter and a “confirmation” from the absorber, much like the negotiation in a financial transaction. When these occur, we get a “measurement,” and that allows us to define what a measurement is — and explains why we never see things like cats in quantum superpositions. But, in the new development of TI, the offers and confirmations are only possibilities — they are outside the realm of ordinary space-time. In fact, it is the transactional processes that creates space-time events: “Collapse” is the crystallizing of the possibilities of the quantum realm into the concrete actualities of the space-time realm. So, collapse is not something that happens anywhere in space-time. It is the creation of space-time itself.

    The preceding is just the barest introduction to this new, updated version of TI that I call “possibilist TI” or PTI. (The details appear in peer-reviewed publications and in my books.) But if we accept the idea that quantum physics is describing possibilities that exist beyond space-time, then it can begin to make sense that those possibilities are “fuzzier” than the objects we experience in space-time and their correlations are not subject to relativistic “speed limit” that applies to the space-time realm only. And we gain a clear account of measurement that explains why Schrödinger’s Cat is never alive and dead at the same time.

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Great storytelling and rigorous reporting. These are the passions that fuel us. Our business is telling stories, small and large, that start conversations, increase understanding, enrich lives and enliven minds.

    We are reporters in Washington D.C., and in bunkers, streets, alleys, jungles and deserts around the world. We are engineers, editors, inventors and visionaries. We are Member stations around the country who are deeply connected to our communities. We are listeners and donors who support public radio because we know how it has enriched our own lives and want it to grow strong in a new age.

    We are NPR. And this is our story.

  • richardmitnick 11:47 am on September 11, 2015 Permalink | Reply
    Tags: , , Mott transition, , Quantum Mechanics,   

    From phys.org: “Team announces breakthrough observation of Mott transition in a superconductor” 


    September 11, 2015
    Joost Bruysters


    An international team of researchers, including the MESA+ Institute for Nanotechnology at the University of Twente in The Netherlands and the U.S. Department of Energy’s Argonne National Laboratory, announced today in Science the observation of a dynamic Mott transition in a superconductor.

    The discovery experimentally connects the worlds of classical and quantum mechanics and illuminates the mysterious nature of the Mott transition. It also could shed light on non-equilibrium physics, which is poorly understood but governs most of what occurs in our world. The finding may also represent a step towards more efficient electronics based on the Mott transition.

    Since its foundations were laid in the early part of the 20th century, scientists have been trying to reconcile quantum mechanics with the rules of classical or Newtonian physics (like how you describe the path of an apple thrown into the air—or dropped from a tree). Physicists have made strides in linking the two approaches, but experiments that connect the two are still few and far between; physics phenomena are usually classified as either quantum or classical, but not both.

    One system that unites the two is found in superconductors, certain materials that conduct electricity perfectly when cooled to very low temperatures. Magnetic fields penetrate the superconducting material in the form of tiny filaments called vortices, which control the electronic and magnetic properties of the materials.

    These vortices display both classical and quantum properties, which led researchers to study them for access to one of the most enigmatic phenomena of modern condensed matter physics: the Mott insulator-to-metal transition.

    The Mott transition occurs in certain materials that according to textbook quantum mechanics should be metals, but in reality turn insulators. A complex phenomenon controlled by the interactions of many quantum particles, the Mott transition remains mysterious—even whether or not it’s a classical or quantum phenomenon is not quite clear. Moreover, scientists have never directly observed a dynamic Mott transition, in which a phase transition from an insulating to a metallic state is induced by driving an electrical current through the system; the disorder inherent in real systems disguises Mott properties.

    At the University of Twente, researchers built a system containing 90,000 superconducting niobium nano-sized islands on top of a gold film. In this configuration, the vortices find it energetically easiest to settle into energy dimples in an arrangement like an egg crate—and make the material act as a Mott insulator, since the vortices won’t move if the applied electric current is small.


    When they applied a large enough electric current, however, the scientists saw a dynamic Mott transition as the system flipped to become a conducting metal; the properties of the material had changed as the current pushed it out of equilibrium.

    The vortex system behaved exactly like an electronic Mott transition driven by temperature, said Valerii Vinokur, an Argonne Distinguished Fellow and corresponding author on the study. He and study co-author Tatyana Baturina, then at Argonne, analyzed the data and recognized the Mott behavior.

    “This experimentally materializes the correspondence between quantum and classical physics,” Vinokur said. “We can controllably induce a phase transition between a state of locked vortices to itinerant vortices by applying an electric current to the system,” said Hans Hilgenkamp, head of the University of Twente research group. “Studying these phase transitions in our artificial systems is interesting in its own right, but may also provide further insight in the electronic transitions in real materials.”

    The system could further provide scientists with insight into two categories of physics that have been hard to understand: many-body systems and out-of-equilibrium systems.

    “This is a classical system that which is easy to experiment with and provides what looks like access to very complicated many-body systems,” said Vinokur. “It looks a bit like magic.”

    As the name implies, many-body problems involve a large number of particles interacting; with current theory they are very difficult to model or understand.


    “Furthermore, this system will be key to building a general understanding of out-of-equilibrium physics, which would be a major breakthrough in physics,” Vinokur said.

    The Department of Energy named five great basic energy scientific challenges of our time; one of them is understanding and controlling out-of-equilibrium phenomena. Equilibrium systems—where there’s no energy moving around—are now understood quite well. But nearly everything in our lives involves energy flow, from photosynthesis to digestion to tropical cyclones, and we don’t yet have the physics to describe it well. Scientists think a better understanding could lead to huge improvements in energy capture, batteries and energy storage, electronics and more.

    As we seek to make electronics faster and smaller, Mott systems also offer a possible alternative to the silicon transistor. Since they can be flipped between conducting and insulating with small changes in voltage, they may be able to encode 1s and 0s at smaller scales and higher accuracy than silicon transistors.

    ‘Initially, we were studying the structures for completely different reasons, namely to investigate the effects of inhomogeneities on superconductivity,” Hilgenkamp said. “After discussing with Valerii Vinokur at Argonne, we looked more specifically into our data and were quite amazed to see that it revealed so nicely the details of the transition between the state of locked and moving vortices. There are many ideas for follow up studies, and we look forward to our continued collaboration.”

    The results were printed in the study Critical behavior at a dynamic vortex insulator-to-metal transition, released today in Science. Other co-authors are associated with the Siberian Branch of Russian Academy of Science, the Rome International Center for Materials Science Superstripes, Novosibirsk State University, the Moscow Institute of Physics and Technology and Queen Mary University of London.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

  • richardmitnick 3:10 pm on August 31, 2015 Permalink | Reply
    Tags: , , , Quantum Mechanics   

    From Caltech: “Seeing Quantum Motion” 

    Caltech Logo

    Jessica Stoller-Conrad

    Credit: Chan Lei and Keith Schwab/Caltech

    Consider the pendulum of a grandfather clock. If you forget to wind it, you will eventually find the pendulum at rest, unmoving. However, this simple observation is only valid at the level of classical physics—the laws and principles that appear to explain the physics of relatively large objects at human scale. However, uantum mechanics, the underlying physical rules that govern the fundamental behavior of matter and light at the atomic scale, state that nothing can quite be completely at rest.

    For the first time, a team of Caltech researchers and collaborators has found a way to observe—and control—this quantum motion of an object that is large enough to see. Their results are published in the August 27 online issue of the journal Science.

    Researchers have known for years that in classical physics, physical objects indeed can be motionless. Drop a ball into a bowl, and it will roll back and forth a few times. Eventually, however, this motion will be overcome by other forces (such as gravity and friction), and the ball will come to a stop at the bottom of the bowl.

    “In the past couple of years, my group and a couple of other groups around the world have learned how to cool the motion of a small micrometer-scale object to produce this state at the bottom, or the quantum ground state,” says Keith Schwab, a Caltech professor of applied physics, who led the study. “But we know that even at the quantum ground state, at zero-temperature, very small amplitude fluctuations—or noise—remain.”

    Because this quantum motion, or noise, is theoretically an intrinsic part of the motion of all objects, Schwab and his colleagues designed a device that would allow them to observe this noise and then manipulate it.

    The micrometer-scale device consists of a flexible aluminum plate that sits atop a silicon substrate. The plate is coupled to a superconducting electrical circuit as the plate vibrates at a rate of 3.5 million times per second. According to the laws of classical mechanics, the vibrating structures eventually will come to a complete rest if cooled to the ground state.

    But that is not what Schwab and his colleagues observed when they actually cooled the spring to the ground state in their experiments. Instead, the residual energy—quantum noise—remained.

    “This energy is part of the quantum description of nature—you just can’t get it out,” says Schwab. “We all know quantum mechanics explains precisely why electrons behave weirdly. Here, we’re applying quantum physics to something that is relatively big, a device that you can see under an optical microscope, and we’re seeing the quantum effects in a trillion atoms instead of just one.”

    Because this noisy quantum motion is always present and cannot be removed, it places a fundamental limit on how precisely one can measure the position of an object.

    But that limit, Schwab and his colleagues discovered, is not insurmountable. The researchers and collaborators developed a technique to manipulate the inherent quantum noise and found that it is possible to reduce it periodically. Coauthors Aashish Clerk from McGill University and Florian Marquardt from the Max Planck Institute for the Science of Light proposed a novel method to control the quantum noise, which was expected to reduce it periodically. This technique was then implemented on a micron-scale mechanical device in Schwab’s low-temperature laboratory at Caltech.

    “There are two main variables that describe the noise or movement,” Schwab explains. “We showed that we can actually make the fluctuations of one of the variables smaller—at the expense of making the quantum fluctuations of the other variable larger. That is what’s called a quantum squeezed state; we squeezed the noise down in one place, but because of the squeezing, the noise has to squirt out in other places. But as long as those more noisy places aren’t where you’re obtaining a measurement, it doesn’t matter.”

    The ability to control quantum noise could one day be used to improve the precision of very sensitive measurements, such as those obtained by LIGO, the Laser Interferometry Gravitational-wave Observatory, a Caltech-and-MIT-led project searching for signs of gravitational waves, ripples in the fabric of space-time.

    Caltech Ligo

    “We’ve been thinking a lot about using these methods to detect gravitational waves from pulsars—incredibly dense stars that are the mass of our sun compressed into a 10 km radius and spin at 10 to 100 times a second,” Schwab says. “In the 1970s, Kip Thorne [Caltech’s Richard P. Feynman Professor of Theoretical Physics, Emeritus] and others wrote papers saying that these pulsars should be emitting gravity waves that are nearly perfectly periodic, so we’re thinking hard about how to use these techniques on a gram-scale object to reduce quantum noise in detectors, thus increasing the sensitivity to pick up on those gravity waves,” Schwab says.

    In order to do that, the current device would have to be scaled up. “Our work aims to detect quantum mechanics at bigger and bigger scales, and one day, our hope is that this will eventually start touching on something as big as gravitational waves,” he says.

    These results were published in an article titled, Quantum squeezing of motion in a mechanical resonator. In addition to Schwab, Clerk, and Marquardt, other coauthors include former graduate student Emma E. Wollman (PhD ’15); graduate students Chan U. Lei and Ari J. Weinstein; former postdoctoral scholar Junho Suh; and Andreas Kronwald of Friedrich-Alexander-Universität in Erlangen, Germany. The work was funded by the National Science Foundation (NSF), the Defense Advanced Research Projects Agency, and the Institute for Quantum Information and Matter, an NSF Physics Frontiers Center that also has support from the Gordon and Betty Moore Foundation.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”
    Caltech buildings

  • richardmitnick 10:28 am on August 31, 2015 Permalink | Reply
    Tags: , , Quantum Mechanics   

    From Forbes: “What Has Quantum Mechanics Ever Done For Us?” 


    Forbes Magazine

    Aug 13, 2015
    Chad Orzel

    Intel Corp. CEO Paul Otellini show off chips on a wafer built on so-called 22-nanometer technology at the Intel Developers’ Forum in San Francisco, Tuesday, Sept. 22, 2009. Those chips are still being developed in Intel’s factories and won’t go into production until 2011. Each chip on the silicon “wafer” Otellini showed off has 2.9 billion transistors. (AP Photo/Paul Sakuma)

    In a different corner of the social media universe, someone left comments on a link to Tuesday’s post about quantum randomness declaring that they weren’t aware of any practical applications of quantum physics. There’s a kind of Life of Brian absurdity to posting this on the Internet, which is a giant world-spanning, life-changing practical application of quantum mechanics. But just to make things a little clearer, here’s a quick look at some of the myriad everyday things that depend on quantum physics for their operation.

    Computers and Smartphones

    At bottom, the entire computer industry is built on quantum mechanics. Modern semiconductor-based electronics rely on the band structure of solid objects. This is fundamentally a quantum phenomenon, depending on the wave nature of electrons, and because we understand that wave nature, we can manipulate the electrical properties of silicon. Mixing in just a tiny fraction of the right other elements changes the band structure and thus the conductivity; we know exactly what to add and how much to use thanks to our detailed understanding of the quantum nature of matter.

    Stacking up layers of silicon doped with different elements allows us to make transistors on the nanometer scale. Millions of these packed together in a single block of material make the computer chips that power all the technological gadgets that are so central to modern life. Desktops, laptops, tablets, smartphones, even small household appliances and kids’ toys are driven by computer chips that simply would not be possible to make without our modern understanding of quantum physics.

    Green LED lights and rows of fibre optic cables are seen feeding into a computer server inside a comms room at an office in London, U.K., on Tuesday, Dec. 23, 2014. Vodafone Group Plc will ask telecommunications regulator Ofcom to guarantee that U.K. wireless carriers, which rely on BT’s fiber network to transmit voice and data traffic across the country, are treated fairly when BT sets prices and connects their broadcasting towers. Photographer: Simon Dawson/Bloomberg

    Unless my grumpy correspondent was posting from the exact server hosting the comment files (which would be really creepy), odds are very good that comment took a path to me that also relies on quantum physics, specifically fiber optic telecommunications. The fibers themselves are pretty classical, but the light sources used to send messages down the fiber optic cables are lasers, which are quantum devices.

    The key physics of the laser is contained in a 1917 paper [Albert] Einstein wrote on the statistics of photons (though the term “photon” was coined later) and their interaction with atoms. This introduces the idea of stimulated emission, where an atom in a high-energy state encountering a photon of the right wavelength is induced to emit a second photon identical to the first. This process is responsible for two of the letters in the word “laser,” originally an acronym for “Light Amplification by Stimulated Emission of Radiation.”

    Any time you use a laser, whether indirectly by making a phone call, directly by scanning a UPC label on your groceries, or frivolously to torment a cat, you’re making practical use of quantum physics.

    Atomic Clocks and GPS

    One of the most common uses of Internet-connected smart phones is to find directions to unfamiliar places, another application that is critically dependent on quantum physics. Smartphone navigation is enabled by the Global Positioning System, a network of satellites each broadcasting the time. The GPS receiver in your phone picks up the signal from multiple clocks, and uses the different arrival times from different satellites to determine your distance from each of those satellites. The computer inside the receiver then does a bit of math to figure out the single point on the surface of the Earth that is that distance from those satellites, and locates you to within a few meters.

    This trilateration relies on the constant speed of light to convert time to distance. Light moves at about a foot per nanosecond, so the timing accuracy of the satellite signals needs to be really good, so each satellite in the GPS constellation contains an ensemble of atomic clocks. These rely on quantum mechanics– the “ticking” of the clock is the oscillation of microwaves driving a transition between two particular quantum states in a cesium atom (or rubidium, in some of the clocks).

    Any time you use your phone to get you from point A to point B, the trip is made possible by quantum physics.

    Magnetic Resonance Imaging

    Leila Wehbe, a Ph.D. student at Carnegie Mellon University in Pittsburgh, talks about an experiment that used brain scans made in this brain-scanning MRI machine on campus, Wednesday, Nov. 26, 2014. Volunteers where scanned as each word of a chapter of “Harry Potter and the Sorcerer’s Stone” was flashed for half a second onto a screen inside the machine. Images showing combinations of data and graphics were collected. (AP Photo/Keith Srakocic)

    The transition used for atomic clocks is a “hyperfine” transition, which comes from a small energy shift depending on how the spin of an electron is oriented relative to the spin of the nucleus of the atom. Those spins are an intrinsically quantum phenomenon (actually, it comes in only when you include special relativity with quantum mechanics), causing the electrons, protons, and neutrons making up ordinary matter behave like tiny magnets.

    This spin is responsible for the fourth and final practical application of quantum physics that I’ll talk about today, namely Magnetic Resonance Imaging (MRI). The central process in an MRI machine is called Nuclear Magnetic Resonance (but “nuclear” is a scary word, so it’s avoided for a consumer medical process), and works by flipping the spins in the nuclei of hydrogen atoms. A clever arrangement of magnetic fields lets doctors measure the concentration of hydrogen appearing in different parts of the body, which in turn distinguishes between a lot of softer tissues that don’t show up well in traditional x-rays.

    So any time you, a loved one, or your favorite professional athlete undergoes an MRI scan, you have quantum physics to thank for their diagnosis and hopefully successful recovery.

    So, while it may sometimes seem like quantum physics is arcane and remote from everyday experience (a self-inflicted problem for physicists, to some degree, as we often over-emphasize the weirder aspects when talking about quantum mechanics), in fact it is absolutely essential to modern life. Semiconductor electronics, lasers, atomic clocks, and magnetic resonance scanners all fundamentally depend on our understanding of the quantum nature of light and matter.

    But, you know, other than computers, smartphones, the Internet, GPS, and MRI, what has quantum physics ever done for us?

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 11:41 am on August 28, 2015 Permalink | Reply
    Tags: , , Quantum Mechanics, Quantum Weirdness,   

    From New Scientist: “Quantum weirdness proved real in first loophole-free experiment” 


    New Scientist

    28 August 2015
    Jacob Aron

    Welcome to quantum reality (Image: Julie Guiche/Picturetank)

    It’s official: the universe is weird. Our everyday experience tells us that distant objects cannot influence each other, and don’t disappear just because no one is looking at them. Even Albert Einstein was dead against such ideas because they clashed so badly with our views of the real world.

    But it turns out we’re wrong – the quantum nature of reality means, on some level, these things can and do actually happen. A groundbreaking experiment puts the final nail in the coffin of our ordinary “local realism” view of the universe, settling an argument that has raged through physics for nearly a century.

    Teams of physicists around the world have been racing to complete this experiment for decades. Now, a group led by Ronald Hanson at Delft University of Technology in the Netherlands has finally cracked it. “It’s a very nice and beautiful experiment, and one can only congratulate the group for that,” says Anton Zeilinger, head of one of the rival teams at the University of Vienna, Austria. “Very well done.”

    To understand what Hanson and his colleagues did, we have to go back to the 1930s, when physicists were struggling to come to terms with the strange predictions of the emerging science of quantum mechanics. The theory suggested that particles could become entangled, so that measuring one would instantly influence the measurement of the other, even if they were separated by a great distance. Einstein dubbed this “spooky action at a distance”, unhappy with the implication that particles could apparently communicate faster than any signal could pass between them.

    What’s more, the theory also suggested that the properties of a particle are only fixed when measured, and prior to that they exist in a fuzzy cloud of probabilities.

    Nonsense, said Einstein, who famously proclaimed that God does not play dice with the universe. He and others were guided by the principle of local realism, which broadly says that only nearby objects can influence each other and that the universe is “real” – our observing it doesn’t bring it into existence by crystallising vague probabilities. They argued that quantum mechanics was incomplete, and that “hidden variables” operating at some deeper layer of reality could explain the theory’s apparent weirdness.

    On the other side, physicists like Niels Bohr insisted that we just had to accept the new quantum reality, since it explained problems that classical theories of light and energy couldn’t handle.

    Test it out

    It wasn’t until the 1960s that the debate shifted further to Bohr’s side, thanks to John Bell, a physicist at CERN. He realised that there was a limit to how connected the properties of two particles could be if local realism was to be believed. So he formulated this insight into a mathematical expression called an inequality. If tests showed that the connection between particles exceeded the limit he set, local realism was toast.

    “This is the magic of Bell’s inequality,” says Zeilinger’s colleague Johannes Kofler. “It brought an almost purely philosophical thing, where no one knew how to decide between two positions, down to a thing you could experimentally test.”

    And test they did. Experiments have been violating Bell’s inequality for decades, and the majority of physicists now believe Einstein’s views on local realism were wrong. But doubts remained. All prior experiments were subject to a number of potential loopholes, leaving a gap that could allow Einstein’s camp to come surging back.

    “The notion of local realism is so ingrained into our daily thinking, even as physicists, that it is very important to definitely close all the loopholes,” says Zeilinger.

    Loophole trade-off

    A typical Bell test begins with a source of photons, which spits out two at the same time and sends them in different directions to two waiting detectors, operated by a hypothetical pair conventionally known as Alice and Bob. The pair have independently chosen the settings on their detectors so that only photons with certain properties can get through. If the photons are entangled according to quantum mechanics, they can influence each other and repeated tests will show a stronger pattern between Alice and Bob’s measurements than local realism would allow.

    But what if Alice and Bob are passing unseen signals – perhaps through Einstein’s deeper hidden layer of reality – that allow one detector to communicate with the other? Then you couldn’t be sure that the particles are truly influencing each other in their instant, spooky quantum-mechanical way – instead, the detectors could be in cahoots, altering their measurements. This is known as the locality loophole, and it can be closed by moving the detectors far enough apart that there isn’t enough time for a signal to cross over before the measurement is complete. Previously Zeilinger and others have done just that, including shooting photons between two Canary Islands 144 kilometres apart.

    Close one loophole, though, and another opens. The Bell test relies on building up a statistical picture through repeated experiments, so it doesn’t work if your equipment doesn’t pick up enough photons. Other experiments closed this detection loophole, but the problem gets worse the further you separate the detectors, as photons can get lost on the way. So moving the detectors apart to close the locality loophole begins to widen the detection one.

    “There’s a trade-off between these two things,” says Kofler. That meant hard-core local realists always had a loophole to explain away previous experiments – until now.

    “Our experiment realizes the first Bell test that simultaneously addressed both the detection loophole and the locality loophole,” writes Hanson’s team in a paper detailing the study. Hanson declined to be interviewed because the work is currently under review for publication in a journal.

    Entangled diamonds

    In this set-up, Alice and Bob sit in two laboratories 1.3 kilometres apart. Light takes 4.27 microseconds to travel this distance and their measurement takes only 3.7 microseconds, so this is far enough to close the locality loophole.

    Each laboratory has a diamond that contains an electron with a property called spin. The team hits the diamonds with randomly produced microwave pulses. This makes them each emit a photon, which is entangled with the electron’s spin. These photons are then sent to a third location, C, in between Alice and Bob, where another detector clocks their arrival time.

    If photons arrive from Alice and Bob at exactly the same time, they transfer their entanglement to the spins in each diamond. So the electrons are entangled across the distance of the two labs – just what we need for a Bell test. What’s more, the electrons’ spin is constantly monitored, and the detectors are of high enough quality to close the detector loophole.

    But the downside is that the two photons arriving at C rarely coincide – just a few per hour. The team took 245 measurements, so it was a long wait. “This is really a very tough experiment,” says Kofler.

    The result was clear: the labs detected more highly correlated spins than local realism would allow. The weird world of quantum mechanics is our world.

    “If they’ve succeeded, then without any doubt they’ve done a remarkable experiment,” says Sandu Popescu of the University of Bristol, UK. But he points out that most people expected this result – “I can’t say everybody was holding their breath to see what happens.” What’s important is that these kinds of experiments drive the development of new quantum technology, he says.

    One of the most important quantum technologies in use today is quantum cryptography. Data networks that use the weird properties of the quantum world to guarantee absolute secrecy are already springing up across the globe, but the loopholes are potential bugs in the laws of physics that might have allowed hackers through. “Bell tests are a security guarantee,” says Kofler. You could say Hanon’s team just patched the universe.
    Freedom of choice

    There are still a few ways to quibble with the result. The experiment was so tough that the p-value – a measure of statistical significance – was relatively high for work in physics. Other sciences like biology normally accept a p-value below 5 per cent as a significant result, but physicists tend to insist on values millions of times smaller, meaning the result is more statistically sound. Hanson’s group reports a p-value of around 4 per cent, just below that higher threshold.

    That isn’t too concerning, says Zeilinger. “I expect they have improved the experiment, and by the time it is published they’ll have better data,” he says. “There is no doubt it will withstand scrutiny.”

    And there is one remaining loophole for local realists to cling to, but no experiment can ever rule it out. What if there is some kind of link between the random microwave generators and the detectors? Then Alice and Bob might think they’re free to choose the settings on their equipment, but hidden variables could interfere with their choice and thwart the Bell test.

    Hanson’s team note this is a possibility, but assume it isn’t the case. Zeilinger’s experiment attempts to deal with this freedom of choice loophole by separating their random number generators and detectors, while others have proposed using photons from distant quasars to produce random numbers, resulting in billions of years of separation.

    None of this helps in the long run. Suppose the universe is somehow entirely predetermined, the flutter of every photon carved in stone since time immemorial. In that case, no one would ever have a choice about anything. “The freedom of choice loophole will never be closed fully,” says Kofler. As such, it’s not really worth experimentalists worrying about – if the universe is predetermined, the complete lack of free will means we’ve got bigger fish to fry.

    So what would Einstein have made of this new result? Unfortunately he died before Bell proposed his inequality, so we don’t know if subsequent developments would have changed his mind, but he’d likely be enamoured with the lengths people have gone to prove him wrong. “I would give a lot to know what his reaction would be,” says Zeilinger. “I think he would be very impressed.”

    Journal reference: arxiv.org/abs/1508.05949v1

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 11:22 am on July 26, 2015 Permalink | Reply
    Tags: , , , Quantum Mechanics, Time Travel   

    From RT: “Time-traveling photons connect general relativity to quantum mechanics” 

    RT Logo


    23 Jun, 2014
    No Writer Credit

    Space-time structure exhibiting closed paths in space (horizontal) and time (vertical). A quantum particle travels through a wormhole back in time and returns to the same location in space and time. (Photo credit: Martin Ringbauer)

    Scientists have simulated time travel by using particles of light acting as quantum particles sent away and then brought back to their original space-time location. This is a huge step toward marrying two of the most irreconcilable theories in physics.

    Since traveling all the way to a black hole to see if an object you’re holding would bend, break or put itself back together in inexplicable ways is a bit of a trek, scientists have decided to find a point of convergence between general relativity and quantum mechanics in lab conditions, and they achieved success.

    Australian researchers from the UQ’s School of Mathematics and Physics wanted to plug the holes in the discrepancies that exist between two of our most commonly accepted physics theories, which is no easy task: on the one hand, you have Einstein’s theory of general relativity, which predicts the behavior of massive objects like planets and galaxies; but on the other, you have something whose laws completely clash with Einstein’s – and that is the theory of quantum mechanics, which describes our world at the molecular level. And this is where things get interesting: we still have no concrete idea of all the principles of movement and interaction that underpin this theory.

    Natural laws of space and time simply break down there.

    The light particles used in the study are known as photons, and in this University of Queensland study, they stood in for actual quantum particles for the purpose of finding out how they behaved while moving through space and time.

    The team simulated the behavior of a single photon that travels back in time through a wormhole and meets its older self – an identical photon. “We used single photons to do this but the time-travel was simulated by using a second photon to play the part of the past incarnation of the time traveling photon,” said UQ Physics Professor Tim Ralph asquotedby The Speaker.

    The findings were published in the journal Nature Communications and gained support from the country’s key institutions on quantum physics.

    Some of the biggest examples of why the two approaches can’t be reconciled concern the so-called space-time loop. Einstein suggested that you can travel back in time and return to the starting point in space and time. This presented a problem, known commonly as the ‘grandparents paradox,’ theorized by Kurt Godel in 1949: if you were to travel back in time and prevent your grandparents from meeting, and in so doing prevent your own birth, the classical laws of physics would prevent you from being born.

    But Tim Ralph has reminded that in 1991, such situations could be avoided by harnessing quantum mechanics’ flexible laws: “The properties of quantum particles are ‘fuzzy’ or uncertain to start with, so this gives them enough wiggle room to avoid inconsistent time travel situations,” he said.

    There are still ways in which science hasn’t tested the meeting points between general relativity and quantum mechanics – such as when relativity is tested under extreme conditions, where its laws visibly seem to bend, just like near the event horizon of a black hole.

    But since it’s not really easy to approach one, the UQ scientists were content with testing out these points of convergence on photons.

    “Our study provides insights into where and how nature might behave differently from what our theories predict,” Professor Ralph said.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 10:31 am on July 18, 2015 Permalink | Reply
    Tags: , , , , , Quantum Mechanics   

    From NOVA: “How Time Got Its Arrow” 



    15 Jul 2015

    Lee Smolin, Perimeter Institute for Theoretical Physics

    I believe in time.

    I haven’t always believed in it. Like many physicists and philosophers, I had once concluded from general relativity and quantum gravity that time is not a fundamental aspect of nature, but instead emerges from another, deeper description. Then, starting in the 1990s and accelerated by an eight year collaboration with the Brazilian philosopher Roberto Mangabeira Unger, I came to believe instead that time is fundamental. (How I came to this is another story.) Now, I believe that by taking time to be fundamental, we might be able to understand how general relativity and the standard model emerge from a deeper theory, why time only goes one way, and how the universe was born.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Flickr user Robert Couse-Baker, adapted under a Creative Commons license.

    The story starts with change. Science, most broadly defined, is the systematic study of change. The world we observe and experience is constantly changing. And most of the changes we observe are irreversible. We are born, we grow, we age, we die, as do all living things. We remember the past and our actions influence the future. Spilled milk is hard to clean up; a cool drink or a hot bath tend towards room temperature. The whole world, living and non-living, is dominated by irreversible processes, as captured mathematically by the second law of thermodynamics, which holds that the entropy of a closed system usually increases and seldom decreases.

    It may come as a surprise, then, that physics regards this irreversibility as a cosmic accident. The laws of nature as we know them are all reversible when you change the direction of time. Film a process described by those laws, and then run the movie backwards: the rewound version is also allowed by the laws of physics. To be more precise, you may have to change left for right and particles for antiparticles, along with reversing the direction of time, but the standard model of particle physics predicts that the original process and its reverse are equally likely.

    The same is true of Einstein’s theory of general relativity, which describes gravity and cosmology. If the whole universe were observed to run backwards in time, so that it heated up while it collapsed, rather than cooled as it expanded, that would be equally consistent with these fundamental laws, as we currently understand them.

    This leads to a fundamental question: Why, if the laws are reversible, is the universe so dominated by irreversible processes? Why does the second law of thermodynamics hold so universally?

    Gravity is one part of the answer. The second law tells us that the entropy of a closed system, which is a measure of disorder or randomness in the motions of the atoms making up that system, will most likely increase until a state of maximum disorder is reached. This state is called equilibrium. Once it is reached, the system is as mixed as possible, so all parts have the same temperature and all the elements are equally distributed.

    But on large scales, the universe is far from equilibrium. Galaxies like ours are continually forming stars, turning nuclear potential energy into heat and light, as they drive the irreversible flows of energy and materials that characterize the galactic disks. On these large scales, gravity fights the decay to equilibrium by causing matter to clump,,creating subsystems like stars and planets. This is beautifully illustrated in some recent papers by Barbour, Koslowski and Mercati.

    But this is only part of the answer to why the universe is out of equilibrium. There remains the mystery of why the universe at the big bang was not created in equilibrium to start with, for the picture of the universe given us by observations requires that the universe be created in an extremely improbable state—very far from equilibrium. Why?

    So when we say that our universe started off in a state far from equilibrium, we are saying that it started off in a state that would be very improbable, were the initial state chosen randomly from the set of all possible states. Yet we must accept this vast improbability to explain the ubiquity of irreversible processes in our world in terms of the reversible laws we know.

    In particular, the conditions present in the early universe, being far from equilibrium, are highly irreversible. Run the early universe backwards to a big crunch and they look nothing like the late universe that might be in our future.

    In 1979 Roger Penrose proposed a radical answer to the mystery of irreversibility. His proposal concerned quantum gravity, the long-searched-for unification of all the known laws, which is believed to govern the processes that created the universe in the big bang—or transformed it from whatever state it was in before the big bang.

    Penrose hypothesized that quantum gravity, as the most fundamental law, will be unlike the laws we know in that it will be irreversible. The known laws, along with their time-reversibility, emerge as approximations to quantum gravity when the universe grows large and cool and dilute, Penrose argued. But those approximate laws will act within a universe whose early conditions were set up by the more fundamental, irreversible laws. In this way the improbability of the early conditions can be explained.

    In the intervening years our knowledge of the early universe has been dramatically improved by a host of cosmological observations, but these have only deepened the mysteries we have been discussing. So a few years ago, Marina Cortes, a cosmologist from the Institute for Astronomy in Edinburgh, and I decided to revive Penrose’s suggestion in the light of all the knowledge gained since, both observationally and theoretically.

    Dr. Cortes argued that time is not only fundamental but fundamentally irreversible. She proposed that the universe is made of processes that continuously generate new events from present events. Events happen, but cannot unhappen. The reversal of an event does not erase that event, Cortes says: It is a new event, which happens after it.

    In December of 2011, Dr. Cortes began a three-month visit to Perimeter Institute, where I work, and challenged me to collaborate with her on realizing these ideas. The first result was a model we developed of a universe created by events, which we called an energetic causal set model.

    This is a version of a kind of model called a causal set model, in which the history of the universe is considered to be a discrete set of events related only by cause-and-effect. Our model was different from earlier models, though. In it, events are created by a process which maximizes their uniqueness. More precisely, the process produces a universe created by events, each of which is different from all the others. Space is not fundamental, only the events and the causal process that creates them are fundamental. But if space is not fundamental, energy is. The events each have a quantity of energy, which they gain from their predecessors and pass on to their successors. Everything else in the world emerges from these events and the energy they convey.

    We studied the model universes created by these processes and found that they generally pass through two stages of evolution. In the first stage, they are dominated by the irreversible processes that create the events, each unique. The direction of time is clear. But this gives rise to a second stage in which trails of events appear to propagate, creating emergent notions of particles. Particles emerge only when the second, approximately reversible stage is reached. These emergent particles propagate and appear to interact through emergent laws which seem reversible. In fact, we found, there are many possible models in which particles and approximately reversible laws emerge after a time from a more fundamental irreversible, particle-free system.

    This might explain how general relativity and the standard model emerged from a more fundamental theory, as Penrose hypothesized. Could we, we wondered, start with general relativity and, staying within the language of that theory, modify it to describe an irreversible theory? This would give us a framework to bridge the transition between the early, irreversible stage and the later, reversible stage.

    In a recent paper, Marina Cortes, PI postdoc Henrique Gomes and I showed one way to modify general relativity in a way that introduces a preferred direction of time, and we explored the possible consequences for the cosmology of the early universe. In particular, we showed that there were analogies of dark matter and dark energy, but which introduce a preferred direction of time, so a contracting universe is no longer the time-reverse of an expanding universe.

    To do this we had to first modify general relativity to include a physically preferred notion of time. Without that there is no notion of reversing time. Fortunately, such a modification already existed. Called shape dynamics, it had been proposed in 2011 by three young people, including Gomes. Their work was inspired by Julian Barbour, who had proposed that general relativity could be reformulated so that a relativity of size substituted for a relativity of time.

    Using the language of shape dynamics, Cortes, Gomes and I found a way to gently modify general relativity so that little is changed on the scale of stars, galaxies and planets. Nor are the predictions of general relativity regarding gravitational waves affected. But on the scale of the whole universe, and for the early universe, there are deviations where one cannot escape the consequences of a fundamental direction of time.

    Very recently I found still another way to modify the laws of general relativity to make them irreversible. General relativity incorporates effects of two fixed constants of nature, Newton’s constant, which measures the strength of the gravitational force, and the cosmological constant [usually denoted by the Greek capital letter lambda: Λ], which measures the density of energy in empty space. Usually these both are fixed constants, but I found a way they could evolve in time without destroying the beautiful harmony and consistency of the Einstein equations of general relativity.

    These developments are very recent and are far from demonstrating that the irreversibility we see around us is a reflection of a fundamental arrow of time. But they open a way to an understanding of how time got its direction that does not rely on our universe being a consequence of a cosmic accident.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

  • richardmitnick 3:32 pm on May 29, 2015 Permalink | Reply
    Tags: , , , Quantum Mechanics   

    From ANU: “Physicists solve quantum tunneling mystery” 

    ANU Australian National University Bloc

    Australian National University

    28 May 2015
    No Writer Credit

    Professor Anatoli Kheifets’ theory tackles ultrafast physics. Composite image Stuart Hay, ANU

    An international team of scientists studying ultrafast physics have solved a mystery of quantum mechanics, and found that quantum tunneling is an instantaneous process.

    The new theory could lead to faster and smaller electronic components, for which quantum tunneling is a significant factor. It will also lead to a better understanding of diverse areas such as electron microscopy, nuclear fusion and DNA mutations.

    “Timescales this short have never been explored before. It’s an entirely new world,” said one of the international team, Professor Anatoli Kheifets, from The Australian National University (ANU).

    “We have modelled the most delicate processes of nature very accurately.”

    At very small scales quantum physics shows that particles such as electrons have wave-like properties – their exact position is not well defined. This means they can occasionally sneak through apparently impenetrable barriers, a phenomenon called quantum tunneling.

    Quantum tunneling plays a role in a number of phenomena, such as nuclear fusion in the sun, scanning tunneling microscopy, and flash memory for computers. However, the leakage of particles also limits the miniaturisation of electronic components.

    Professor Kheifets and Dr. Igor Ivanov, from the ANU Research School of Physics and Engineering, are members of a team which studied ultrafast experiments at the attosecond scale (10-18 seconds), a field that has developed in the last 15 years.

    Until their work, a number of attosecond phenomena could not be adequately explained, such as the time delay when a photon ionised an atom.

    “At that timescale the time an electron takes to quantum tunnel out of an atom was thought to be significant. But the mathematics says the time during tunneling is imaginary – a complex number – which we realised meant it must be an instantaneous process,” said Professor Kheifets.

    “A very interesting paradox arises, because electron velocity during tunneling may become greater than the speed of light. However, this does not contradict the special theory of relativity, as the tunneling velocity is also imaginary” said Dr Ivanov, who recently took up a position at the Center for Relativistic Laser Science in Korea.

    The team’s calculations, which were made using the Raijin supercomputer, revealed that the delay in photoionisation originates not from quantum tunneling but from the electric field of the nucleus attracting the escaping electron.

    The results give an accurate calibration for future attosecond-scale research, said Professor Kheifets.

    “It’s a good reference point for future experiments, such as studying proteins unfolding, or speeding up electrons in microchips,” he said.

    The research is published in Nature Physics.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ANU Campus

    ANU is a world-leading university in Australia’s capital city, Canberra. Our location points to our unique history, ties to the Australian Government and special standing as a resource for the Australian people.

    Our focus on research as an asset, and an approach to education, ensures our graduates are in demand the world-over for their abilities to understand, and apply vision and creativity to addressing complex contemporary challenges.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc

Get every new post delivered to your Inbox.

Join 475 other followers

%d bloggers like this: