Tagged: Quantum entanglement Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:27 am on July 12, 2017 Permalink | Reply
    Tags: , , , , Micius satellite, , Quantum entanglement, Teleportation achieved   

    From MIT Tech Review: “First Object Teleported from Earth to Orbit” 

    MIT Technology Review
    M.I.T Technology Review

    July 10, 2017
    No writer credit found

    Researchers in China have teleported a photon from the ground to a satellite orbiting more than 500 kilometers above.

    Last year, a Long March 2D rocket took off from the Jiuquan Satellite Launch Centre in the Gobi Desert carrying a satellite called Micius, named after an ancient Chinese philosopher who died in 391 B.C. The rocket placed Micius in a Sun-synchronous orbit so that it passes over the same point on Earth at the same time each day.

    Micius is a highly sensitive photon receiver that can detect the quantum states of single photons fired from the ground. That’s important because it should allow scientists to test the technological building blocks for various quantum feats such as entanglement, cryptography, and teleportation.

    2
    Micius satellite. https://www.fusecrunch.com/chinas-first-quantum-satellite.html

    Today, the Micius team announced the results of its first experiments. The team created the first satellite-to-ground quantum network, in the process smashing the record for the longest distance over which entanglement has been measured. And they’ve used this quantum network to teleport the first object from the ground to orbit.

    Teleportation has become a standard operation in quantum optics labs around the world. The technique relies on the strange phenomenon of entanglement. This occurs when two quantum objects, such as photons, form at the same instant and point in space and so share the same existence. In technical terms, they are described by the same wave function.

    3
    No image caption or credit.

    The curious thing about entanglement is that this shared existence continues even when the photons are separated by vast distances. So a measurement on one immediately influences the state of the other, regardless of the distance between them.

    Back in the 1990s, scientists realized they could use this link to transmit quantum information from one point in the universe to another. The idea is to “download” all the information associated with one photon in one place and transmit it over an entangled link to another photon in another place.

    This second photon then takes on the identity of the first. To all intents and purposes, it becomes the first photon. That’s the nature of teleportation and it has been performed many times in labs on Earth.

    Teleportation is a building block for a wide range of technologies. “Long-distance teleportation has been recognized as a fundamental element in protocols such as large-scale quantum networks and distributed quantum computation,” says the Chinese team.

    In theory, there should be no maximum distance over which this can be done. But entanglement is a fragile thing because photons interact with matter in the atmosphere or inside optical fibers, causing the entanglement to be lost.

    As a result, the distance over which scientists have measured entanglement or performed teleportation is severely limited. “Previous teleportation experiments between distant locations were limited to a distance on the order of 100 kilometers, due to photon loss in optical fibers or terrestrial free-space channels,” says the team.

    But Micius changes all that because it orbits at an altitude of 500 kilometers, and for most of this distance, any photons making the journey travel through a vacuum. To minimize the amount of atmosphere in the way, the Chinese team set up its ground station in Ngari in Tibet at an altitude of over 4,000 meters. So the distance from the ground to the satellite varies from 1,400 kilometers when it is near the horizon to 500 kilometers when it is overhead.

    To perform the experiment, the Chinese team created entangled pairs of photons on the ground at a rate of about 4,000 per second. They then beamed one of these photons to the satellite, which passed overhead every day at midnight. They kept the other photon on the ground.

    Finally, they measured the photons on the ground and in orbit to confirm that entanglement was taking place, and that they were able to teleport photons in this way. Over 32 days, they sent millions of photons and found positive results in 911 cases. “We report the first quantum teleportation of independent single-photon qubits from a ground observatory to a low Earth orbit satellite—through an up-link channel— with a distance up to 1400 km,” says the Chinese team.

    This is the first time that any object has been teleported from Earth to orbit, and it smashes the record for the longest distance for entanglement.

    That’s impressive work that sets the stage for much more ambitious goals in the future. “This work establishes the first ground-to-satellite up-link for faithful and ultra-long-distance quantum teleportation, an essential step toward global-scale quantum internet,” says the team.

    It also shows China’s obvious dominance and lead in a field that, until recently, was led by Europe and the U.S.—Micius would surely have been impressed. But an important question now is how the West will respond.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

     
  • richardmitnick 1:27 pm on June 18, 2017 Permalink | Reply
    Tags: , China has taken the leadership in quantum communication, China Shatters 'Spooky Action at a Distance' Record, For now the system remains mostly a proof of concept, Global quantum communication is possible and will be achieved in the near future, , Preps for Quantum Internet, Quantum entanglement, ,   

    From SA: “China Shatters ‘Spooky Action at a Distance’ Record, Preps for Quantum Internet” 

    Scientific American

    Scientific American

    June 15, 2017
    Lee Billings

    1
    Credit: Alfred Pasieka Getty Images

    In a landmark study, a team of Chinese scientists using an experimental satellite has tested quantum entanglement over unprecedented distances, beaming entangled pairs of photons to three ground stations across China—each separated by more than 1,200 kilometers. The test verifies a mysterious and long-held tenet of quantum theory, and firmly establishes China as the front-runner in a burgeoning “quantum space race” to create a secure, quantum-based global communications network—that is, a potentially unhackable “quantum internet” that would be of immense geopolitical importance. The findings were published Thursday in Science.

    “China has taken the leadership in quantum communication,” says Nicolas Gisin, a physicist at the University of Geneva who was not involved in the study. “This demonstrates that global quantum communication is possible and will be achieved in the near future.”

    The concept of quantum communications is considered the gold standard for security, in part because any compromising surveillance leaves its imprint on the transmission. Conventional encrypted messages require secret keys to decrypt, but those keys are vulnerable to eavesdropping as they are sent out into the ether. In quantum communications, however, these keys can be encoded in various quantum states of entangled photons—such as their polarization—and these states will be unavoidably altered if a message is intercepted by eavesdroppers. Ground-based quantum communications typically send entangled photon pairs via fiber-optic cables or open air. But collisions with ordinary atoms along the way disrupt the photons’ delicate quantum states, limiting transmission distances to a few hundred kilometers. Sophisticated devices called “quantum repeaters”—equipped with “quantum memory” modules—could in principle be daisy-chained together to receive, store and retransmit the quantum keys across longer distances, but this task is so complex and difficult that such systems remain largely theoretical.

    “A quantum repeater has to receive photons from two different places, then store them in quantum memory, then interfere them directly with each other” before sending further signals along a network, says Paul Kwiat, a physicist at the University of Illinois in Urbana–Champaign who is unaffiliated with the Chinese team. “But in order to do all that, you have to know you’ve stored them without actually measuring them.” The situation, Kwiat says, is a bit like knowing what you have received in the mail without looking in your mailbox or opening the package inside. “You can shake the package—but that’s difficult to do if what you’re receiving is just photons. You want to make sure you’ve received them but you don’t want to absorb them. In principle it’s possible—no question—but it’s very hard to do.”

    To form a globe-girdling secure quantum communications network, then, the only available solution is to beam quantum keys through the vacuum of space then distribute them across tens to hundreds of kilometers using ground-based nodes. Launched into low Earth orbit in 2016 and named after an ancient Chinese philosopher, the 600-kilogram “Micius” satellite is China’s premiere effort to do just that, and is only the first of a fleet the nation plans as part of its $100-million Quantum Experiments at Space Scale (QUESS) program.

    Micius carries in its heart an assemblage of crystals and lasers that generates entangled photon pairs then splits and transmits them on separate beams to ground stations in its line-of-sight on Earth. For the latest test, the three receiving stations were located in the cities of Delingha and Ürümqi—both on the Tibetan Plateau—as well as in the city of Lijiang in China’s far southwest. At 1,203 kilometers, the geographical distance between Delingha and Lijiang is the record-setting stretch over which the entangled photon pairs were transmitted.

    For now the system remains mostly a proof of concept, because the current reported data transmission rate between Micius and its receiving stations is too low to sustain practical quantum communications. Of the roughly six million entangled pairs that Micius’s crystalline core produced during each second of transmission, only about one pair per second reached the ground-based detectors after the beams weakened as they passed through Earth’s atmosphere and each receiving station’s light-gathering telescopes. Team leader Jian-Wei Pan—a physicist at the University of Science and Technology of China in Hefei who has pushed and planned for the experiment since 2003—compares the feat with detecting a single photon from a lone match struck by someone standing on the moon. Even so, he says, Micius’s transmission of entangled photon pairs is “a trillion times more efficient than using the best telecommunication fibers. … We have done something that was absolutely impossible without the satellite.” Within the next five years, Pan says, QUESS will launch more practical quantum communications satellites.

    Although Pan and his team plan for Micius and its nascent network of sister satellites to eventually distribute quantum keys, their initial demonstration instead aimed to achieve a simpler task: proving Einstein wrong.

    Einstein famously derided as “spooky action at a distance” one of the most bizarre elements of quantum theory—the way that measuring one member of an entangled pair of particles seems to instantaneously change the state of its counterpart, even if that counterpart particle is on the other side of the galaxy. This was abhorrent to Einstein, because it suggests information might be transmitted between the particles faster than light, breaking the universal speed limit set by his theory of special relativity. Instead, he and others posited, perhaps the entangled particles somehow shared “hidden variables” that are inaccessible to experiment but would determine the particles’ subsequent behavior when measured. In 1964 the physicist John Bell devised a way to test Einstein’s idea, calculating a limit that physicists could statistically measure for how much hidden variables could possibly correlate with the behavior of entangled particles. If experiments showed this limit to be exceeded, then Einstein’s idea of hidden variables would be incorrect.

    Ever since the 1970s “Bell tests” by physicists across ever-larger swaths of spacetime have shown that Einstein was indeed mistaken, and that entangled particles do in fact surpass Bell’s strict limits. The most definitive test arguably occurred in the Netherlands in 2015, when a team at Delft University of Technology closed several potential “loopholes” that had plagued past experiments and offered slim-but-significant opportunities for the influence of hidden variables to slip through. That test, though, involved separating entangled particles by scarcely more than a kilometer. With Micius’s transmission of entangled photons between widely separated ground stations, Pan’s team has now performed a Bell test at distances a thousand times greater. Just as before, their results confirm that Einstein was wrong. The quantum realm remains a spooky place—although no one yet understands why.

    “Of course, no one who accepts quantum mechanics could possibly doubt that entanglement can be created over that distance—or over any distance—but it’s still nice to see it made concrete,” says Scott Aaronson, a physicist at The University of Texas at Austin. “Nothing we knew suggested this goal was unachievable. The significance of this news is not that it was unexpected or that it overturns anything previously believed, but simply that it’s a satisfying culmination of years of hard work.”

    That work largely began in the 1990s when Pan, leader of the Chinese team, was a graduate student in the lab of the physicist Anton Zeilinger at the University of Innsbruck in Austria. Zeilinger was Pan’s PhD adviser, and they collaborated closely to test and further develop ideas for quantum communication. Pan returned to China to start his own lab in 2001, and Zeilinger started one as well at the Austrian Academy of Sciences in Vienna. For the next seven years they would compete fiercely to break records for transmitting entangled photon pairs across ever-wider gaps, and in ever-more extreme conditions, in ground-based experiments. All the while each man lobbied his respective nation’s space agency to green-light a satellite that could be used to test the technique from space. But Zeilinger’s proposals perished in a bureaucratic swamp at the European Space Agency whereas Pan’s were quickly embraced by the China National Space Administration. Ultimately, Zeilinger chose to collaborate again with his old pupil rather than compete against him; today the Austrian Academy of Sciences is a partner in QUESS, and the project has plans to use Micius to perform an intercontinental quantum key distribution experiment between ground stations in Vienna and Beijing.

    “I am happy that the Micius works so well,” Zeilinger says. “But one has to realize that it is a missed opportunity for Europe and others, too.”

    For years now, other researchers and institutions have been scrambling to catch up, pushing governments for more funding for further experiments on the ground and in space—and many of them see Micius’s success as the catalytic event they have been waiting for. “This is a major milestone, because if we are ever to have a quantum internet in the future, we will need to send entanglement over these sorts of long distances,” says Thomas Jennewein, a physicist at the University of Waterloo in Canada who was not involved with the study. “This research is groundbreaking for all of us in the community—everyone can point to it and say, ‘see, it does work!’”

    Jennewein and his collaborators are pursuing a space-based approach from the ground up, partnering with the Canadian Space Agency to plan a smaller, simpler satellite that could launch as soon as five years from now to act as a “universal receiver” and redistribute entangled photons beamed up from ground stations. At the National University of Singapore, an international collaboration led by the physicist Alexander Ling has already launched cheap shoe box–size CubeSats to create, study and perhaps even transmit photon pairs that are “correlated”—a situation just shy of full entanglement. And in the U.S., Kwiat at the University of Illinois is using NASA funding to develop a device that could someday test quantum communications using “hyperentanglement” (the simultaneous entanglement of photon pairs in multiple ways) onboard the International Space Station.

    Perhaps most significantly, a team led by Gerd Leuchs and Christoph Marquardt at the Max Planck Institute for the Science of Light in Germany is developing quantum communications protocols for commercially available laser systems already in space onboard the European Copernicus and SpaceDataHighway satellites. Using one of these systems, the team successfully encoded and sent simple quantum states to ground stations using photons beamed from a satellite in geostationary orbit, some 38,000 kilometers above Earth. This approach, Marquardt explains, does not rely on entanglement and is very different from that of QUESS—but it could, with minimal upgrades, nonetheless be used to distribute quantum keys for secure communications in as little as five years. Their results appear in Optica.

    “Our purpose is really to find a shortcut into making things like quantum key distribution with satellites economically viable and employable, pretty fast and soon,” Marquardt says. “[Engineers] invested 20 years of hard work making these systems, so it’s easier to upgrade them than to design everything from scratch. … It is a very good advantage if you can rely on something that is already qualified in space, because space qualification is very complicated. It usually takes five to 10 years just to develop that.”

    Marquardt and others suspect, however, that this field could be much further advanced than has been publicly acknowledged, with developments possibly hidden behind veils of official secrecy in the U.S. and elsewhere. It may be that the era of quantum communication is already upon us. “Some colleague of mine made the joke, ‘the silence of the U.S. is very loud,’” Marquardt says. “They had some very good groups concerning free-space satellites and quantum key distribution at Los Alamos [National Laboratory] and other places, and suddenly they stopped publishing. So we always say there are two reasons that they stopped publishing: either it didn’t work, or it worked really well!”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 11:07 am on June 11, 2017 Permalink | Reply
    Tags: , Bell test, Cosmic Bell test, Experiment Reaffirms Quantum Weirdness, John Bell, , Quantum entanglement, , , Superdeterminism   

    From Quanta: “Experiment Reaffirms Quantum Weirdness” 

    Quanta Magazine
    Quanta Magazine

    February 7, 2017 [I wonder where this was hiding. It just appeared today in social media.]
    Natalie Wolchover

    Physicists are closing the door on an intriguing loophole around the quantum phenomenon Einstein called “spooky action at a distance.”

    1
    Olena Shmahalo/Quanta Magazine

    There might be no getting around what Albert Einstein called “spooky action at a distance.” With an experiment described today in Physical Review Letters — a feat that involved harnessing starlight to control measurements of particles shot between buildings in Vienna — some of the world’s leading cosmologists and quantum physicists are closing the door on an intriguing alternative to “quantum entanglement.”

    “Technically, this experiment is truly impressive,” said Nicolas Gisin, a quantum physicist at the University of Geneva who has studied this loophole around entanglement.

    00:00/09:42

    According to standard quantum theory, particles have no definite states, only relative probabilities of being one thing or another — at least, until they are measured, when they seem to suddenly roll the dice and jump into formation. Stranger still, when two particles interact, they can become “entangled,” shedding their individual probabilities and becoming components of a more complicated probability function that describes both particles together. This function might specify that two entangled photons are polarized in perpendicular directions, with some probability that photon A is vertically polarized and photon B is horizontally polarized, and some chance of the opposite. The two photons can travel light-years apart, but they remain linked: Measure photon A to be vertically polarized, and photon B instantaneously becomes horizontally polarized, even though B’s state was unspecified a moment earlier and no signal has had time to travel between them. This is the “spooky action” that Einstein was famously skeptical about in his arguments against the completeness of quantum mechanics in the 1930s and ’40s.

    In 1964, the Northern Irish physicist John Bell found a way to put this paradoxical notion to the test. He showed that if particles have definite states even when no one is looking (a concept known as “realism”) and if indeed no signal travels faster than light (“locality”), then there is an upper limit to the amount of correlation that can be observed between the measured states of two particles. But experiments have shown time and again that entangled particles are more correlated than Bell’s upper limit, favoring the radical quantum worldview over local realism.

    Only there’s a hitch: In addition to locality and realism, Bell made another, subtle assumption to derive his formula — one that went largely ignored for decades. “The three assumptions that go into Bell’s theorem that are relevant are locality, realism and freedom,” said Andrew Friedman of the Massachusetts Institute of Technology, a co-author of the new paper. “Recently it’s been discovered that you can keep locality and realism by giving up just a little bit of freedom.” This is known as the “freedom-of-choice” loophole.

    In a Bell test, entangled photons A and B are separated and sent to far-apart optical modulators — devices that either block photons or let them through to detectors, depending on whether the modulators are aligned with or against the photons’ polarization directions. Bell’s inequality puts an upper limit on how often, in a local-realistic universe, photons A and B will both pass through their modulators and be detected. (Researchers find that entangled photons are correlated more often than this, violating the limit.) Crucially, Bell’s formula assumes that the two modulators’ settings are independent of the states of the particles being tested. In experiments, researchers typically use random-number generators to set the devices’ angles of orientation. However, if the modulators are not actually independent — if nature somehow restricts the possible settings that can be chosen, correlating these settings with the states of the particles in the moments before an experiment occurs — this reduced freedom could explain the outcomes that are normally attributed to quantum entanglement.

    The universe might be like a restaurant with 10 menu items, Friedman said. “You think you can order any of the 10, but then they tell you, ‘We’re out of chicken,’ and it turns out only five of the things are really on the menu. You still have the freedom to choose from the remaining five, but you were overcounting your degrees of freedom.” Similarly, he said, “there might be unknowns, constraints, boundary conditions, conservation laws that could end up limiting your choices in a very subtle way” when setting up an experiment, leading to seeming violations of local realism.

    This possible loophole gained traction in 2010, when Michael Hall, now of Griffith University in Australia, developed a quantitative way of reducing freedom of choice [Phys.Rev.Lett.]. In Bell tests, measuring devices have two possible settings (corresponding to one bit of information: either 1 or 0), and so it takes two bits of information to specify their settings when they are truly independent. But Hall showed that if the settings are not quite independent — if only one bit specifies them once in every 22 runs — this halves the number of possible measurement settings available in those 22 runs. This reduced freedom of choice correlates measurement outcomes enough to exceed Bell’s limit, creating the illusion of quantum entanglement.

    The idea that nature might restrict freedom while maintaining local realism has become more attractive in light of emerging connections between information and the geometry of space-time. Research on black holes, for instance, suggests that the stronger the gravity in a volume of space-time, the fewer bits can be stored in that region. Could gravity be reducing the number of possible measurement settings in Bell tests, secretly striking items from the universe’s menu?

    2
    Members of the cosmic Bell test team calibrating the telescope used to choose the settings of one of their two detectors located in far-apart buildings in Vienna. Jason Gallicchio

    Friedman, Alan Guth and colleagues at MIT were entertaining such speculations a few years ago when Anton Zeilinger, a famous Bell test experimenter at the University of Vienna, came for a visit.

    4
    Alan Guth, Highland Park High School and M.I.T., who first proposed cosmic inflation

    HPHS Owls

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    5
    Alan Guth’s notes. http://www.bestchinanews.com/Explore/4730.html

    Zeilinger also had his sights on the freedom-of-choice loophole. Together, they and their collaborators developed an idea for how to distinguish between a universe that lacks local realism and one that curbs freedom.

    In the first of a planned series of “cosmic Bell test” experiments, the team sent pairs of photons from the roof of Zeilinger’s lab in Vienna through the open windows of two other buildings and into optical modulators, tallying coincident detections as usual. But this time, they attempted to lower the chance that the modulator settings might somehow become correlated with the states of the photons in the moments before each measurement. They pointed a telescope out of each window, trained each telescope on a bright and conveniently located (but otherwise random) star, and, before each measurement, used the color of an incoming photon from each star to set the angle of the associated modulator. The colors of these photons were decided hundreds of years ago, when they left their stars, increasing the chance that they (and therefore the measurement settings) were independent of the states of the photons being measured.

    And yet, the scientists found that the measurement outcomes still violated Bell’s upper limit, boosting their confidence that the polarized photons in the experiment exhibit spooky action at a distance after all.

    Nature could still exploit the freedom-of-choice loophole, but the universe would have had to delete items from the menu of possible measurement settings at least 600 years before the measurements occurred (when the closer of the two stars sent its light toward Earth). “Now one needs the correlations to have been established even before Shakespeare wrote, ‘Until I know this sure uncertainty, I’ll entertain the offered fallacy,’” Hall said.

    Next, the team plans to use light from increasingly distant quasars to control their measurement settings, probing further back in time and giving the universe an even smaller window to cook up correlations between future device settings and restrict freedoms. It’s also possible (though extremely unlikely) that the team will find a transition point where measurement settings become uncorrelated and violations of Bell’s limit disappear — which would prove that Einstein was right to doubt spooky action.

    “For us it seems like kind of a win-win,” Friedman said. “Either we close the loophole more and more, and we’re more confident in quantum theory, or we see something that could point toward new physics.”

    There’s a final possibility that many physicists abhor. It could be that the universe restricted freedom of choice from the very beginning — that every measurement was predetermined by correlations established at the Big Bang. “Superdeterminism,” as this is called, is “unknowable,” said Jan-Åke Larsson, a physicist at Linköping University in Sweden; the cosmic Bell test crew will never be able to rule out correlations that existed before there were stars, quasars or any other light in the sky. That means the freedom-of-choice loophole can never be completely shut.

    But given the choice between quantum entanglement and superdeterminism, most scientists favor entanglement — and with it, freedom. “If the correlations are indeed set [at the Big Bang], everything is preordained,” Larsson said. “I find it a boring worldview. I cannot believe this would be true.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 2:16 pm on May 16, 2017 Permalink | Reply
    Tags: , , , , , Quantum entanglement, Tim Maudlin   

    From Quanta: “A Defense of the Reality of Time” Tim Maudlin 

    Quanta Magazine
    Quanta Magazine

    May 16, 2017
    George Musser

    1
    Tim Maudlin. Edwin Tse for Quanta Magazine

    Time isn’t just another dimension, argues Tim Maudlin. To make his case, he’s had to reinvent geometry.

    Physicists and philosophers seem to like nothing more than telling us that everything we thought about the world is wrong. They take a peculiar pleasure in exposing common sense as nonsense. But Tim Maudlin thinks our direct impressions of the world are a better guide to reality than we have been led to believe.

    Not that he thinks they always are. Maudlin, who is a professor at New York University and one of the world’s leading philosophers of physics, made his name studying the strange behavior of “entangled” quantum particles, which display behavior that is as counterintuitive as can be; if anything, he thinks physicists have downplayed how transformative entanglement is.

    2
    Quantum entanglement. ATCA

    At the same time, though, he thinks physicists can be too hasty to claim that our conventional views are misguided, especially when it comes to the nature of time.

    He defends a homey and unfashionable view of time. It has a built-in arrow. It is fundamental rather than derived from some deeper reality. Change is real, as opposed to an illusion or an artifact of perspective. The laws of physics act within time to generate each moment. Mixing mathematics, physics and philosophy, Maudlin bats away the reasons that scientists and philosophers commonly give for denying this folk wisdom.

    The mathematical arguments are the target of his current project, the second volume of New Foundations for Physical Geometry (the first appeared in 2014). Modern physics, he argues, conceptualizes time in essentially the same way as space. Space, as we commonly understand it, has no innate direction — it is isotropic. When we apply spatial intuitions to time, we unwittingly assume that time has no intrinsic direction, either. New Foundations rethinks topology in a way that allows for a clearer distinction between time and space. Conventionally, topology — the first level of geometrical structure — is defined using open sets, which describe the neighborhood of a point in space or time. “Open” means a region has no sharp edge; every point in the set is surrounded by other points in the same set.

    Maudlin proposes instead to base topology on lines. He sees this as closer to our everyday geometrical intuitions, which are formed by thinking about motion. And he finds that, to match the results of standard topology, the lines need to be directed, just as time is. Maudlin’s approach differs from other approaches that extend standard topology to endow geometry with directionality; it is not an extension, but a rethinking that builds in directionality at the ground level.

    Maudlin discussed his ideas with Quanta Magazine in March. Here is a condensed and edited version of the interview.

    Why might one think that time has a direction to it? That seems to go counter to what physicists often say.

    I think that’s a little bit backwards. Go to the man on the street and ask whether time has a direction, whether the future is different from the past, and whether time doesn’t march on toward the future. That’s the natural view. The more interesting view is how the physicists manage to convince themselves that time doesn’t have a direction.
    They would reply that it’s a consequence of Einstein’s special theory of relativity, which holds that time is a fourth dimension.

    This notion that time is just a fourth dimension is highly misleading. In special relativity, the time directions are structurally different from the space directions. In the timelike directions, you have a further distinction into the future and the past, whereas any spacelike direction I can continuously rotate into any other spacelike direction. The two classes of timelike directions can’t be continuously transformed into one another.

    Standard geometry just wasn’t developed for the purpose of doing space-time. It was developed for the purpose of just doing spaces, and spaces have no directedness in them. And then you took this formal tool that you developed for this one purpose and then pushed it to this other purpose.

    When relativity was developed in the early part of the 20th century, did people begin to see this problem?

    I don’t think they saw it as a problem. The development was highly algebraic, and the more algebraic the technique, the further you get from having a geometrical intuition about what you’re doing. So if you develop the standard account of, say, the metric of space-time, and then you ask, “Well, what happens if I start putting negative numbers in this thing?” That’s a perfectly good algebraic question to ask. It’s not so clear what it means geometrically. And people do the same thing now when they say, “Well, what if time had two dimensions?” As a purely algebraic question, I can say that. But if you ask me what could it mean, physically, for time to have two dimensions, I haven’t the vaguest idea. Is it consistent with the nature of time that it be a two-dimensional thing? Because if you think that what time does is order events, then that order is a linear order, and you’re talking about a fundamentally one-dimensional kind of organization.
    And so you are trying to allow for the directionality of time by rethinking geometry. How does that work?

    I really was not starting from physics. I was starting from just trying to understand topology. When you teach, you’re forced to confront your own ignorance. I was trying to explain standard topology to some students when I was teaching a class on space and time, and I realized that I didn’t understand it. I couldn’t see the connection between the technical machinery and the concepts that I was using.

    Suppose I just hand you a bag of points. It doesn’t have a geometry. So I have to add some structure to give it anything that is recognizably geometrical. In the standard approach, I specify which sets of points are open sets. In my approach, I specify which sets of points are lines.

    How does this differ from ordinary geometry taught in high school?

    In this approach that’s based on lines, a very natural thing to do is to put directionality on the lines. It’s very easy to implement at the level of axioms. If you’re doing Euclidean geometry, this isn’t going to occur to you, because your idea in Euclidean geometry is if I have a continuous line from A to B, it’s just as well a continuous line B to A — that there’s no directionality in a Euclidean line.
    From the pure mathematical point of view, why might your approach be preferable?

    In my approach, you put down a linear structure on a set of points. If you put down lines according to my axioms, there’s then a natural definition of an open set, and it generates a topology.

    Another important conceptual advantage is that there’s no problem thinking of a line that’s discrete. People form lines where there are only finitely many people, and you can talk about who’s the next person in line, and who’s the person behind them, and so on. The notion of a line is neutral between it being discrete and being continuous. So you have this general approach.

    Why is this kind of modification important for physics?

    As soon as you start talking about space-time, the idea that time has a directionality is obviously something we begin with. There’s a tremendous difference between the past and the future. And so, as soon as you start to think geometrically of space-time, of something that has temporal characteristics, a natural thought is that you are thinking of something that does now have an intrinsic directionality. And if your basic geometrical objects can have directionality, then you can use them to represent this physical directionality.
    Physicists have other arguments for why time doesn’t have a direction.

    Often one will hear that there’s a time-reversal symmetry in the laws. But the normal way you describe a time-reversal symmetry presupposes there’s a direction of time. Someone will say the following: “According to Newtonian physics, if the glass can fall off the table and smash on the floor, then it’s physically possible for the shards on the floor to be pushed by the concerted effort of the floor, recombine into the glass and jump back up on the table.” That’s true. But notice, both of those descriptions are ones that presuppose there’s a direction of time. That is, they presuppose that there’s a difference between the glass falling and the glass jumping, and there’s a difference between the glass shattering and the glass recombining. And the difference between those two is always which direction is the future, and which direction is the past.

    So I’m certainly not denying that there is this time-reversibility. But the time-reversibility doesn’t imply that there isn’t a direction of time. It just says that for every event that the laws of physics allow, there is a corresponding event in which various things have been reversed, velocities have been reversed and so on. But in both of these cases, you think of them as allowing a process that’s running forward in time.

    Now that raises a puzzle: Why do we often see the one kind of thing and not the other kind of thing? And that’s the puzzle about thermodynamics and entropy and so on.

    If time has a direction, is the thermodynamic arrow of time still a problem?

    The problem there isn’t with the arrow. The problem is with understanding why things started out in a low-entropy state. Once you have that it starts in a low-entropy state, the normal thermodynamic arguments lead you to expect that most of the possible initial states are going to yield an increasing entropy. So the question is, why did things start out so low entropy?

    One choice is that the universe is only finite in time and had an initial state, and then there’s the question: “Can you explain why the initial state was low?” which is a subpart of the question, “Can you explain an initial state at all?” It didn’t come out of anything, so what would it mean to explain it in the first place?

    The other possibility is that there was something before the big bang. If you imagine the big bang is the bubbling-off of this universe from some antecedent proto-universe or from chaotically inflating space-time, then there’s going to be the physics of that bubbling-off, and you would hope the physics of the bubbling-off might imply that the bubbles would be of a certain character.
    Given that we still need to explain the initial low-entropy state, why do we need the internal directedness of time? If time didn’t have a direction, wouldn’t specification of a low-entropy state be enough to give it an effective direction?

    If time didn’t have a direction, it seems to me that would make time into just another spatial dimension, and if all we’ve got all are spatial dimensions, then it seems to me nothing’s happening in the universe. I can imagine a four-dimensional spatial object, but nothing occurs in it. This is the way people often talk about the, quote, “block universe” as being fixed or rigid or unchanging or something like that, because they’re thinking of it like a four-dimensional spatial object. If you had that, then I don’t see how any initial condition put on it — or any boundary condition put on it; you can’t say “initial” anymore — could create time. How can a boundary condition change the fundamental character of a dimension from spatial to temporal?

    Suppose on one boundary there’s low entropy; from that I then explain everything. You might wonder: “But why that boundary? Why not go from the other boundary, where presumably things are at equilibrium?” The peculiar characteristics at this boundary are not low entropy — there’s high entropy there — but that the microstate is one of the very special ones that leads to a long period of decreasing entropy. Now it seems to me that it has the special microstate because it developed from a low-entropy initial state. But now I’m using “initial” and “final,” and I’m appealing to certain causal notions and productive notions to do the explanatory work. If you don’t have a direction of time to distinguish the initial from the final state and to underwrite these causal locutions, I’m not quite sure how the explanations are supposed to go.

    But all of this seems so — what can I say? It seems so remote from the physical world. We’re sitting here and time is going on, and we know what it means to say that time is going on. I don’t know what it means to say that time really doesn’t pass and it’s only in virtue of entropy increasing that it seems to.

    You don’t sound like much of a fan of the block universe.

    There’s a sense in which I believe a certain understanding of the block universe. I believe that the past is equally real as the present, which is equally real as the future. Things that happened in the past were just as real. Pains in the past were pains, and in the future they’ll be real too, and there was one past and there will be one future. So if that’s all it means to believe in a block universe, fine.

    People often say, “I’m forced into believing in a block universe because of relativity.” The block universe, again, is some kind of rigid structure. The totality of concrete physical reality is specifying that four-dimensional structure and what happens everywhere in it. In Newtonian mechanics, this object is foliated by these planes of absolute simultaneity. And in relativity you don’t have that; you have this light-cone structure instead. So it has a different geometrical character. But I don’t see how that different geometrical character gets rid of time or gets rid of temporality.

    The idea that the block universe is static drives me crazy. What is it to say that something is static? It’s to say that as time goes on, it doesn’t change. But it’s not that the block universe is in time; time is in it. When you say it’s static, it somehow suggests that there is no change, nothing really changes, change is an illusion. It blows your mind. Physics has discovered some really strange things about the world, but it has not discovered that change is an illusion.
    What does it mean for time to pass? Is that synonymous with “time has a direction,” or is there something in addition?

    There’s something in addition. For time to pass means for events to be linearly ordered, by earlier and later. The causal structure of the world depends on its temporal structure. The present state of the universe produces the successive states. To understand the later states, you look at the earlier states and not the other way around. Of course, the later states can give you all kinds of information about the earlier states, and, from the later states and the laws of physics, you can infer the earlier states. But you normally wouldn’t say that the later states explain the earlier states. The direction of causation is also the direction of explanation.
    Am I accurate in getting from you that there’s a generation or production going on here — that there’s a machinery that sits grinding away, one moment giving rise to the next, giving rise to the next?

    Well, that’s certainly a deep part of the picture I have. The machinery is exactly the laws of nature. That gives a constraint on the laws of nature — namely, that they should be laws of temporal evolution. They should be laws that tell you, as time goes on, how will new states succeed old ones. The claim would be there are no fundamental laws that are purely spatial and that where you find spatial regularities, they have temporal explanations.

    Does this lead you to a different view of what a law even is?

    It leads me to a different view than the majority view. I think of laws as having a kind of primitive metaphysical status, that laws are not derivative on anything else. It’s, rather, the other way around: Other things are derivative from, produced by, explained by, derived from the laws operating. And there, the word “operating” has this temporal characteristic.
    Why is yours a minority view? Because it seems to me, if you ask most people on the street what the laws of physics do, they would say, “It’s part of a machinery.”

    I often say my philosophical views are just kind of the naïve views you would have if you took a physics class or a cosmology class and you took seriously what you were being told. In a physics class on Newtonian mechanics, they’ll write down some laws and they’ll say, “Here are the laws of Newtonian mechanics.” That’s really the bedrock from which you begin.

    I don’t think I hold really bizarre views. I take “time doesn’t pass” or “the passage of time is an illusion” to be a pretty bizarre view. Not to say it has to be false, but one that should strike you as not what you thought.
    What does this all have to say about whether time is fundamental or emergent?

    I’ve never been able to quite understand what the emergence of time, in its deeper sense, is supposed to be. The laws are usually differential equations in time. They talk about how things evolve. So if there’s no time, then things can’t evolve. How do we understand — and is the emergence a temporal emergence? It’s like, in a certain phase of the universe, there was no time; and then in other phases, there is time, where it seems as though time emerges temporally out of non-time, which then seems incoherent.

    Where do you stop offering analyses? Where do you stop — where is your spade turned, as Wittgenstein would say? And for me, again, the notion of temporality or of time seems like a very good place to think I’ve hit a fundamental feature of the universe that is not explicable in terms of anything else.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 6:57 am on May 16, 2017 Permalink | Reply
    Tags: , , EPR paradox, , , , Quantum entanglement,   

    From COSMOS: “Using Einstein’s ‘spooky action at a distance’ to hear ripples in spacetime” 

    Cosmos Magazine bloc

    COSMOS

    16 May 2017
    Cathal O’Connell

    1
    The new technique will aid in the detection of gravitational waves caused by colliding black holes. Henze / NASA

    In new work that connects two of Albert Einstein’s ideas in a way he could scarcely have imagined, physicists have proposed a way to improve gravitational wave detectors, using the weirdness of quantum physics.

    The new proposal, published in Nature Physics, could double the sensitivity of future detectors listening out for ripples in spacetime caused by catastrophic collisions across the universe.

    When the advanced Laser Interferometer Gravitational-Wave Observatory (LIGO) detected gravitational waves in late 2015 it was the first direct evidence of the gravitational waves Einstein had predicted a century before.


    Caltech/MIT Advanced aLigo Hanford, WA, USA installation


    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project


    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger-Zib

    Now it another of Einstein’s predictions – one he regarded as a failure – could potentially double the sensitivity of LIGOs successors.

    The story starts with his distaste for quantum theory – or at least for the fundamental fuzziness of all things it seemed to demand.

    Einstein thought the universe would ultimately prove predictable and exact, a clockwork universe rather than one where God “plays dice”. In 1935 he teamed up with Boris Podolsky and Nathan Rosen to publish a paper they thought would be a sort of reductio ad absurdum. They hoped to disprove quantum mechanics by following it to its logical, ridiculous conclusion. Their ‘EPR paradox’ (named for their initials) described the instantaneous influence of one particle on another, what Einstein called “spooky action at a distance” because it seemed at first to be impossible.

    Yet this sally on the root of quantum physics failed, as the EPR effect turned out not to be a paradox after all. Quantum entanglement, as it’s now known, has been repeatedly proven to exist, and features in several proposed quantum technologies, including quantum computation and quantum cryptography.

    2
    Artistic rendering of the generation of an entangled pair of photons by spontaneous parametric down-conversion as a laser beam passes through a nonlinear crystal. Inspired by an image in Dance of the Photons by Anton Zeilinger. However, this depiction is from a different angle, to better show the “figure 8” pattern typical of this process, clearly shows that the pump beam continues across the entire image, and better represents that the photons are entangled.
    Date 31 March 2011
    Source Entirely self-generated using computer graphics applications.
    Author J-Wiki at English Wikipedia

    Now we can add gravity wave detection to the list.

    LIGO works by measuring the minute wobbling of mirrors as a gravitational wave stretches and squashes spacetime around them. It is insanely sensitive – able to detect wobbling down to 10,000th the width of a single proton.

    At this level of sensitivity the quantum nature of light becomes a problem. This means the instrument is limited by the inherent fuzziness of the photons bouncing between its mirrors — this quantum noise washes out weak signals.

    To get around this, physicists plan to use so-called squeezed light to dial down the level of quantum noise near the detector (while increasing it elsewhere).

    The new scheme aids this by adding two new, entangled laser beams to the mix. Because of the ‘spooky’ connection between the two entangled beams, their quantum noise is correlated – detecting one allows the prediction of the other.

    This way, the two beams can be used to probe the main LIGO beam, helping nudge it into a squeezed light state. This reduces the noise to a level that standard quantum theory would deem impossible.

    The authors of the new proposal write that it is “appropriate for all future gravitational-wave detectors for achieving sensitivities beyond the standard quantum limit”.

    Indeed, the proposal could as much as double the sensitivity of future detectors.

    Over the next 30 years, astronomers aim to improve the sensitivity of the detectors, like LIGO, by 30-fold. At that level, we’d be able to hear all black hole mergers in the observable universe.

    ESA/eLISA, the future of gravitational wave research

    However, along with improved sensitivity, the proposed system would also increase the number of photons lost in the detector. Raffaele Flaminio, a physicist at the National Astronomical Observatory of Japan, points out in a perspective piece for Nature Physics [no link], Flaminio that the team need to do more work to understand how this will affect ultimate performance.

    “But the idea of using Einstein’s most famous (mistaken) paradox to improve the sensitivity of gravitational-wave detectors, enabling new tests of his general theory of relativity, is certainly intriguing,” Flaminio writes. “Einstein’s ideas – whether wrong or right – continue to have a strong influence on physics and astronomy.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 2:40 pm on April 9, 2017 Permalink | Reply
    Tags: , NIST Team Proves 'Spooky Action at a Distance' is Really Real, , Quantum entanglement   

    From NIST: “NIST Team Proves ‘Spooky Action at a Distance’ is Really Real” 

    NIST

    March 28, 2017

    1
    US physicists have made a breakthrough in proving that quantum mechanics is indeed “spooky” and that trapped ions can be relied on for the quantum entanglement crucial for building super-fast futuristic computers iStock

    Adding to strong recent demonstrations that particles of light perform what Einstein called “spooky action at a distance,” in which two separated objects can have a connection that exceeds everyday experience, physicists at the National Institute of Standards and Technology (NIST) have confirmed that particles of matter can act really spooky too.

    The NIST team entangled a pair of beryllium ions (charged atoms) in a trap, thus linking their properties, and then separated the pair and performed one of a set of possible manipulations on each ion’s properties before measuring them. Across thousands of runs, the pair’s measurement outcomes in certain cases matched, or in other cases differed, more often than everyday experience would predict. These strong correlations are hallmarks of quantum entanglement.

    What’s more, statistical calculations found the ion pairs displayed a rare high level of spookiness.

    “We are confident that the ions are 67 percent spooky,” said Ting Rei Tan, lead author of a new Physical Review Letters paper about the experiments.

    The experiments were “chained” Bell tests, meaning that they were constructed from a series of possible sets of manipulations on two ions. Unlike earlier experiments, these were enhanced
    Bell tests in which the number of possible manipulations for each ion was chosen randomly from sets of at least two and as many as 15 choices.

    This method produces stronger statistical results than conventional Bell tests (link is external). That’s because as the number of options grows for manipulating each ion, the chance automatically decreases that the ions are behaving by classical, or non-quantum, rules. According to classical rules, all objects must have definite “local” properties and can only influence each other at the speed of light or slower. Bell tests have been long used to show that through quantum physics, objects can break one or both of these rules, demonstrating spooky action.

    Conventional Bell tests produce data that are a mixture of local and spooky action. Perfect chained Bell tests can, in theory, prove there is zero chance of local influence. The NIST results got down to a 33 percent chance of local influence—lower than conventional Bell tests can achieve, although not the lowest ever reported for a chained test, Tan said.

    However, the NIST experiment broke new ground by closing two of three “loopholes” that could undermine the results, the only chained Bell test to do this using three or more options for manipulating material particles. The results are good enough to infer the high quality of the entangled states using minimal assumptions about the experiment—a rare achievement, Tan said.

    Last year, a different group of NIST researchers and collaborators closed all three loopholes in conventional Bell tests with particles of light. The new ion experiments confirm again that spooky action is real.

    “Actually, I believed in quantum mechanics before this experiment,” Tan said with a chuckle. “Our motivation was we were trying to use this experiment to showcase how good our trapped ion quantum computing technology is, and what we can do with it.”

    The researchers used the same ion trap setup as in previous quantum computing experiments. With this apparatus, researchers use electrodes and lasers to perform all the basic steps needed for quantum computing, including preparing and measuring ions’ quantum states; transporting ions between multiple trap zones; and creating stable quantum bits (qubits), qubit rotations, and reliable two-qubit logic operations. All these features were needed to conduct the chained Bell tests. Quantum computers are expected to one day solve problems that are currently intractable such as simulating superconductivity (the flow of electricity without resistance) and breaking today’s most popular data encryption codes.

    In NIST’s chained Bell tests, the number of settings (options for different manipulations before measurement) ranged from two to 15. The manipulations acted on the ions’ internal energy states called “spin up” or “spin down.” The researchers used lasers to rotate the spins of the ions by specific angles before the final measurements.

    Researchers performed several thousand runs for each setting and collected two data sets 6 months apart. The measurements determined the ions’ spin states. There were four possible final results: (1) both ions spin up, (2) first ion spin up and second ion spin down, (3) first ion spin down and second ion spin up, or (4) both ions spin down. Researchers measured the states based on how much the ions fluoresced or scattered light—bright was spin up and dark was spin down.

    The NIST experiment closed the detection and memory loopholes, which might otherwise allow ordinary classical systems to appear spooky.

    The detection loophole is opened if detectors are inefficient and a subset of the data are used to represent the entire data set. The NIST tests closed this loophole because the fluorescence detection was near 100 percent efficient, and the measurement outcomes of every trial in each experiment were recorded and used to calculate results.

    The memory loophole is opened if one assumes that the outcomes of the trials are identically distributed or there are no experimental drifts. Previous chained Bell tests have relied on this assumption, but the NIST test was able to drop it. The NIST team closed the memory loophole by performing thousands of extra trials over many hours with the set of six possible settings, using a randomly chosen setting for each trial and developing a more robust statistical analysis technique.

    The NIST experiments did not close the locality loophole, which is open if it is possible for the choice of settings to be communicated between the ions. To close this loophole, one would need to separate the ions by such a large distance that communication between them would be impossible, even at light speed. In the NIST experiment, the ions had to be positioned close together (at most, 340 micrometers apart) to be entangled and subsequently measured, Tan explained.

    This work was supported by the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA) and the Office of Naval Research.

    Paper: T.R. Tan, Y. Wan, S. Erickson, P. Bierhorst, D. Kienzler, S. Glancy, E. Knill, D. Leibfried and D.J. Wineland. 2017. Chained Bell Inequality Experiment With High-Efficiency Measurements. Physical Review Letters. DOI: 10.1103/PhysRevLett.118.130403

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    NIST Mission, Vision, Core Competencies, and Core Values

    NIST’s mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.
    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.
    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

     
  • richardmitnick 9:10 am on March 29, 2017 Permalink | Reply
    Tags: , , Quantum entanglement, Quantum memories, SN   

    From SN: “Millions of atoms entangled in record-breaking quantum tests” 

    ScienceNews bloc

    ScienceNews

    March 27, 2017
    Emily Conover

    Two teams report pushing the spooky effect to larger scales than ever before.

    1
    QUANTUM TANGLES Scientists have pushed quantum entanglement to new levels in two experiments. In one study, researchers linked up millions of atoms, and in another, intertwined hundreds of large groups consisting of billions of atoms. VAlex/Shutterstock

    Researchers from Geneva demonstrated quantum entanglement of 16 million atoms, smashing the previous record of about 3,000 entangled atoms (SN Online: 3/25/2015). Meanwhile, scientists from Canada and the United States used a similar technique to entangle over 200 groups of a billion atoms each. The teams published their results online March 14 in a pair of papers posted at arXiv.org.

    Through quantum entanglement, seemingly independent particles become intertwined. Entangled atoms can no longer be considered separate entities, but make sense only as part of a whole — even though the atoms may be far apart. The process typically operates on small scales, hooking up tiny numbers of particles, but the researchers convinced atoms to defy that tendency.

    “It’s a beautiful result,” says atomic physicist Vladan Vuletić of MIT, who was part of the team that previously demonstrated the 3,000-atom entanglement. Quantum effects typically don’t appear at the large scales that humans deal with every day. Instead, particles’ delicate quantum properties are smeared out through interactions with the messy world. But under the right conditions, quantum effects like entanglement can proliferate. “What this work shows us is that there are certain types of quantum mechanical states that are actually quite robust,” Vuletić says.

    Both teams demonstrated entanglement using devices known as “quantum memories.” Consisting of a crystal interspersed with rare-earth ions — exotic elements like neodymium and thulium — the researchers’ quantum memories are designed to absorb a single photon and re-emit it after a short delay. The single photon is collectively absorbed by many rare-earth ions at once, entangling them. After tens of nanoseconds, the quantum memory emits an echo of the original photon: another photon continuing in the same direction as the photon that entered the crystal.

    By studying the echoes from single photons, the scientists quantified how much entanglement occurred in the crystals. The more reliable the timing and direction of the echo, the more extensive the entanglement was. While the U.S.-Canadian team based its measurement on the timing of the emitted photon, the Swiss team focused on the direction of the photon.

    The quantum memories used to entangle the atoms aren’t new technologies. “The experiments are not complicated,” says physicist Erhan Saglamyurek of the University of Alberta in Canada, who was not involved with the research. Instead, the advance is mainly in the theoretical physics the researchers established to quantify the entanglement that was expected to arise inside such quantum memories. This allowed them to actually prove that such large numbers of particles were entangled, Saglamyurek says.

    Scientists from the two research teams declined to comment, as the papers reporting the work are still undergoing peer review by a journal.

    The results don’t have any obvious practical use. Instead, the work grows out of technology that is being developed for its potential applications: Quantum memories could be used in quantum communication networks to allow for storage of quantum information.

    Eventually, physicists hope to push weird quantum effects to larger and larger scales. For quantum entanglement, “it would be a dream if you could make that visible to the naked eye,” says quantum physicist Jakob Reichel of École Normale Supérieure in Paris. The latest results don’t go that far.

    “It’s not a revolution,” Reichel says. But, “I think it helps us [get] a better feeling for entangled states.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 2:41 pm on August 16, 2016 Permalink | Reply
    Tags: , , , Pan Jian-Wei, , Quantum entanglement   

    From Nature- “China’s quantum space pioneer: We need to explore the unknown” 

    Nature Mag
    Nature

    14 January 2016 [Just appeared in social media, probably because of new Chinese spacecraft that went up today.]
    Celeste Biever

    1
    Pan Jian-Wei is leading a satellite project that will probe quantum entanglement. Tengyun Chen

    Physicist Pan Jian-Wei is the architect of the world’s first attempt to set up a quantum communications link between Earth and space — an experiment that is set to begin with the launch of a satellite in June.

    The satellite will test whether the quantum property of entanglement extends over record-breaking distances of more than 1,000 kilometres, by beaming individual entangled photons between space and various ground stations on Earth. It will also test whether it is possible, using entangled photons, to teleport information securely between Earth and space.

    On 8 January, Pan, who works at the University of Science and Technology of China in Hefei, won a major national Chinese science prize (worth 200,000 yuan, or US$30,000) for his contributions to quantum science. He spoke to Nature about why his experiments are necessary and about the changing nature of Chinese space-science missions.

    How are preparations for the launch going?

    We always have two feelings. We feel, “Yes, everything is all right,” and then we are happy and excited. But we have, a couple of times, thought, “Probably our project will collapse and never work.” I think the satellite should be launched on time.
    What technical challenges do you face?

    The satellite will fly so fast (it takes just 90 minutes to orbit Earth) and there will be turbulence and other problems — so the single-photon beam can be seriously affected. Also we have to overcome background noise from sunlight, the Moon and light noise from cities, which are much stronger than our single photon.

    What is the aim of the satellite?

    Our first mission is to see if we can establish quantum key distribution [the encoding and sharing of a secret cryptographic key using the quantum properties of photons] between a ground station in Beijing and the satellite, and between the satellite and Vienna. Then we can see whether it is possible to establish a quantum key between Beijing and Vienna, using the satellite as a relay.

    The second step will be to perform long-distance entanglement distribution, over about 1,000 kilometres. We have technology on the satellite that can produce pairs of entangled photons. We beam one photon of an entangled pair to a station in Delingha, Tibet, and the other to a station in Lijiang or Nanshan. The distance between the two ground stations is about 1,200 kilometres. Previous tests were done on the order of 100 kilometres.

    Does anyone doubt that entanglement happens no matter how far apart two particles are?

    Not too many people doubt quantum mechanics, but if you want to explore new physics, you must push the limit. Sure, in principle, quantum entanglement can exist for any distance. But we want to see if there is some physical limit. People ask whether there is some sort of boundary between the classical world and the quantum world: we hope to build some sort of macroscopic system in which we can show that the quantum phenomena can still exist.

    In future, we also want to see if it is possible to distribute entanglement between Earth and the Moon. We hope to use the Chang’e programme (China’s Moon programme) to send a quantum satellite to one of the gravitationally-stable points [Lagrangian points] in the Earth-Moon system.

    How does entanglement relate to quantum teleportation?

    We will beam one photon from an entangled pair created at a ground station in Ali, Tibet, to the satellite. The quantum state of a third photon in Ali can then be teleported to the particle in space, using the entangled photon in Ali as a conduit.
    The quantum satellite is a basic-science space mission, as is the Dark Matter Particle Explorer (DAMPE), which China launched in December.

    Are basic-research satellites a new trend for China?

    Yes, and my colleagues at the Chinese Academy of Sciences (CAS) and I helped to force things in this direction. In the past, China had only two organizations that could launch satellites: the army and the Ministry of Industry and Information Technology. So scientists had no way to launch a satellite for scientific research. One exception is the Double Star probe, launched in collaboration with the European Space Agency in 2003 to study magnetic storms on Earth.

    What changed?

    We at CAS really worked hard to convince our government that it is important that we have a way to launch science satellites. In 2011, the central government established the Strategic Priority Program on Space Science, which DAMPE and our quantum satellite are part of. This is a very important step.

    I think China has an obligation not just to do something for ourselves — many other countries have been to the Moon, have done manned spaceflight — but to explore something unknown.

    Will scientists also be involved in China’s programme to build a space station, Tiangong?

    The mechanism to make decisions for which projects can go to the space station has been significantly changed. Originally, the army wanted to take over the responsibility, but it was finally agreed that CAS is the right organization.

    We will have a quantum experiment on the space station and it will make our studies easier because we can from time to time upgrade our experiment (unlike on the quantum satellite). We are quite happy with this mechanism. We need only talk to the leaders of CAS — and they are scientists, so you can communicate with them much more easily.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

     
  • richardmitnick 10:52 am on July 13, 2016 Permalink | Reply
    Tags: , Quantum entanglement,   

    From UC Santa Barbara- “Entanglement : Chaos” 

    UC Santa Barbara Name bloc

    July 11, 2016
    Sonia Fernandez

    1
    A quantum qubit array. Photo Credit: Michael Fang/Martinis Lab

    2
    Experimental link between quantum entanglement (left) and classical chaos (right) found using a small quantum computer. Photo Credit: Courtesy Image

    3
    The Google and UCSB researchers, from left to right: Jimmy Chen, John Martinis, Pedram Roushan, Yu Chen, Anthony Megrant and Charles Neill. Photo Credit: Sonia Fernandez

    Using a small quantum system consisting of three superconducting qubits, researchers at UC Santa Barbara and Google have uncovered a link between aspects of classical and quantum physics thought to be unrelated: classical chaos and quantum entanglement. Their findings suggest that it would be possible to use controllable quantum systems to investigate certain fundamental aspects of nature.

    “It’s kind of surprising because chaos is this totally classical concept — there’s no idea of chaos in a quantum system,” Charles Neill, a researcher in the UCSB Department of Physics and lead author of a paper that appears in Nature Physics. “Similarly, there’s no concept of entanglement within classical systems. And yet it turns out that chaos and entanglement are really very strongly and clearly related.”

    Initiated in the 15th century, classical physics generally examines and describes systems larger than atoms and molecules. It consists of hundreds of years’ worth of study including Newton’s laws of motion, electrodynamics, relativity, thermodynamics as well as chaos theory — the field that studies the behavior of highly sensitive and unpredictable systems. One classic example of a chaotic system is the weather, in which a relatively small change in one part of the system is enough to foil predictions — and vacation plans — anywhere on the globe.

    At smaller size and length scales in nature, however, such as those involving atoms and photons and their behaviors, classical physics falls short. In the early 20th century quantum physics emerged, with its seemingly counterintuitive and sometimes controversial science, including the notions of superposition (the theory that a particle can be located in several places at once) and entanglement (particles that are deeply linked behave as such despite physical distance from one another).

    And so began the continuing search for connections between the two fields.

    All systems are fundamentally quantum systems, according Neill, but the means of describing in a quantum sense the chaotic behavior of, say, air molecules in an evacuated room, remains limited.

    Imagine taking a balloon full of air molecules, somehow tagging them so you could see them and then releasing them into a room with no air molecules, noted co-author and UCSB/Google researcher Pedram Roushan. One possible outcome is that the air molecules remain clumped together in a little cloud following the same trajectory around the room. And yet, he continued, as we can probably intuit, the molecules will more likely take off in a variety of velocities and directions, bouncing off walls and interacting with each other, resting after the room is sufficiently saturated with them.

    “The underlying physics is chaos, essentially,” he said. The molecules coming to rest — at least on the macroscopic level — is the result of thermalization, or of reaching equilibrium after they have achieved uniform saturation within the system. But in the infinitesimal world of quantum physics, there is still little to describe that behavior. The mathematics of quantum mechanics, Roushan said, do not allow for the chaos described by Newtonian laws of motion.

    To investigate, the researchers devised an experiment using three quantum bits, the basic computational units of the quantum computer. Unlike classical computer bits, which utilize a binary system of two possible states (e.g., zero/one), a qubit can also use a superposition of both states (zero and one) as a single state. Additionally, multiple qubits can entangle, or link so closely that their measurements will automatically correlate. By manipulating these qubits with electronic pulses, Neill caused them to interact, rotate and evolve in the quantum analog of a highly sensitive classical system.

    The result is a map of entanglement entropy of a qubit that, over time, comes to strongly resemble that of classical dynamics — the regions of entanglement in the quantum map resemble the regions of chaos on the classical map. The islands of low entanglement in the quantum map are located in the places of low chaos on the classical map.

    “There’s a very clear connection between entanglement and chaos in these two pictures,” said Neill. “And, it turns out that thermalization is the thing that connects chaos and entanglement. It turns out that they are actually the driving forces behind thermalization.

    “What we realize is that in almost any quantum system, including on quantum computers, if you just let it evolve and you start to study what happens as a function of time, it’s going to thermalize,” added Neill, referring to the quantum-level equilibration. “And this really ties together the intuition between classical thermalization and chaos and how it occurs in quantum systems that entangle.”

    The study’s findings have fundamental implications for quantum computing. At the level of three qubits, the computation is relatively simple, said Roushan, but as researchers push to build increasingly sophisticated and powerful quantum computers that incorporate more qubits to study highly complex problems that are beyond the ability of classical computing — such as those in the realms of machine learning, artificial intelligence, fluid dynamics or chemistry — a quantum processor optimized for such calculations will be a very powerful tool.

    “It means we can study things that are completely impossible to study right now, once we get to bigger systems,” said Neill.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    UC Santa Barbara Seal
    The University of California, Santa Barbara (commonly referred to as UC Santa Barbara or UCSB) is a public research university and one of the 10 general campuses of the University of California system. Founded in 1891 as an independent teachers’ college, UCSB joined the University of California system in 1944 and is the third-oldest general-education campus in the system. The university is a comprehensive doctoral university and is organized into five colleges offering 87 undergraduate degrees and 55 graduate degrees. In 2012, UCSB was ranked 41st among “National Universities” and 10th among public universities by U.S. News & World Report. UCSB houses twelve national research centers, including the renowned Kavli Institute for Theoretical Physics.

     
  • richardmitnick 12:13 pm on June 9, 2016 Permalink | Reply
    Tags: , , Quantum entanglement   

    From MIT Tech Review: “Proof That Quantum Computers Will Change Everything” 

    MIT Technology Review
    MIT Technology Review

    First Demonstration of 10-Photon Quantum Entanglement

    June 9, 2016
    Emerging Technology from the arXiv

    The ability to entangle 10 photons should allow physicists to prove, once and for all, that quantum computers really can do things classical computers cannot.

    Entanglement is the strange phenomenon in which quantum particles become so deeply linked that they share the same existence. Once rare, entangling particles has become routine in labs all over the world.

    Quantum approach to big data. MIT
    Quantum approach to big data. MIT

    Physicists have learned how to create entanglement, transfer it from one particle to another, and even distil it. Indeed, entanglement has become a resource in itself and a crucial one for everything from cryptography and teleportation to computing and simulation.

    But a significant problem remains. To carry out ever more complex and powerful experiments, physicists need to produce entanglement on ever-larger scales by entangling more particles at the same time.

    The current numbers are paltry, though. Photons are the quantum workhorses in most labs and the record for the number of entangled photons is a mere eight, produced at a rate of about nine events per hour.

    Using the same techniques to create a 10-photon count rate would result in only 170 per year, too few even to measure easily. So the prospects of improvement have seemed remote.

    Which is why the work of Xi-Lin Wang and pals at the University of Science and Technology of China in Heifu is impressive. Today, they announce that they’ve produced 10-photon entanglement for the first time, and they’ve done it at a count rate that is three orders of magnitude higher than anything possible until now.

    The biggest bottleneck in entangling photons is the way they are produced. This involves a process called spontaneous parametric down conversion, in which one energetic photon is converted into two photons of lower energy inside a crystal of beta-barium borate. These daughter photons are naturally entangled.

    2
    Experiment setup for generating ten-photon polarization-entangled GHZ, from the science paper

    By zapping the crystal continuously with a laser beam, it is possible to create a stream of entangled photon pairs. However, the rate of down conversion is tiny, just one photon per trillion. So collecting the entangled pairs efficiently is hugely important.

    That’s no easy tasks, not least because the photons come out of the crystal in slightly different directions, neither of which can be easily predicted. Physicists collect the photons from the two points where they are most likely to appear but most of the entangled photons are lost.

    Xi-Lin and co have tackled this problem by reducing the uncertainty in the photon directions. Indeed, they have been able to shape the beams of entangled photons so that they form two separate circular beams, which can be more easily collected.

    In this way, the team has generated entangled photon pairs at the rate of about 10 million per watt of laser power. This is brighter than previous entanglement generators by a factor of about four. It is this improvement that makes 10-photon entanglement possible.

    Their method is to collect five successively generated pairs of entangled photons and pass them into an optical network of four beam splitters. The team then introduces time delays that ensure the photons arrive at the beam splitters simultaneously and so become entangled.

    This creates the 10-photon entangled state, albeit at a rate of about four per hour, which is low but finally measureable for the first time. “We demonstrate, for the first time, genuine and distillable entanglement of 10 single photons,” say Xi-Lin and co.

    That’s impressive work that immediately opens the prospect of a new generation of experiments. The most exciting of these is a technique called boson sampling that physicists hope will prove that quantum computers really are capable of things classical computers are not.

    That’s important because nobody has built a quantum computer more powerful than a pocket calculator (the controversial D-Wave results aside). Neither are they likely to in the near future. So boson sampling is quantum physicists’ greatest hope that will allow them to show off the mind-boggling power of quantum computation for the first time.

    Other things also become possible, such as the quantum teleportation of three degrees of freedom in a single photon and multi-photon experiments over very long distances.

    But it is the possibility of boson sampling that will send a frisson through the quantum physics community.

    Ref: arxiv.org/abs/1605.08547: Experimental Ten-Photon Entanglement

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: