Tagged: Quantum Computing Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 5:46 pm on January 28, 2015 Permalink | Reply
    Tags: , Isotropy, , Quantum Computing,   

    From UC Berkeley: “Quantum computer as detector shows space is not squeezed” 

    UC Berkeley

    UC Berkeley

    January 28, 2015
    Robert Sanders

    1
    As the Earth rotates every 24 hours, the orientation of the ions in the quantum computer/detector changes with respect to the Sun’s rest frame. If space were squeezed in one direction and not another, the energies of the electrons in the ions would have shifted with a 12-hour period. Hartmut Haeffner image.

    A new experiment by University of California, Berkeley, physicists used partially entangled atoms – identical to the qubits in a quantum computer – to demonstrate more precisely than ever before that this is true, to one part in a billion billion.

    The classic experiment that inspired Albert Einstein was performed in Cleveland by Albert Michelson and Edward Morley in 1887 and disproved the existence of an “ether” permeating space through which light was thought to move like a wave through water. What it also proved, said Hartmut Häffner, a UC Berkeley assistant professor of physics, is that space is isotropic and that light travels at the same speed up, down and sideways.

    “Michelson and Morley proved that space is not squeezed,” Häffner said. “This isotropy is fundamental to all physics, including the Standard Model of physics. If you take away isotropy, the whole Standard Model will collapse. That is why people are interested in testing this.”

    2
    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    The Standard Model of particle physics describes how all fundamental particles interact, and requires that all particles and fields be invariant under Lorentz transformations, and in particular that they behave the same no matter what direction they move.

    Häffner and his team conducted an experiment analogous to the Michelson-Morley experiment, but with electrons instead of photons of light. In a vacuum chamber he and his colleagues isolated two calcium ions, partially entangled them as in a quantum computer, and then monitored the electron energies in the ions as Earth rotated over 24 hours.

    If space were squeezed in one or more directions, the energy of the electrons would change with a 12-hour period. It didn’t, showing that space is in fact isotropic to one part in a billion billion (10^18), 100 times better than previous experiments involving electrons, and five times better than experiments like Michelson and Morley’s that used light.

    The results disprove at least one theory that extends the Standard Model by assuming some anisotropy of space, he said.

    Häffner and his colleagues, including former graduate student Thaned Pruttivarasin, now at the Quantum Metrology Laboratory in Saitama, Japan, will report their findings in the Jan. 29 issue of the journal Nature.

    Entangled qubits

    Häffner came up with the idea of using entangled ions to test the isotropy of space while building quantum computers, which involve using ionized atoms as quantum bits, or qubits, entangling their electron wave functions, and forcing them to evolve to do calculations not possible with today’s digital computers. It occurred to him that two entangled qubits could serve as sensitive detectors of slight disturbances in space.

    “I wanted to do the experiment because I thought it was elegant and that it would be a cool thing to apply our quantum computers to a completely different field of physics,” he said. “But I didn’t think we would be competitive with experiments being performed by people working in this field. That was completely out of the blue.”

    He hopes to make more sensitive quantum computer detectors using other ions, such as ytterbium, to gain another 10,000-fold increase in the precision measurement of Lorentz symmetry. He is also exploring with colleagues future experiments to detect the spatial distortions caused by the effects of dark matter particles, which are a complete mystery despite comprising 27 percent of the mass of the universe.

    “For the first time we have used tools from quantum information to perform a test of fundamental symmetries, that is, we engineered a quantum state which is immune to the prevalent noise but sensitive to the Lorentz-violating effects,” Häffner said. “We were surprised the experiment just worked, and now we have a fantastic new method at hand which can be used to make very precise measurements of perturbations of space.”

    Other co-authors are UC Berkeley graduate student Michael Ramm, former UC Berkeley postdoc Michael Hohensee of Lawrence Livermore National Laboratory, and colleagues from the University of Delaware and University of Maryland and institutions in Russia. The work was supported by the National Science Foundation.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Founded in the wake of the gold rush by leaders of the newly established 31st state, the University of California’s flagship campus at Berkeley has become one of the preeminent universities in the world. Its early guiding lights, charged with providing education (both “practical” and “classical”) for the state’s people, gradually established a distinguished faculty (with 22 Nobel laureates to date), a stellar research library, and more than 350 academic programs.

    UC Berkeley Seal

     
  • richardmitnick 2:54 pm on January 22, 2015 Permalink | Reply
    Tags: , , Quantum Computing   

    From Quanta: “Quantum Computing Without Qubits” 

    Quanta Magazine
    Quanta Magazine

    January 22, 2015
    Peter Byrne

    A quantum computing pioneer explains why analog simulators may beat out general-purpose digital quantum machines — for now.

    1
    Ivan Deutsch is building quantum computers out of base-16 “qudits,” quantum information units that can assume any number of “d” states.

    For more than 20 years, Ivan H. Deutsch has struggled to design the guts of a working quantum computer. He has not been alone. The quest to harness the computational might of quantum weirdness continues to occupy hundreds of researchers around the world. Why hasn’t there been more to show for their work? As physicists have known since quantum computing’s beginnings, the same characteristics that make quantum computing exponentially powerful also make it devilishly difficult to control. The quantum computing “nightmare” has always been that a quantum computer’s advantages in speed would be wiped out by the machine’s complexity.

    Yet progress is arriving on two main fronts. First, researchers are developing unique quantum error-correction techniques that will help keep quantum processors up and running for the time needed to complete a calculation. Second, physicists are working with so-called analog quantum simulators — machines that can’t act like a general-purpose computer, but rather are designed to explore specific problems in quantum physics. A classical computer would have to run for thousands of years to compute the quantum equations of motion for just 100 atoms. A quantum simulator could do it in less than a second.

    Quanta Magazine spoke with Deutsch about recent progress in the field, his hopes for the near future, and his own work at the University of New Mexico’s Center for Quantum Information and Control on scaling up binary quantum bits into base-16 digits.

    QUANTA MAGAZINE: Why would a universal quantum machine be so uniquely powerful?

    IVAN DEUTSCH: In a classical computer, information is stored in retrievable bits binary coded as 0 or 1. But in a quantum computer, elementary particles inhabit a probabilistic limbo called superposition where a “qubit” can be coded as 0 and 1.

    Here is the magic: Each qubit can be entangled with the other qubits in the machine. The intertwining of quantum “states” exponentially increases the number of 0s and 1s that can be simultaneously processed by an array of qubits. Machines that can harness the power of quantum logic can deal with exponentially greater levels of complexity than the most powerful classical computer. Problems that would take a state-of-the-art classical computer the age of our universe to solve, can, in theory, be solved by a universal quantum computer in hours.

    What is the quantum computing “nightmare”?

    The same quantum effects that make a quantum computer so blazingly fast also make it incredibly difficult to operate. From the beginning, it has not been clear whether the exponential speed up provided by a quantum computer would be cancelled out by the exponential complexity needed to protect the system from crashing.

    Is the situation hopeless?

    Not at all. We now know that a universal quantum computer will not require exponential complexity in design. But it is still very hard.

    So what’s the problem, and how do we get around it?

    The hardware problem is that the superposition is so fragile that the random interaction of a single qubit with the molecules composing its immediate surroundings can cause the entire network of entangled qubits to delink or collapse. The ongoing calculation is destroyed as each qubit transforms into a digitized classical bit holding a single value: 0 or 1.
    Caption

    3
    A test-bed quantum computer that Deutsch is working on with his colleague Poul Jessen at the University of Arizona. Courtesy of Poul Jessen

    In classical computers, we reduce the inevitable loss of information by designing a lot of redundancy into the system. Error-correcting algorithms compare multiple copies of the output. They select the most frequent answer and discard the rest of the data as noise. We cannot do that with a quantum computer, because trying to directly compare qubits will crash the program. But we are gradually learning how to keep systems of entangled qubits from collapsing.

    The major obstacle, to my mind, is creating error-correcting software that can keep data from being corrupted as the calculation proceeds toward the final readout. The great trick is to design and implement an algorithm that only measures the errors and not the data, thus preserving the superposition that contains the correct answer.

    Will that end the nightmare?

    It turns out that the error correction technique itself introduces errors. One of the most wonderful advances in quantum computing was recognizing that, in theory, we can correct the new errors without requiring 100 percent precision, allowing minor background noise to pollute the calculation as it rolls along. We cannot actually do this — yet. The main reason that we do not have a working universal quantum computer is that we are still experimenting with how to implant such a “fault-tolerant” algorithm into a quantum circuit. Right now we can control 10 qubits reasonably well. But there is no error-correcting technique, to my knowledge, capable of controlling the thousands of qubits needed to construct a universal machine.

    Is that what you’re working on?

    I study the information processing capabilities of trapped atoms. My colleague Poul Jessen at the University of Arizona and I are pushing the logical power beyond binary-based qubits. For example, what if we can control the superposition of an atom with, say, 16 different energy levels? Using base 16, we can then store what we call a “qudit” in a single atom. That would move us beyond the information processing speed obtainable by a base 2 system, the qubit.

    What other options do we have?

    There may be significant applications available for making non-universal machines: Special purpose, analog quantum simulators designed to solve specific problems, such as how room-temperature superconductors work or how a particular protein folds.

    Are these actually computers?

    They are not universal machines capable of solving any type of question. But say that I want to model global climate change. One way to do this is to write down a mathematical model and then solve the equations on a digital computer. That is typically what climate scientists do. Another way is to try to simulate some aspect of the earth’s climate in a controllable experiment. I can create a simple physical system that obeys the same laws of motion as the system I’m trying to model — mixing nitrogen, oxygen, and hydrogen in a tank, for example. What goes on inside the tank is a real-world computation that tells me something about atmospheric turbulence under certain conditions.

    It is the same with an analog quantum simulator — I use one controllable physical system to simulate another. For example, successfully simulating a superconductor with such a device would reveal the quantum mechanics of high-temperature superconductivity. That could lead to the manufacture of non-brittle superconducting materials for many uses, including building less-fragile quantum circuits. Hopefully, we can learn how to build a robust universal digital computer by experimenting with analog simulators.

    Has anyone built a working analog quantum simulator?

    In 2002, a group at the Max Planck Institute in Germany built an optical lattice — a super-chilled egg carton made of light — and controlled it by pulsing different strengths of laser beams at it. This was a fundamentally analog device designed to obey quantum mechanical equations of motion. The short story is that it successfully simulated how atoms transition between acting as superfluids or insulators. That experiment has sparked a lot of research in analog quantum computing with optical lattices and cold atom traps.

    What are the main challenges for these quantum simulators?

    Because the evolution of the analog simulation is not digitized, the software cannot correct the tiny errors that accumulate during the calculation as we could error-correct noise on a universal machine. The analog device must keep a quantum superposition intact long enough for the simulation to run its course without resorting to digital error correction. This is a particular challenge for the analog approach to quantum simulation.

    Is the D-Wave machine a quantum simulator?

    The D-Wave prototype is not a universal quantum computer. It is not digital, nor error-correcting, nor fault tolerant. It is a purely analog machine designed to solve a particular optimization problem. It is unclear if it qualifies as a quantum device.

    Will a scalable quantum computer be deployed during your lifetime?

    We are pushing past the nightmare. Around the world, many university-based labs are working hard to remove or bypass the road block of fault tolerance. Academic researchers are leading the way, intellectually. For example, the groups of Rob Schoelkopf and Michel H. Devoret at Yale are taking superconducting technologies close to fault-tolerance.

    But constructing a working universal digital quantum computer will likely require mobilizing industrial-scale resources. To that end, IBM is exploring quantum computing with superconducting circuits with personnel largely from the Yale groups. Google is working with John Martinis’s lab at the University of California, Santa Barbara. HRL Laboratories is working on silicon-based quantum computing. Lockheed Martin is exploring ion traps. And who knows what the National Security Agency is up to.

    But generally in academic labs, without these industrial-scale resources, scientists are focusing more and more on learning how to control analog quantum simulators. There is short-term fruit to be picked in that arena — both intellectually and in the currency of academics: publishable papers.

    Are you willing to settle for analog?

    I favor pursuing the digital approach full force. Before I die, I would love to see just one universal logical qubit that can be indefinitely error corrected. It would instantly be classified by the government, of course. But I dream on, regardless.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 7:23 am on January 9, 2015 Permalink | Reply
    Tags: , , Quantum Computing   

    From MIT: “Toward quantum chips” 


    MIT News

    January 9, 2015
    Larry Hardesty | MIT News Office

    A team of researchers has built an array of light detectors sensitive enough to register the arrival of individual light particles, or photons, and mounted them on a silicon optical chip. Such arrays are crucial components of devices that use photons to perform quantum computations.

    1
    One of the researchers’ new photon detectors, deposited athwart a light channel — or “waveguide” (horizontal black band) — on a silicon optical chip.
    Image courtesy of Nature Communications

    Single-photon detectors are notoriously temperamental: Of 100 deposited on a chip using standard manufacturing techniques, only a handful will generally work. In a paper appearing today in Nature Communications, the researchers at MIT and elsewhere describe a procedure for fabricating and testing the detectors separately and then transferring those that work to an optical chip built using standard manufacturing processes.

    In addition to yielding much denser and larger arrays, the approach also increases the detectors’ sensitivity. In experiments, the researchers found that their detectors were up to 100 times more likely to accurately register the arrival of a single photon than those found in earlier arrays.

    “You make both parts — the detectors and the photonic chip — through their best fabrication process, which is dedicated, and then bring them together,” explains Faraz Najafi, a graduate student in electrical engineering and computer science at MIT and first author on the new paper.

    Thinking small

    According to quantum mechanics, tiny physical particles are, counterintuitively, able to inhabit mutually exclusive states at the same time. A computational element made from such a particle — known as a quantum bit, or qubit — could thus represent zero and one simultaneously. If multiple qubits are “entangled,” meaning that their quantum states depend on each other, then a single quantum computation is, in some sense, like performing many computations in parallel.

    With most particles, entanglement is difficult to maintain, but it’s relatively easy with photons. For that reason, optical systems are a promising approach to quantum computation. But any quantum computer — say, one whose qubits are laser-trapped ions or nitrogen atoms embedded in diamond — would still benefit from using entangled photons to move quantum information around.

    “Because ultimately one will want to make such optical processors with maybe tens or hundreds of photonic qubits, it becomes unwieldy to do this using traditional optical components,” says Dirk Englund, the Jamieson Career Development Assistant Professor in Electrical Engineering and Computer Science at MIT and corresponding author on the new paper. “It’s not only unwieldy but probably impossible, because if you tried to build it on a large optical table, simply the random motion of the table would cause noise on these optical states. So there’s been an effort to miniaturize these optical circuits onto photonic integrated circuits.”

    The project was a collaboration between Englund’s group and the Quantum Nanostructures and Nanofabrication Group, which is led by Karl Berggren, an associate professor of electrical engineering and computer science, and of which Najafi is a member. The MIT researchers were also joined by colleagues at IBM and NASA’s Jet Propulsion Laboratory.

    Relocation

    The researchers’ process begins with a silicon optical chip made using conventional manufacturing techniques. On a separate silicon chip, they grow a thin, flexible film of silicon nitride, upon which they deposit the superconductor niobium nitride in a pattern useful for photon detection. At both ends of the resulting detector, they deposit gold electrodes.

    Then, to one end of the silicon nitride film, they attach a small droplet of polydimethylsiloxane, a type of silicone. They then press a tungsten probe, typically used to measure voltages in experimental chips, against the silicone.

    “It’s almost like Silly Putty,” Englund says. “You put it down, it spreads out and makes high surface-contact area, and when you pick it up quickly, it will maintain that large surface area. And then it relaxes back so that it comes back to one point. It’s like if you try to pick up a coin with your finger. You press on it and pick it up quickly, and shortly after, it will fall off.”

    With the tungsten probe, the researchers peel the film off its substrate and attach it to the optical chip.

    In previous arrays, the detectors registered only 0.2 percent of the single photons directed at them. Even on-chip detectors deposited individually have historically topped out at about 2 percent. But the detectors on the researchers’ new chip got as high as 20 percent. That’s still a long way from the 90 percent or more required for a practical quantum circuit, but it’s a big step in the right direction.

    “This work is a technical tour de force,” says Robert Hadfield, a professor of photonics at the University of Glasgow who was not involved in the research. “There is potential for scale-up to large circuits requiring hundreds of detectors using commercial pick-and-place technology.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 3:56 pm on December 9, 2014 Permalink | Reply
    Tags: , , Quantum Computing   

    From NOVA: “Is There Anything Beyond Quantum Computing?” 

    PBS NOVA

    NOVA

    Thu, 10 Apr 2014
    Scott Aaronson

    A quantum computer is a device that could exploit the weirdness of the quantum world to solve certain specific problems much faster than we know how to solve them using a conventional computer. Alas, although scientists have been working toward the goal for 20 years, we don’t yet have useful quantum computers. While the theory is now well-developed, and there’s also been spectacular progress on the experimental side, we don’t have any computers that uncontroversially use quantum mechanics to solve a problem faster than we know how to solve the same problem using a conventional computer.

    cc
    computer_620
    Credit: Marcin Wichary/Flickr, under a Creative Commons license.

    Yet some physicists are already beginning to theorize about what might lie beyond quantum computers. You might think that this is a little premature, but I disagree. Think of it this way: From the 1950s through the 1970s, the intellectual ingredients for quantum computing were already in place, yet no one broached the idea. It was as if people were afraid to take the known laws of quantum physics and see what they implied about computation. So, now that we know about quantum computing, it’s natural not to want to repeat that mistake! And in any case, I’ll let you in on a secret: Many of us care about quantum computing less for its (real but modest) applications than because it defies our preconceptions about the ultimate limits of computation. And from that standpoint, it’s hard to avoid asking whether quantum computers are “the end of the line.”

    Now, I’m emphatically not asking a philosophical question about whether a computer could be conscious, or “truly know why” it gave the answer it gave, or anything like that. I’m restricting my attention to math problems with definite right answers: e.g., what are the prime factors of a given number? And the question I care about is this: Is there any such problem that couldn’t be solved efficiently by a quantum computer, but could be solved efficiently by some other computer allowed by the laws of physics?

    Here I’d better explain that, when computer scientists say “efficiently,” they mean something very specific: that is, that the amount of time and memory required for the computation grows like the size of the task raised to some fixed power, rather than exponentially. For example, if you want to use a classical computer to find out whether an n-digit number is prime or composite—though not what its prime factors are!—the difficulty of the task grows only like n cubed; this is a problem classical computers can handle efficiently. If that’s too technical, feel free to substitute the everyday meaning of the word “efficiently”! Basically, we want to know which problems computers can solve not only in principle, but in practice, in an amount of time that won’t quickly blow up in our faces and become longer than the age of the universe. We don’t care about the exact speed, e.g., whether a computer can do a trillion steps or “merely” a billion steps per second. What we care about is the scaling behavior: How does the number of steps grow as the number to be factored, the molecule to be simulated, or whatever gets bigger and bigger? Scaling behavior is where we see profound differences between today’s computers and quantum computers; it’s the whole reason why anyone wants to build quantum computers in the first place. So, could there be a physical device whose scaling behavior is better than quantum computers’?

    The Simulation Machine

    A quantum computer, as normally envisioned, would be a very specific kind of quantum system: one built up out of “qubits,” or quantum bits, which exist in “superpositions” of the “0” and “1” states. It’s not immediately obvious that a machine based on qubits could simulate other kinds of quantum-mechanical systems, for example, systems involving particles (like electrons and photons) that can move around in real space. And if there are systems that are hard to simulate on standard, qubit-based quantum computers, then those systems themselves could be thought of as more powerful kinds of quantum computers, which solve at least one problem—the problem of simulating themselves—faster than is otherwise possible.
    “It looks likely that a single device, a quantum computer, would in the future be able to simulate all of quantum chemistry and atomic physics efficiently.”

    So maybe Nature could allow more powerful kinds of quantum computers than the “usual” qubit-based kind? Strong evidence that the answer is “no” comes from work by Richard Feynman in the 1980s, and by Seth Lloyd and many others starting in the 1990s. They showed how to take a wide range of realistic quantum systems and simulate them using nothing but qubits. Thus, just as today’s scientists no longer need wind tunnels, astrolabes, and other analog computers to simulate classical physics, but instead represent airflow, planetary motions, or whatever else they want as zeroes and ones in their digital computers, so too it looks likely that a single device, a quantum computer, would in the future be able to simulate all of quantum chemistry and atomic physics efficiently.

    So far, we’ve been talking about computers that can simulate “standard,” non-relativistic quantum mechanics. If we want to bring special relativity into the picture, we need quantum field theory—the framework for modern particle physics, as studied at colliders like the LHC—which presents a slew of new difficulties. First, many quantum field theories aren’t even rigorously defined: It’s not clear what we should program our quantum computer to simulate. Also, in most quantum field theories, even a vacuum is a complicated object, like an ocean surface filled with currents and waves. In some sense, this complexity is a remnant of processes that took place in the moments after the Big Bang, and it’s not obvious that a quantum computer could efficiently simulate the dynamics of the early universe in order to reproduce that complexity. So, is it possible that a “quantum field theory computer” could solve certain problems more efficiently than a garden-variety quantum computer? If nothing else, then at least the problem of simulating quantum field theory?

    While we don’t yet have full answers to these questions, over the past 15 years we’ve accumulated strong evidence that qubit quantum computers are up to the task of simulating quantum field theory. First, Michael Freedman, Alexei Kitaev, and Zhenghan Wang showed how to simulate a “toy” class of quantum field theories, called topological quantum field theories (TQFTs), efficiently using a standard quantum computer. These theories, which involve only two spatial dimensions instead of the usual three, are called “topological” because in some sense, the only thing that matters in them is the global topology of space. (Interestingly, along with Michael Larsen, these authors also proved the converse: TQFTs can efficiently simulate everything that a standard quantum computer can do.)

    Then, a few years ago, Stephen Jordan, Keith Lee, and John Preskill gave the first detailed, efficient simulation of a “realistic” quantum field theory using a standard quantum computer. (Here, “realistic” means they can simulate a universe containing a specific kind of particle called scalar particles. Hey, it’s a start.) Notably, Jordan and his colleagues solve the problem of creating the complicated vacuum state using an algorithm called “adiabatic state preparation” that, in some sense, mimics the cooling the universe itself underwent shortly after the Big Bang. They haven’t yet extended their work to the full Standard Model of particle physics, but the difficulties in doing so are probably surmountable.

    So, if we’re looking for areas of physics that a quantum computer would have trouble simulating, we’re left with just one: quantum gravity. As you might have heard, quantum gravity has been the white whale of theoretical physicists for almost a century. While there are deep ideas about it (most famously, string theory), no one really knows yet how to combine quantum mechanics with [Albert] Einstein’s general theory of relativity, leaving us free to project our hopes onto quantum gravity—including, if we like, the hope of computational powers beyond those of quantum computers!

    Boot Up Your Time Machine

    But is there anything that could support such a hope? Well, quantum gravity might force us to reckon with breakdowns of causality itself, if closed timelike curves (i.e., time machines to the past) are possible. A time machine is definitely the sort of thing that might let us tackle problems too hard even for a quantum computer, as David Deutsch, John Watrous and I have pointed out. To see why, consider the “Shakespeare paradox,” in which you go back in time and dictate Shakespeare’s plays to him, to save Shakespeare the trouble of writing them. Unlike with the better-known “grandfather paradox,” in which you go back in time and kill your grandfather, here there’s no logical contradiction. The only “paradox,” if you like, is one of “computational effort”: somehow Shakespeare’s plays pop into existence without anyone going to the trouble to write them!
    “A time machine is definitely the sort of thing that might let us tackle problems too hard even for a quantum computer.”

    Using similar arguments, it’s possible to show that, if closed timelike curves exist, then under fairly mild assumptions, one could “force” Nature to solve hard combinatorial problems, just to keep the universe’s history consistent (i.e., to prevent things like the grandfather paradox from arising). Notably, the problems you could solve that way include the NP-complete problems: a class that includes hundreds of problems of practical importance (airline scheduling, chip design, etc.), and that’s believed to scale exponentially in time even for quantum computers.

    Of course, it’s also possible that quantum gravity will simply tell us that closed timelike curves can’t exist—and maybe the computational superpowers they would give us if they did exist is evidence that they must be forbidden!

    Simulating Quantum Gravity

    Going even further out on a limb, the famous mathematical physicist Roger Penrose has speculated that quantum gravity is literally impossible to simulate using either an ordinary computer or a quantum computer, even with unlimited time and memory at your disposal. That would put simulating quantum gravity into a class of problems studied by the logicians Alan Turing and Kurt Gödel in the 1930s, which includes problems way harder than even the NP-complete problems—like determining whether a given computer program will ever stop running (the “halting problem”). Penrose further speculates that the human brain is sensitive to quantum gravity effects, and that this gives humans the ability to solve problems that are fundamentally unsolvable by computers. However, virtually no other expert in the relevant fields agrees with the arguments that lead Penrose to this provocative position.

    What’s more, there are recent developments in quantum gravity that seem to support the opposite conclusion: that is, they hint that a standard quantum computer could efficiently simulate even quantum-gravitational processes, like the formation and evaporation of black holes. Most notably, the AdS/CFT correspondence, which emerged from string theory, posits a “duality” between two extremely different-looking kinds of theories. On one side of the duality is AdS (Anti de Sitter): a theory of quantum gravity for a hypothetical universe that has a negative cosmological constant, effectively causing the whole universe to be surrounded by a reflecting boundary. On the other side is a CFT (Conformal Field Theory): an “ordinary” quantum field theory, without gravity, that lives only on the boundary of the AdS space. The AdS/CFT correspondence, for which there’s now overwhelming evidence (though not yet a proof), says that any question about what happens in the AdS space can be translated into an “equivalent” question about the CFT, and vice versa.
    “Even if a quantum gravity theory seems ‘wild’—even if it involves nonlocality, wormholes, and other exotica—there might be a dual description of the theory that’s more ‘tame,’ and that’s more amenable to simulation by a quantum computer.”

    This suggests that, if we wanted to simulate quantum gravity phenomena in AdS space, we might be able to do so by first translating to the CFT side, then simulating the CFT on our quantum computer, and finally translating the results back to AdS. The key point here is that, since the CFT doesn’t involve gravity, the difficulties of simulating it on a quantum computer are “merely” the relatively prosaic difficulties of simulating quantum field theory on a quantum computer. More broadly, the lesson of AdS/CFT is that, even if a quantum gravity theory seems “wild”—even if it involves nonlocality, wormholes, and other exotica—there might be a dual description of the theory that’s more “tame,” and that’s more amenable to simulation by a quantum computer. (For this to work, the translation between the AdS and CFT descriptions also needs to be computationally efficient—and it’s possible that there are situations where it isn’t.)

    The Black Hole Problem

    So, is there any other hope for doing something in Nature that a quantum computer couldn’t efficiently simulate? Let’s circle back from the abstruse reaches of string theory to some much older ideas about how to speed up computation. For example, wouldn’t it be great if you could program your computer to do the first step of a computation in one second, the second step in half a second, the third step in a quarter second, the fourth step in an eighth second, and so on—halving the amount of time with each additional step? If so, then much like in Zeno’s paradox, your computer would have completed infinitely many steps in a mere two seconds!

    Or, what if you could leave your computer on Earth, working on some incredibly hard calculation, then board a spaceship, accelerate to close to the speed of light, then decelerate and return to Earth? If you did this, then Einstein’s special theory of relativity firmly predicts that, depending on just how close you got to the speed of light, millions or even trillions of years would have elapsed in Earth’s frame of reference. Presumably, civilization would have collapsed and all your friends would be long dead. But if, hypothetically, you could find your computer in the ruins and it was still running, then you could learn the answer to your hard problem!

    We’re now faced with a puzzle: What goes wrong if you try to accelerate computation using these sorts of tricks? The key factor is energy. Even in real life, there are hobbyists who “overclock” their computers, or run them faster than the recommended speed; for example, they might run a 1000 MHz chip at 2000 MHz. But the well-known danger in doing this is that your microchip might overheat and melt! Indeed, it’s precisely because of the danger of overheating that your computer has a fan. Now, the faster you run your computer, the more cooling you need—that’s why many supercomputers are cooled using liquid nitrogen. But cooling takes energy. So, is there some fundamental limit here? It turns out that there is. Suppose you wanted to cool your computer so completely that it could perform about 1043 operations per second—that is, one about operation per Planck time (where a Planck time, ~10-43 seconds, is the smallest measurable unit of time in quantum gravity). To run your computer that fast, you’d need so much energy concentrated in so small a space that, according to general relativity, your computer would collapse into a black hole!

    And the story is similar for the “relativity computer.” There, the more you want to speed up your computer, the closer you have to accelerate your spaceship to the speed of light. But the more you accelerate the spaceship, the more energy you need, with the energy diverging to infinity as your speed approaches that of light. At some point, your spaceship will become so energetic that it, too, will collapse into to a black hole.

    Now, how do we know that collapse into a black hole is inevitable—that there’s no clever way to avoid it? The calculation combines Newton’s gravitational constant G with Planck’s constant h, the central constant of quantum mechanics. That means one is doing a quantum gravity calculation! I’ll end by letting you savor the irony: Even as some people hope that a quantum theory of gravity might let us surpass the known limits of quantum computers, quantum gravity might play just the opposite role, enforcing those limits.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

     
  • richardmitnick 8:20 am on October 17, 2014 Permalink | Reply
    Tags: , , Quantum Computing,   

    From MIT: “Superconducting circuits, simplified” 


    MIT News

    October 17, 2014
    Larry Hardesty | MIT News Office

    Computer chips with superconducting circuits — circuits with zero electrical resistance — would be 50 to 100 times as energy-efficient as today’s chips, an attractive trait given the increasing power consumption of the massive data centers that power the Internet’s most popular sites.

    chip
    Shown here is a square-centimeter chip containing the nTron adder, which performed the first computation using the researchers’ new superconducting circuit. Photo: Adam N. McCaughan

    Superconducting chips also promise greater processing power: Superconducting circuits that use so-called Josephson junctions have been clocked at 770 gigahertz, or 500 times the speed of the chip in the iPhone 6.

    But Josephson-junction chips are big and hard to make; most problematic of all, they use such minute currents that the results of their computations are difficult to detect. For the most part, they’ve been relegated to a few custom-engineered signal-detection applications.

    In the latest issue of the journal Nano Letters, MIT researchers present a new circuit design that could make simple superconducting devices much cheaper to manufacture. And while the circuits’ speed probably wouldn’t top that of today’s chips, they could solve the problem of reading out the results of calculations performed with Josephson junctions.

    The MIT researchers — Adam McCaughan, a graduate student in electrical engineering, and his advisor, professor of electrical engineering and computer science Karl Berggren — call their device the nanocryotron, after the cryotron, an experimental computing circuit developed in the 1950s by MIT professor Dudley Buck. The cryotron was briefly the object of a great deal of interest — and federal funding — as the possible basis for a new generation of computers, but it was eclipsed by the integrated circuit.

    “The superconducting-electronics community has seen a lot of devices come and go, without any real-world application,” McCaughan says. “But in our paper, we have already applied our device to applications that will be highly relevant to future work in superconducting computing and quantum communications.”

    Superconducting circuits are used in light detectors that can register the arrival of a single light particle, or photon; that’s one of the applications in which the researchers tested the nanocryotron. McCaughan also wired together several of the circuits to produce a fundamental digital-arithmetic component called a half-adder.

    Resistance is futile

    Superconductors have no electrical resistance, meaning that electrons can travel through them completely unimpeded. Even the best standard conductors — like the copper wires in phone lines or conventional computer chips — have some resistance; overcoming it requires operational voltages much higher than those that can induce current in a superconductor. Once electrons start moving through an ordinary conductor, they still collide occasionally with its atoms, releasing energy as heat.

    Superconductors are ordinary materials cooled to extremely low temperatures, which damps the vibrations of their atoms, letting electrons zip past without collision. Berggren’s lab focuses on superconducting circuits made from niobium nitride, which has the relatively high operating temperature of 16 Kelvin, or minus 257 degrees Celsius. That’s achievable with liquid helium, which, in a superconducting chip, would probably circulate through a system of pipes inside an insulated housing, like Freon in a refrigerator.

    A liquid-helium cooling system would of course increase the power consumption of a superconducting chip. But given that the starting point is about 1 percent of the energy required by a conventional chip, the savings could still be enormous. Moreover, superconducting computation would let data centers dispense with the cooling systems they currently use to keep their banks of servers from overheating.

    Cheap superconducting circuits could also make it much more cost-effective to build single-photon detectors, an essential component of any information system that exploits the computational speedups promised by quantum computing.

    Engineered to a T

    The nanocryotron — or nTron — consists of a single layer of niobium nitride deposited on an insulator in a pattern that looks roughly like a capital “T.” But where the base of the T joins the crossbar, it tapers to only about one-tenth its width. Electrons sailing unimpeded through the base of the T are suddenly crushed together, producing heat, which radiates out into the crossbar and destroys the niobium nitride’s superconductivity.

    A current applied to the base of the T can thus turn off a current flowing through the crossbar. That makes the circuit a switch, the basic component of a digital computer.

    After the current in the base is turned off, the current in the crossbar will resume only after the junction cools back down. Since the superconductor is cooled by liquid helium, that doesn’t take long. But the circuits are unlikely to top the 1 gigahertz typical of today’s chips. Still, they could be useful for some lower-end applications where speed isn’t as important as energy efficiency.

    Their most promising application, however, could be in making calculations performed by Josephson junctions accessible to the outside world. Josephson junctions use tiny currents that until now have required sensitive lab equipment to detect. They’re not strong enough to move data to a local memory chip, let alone to send a visual signal to a computer monitor.

    In experiments, McCaughan demonstrated that currents even smaller than those found in Josephson-junction devices were adequate to switch the nTron from a conductive to a nonconductive state. And while the current in the base of the T can be small, the current passing through the crossbar could be much larger — large enough to carry information to other devices on a computer motherboard.

    “I think this is a great device,” says Oleg Mukhanov, chief technology officer of Hypres, a superconducting-electronics company whose products rely on Josephson junctions. “We are currently looking very seriously at the nTron for use in memory.”

    “There are several attractions of this device,” Mukhanov says. “First, it’s very compact, because after all, it’s a nanowire. One of the problems with Josephson junctions is that they are big. If you compare them with CMOS transistors, they’re just physically bigger. The second is that Josephson junctions are two-terminal devices. Semiconductor transistors are three-terminal, and that’s a big advantage. Similarly, nTrons are three-terminal devices.”

    “As far as memory is concerned,” Mukhanov adds, “one of the features that also attracts us is that we plan to integrate it with magnetoresistive spintronic devices, mRAM, magnetic random-access memories, at room temperature. And one of the features of these devices is that they are high-impedance. They are in the kilo-ohms range, and if you look at Josephson junctions, they are just a few ohms. So there is a big mismatch, which makes it very difficult from an electrical-engineering standpoint to match these two devices. NTrons are nanowire devices, so they’re high-impedance, too. They’re naturally compatible with the magnetoresistive elements.”

    McCaughan and Berggren’s research was funded by the National Science Foundation and by the Director of National Intelligence’s Intelligence Advanced Research Projects Activity.

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:55 pm on October 3, 2014 Permalink | Reply
    Tags: , , , Quantum Computing,   

    From Princeton via Huff Post: “Physicists Observe New Particle That’s Also Its Own ‘Antiparticle'” 

    Princeton University
    Princeton University

    Huffington Post
    10/03/2014
    Macrina Cooper-White

    After decades of searching, physicists at Princeton University say they’ve observed an elusive particle that behaves both like matter and antimatter.

    Yes, the discovery is an exciting step forward for particle physics, but it may also help advance the creation of powerful quantum computers.

    team
    Team not identifed

    In the early 20th century, as quantum theory emerged, scientists predicted that most common particles, like electrons, had mysterious “antimatter” counterparts with the same mass and opposite charge. Scientists even thought that if a particle came in contact with its “antiparticle,” the two would annihilate one another.

    Italian physicist Ettore Majorana first hypothesized in 1937 that one particle — called the “Majorana fermion” — could serve as its very own antimatter particle, and scientists have been searching for that particle ever since.

    iron
    An experiment revealing the atomic structure of an iron wire on a lead surface. The zoomed-in portion of the image depicts the probability of the wire containing the Majorana fermion. The image pinpoints the particle to the end of the wire, which is where it had been predicted to based on years of theoretical calculations.

    For their study, the Princeton researchers designed a simple experiment to observe what they call “emergent particles” which can be found within a material — rather than in the vacuum of a giant collider, where the Higgs boson was discovered.

    “This is more exciting and can actually be practically beneficial,” Ali Yazdani, a physics professor at the university who led the research team, said a written statement, “because it allows scientists to manipulate exotic particles for potential applications, such as quantum computing.”

    The researchers placed a thin, long chain of pure magnetic iron atoms on a superconductor made of lead. Then they cooled the materials to -457 degrees Fahrenheit and peered at them through a two-story-tall scanning-tunneling microscope. Just check out the video above.

    What did the researchers find? An electrically neutral signal at the ends of the iron wires, which is considered to be the “key signature” of the elusive Majorana fermion. The researchers say the fermion’s observed properties make it a good candidate for building quantum bits in computers.

    “One of the first steps in making a quantum computer is to make a quantum bit,” Yazdani said in an email to the Huffington Post, “The ideal quantum bit should [be] one that you can control but it does not interact with its environment, so as to be changed.”

    While other scientists find the study intriguing, many believe further research should be conducted to confirm the results.

    “We should keep in mind possible alternative explanations — even if there are no immediately obvious candidates,” Jason Alicea, a physicist at the California Institute of Technology, who did not participate in this research, told Scientific American.

    The research was published online on Oct. 2 in the journal Science.

    See the full article, with video, here.

    About Princeton: Overview

    Princeton University is a vibrant community of scholarship and learning that stands in the nation’s service and in the service of all nations. Chartered in 1746, Princeton is the fourth-oldest college in the United States. Princeton is an independent, coeducational, nondenominational institution that provides undergraduate and graduate instruction in the humanities, social sciences, natural sciences and engineering.

    As a world-renowned research university, Princeton seeks to achieve the highest levels of distinction in the discovery and transmission of knowledge and understanding. At the same time, Princeton is distinctive among research universities in its commitment to undergraduate teaching.

    Today, more than 1,100 faculty members instruct approximately 5,200 undergraduate students and 2,600 graduate students. The University’s generous financial aid program ensures that talented students from all economic backgrounds can afford a Princeton education.

    Princeton Shield
    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 7:27 am on September 30, 2014 Permalink | Reply
    Tags: , , Quantum Computing   

    From physicsworld: “Quantum data are compressed for the first time” 

    physicsworld
    physicsworld.com

    Sep 29, 2014
    Jon Cartwright

    A quantum analogue of data compression has been demonstrated for the first time in the lab. Physicists working in Canada and Japan have squeezed quantum information contained in three quantum bits (qubits) into two qubits. The technique could pave the way for a more effective use of quantum memories and offers a new method of testing quantum logic devices.

    image
    Three for two: physicists have compressed quantum data

    Compression of classical data is a simple procedure that allows a string of information to take up less space in a computer’s memory. Given an unadulterated string of, for example, 1000 binary values, a computer could simply record the frequency of the 1s and 0s, which might require just a dozen or so binary values. Recording the information about the order of those 1s and 0s would require a slightly longer string, but it would probably still be shorter than the original sequence.

    Quantum data are rather different, and it is not possible to simply determine the frequencies of 1s and 0s in a string of quantum information. The problem comes down to the peculiar nature of qubits, which, unlike classical bits, can be a 1, a 0 or some “superposition” of both values. A user can indeed perform a measurement to record the “one-ness” of a qubit, but such a measurement would destroy any information about that qubit’s “zero-ness”. What is more, if a user then measures a second qubit prepared in an identical way, he or she might find a different value for its “one-ness” – because qubits do not specify unique values but only the probability of measurement outcomes. This latter trait would seem to preclude the possibility of compressing even identical qubits, because there is no way of predicting what classical values they will ultimately manifest as.

    A way forward

    In 2010 physicists Martin Plesch and Vladimír Bužek of the Slovak Academy of Sciences in Bratislava realized that, while it is not possible to compress quantum data to the same extent as classical data, some compression can be achieved. As long as the quantum nature of a string of identically prepared qubits is preserved, they said, it should be possible to feed them through a circuit that records only their probabilistic natures. Such a recording would require exponentially fewer qubits, and would allow a user to easily store the quantum information in a quantum memory, which is currently a limited resource. Then at some later time, the user could decide what type of measurement to perform on the data.

    “This way you can store the qubits until you know what question you’re interested in,” says Aephraim Steinberg of the University of Toronto. “Then you can measure x if you want to know x; and if you want to know z, you can measure z – whereas if you don’t store the qubits, you have to choose which measurements you want to do right now.”

    Now, Steinberg and his colleagues have demonstrated working quantum compression for the first time with photon qubits. Because photon qubits are currently very difficult to process in quantum logic gates, Steinberg’s group resorted to a technique known as measurement-based quantum computing, in which the outcomes of a logic gate are “built in” to qubits that are prepared and entangled at the same source. The details are complex, but the researchers managed to transfer the probabilistic nature of three qubits into two qubits.

    A nice trick

    Plesch says that this is the first time that compression of quantum data has been realized, and believes Steinberg and colleagues have come up with a “nice trick” to make it work. “This approach is, however, hard to scale to a larger number of qubits,” Plesch adds. “Having said that, I consider the presented work as a very nice proof-of-concept for the future.”

    Steinberg thinks that larger-scale quantum compression might be possible with different types of qubits, such as trapped ions, which have so far proved easier to manage in large ensembles. A practical use for the process would be in testing quantum devices using a process known as quantum tomography, in which many identically prepared qubits are sent through a quantum device to check that it is functioning properly. With quantum compression, says Steinberg, one could perform the tomography experiment and then decide later what aspect of the device you wanted to test.

    But in the meantime, says Steinberg, the demonstration provides another perspective on the strangeness of the quantum world. “If you had a book filled just with ones, you could simply tell your friend that it’s a book filled with ones,” he says. “But quantum mechanically, that’s already not true. Even if I gave you a billion identically prepared photons, you could get different information from each one. To describe their states completely would require infinite classical information.”

    The research will be described in Physical Review Letters.

    See the full article here.

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:41 pm on September 20, 2014 Permalink | Reply
    Tags: , , , Quantum Computing, ,   

    From Princeton: “‘Solid’ light could compute previously unsolvable problems” 

    Princeton University
    Princeton University

    Sep 08, 2014
    John Sullivan

    Researchers at Princeton University have begun crystallizing light as part of an effort to answer fundamental questions about the physics of matter.

    The researchers are not shining light through crystal – they are transforming light into crystal. As part of an effort to develop exotic materials such as room-temperature superconductors, the researchers have locked together photons, the basic element of light, so that they become fixed in place.

    “It’s something that we have never seen before,” said Andrew Houck, an associate professor of electrical engineering and one of the researchers. “This is a new behavior for light.”

    The results raise intriguing possibilities for a variety of future materials. But the researchers also intend to use the method to address questions about the fundamental study of matter, a field called condensed matter physics.

    “We are interested in exploring – and ultimately controlling and directing – the flow of energy at the atomic level,” said Hakan Türeci, an assistant professor of electrical engineering and a member of the research team. “The goal is to better understand current materials and processes and to evaluate materials that we cannot yet create.”

    The team’s findings, reported online on Sept. 8 in the journal Physical Review X, are part of an effort to answer fundamental questions about atomic behavior by creating a device that can simulate the behavior of subatomic particles. Such a tool could be an invaluable method for answering questions about atoms and molecules that are not answerable even with today’s most advanced computers.

    light

    In part, that is because current computers operate under the rules of classical mechanics, which is a system that describes the everyday world containing things like bowling balls and planets. But the world of atoms and photons obeys the rules of quantum mechanics, which include a number of strange and very counterintuitive features. One of these odd properties is called “entanglement” in which multiple particles become linked and can affect each other over long distances.

    The difference between the quantum and classical rules limits a standard computer’s ability to efficiently study quantum systems. Because the computer operates under classical rules, it simply cannot grapple with many of the features of the quantum world. Scientists have long believed that a computer based on the rules of quantum mechanics could allow them to crack problems that are currently unsolvable. Such a computer could answer the questions about materials that the Princeton team is pursuing, but building a general-purpose quantum computer has proven to be incredibly difficult and requires further research.

    Another approach, which the Princeton team is taking, is to build a system that directly simulates the desired quantum behavior. Although each machine is limited to a single task, it would allow researchers to answer important questions without having to solve some of the more difficult problems involved in creating a general-purpose quantum computer. In a way, it is like answering questions about airplane design by studying a model airplane in a wind tunnel – solving problems with a physical simulation rather than a digital computer.

    In addition to answering questions about currently existing material, the device also could allow physicists to explore fundamental questions about the behavior of matter by mimicking materials that only exist in physicists’ imaginations.

    To build their machine, the researchers created a structure made of superconducting materials that contains 100 billion atoms engineered to act as a single “artificial atom.” They placed the artificial atom close to a superconducting wire containing photons.

    By the rules of quantum mechanics, the photons on the wire inherit some of the properties of the artificial atom – in a sense linking them. Normally photons do not interact with each other, but in this system the researchers are able to create new behavior in which the photons begin to interact in some ways like particles.

    “We have used this blending together of the photons and the atom to artificially devise strong interactions among the photons,” said Darius Sadri, a postdoctoral researcher and one of the authors. “These interactions then lead to completely new collective behavior for light – akin to the phases of matter, like liquids and crystals, studied in condensed matter physics.”

    Türeci said that scientists have explored the nature of light for centuries; discovering that sometimes light behaves like a wave and other times like a particle. In the lab at Princeton, the researchers have engineered a new behavior.

    “Here we set up a situation where light effectively behaves like a particle in the sense that two photons can interact very strongly,” Türeci said. “In one mode of operation, light sloshes back and forth like a liquid; in the other, it freezes.”

    The current device is relatively small, with only two sites where an artificial atom is paired with a superconducting wire. But the researchers say that by expanding the device and the number of interactions, they can increase their ability to simulate more complex systems – growing from the simulation of a single molecule to that of an entire material. In the future, the team plans to build devices with hundreds of sites with which they hope to observe exotic phases of light such as superfluids and insulators.

    “There is a lot of new physics that can be done even with these small systems,” said James Raftery, a graduate student in electrical engineering and one of the authors. “But as we scale up, we will be able to tackle some really interesting questions.”

    Besides Houck, Türeci, Sadri and Raftery, the research team included Sebastian Schmidt, a senior researcher at the Institute for Theoretical Physics at ETH Zurich, Switzerland. Support for the project was provided by: the Eric and Wendy Schmidt Transformative Technology Fund; the National Science Foundation; the David and Lucile Packard Foundation; the U.S. Army Research Office; and the Swiss National Science Foundation.

    See the full article here.

    About Princeton: Overview

    Princeton University is a vibrant community of scholarship and learning that stands in the nation’s service and in the service of all nations. Chartered in 1746, Princeton is the fourth-oldest college in the United States. Princeton is an independent, coeducational, nondenominational institution that provides undergraduate and graduate instruction in the humanities, social sciences, natural sciences and engineering.

    As a world-renowned research university, Princeton seeks to achieve the highest levels of distinction in the discovery and transmission of knowledge and understanding. At the same time, Princeton is distinctive among research universities in its commitment to undergraduate teaching.

    Today, more than 1,100 faculty members instruct approximately 5,200 undergraduate students and 2,600 graduate students. The University’s generous financial aid program ensures that talented students from all economic backgrounds can afford a Princeton education.

    Princeton Shield
    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 8:30 pm on September 9, 2014 Permalink | Reply
    Tags: , , , Quantum Computing   

    From Kavli: “Tiny Graphene Drum Could Form Future Quantum Memory” 

    KavliFoundation

    The Kavli Foundation

    09/09/2014
    No Writer Credit

    Scientists from TU Delft’s Kavli Institute of Nanoscience have demonstrated that they can detect extremely small changes in position and forces on very small drums of graphene. Graphene drums have great potential to be used as sensors in devices such as mobile phones. Using their unique mechanical properties, these drums could also act as memory chips in a quantum computer. The researchers present their findings in an article in the August 24th edition of Nature Nanotechnology. The research was funded by the FOM Foundation, the EU Marie-Curie program, and NWO.

    Graphene drums

    drum
    Graphene Drum

    Graphene is famous for its special electrical properties, but research on the one-layer thin graphite was recently expanded to explore graphene as a mechanical object. Thanks to their extreme low mass, tiny sheets of graphene can be used the same was as the drumhead of a musician. In the experiment, scientists use microwave-frequency light to ‘play’ the graphene drums, to listen to its ‘nano sound’, and to explore the way graphene in these drums moves.

    Optomechanics

    Dr. Vibhor Singh and his colleagues did this by using a 2D crystal membrane as a mirror in an ‘optomechanical cavity’. “In optomechanics you use the interference pattern of light to detect tiny changes in the position of an object. In this experiment, we shot microwave photons at a tiny graphene drum. The drum acts as a mirror: by looking at the interference of the microwave photons bouncing off of the drum, we are able to sense minute changes in the position of the graphene sheet of only 17 femtometers, nearly 1/10000th of the diameter of an atom.”, Singh explains.

    Amplifier

    The microwave ‘light’ in the experiment is not only good for detecting the position of the drum, but can also push on the drum with a force. This force from light is extremely small, but the small mass of the graphene sheet and the tiny displacements they can detect mean that the scientist can use these forces to ‘beat the drum’: the scientists can shake the graphene drum with the momentum of light. Using this radiation pressure, they made an amplifier in which microwave signals, such as those in your mobile phone, are amplified by the mechanical motion of the drum.

    Memory

    The scientists also show you can use these drums as ‘memory chips’ for microwave photons, converting photons into mechanical vibrations and storing them for up to 10 milliseconds. Although that is not long by human standards, it is a long time for a computer chip. “One of the long-term goals of the project is explore 2D crystal drums to study quantum motion. If you hit a classical drum with a stick, the drumhead will start oscillating, shaking up and down. With a quantum drum, however, you can not only make the drumhead move up and then down, but also make it into a ‘quantum superposition’, in which the drum head is both moving up and moving down at the same time ”, says research group leader Dr. Gary Steele. “This ‘strange’ quantum motion is not only of scientific relevance, but also could have very practical applications in a quantum computer as a quantum ‘memory chip’”.

    In a quantum computer, the fact that quantum ‘bits’ that can be both in the state 0 and 1 at the same time allow it to potentially perform computations much faster than a classical computer like those used today. Quantum graphene drums that are ‘shaking up and down at the same time’ could be used to store quantum information in the same way as RAM chips in your computer, allowing you to store your quantum computation result and retrieve it at a later time by listening to its quantum sound.

    See the full article, with video, here.

    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 6:41 pm on April 9, 2014 Permalink | Reply
    Tags: , , , Quantum Computing   

    From M.I.T.: “New ‘switch’ could power quantum computing” 

    April 9, 2014
    Peter Dizikes | MIT News Office

    A light lattice that traps atoms may help scientists build networks of quantum information transmitters.

    Using a laser to place individual rubidium atoms near the surface of a lattice of light, scientists at MIT and Harvard University have developed a new method for connecting particles — one that could help in the development of powerful quantum computing systems.

    qc

    The new technique, described in a paper published today in the journal Nature, allows researchers to couple a lone atom of rubidium, a metal, with a single photon, or light particle. This allows both the atom and photon to switch the quantum state of the other particle, providing a mechanism through which quantum-level computing operations could take place.

    Moreover, the scientists believe their technique will allow them to increase the number of useful interactions occurring within a small space, thus scaling up the amount of quantum computing processing available.

    “This is a major advance of this system,” says Vladan Vuletić, a professor in MIT’s Department of Physics and Research Laboratory for Electronics (RLE), and a co-author of the paper. “We have demonstrated basically an atom can switch the phase of a photon. And the photon can switch the phase of an atom.”

    That is, photons can have two polarization states, and interaction with the atom can change the photon from one state to another; conversely, interaction with the photon can change the atom’s phase, which is equivalent to changing the quantum state of the atom from its “ground” state to its “excited” state. In this way the atom-photon coupling can serve as a quantum switch to transmit information — the equivalent of a transistor in a classical computing system. And by placing many atoms within the same field of light, the researchers may be able to build networks that can process quantum information more effectively.

    “You can now imagine having several atoms placed there, to make several of these devices — which are only a few hundred nanometers thick, 1,000 times thinner than a human hair — and couple them together to make them exchange information,” Vuletić adds.

    Using a photonic cavity

    Quantum computing could enable the rapid performance of calculations by taking advantage of the distinctive quantum-level properties of particles. Some particles can be in a condition of superposition, appearing to exist in two places at the same time. Particles in superposition, known as qubits, could thus contain more information than particles at classical scales, and allow for faster computing.

    However, researchers are in the early stages of determining which materials best allow for quantum-scale computing. The MIT and Harvard researchers have been examining photons as a candidate material, since photons rarely interact with other particles. For this reason, an optical quantum computing system, using photons, could be harder to knock out of its delicate alignment. But since photons rarely interact with other bits of matter, they are difficult to manipulate in the first place.

    In this case, the researchers used a laser to place a rubidium atom very close to the surface of a photonic crystal cavity, a structure of light. The atoms were placed no more than 100 or 200 nanometers — less than a wavelength of light — from the edge of the cavity. At such small distances, there is a strong attractive force between the atom and the surface of the light field, which the researchers used to trap the atom in place.

    Other methods of producing a similar outcome have been considered before — such as, in effect, dropping atoms into the light and then finding and trapping them. But the researchers found that they had greater control over the particles this way.

    “In some sense, it was a big surprise how simple this solution was compared to the different techniques you might envision of getting the atoms there,” Vuletić says.

    The result is what he calls a “hybrid quantum system,” where individual atoms are coupled to microscopic fabricated devices, and in which atoms and photons can be controlled in productive ways. The researchers also found that the new device serves as a kind of router separating photons from each other.

    “The idea is to combine different things that have different strengths and weaknesses in such a way to generate something new,” Vuletić says, adding: “This is an advance in technology. Of course, whether this will be the technology remains to be seen.”

    ‘Still amazing’ to hold onto one atom

    The paper, Nanophotonic quantum phase switch with a single atom, is co-authored by Vuletić; Tobias Tiecke, a postdoc affiliated with both RLE and Harvard; Harvard professor of physics Mikhail Lukin; Harvard postdoc Nathalie de Leon; and Harvard graduate students Jeff Thompson and Bo Liu.

    The collaboration between the MIT and Harvard researchers is one of two advances in the field described in the current issue of Nature. Researchers at the Max Planck Institute of Quantum Optics in Germany have concurrently developed a new method of producing atom-photon interactions using mirrors, forming quantum gates, which change the direction of motion or polarization of photons.

    “The Harvard/MIT experiment is a masterpiece of quantum nonlinear optics, demonstrating impressively the preponderance of single atoms over many atoms for the control of quantum light fields,” says Gerhard Rempe, a professor at the Max Planck Institute of Quantum Optics who helped lead the German team’s new research, and who has read the paper by the U.S.-based team. “The coherent manipulation of an atom coupled to a photonic crystal resonator constitutes a breakthrough and complements our own work … with an atom in a dielectric mirror resonator.”

    Rempe adds that he thinks both techniques will be regarded as notable “achievements on our way toward a robust quantum technology with stationary atoms and flying photons.”

    If the research techniques seem a bit futuristic, Vuletić says that even as an experienced researcher in the field, he remains slightly awed by the tools at his disposal.

    “For me what is still amazing, after working in this for 20 years,” Vuletić reflects, “is that we can hold onto a single atom, we can see it, we can move it around, we can prepare quantum superpositions of atoms, we can detect them one by one.”

    Funding for the research was provided in part by the National Science Foundation, the MIT-Harvard Center for Ultracold Atoms, the Natural Sciences and Engineering Research Council of Canada, the Air Force Office of Scientific Research, and the Packard Foundation.

    See the full article here.


    ScienceSprings is powered by MAINGEAR computers

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 419 other followers

%d bloggers like this: