Tagged: Google 54-qubit Sycamore superconducting processor quantum computer Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:59 pm on September 16, 2020 Permalink | Reply
    Tags: "IBM promises 1000-qubit quantum computer—a milestone—by 2023", , Google 54-qubit Sycamore superconducting processor quantum computer, IBM is already preparing a jumbo liquid-helium refrigerator or cryostat to hold a quantum computer with 1 million qubits., ,   

    From Science: “IBM promises 1000-qubit quantum computer—a milestone—by 2023” 

    From Science

    Sep. 15, 2020
    Adrian Cho

    IBM researchers have already installed the mounting hardware for a jumbo cryostat big enough to hold a quantum computer with 1 million qubits.
    Credit: Connie Zhou/IBM.

    For 20 years scientists and engineers have been saying that “someday” they’ll build a full-fledged quantum computer able to perform useful calculations that would overwhelm any conventional supercomputer. But current machines contain just a few dozen quantum bits, or qubits, too few to do anything dazzling. Today, IBM made its aspirations more concrete by publicly announcing a “road map” for the development of its quantum computers, including the ambitious goal of building one containing 1000 qubits by 2023. IBM’s current largest quantum computer, revealed this month, contains 65 qubits.

    “We’re very excited,” says Prineha Narang, co-founder and chief technology officer of Aliro Quantum, a startup that specializes in code that helps higher level software efficiently run on different quantum computers. “We didn’t know the specific milestones and numbers that they’ve announced,” she says. The plan includes building intermediate-size machines of 127 and 433 qubits in 2021 and 2022, respectively, and envisions following up with a million-qubit machine at some unspecified date. Dario Gil, IBM’s director of research, says he is confident his team can keep to the schedule. “A road map is more than a plan and a PowerPoint presentation,” he says. “It’s execution.”

    IBM is not the only company with a road map to build a full-fledged quantum computer—a machine that would take advantage of the strange rules of quantum mechanics to breeze through certain computations that just overwhelm conventional computers. At least in terms of public relations, IBM has been playing catch-up to Google, which 1 year ago grabbed headlines when the company announced its researchers had used their 53-qubit quantum computer to solve a particular abstract problem that they claimed would overwhelm any conventional computer—reaching a milestone known as quantum supremacy.

    Judging by the cover of Nature, 24 October 2019 marked a turning point in the decades-long effort to harness the strange laws of quantum mechanics in the service of computing.

    Google 54-qubit Sycamore superconducting processor quantum computer.

    Google has its own plan to build a million-qubit quantum computer within 10 years, as Hartmut Neven, who leads Google’s quantum computing effort, explained in an April interview, although he declined to reveal a specific timeline for advances.

    IBM’s declared timeline comes with an obvious risk that everyone will know if it misses its milestones. But the company decided to reveal its plans so that its clients and collaborators would know what to expect. Dozens of quantum-computing startup companies use IBM’s current machines to develop their own software products, and knowing IBM’s milestones should help developers better tailor their efforts to the hardware, Gil says.

    One company joining those efforts is Q-CTRL, which develops software to optimize the control and performance of the individual qubits. The IBM announcement shows venture capitalists the company is serious about developing the challenging technology, says Michael Biercuk, founder and CEO of Q-CTRL. “It’s relevant to convincing investors that this large hardware manufacturer is pushing hard on this and investing significant resources,” he says.

    A 1000-qubit machine is a particularly important milestone in the development of a full-fledged quantum computer, researchers say. Such a machine would still be 1000 times too small to fulfill quantum computing’s full potential—such as breaking current internet encryption schemes—but it would big enough to spot and correct the myriad errors that ordinarily plague the finicky quantum bits.

    A bit in an ordinary computer is an electrical switch that can be set to either zero or one. In contrast, a qubit is a quantum device—in IBM’s and Google’s machines, each is a tiny circuit of superconducting metal chilled to nearly absolute zero—that can be set to zero, one, or, thanks to the strange rules of quantum mechanics, zero and one at the same time. But the slightest interaction with the environment tends to distort those delicate two-ways-at-once states, so researchers have developed error-correction protocols to spread information ordinarily encoded in a single physical qubit to many of them in a way that the state of that “logical qubit” can be maintained indefinitely.

    With their planned 1121-qubit machine, IBM researchers would be able to maintain a handful of logical qubits and make them interact, says Jay Gambetta, a physicist who leads IBM’s quantum computing efforts. That’s exactly what will be required to start to make a full-fledged quantum computer with thousands of logical qubits. Such a machine would mark an “inflection point” in which researchers’ focus would switch from beating down the error rate in the individual qubits to optimizing the architecture and performance of the entire system, Gambetta says.

    IBM is already preparing a jumbo liquid-helium refrigerator, or cryostat, to hold a quantum computer with 1 million qubits. The IBM road map doesn’t specify when such a machine could be built. But if company researchers really can build a 1000-qubit computer in the next 2 years, that ultimate goal will sound far less fantastical than it does now.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 10:51 am on September 16, 2020 Permalink | Reply
    Tags: "Supreme or Unproven?", A look at optics, Classical shortcuts, Dots, Erring on the side of caution, Google 54-qubit Sycamore superconducting processor quantum computer, ions and photons, Long-sought milestone?, , , , Quantum’s power   

    From Optics & Photonics: “Supreme or Unproven?” 

    From Optics & Photonics

    01 March 2020 [Missed this very important article. Making amends here.]
    Edwin Cartlidge

    Despite much recent fanfare, quantum computers still need to show that they can do something useful.

    Google 54-qubit Sycamore superconducting processor quantum computer.

    Judging by the cover of Nature that day, 24 October 2019 marked a turning point in the decades-long effort to harness the strange laws of quantum mechanics in the service of computing.


    The words “quantum supremacy,” emblazoned in large capital letters on the front of the prestigious journal, announced to the world that a quantum computer had, for the first time, performed a computation impossible to carry out on a classical supercomputer in any reasonable amount of time—despite having vastly less in the way of processors, memory and software to draw on.

    The quantum computer in question, Sycamore, comprised a mere 53 superconducting quantum bits, or qubits. It was built by a group of scientists at Google led by physicist John Martinis, who used it to execute an algorithm that generated a semi-random series of numbers. Those researchers then worked out how long they would have needed to simulate that operation on the IBM-built Summit supercomputer at Oak Ridge National Laboratory in Tennessee, USA, the processors of which include tens of trillions of transistors and which has 250,000 terabytes of storage.

    ORNL IBM AC922 SUMMIT supercomputer, was No.1 on the TOP500. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy.

    The IBM-built Summit supercomputer at the Oak Ridge National Laboratory, USA, contains tens of trillions of transistors and can carry out about 200,000 trillion operations a second. Credit:ORNL.

    Amazingly, Martinis and colleagues concluded that what Sycamore could do in a little over three minutes, Summit would take 10,000 years to simulate.

    Google CEO Sundar Pichai next to the company’s quantum computer. Credit: Google.

    Long-sought milestone?

    For many scientists, Sycamore’s result represents a major milestone on the road to a real-world, general-purpose quantum computer. Having invested millions of dollars in the field over the course of more than 30 years, governments and, increasingly, industry have bet that the exponential speed-up in processing power offered by quantum states in theory can be realized practically.


    Google’s Sycamore processor. Credit: Erik Lucero, Google.
    Sycamore—A quantum chip bearing fruit

    Google’s Sycamore processor consists of a 1-cm^2 piece of aluminum containing a 2D array of 53 qubits—each acting as a tiny superconducting resonator that encodes the values 0 and 1 in its two lowest energy levels, and coupled to its four nearest neighbors. Cooled to below 20 mK to minimize thermal interference, the qubits are subject to “gate” operations—having their coupling turned on and off, as well as absorbing microwaves and experiencing variations in magnetic flux.

    The Google team executed a series of cycles, each involving a random selection of one-qubit gates and a specific two-qubit gate. After completing the last cycle, they then read out the value of each qubit to yield a 53-bit-long string of 0s and 1s. That sequence appears random, but quantum entanglement and interference dictate that some of the 253 permutations are much more likely to occur than others. Repeating the process a million times builds up a statistically significant number of bit strings that can be compared with the theoretical distribution calculated using a classical computer.

    Measuring Sycamore’s “fidelity” to the theoretical distribution over 14 cycles, Martinis and coworkers found that the figure, 0.8%, agreed with calculations based on the fidelities of individual gates—and used that fact to estimate that after 20 cycles, the fidelity would have been about 0.1% (as the fidelity is gradually eroded by gate errors). At this level of complexity and fidelity, the team calculated, the classical Summit supercomputer would require a whopping 10,000 years to simulate the quantum wave function—whereas Sycamore needed a mere 200 seconds to take its 1 million samples.

    Winning that bet, however, depends on being able to protect a quantum computer’s delicate superposition states from even the smallest amounts of noise, such as tiny temperature fluctuations or minuscule electric fields. The Google result shows that noise can be controlled sufficiently to enable the execution of a classically difficult algorithm, according to Greg Kuperberg, a mathematician at the University of California, Davis, USA. “This advance is a major blow against arguments that quantum computers are impossible,” he says. “It is a tremendous confidence builder for the future.”

    Not everyone, however, is convinced by the research. A number of experts, including several at IBM, believe that the Google group has seriously underestimated the capacity of traditional digital computers to simulate the kind of algorithms that could be run on Sycamore. More fundamentally, it still remains to be seen whether scientists can develop a quantum algorithm that is resilient to noise and that does something people are willing to pay for—given how little practical utility the current algorithm is likely to have.

    “For me, the biggest value in the Google research is the technical achievement,” says Lieven Vandersypen, who works on rival quantum-dot qubits at the Delft University of Technology in the Netherlands. He points out that the previous best superconducting computer featured just 20 quite poorly controlled qubits. “But what we in the field are after is a computer that can solve useful problems, and we are still far from that.”

    Quantum’s power

    Quantum computers offer the possibility of carrying out certain tasks far more quickly than is possible with classical devices, owing to a number of bizarre properties of the quantum world. Whereas a classical computer processes data sequentially, a quantum computer should operate as a massively parallel processor. It does so thanks to the fact that each qubit—encoded in quantum particles such as atoms, electrons or photons—can exist in a superposition of the “0” and “1” states, rather than simply one or the other, and because the qubits are linked together through entanglement.

    For N qubits, each of the 2^N possible states that can be represented has an associated amplitude. The idea is to carry out a series of operations on the qubits, specified by a quantum algorithm, such that the system’s wave function evolves in a predetermined way, causing the amplitudes to change at each step. When the computer’s output is then obtained by measuring the value of each qubit, the wave function collapses to yield the result.

    The Google experiment, carried out in company labs in Santa Barbara, CA, USA, was designed to execute an algorithm whose answer could only be found classically by simulating the system’s wave function. So while running the algorithm on a quantum computer would only take as long as is needed to execute its limited number of steps, simulating that algorithm classically would involve tracking the 2^N probability amplitudes. Even with just 53 qubits that is an enormous number—9×10^15, or 9,000 trillion.

    Sycamore is not the first processor to have harnessed quantum interference to perform a calculation considered very difficult, if not impossible, to do using a classical computer. In 2017, two groups in the U.S. each used about 50 interacting, individually controllable qubits to simulate collections of quantum spins. Christopher Monroe and colleagues at the University of Maryland, College Park, manipulated electrically trapped ions using laser pulses, while OSA Fellow Mikhail Lukin of Harvard University and coworkers used a laser to excite neutral atoms. Both groups used their devices to determine the critical point at which a magnetic-phase transition occurs.

    However, these systems were designed to carry out very specific tasks, somewhat akin to early classical analog computers. Google’s processor, in contrast, is a programmable digital machine. By employing a handful of different logic gates—specific operations applied either to one or two qubits—it in principle can execute many types of quantum algorithms.

    Martinis and colleagues showed that they could use these gates to reliably generate a sample of numbers from the semi-random algorithm. Crucially, they found that they could prevent errors in the gates from building up and generating garbage at the output—leading them to declare that they had achieved quantum supremacy.

    “We are thrilled,” says Martinis, who is also a professor at the University of California, Santa Barbara. “We have been trying to do this for quite a few years and have been talking about it, but of course there is a bit of pressure on you to make good on your claims.”

    Classical shortcuts

    When the Google team published its results—a preliminary version of which had been accidently posted online at NASA a month earlier—rivals lost little time in criticizing them. In particular, researchers at IBM, which itself works on superconducting qubits, posted a paper on the arXiv server arguing that Summit could in fact simulate Sycamore’s operations in just 2.5 days (and at higher fidelity). Google’s oversight, they said, was to not have considered how much more efficiently the supercomputer could track the system’s wave function if it fully exploited all of its hard disk space.

    Kuperberg argues that Sycamore’s performance still merits the label “supremacy” given the disparity in resources available to the two computers. (In fact, the IBM researchers didn’t actually carry out the simulation, possibly because it would have been too expensive.) Kuperberg adds that with just a dozen or so more qubits, the simulation time would climb from days to centuries. “If this is what passes as refutation, then this is still a quantum David versus a classical Goliath,” he says. “This is supremacy enough as far as I am concerned.”

    Indeed, in their paper Martinis and colleagues write that while they expect classical simulation techniques to improve, they also expect that “they will be consistently outpaced by hardware improvements on larger quantum processors.” Others, however, suggest that quantum computers might struggle to deliver any meaningful speed-up over classical devices. In particular, argue critics, it remains to be seen just how “quantum mechanical” future quantum computers will be—and therefore how easy it might be to imitate them.

    To make classical simulation more competitive, the IBM researchers, as well as counterparts at the Chinese tech company Alibaba, are looking to make better use of supercomputer hardware. But Graeme Smith, a theoretical physicist at the University of Colorado and the JILA research institute in Boulder, USA, thinks that more radical improvement might be possible. He argues that the noise in Google’s gates, low as it is, could still swamp much of the system’s quantum information after multiple cycles. As such, he reckons it may be possible to develop a classical algorithm that sidesteps the need to calculate the 53-qubit wave function. “There is nothing to suggest that you have to do that to sample from [Google’s] circuit,” he says.

    Indeed, Itay Hen, a numerical physicist at the University of Southern California in Los Angeles, USA, is trying to devise a classical algorithm that directly samples from the distribution output by Google’s circuit. Although too early to know whether the scheme will work, he says it would involve calculating easy bits of the wave function and interfering them to generate a succession of individual data strings very quickly. “I am guessing that lots of other people are doing a similar thing,” he adds.

    As Hen explains, Martinis and colleagues had to make a compromise when designing their quantum-supremacy experiment—making the circuit complex enough to be classically hard, but not so complex that its output ended up being pure noise. And he says that the same compromise faces all developers of what is hoped will become the first generation of useful quantum computers—a technology known as “noisy intermediate-scale quantum,” or NISQ.

    Such devices might consist of several hundred qubits, perhaps allowing them to simulate molecules and other small quantum systems. This is how Richard Feynman, back in the early 1980s, originally envisaged quantum computers being used—conceivably allowing scientists to design new materials or develop new drugs. But as their name suggests, these devices, too, would be limited by noise. The question, says Hen, is whether they can be built with enough qubits and processor cycles to do something that a classical computer can’t.

    Dots, ions and photons

    To try and meet the challenge, physicists are working on a number of competing technologies—superconducting circuits, qubits encoded in nuclear or electronic spins, trapped atoms or ions—each of which has its strengths and weaknesses (see OPN, October 2016, Quantum Computing: How Close Are We?). Vandersypen, for instance, is hopeful that spin qubits made from quantum dots—essentially artificial atoms—can be scaled up. He points out that such qubits have been fabricated in an industrial clean room at the U.S. chip giant Intel, which has teamed up with him and his colleagues at the Delft University of Technology to develop the technology. “We have done measurements [on the qubits],” he adds, “but not yet gotten to the point of qubit manipulation.”

    Collaborating scientists from Intel and QuTech at the Delft University of Technology with Intel’s 17-qubit superconducting test chip. [Courtesy of Intel Corp.]

    Trapped-ion qubits, meanwhile, are relatively slow, but have higher fidelities and can operate more cycles than their superconducting rivals. Monroe is confident that by linking up multiple ion traps, perhaps optically, it should be possible to make NISQ devices with hundreds of qubits. Indeed, he cofounded the company IonQ with OSA Fellow Jungsang Kim from Duke University, USA, to commercialize the technology.

    A completely different approach is to encode quantum information in light rather than matter. Photonic qubits are naturally resistant to certain types of noise, but being harder to manipulate they may ultimately be more suited to communication and sensing rather than computing (see “A look at optics,”).

    Xanadu’s quantum chip. Credit: Xanadu Quantum Technologies Inc.

    A look at optics

    As qubits, photons have several virtues. Because they usually don’t interact with one another they are immune to stray electromagnetic fields, while their high energies at visible wavelengths make them robust against thermal fluctuations—removing the need for refrigeration. But their isolation makes them tricky to manipulate and process.

    Two startups are working to get around this problem—and raising tens of millions of dollars in the process. PsiQuantum in Palo Alto, CA, USA, aims to make a chip with around 1 million qubits. Because photons are bosons and tend to stick together, their paths combine after entering 50-50 beam splitters from opposite sides, effectively interacting. Xanadu in Toronto, Canada, instead relies on the uncertainty principle, generating beams of “squeezed light” that have lower uncertainty in one quantum property at the expense of greater uncertainty in another. In theory, interfering these beams and counting photons at the output might enable quantum computation.

    Both Xanadu and PsiQuantum have major, if different, technical hurdles to overcome before their computers become reality, according to OSA Fellow Michael Raymer, an optical physicist at the University of Oregon, USA, and a driving force behind the U.S. National Quantum Initiative.

    Raymer adds that photons might also interact not directly, but via matter intermediaries, potentially enabling quantum-logic operations between single photons. Or they might be used to link superconducting processors to slower but longer-lived trapped-ion qubits (acting as memory). Alternatively, photon–matter interactions could be exploited in the quantum repeaters needed to ensure entanglement between distant particles—potentially a boost for both communication and sensing.

    “Whether or not optics will be used to create free-standing quantum computers,” says Raymer, “I will defer prediction on that.”

    Yet turning NISQ computers into practical devices will need more than just improvements in hardware, according to William Oliver, an electrical engineer and physicist at the Massachusetts Institute of Technology, USA. Also essential, he says, will be developing new algorithms that can exploit these devices for commercial ends—be those ends optimizing investment portfolios or simulating new materials. “The most important thing,” Oliver says, “is to find commercial applications that gain advantage from the qubits we have today.”

    According to Hen, though, it remains to be seen whether any suitable algorithms can be found. For simulation of chemical systems, he says, it is not clear if even hundreds of qubits would be enough to reproduce the interactions of just 40 electrons—the current classical limit—given the inaccuracies introduced by noise. Indeed, Smith is pessimistic about NISQ computers being able to do anything useful. “There is a lot of hope,” he says, “but not a lot of good science to substantiate that hope.”

    Erring on the side of caution

    The only realistic aim, Hen argues—and one that all experts see as the ultimate goal of quantum computing—is to build large, fault-tolerant machines. These would rely on error correction, which involves spreading the value of a single “logical qubit” over multiple physical qubits to make computations robust against errors on any specific bit (since quantum information cannot simply be copied). But implementing error correction will require that the error rate on individual qubits and logic gates is low enough that adding the error-correcting qubits doesn’t introduce more noise into the system than it removes.

    Vandersypen reckons that this milestone could be achieved in as little as a year or two. The real challenge, he argues, will be scaling up—given how many qubits are likely to be needed for full-scale fault-tolerant computers. Particularly challenging will be making a machine that can find the prime factors of huge numbers, an application put forward by mathematician Peter Shor in 1994 that could famously threaten internet encryption. Martinis himself estimates that a device capable of finding the prime factors of a 2000-bit number in a day would need about 20 million physical qubits, given a two-qubit error probability of about 0.1%.

    Despite the huge challenges that lie ahead, Martinis is optimistic about future progress. He says that he and his colleagues at Google are aiming to get two-qubit error rates down to 0.1% by increasing the coherence time of their qubits—doubling their current value of 10–20 microseconds within six months, and then quadrupling it in two years. They then hope to build a computer with 1,000 logical qubits within 10 years—a device that he says wouldn’t be big enough to threaten internet security but could solve problems in quantum chemistry. “We are putting together a plan and a timeline and we are going to try to stick to that,” he says.

    However, Oliver is skeptical that such an ambitious timeframe can be met, estimating that a full-scale fault-tolerant computer is likely to take “a couple of decades” to build. Indeed, he urges his fellow scientists not to overstate quantum computers’ near-term potential. Otherwise, he fears, the field could enter a “quantum winter” in which enthusiasm gives way to pessimism and the withdrawal of funding. “A better approach,” according to Oliver, “is to be realistic about the promise and the challenges of quantum computing so that progress remains steady.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Optics and Photonics News (OPN) is The Optical Society’s monthly news magazine. It provides in-depth coverage of recent developments in the field of optics and offers busy professionals the tools they need to succeed in the optics industry, as well as informative pieces on a variety of topics such as science and society, education, technology and business. OPN strives to make the various facets of this diverse field accessible to researchers, engineers, businesspeople and students. Contributors include scientists and journalists who specialize in the field of optics. We welcome your submissions.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: