Tagged: Qubits Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:10 pm on March 10, 2019 Permalink | Reply
    Tags: A quantum computer would greatly speed up analysis of the collisions hopefully finding evidence of supersymmetry much sooner—or at least allowing us to ditch the theory and move on., And they’ve been waiting for decades. Google is in the race as are IBM Microsoft Intel and a clutch of startups academic groups and the Chinese government., , At the moment researchers spend weeks and months sifting through the debris from proton-proton collisions in the LCH trying to find exotic heavy sister-particles to all our known particles of matter., “This is a marathon” says David Reilly who leads Microsoft’s quantum lab at the University of Sydney Australia. “And it's only 10 minutes into the marathon.”, , , CERN-Future Circular Collider, For CERN the quantum promise could for instance help its scientists find evidence of supersymmetry or SUSY which so far has proven elusive., HL-LHC-High-Luminosity LHC, IBM has steadily been boosting the number of qubits on its quantum computers starting with a meagre 5-qubit computer then 16- and 20-qubit machines and just recently showing off its 50-qubit processor, In a bid to make sense of the impending data deluge some at CERN are turning to the emerging field of quantum computing., In a quantum computer each circuit can have one of two values—either one (on) or zero (off) in binary code; the computer turns the voltage in a circuit on or off to make it work., In theory a quantum computer would process all the states a qubit can have at once and with every qubit added to its memory size its computational power should increase exponentially., Last year physicists from the California Institute of Technology in Pasadena and the University of Southern California managed to replicate the discovery of the Higgs boson found at the LHC in 2012, None of the competing teams have come close to reaching even the first milestone., , , Qubits, The quest has now lasted decades and a number of physicists are questioning if the theory behind SUSY is really valid., Traditional computers—be it an Apple Watch or the most powerful supercomputer—rely on tiny silicon transistors that work like on-off switches to encode bits of data., Venture capitalists invested some $250 million in various companies researching quantum computing in 2018 alone.,   

    From WIRED: “Inside the High-Stakes Race to Make Quantum Computers Work” 

    Wired logo

    From WIRED

    03.08.19
    Katia Moskvitch

    1
    View Pictures/Getty Images

    Deep beneath the Franco-Swiss border, the Large Hadron Collider is sleeping.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    But it won’t be quiet for long. Over the coming years, the world’s largest particle accelerator will be supercharged, increasing the number of proton collisions per second by a factor of two and a half.

    Once the work is complete in 2026, researchers hope to unlock some of the most fundamental questions in the universe. But with the increased power will come a deluge of data the likes of which high-energy physics has never seen before. And, right now, humanity has no way of knowing what the collider might find.

    To understand the scale of the problem, consider this: When it shut down in December 2018, the LHC generated about 300 gigabytes of data every second, adding up to 25 petabytes (PB) annually. For comparison, you’d have to spend 50,000 years listening to music to go through 25 PB of MP3 songs, while the human brain can store memories equivalent to just 2.5 PB of binary data. To make sense of all that information, the LHC data was pumped out to 170 computing centers in 42 countries [http://greybook.cern.ch/]. It was this global collaboration that helped discover the elusive Higgs boson, part of the Higgs field believed to give mass to elementary particles of matter.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    To process the looming data torrent, scientists at the European Organization for Nuclear Research, or CERN, will need 50 to 100 times more computing power than they have at their disposal today. A proposed Future Circular Collider, four times the size of the LHC and 10 times as powerful, would create an impossibly large quantity of data, at least twice as much as the LHC.

    CERN FCC Future Circular Collider map

    In a bid to make sense of the impending data deluge, some at CERN are turning to the emerging field of quantum computing. Powered by the very laws of nature the LHC is probing, such a machine could potentially crunch the expected volume of data in no time at all. What’s more, it would speak the same language as the LHC. While numerous labs around the world are trying to harness the power of quantum computing, it is the future work at CERN that makes it particularly exciting research. There’s just one problem: Right now, there are only prototypes; nobody knows whether it’s actually possible to build a reliable quantum device.

    Traditional computers—be it an Apple Watch or the most powerful supercomputer—rely on tiny silicon transistors that work like on-off switches to encode bits of data.

    ORNL IBM AC922 SUMMIT supercomputer, No.1 on the TOP500. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    Each circuit can have one of two values—either one (on) or zero (off) in binary code; the computer turns the voltage in a circuit on or off to make it work.

    A quantum computer is not limited to this “either/or” way of thinking. Its memory is made up of quantum bits, or qubits—tiny particles of matter like atoms or electrons. And qubits can do “both/and,” meaning that they can be in a superposition of all possible combinations of zeros and ones; they can be all of those states simultaneously.

    For CERN, the quantum promise could, for instance, help its scientists find evidence of supersymmetry, or SUSY, which so far has proven elusive.

    Standard Model of Supersymmetry via DESY

    At the moment, researchers spend weeks and months sifting through the debris from proton-proton collisions in the LCH, trying to find exotic, heavy sister-particles to all our known particles of matter. The quest has now lasted decades, and a number of physicists are questioning if the theory behind SUSY is really valid. A quantum computer would greatly speed up analysis of the collisions, hopefully finding evidence of supersymmetry much sooner—or at least allowing us to ditch the theory and move on.

    A quantum device might also help scientists understand the evolution of the early universe, the first few minutes after the Big Bang. Physicists are pretty confident that back then, our universe was nothing but a strange soup of subatomic particles called quarks and gluons. To understand how this quark-gluon plasma has evolved into the universe we have today, researchers simulate the conditions of the infant universe and then test their models at the LHC, with multiple collisions. Performing a simulation on a quantum computer, governed by the same laws that govern the very particles that the LHC is smashing together, could lead to a much more accurate model to test.

    Beyond pure science, banks, pharmaceutical companies, and governments are also waiting to get their hands on computing power that could be tens or even hundreds of times greater than that of any traditional computer.

    And they’ve been waiting for decades. Google is in the race, as are IBM, Microsoft, Intel and a clutch of startups, academic groups, and the Chinese government. The stakes are incredibly high. Last October, the European Union pledged to give $1 billion to over 5,000 European quantum technology researchers over the next decade, while venture capitalists invested some $250 million in various companies researching quantum computing in 2018 alone. “This is a marathon,” says David Reilly, who leads Microsoft’s quantum lab at the University of Sydney, Australia. “And it’s only 10 minutes into the marathon.”

    Despite the hype surrounding quantum computing and the media frenzy triggered by every announcement of a new qubit record, none of the competing teams have come close to reaching even the first milestone, fancily called quantum supremacy—the moment when a quantum computer performs at least one specific task better than a standard computer. Any kind of task, even if it is totally artificial and pointless. There are plenty of rumors in the quantum community that Google may be close, although if true, it would give the company bragging rights at best, says Michael Biercuk, a physicist at the University of Sydney and founder of quantum startup Q-CTRL. “It would be a bit of a gimmick—an artificial goal,” says Reilly “It’s like concocting some mathematical problem that really doesn’t have an obvious impact on the world just to say that a quantum computer can solve it.”

    That’s because the first real checkpoint in this race is much further away. Called quantum advantage, it would see a quantum computer outperform normal computers on a truly useful task. (Some researchers use the terms quantum supremacy and quantum advantage interchangeably.) And then there is the finish line, the creation of a universal quantum computer. The hope is that it would deliver a computational nirvana with the ability to perform a broad range of incredibly complex tasks. At stake is the design of new molecules for life-saving drugs, helping banks to adjust the riskiness of their investment portfolios, a way to break all current cryptography and develop new, stronger systems, and for scientists at CERN, a way to glimpse the universe as it was just moments after the Big Bang.

    Slowly but surely, work is already underway. Federico Carminati, a physicist at CERN, admits that today’s quantum computers wouldn’t give researchers anything more than classical machines, but, undeterred, he’s started tinkering with IBM’s prototype quantum device via the cloud while waiting for the technology to mature. It’s the latest baby step in the quantum marathon. The deal between CERN and IBM was struck in November last year at an industry workshop organized by the research organization.

    Set up to exchange ideas and discuss potential collab­orations, the event had CERN’s spacious auditorium packed to the brim with researchers from Google, IBM, Intel, D-Wave, Rigetti, and Microsoft. Google detailed its tests of Bristlecone, a 72-qubit machine. Rigetti was touting its work on a 128-qubit system. Intel showed that it was in close pursuit with 49 qubits. For IBM, physicist Ivano Tavernelli took to the stage to explain the company’s progress.

    IBM has steadily been boosting the number of qubits on its quantum computers, starting with a meagre 5-qubit computer, then 16- and 20-qubit machines, and just recently showing off its 50-qubit processor.

    IBM iconic image of Quantum computer

    Carminati listened to Tavernelli, intrigued, and during a much needed coffee break approached him for a chat. A few minutes later, CERN had added a quantum computer to its impressive technology arsenal. CERN researchers are now starting to develop entirely new algorithms and computing models, aiming to grow together with the device. “A fundamental part of this process is to build a solid relationship with the technology providers,” says Carminati. “These are our first steps in quantum computing, but even if we are coming relatively late into the game, we are bringing unique expertise in many fields. We are experts in quantum mechanics, which is at the base of quantum computing.”

    The attraction of quantum devices is obvious. Take standard computers. The prediction by former Intel CEO Gordon Moore in 1965 that the number of components in an integrated circuit would double roughly every two years has held true for more than half a century. But many believe that Moore’s law is about to hit the limits of physics. Since the 1980s, however, researchers have been pondering an alternative. The idea was popularized by Richard Feynman, an American physicist at Caltech in Pasadena. During a lecture in 1981, he lamented that computers could not really simulate what was happening at a subatomic level, with tricky particles like electrons and photons that behave like waves but also dare to exist in two states at once, a phenomenon known as quantum superposition.

    Feynman proposed to build a machine that could. “I’m not happy with all the analyses that go with just the classical theory, because nature isn’t classical, dammit,” he told the audience back in 1981. “And if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy.”

    And so the quantum race began. Qubits can be made in different ways, but the rule is that two qubits can be both in state A, both in state B, one in state A and one in state B, or vice versa, so there are four probabilities in total. And you won’t know what state a qubit is at until you measure it and the qubit is yanked out of its quantum world of probabilities into our mundane physical reality.

    In theory, a quantum computer would process all the states a qubit can have at once, and with every qubit added to its memory size, its computational power should increase exponentially. So, for three qubits, there are eight states to work with simultaneously, for four, 16; for 10, 1,024; and for 20, a whopping 1,048,576 states. You don’t need a lot of qubits to quickly surpass the memory banks of the world’s most powerful modern supercomputers—meaning that for specific tasks, a quantum computer could find a solution much faster than any regular computer ever would. Add to this another crucial concept of quantum mechanics: entanglement. It means that qubits can be linked into a single quantum system, where operating on one affects the rest of the system. This way, the computer can harness the processing power of both simultaneously, massively increasing its computational ability.

    While a number of companies and labs are competing in the quantum marathon, many are running their own races, taking different approaches. One device has even been used by a team of researchers to analyze CERN data, albeit not at CERN. Last year, physicists from the California Institute of Technology in Pasadena and the University of Southern California managed to replicate the discovery of the Higgs boson, found at the LHC in 2012, by sifting through the collider’s troves of data using a quantum computer manufactured by D-Wave, a Canadian firm based in Burnaby, British Columbia. The findings didn’t arrive any quicker than on a traditional computer, but, crucially, the research showed a quantum machine could do the work.

    One of the oldest runners in the quantum race, D-Wave announced back in 2007 that it had built a fully functioning, commercially available 16-qubit quantum computer prototype—a claim that’s controversial to this day. D-Wave focuses on a technology called quantum annealing, based on the natural tendency of real-world quantum systems to find low-energy states (a bit like a spinning top that inevitably will fall over). A D-Wave quantum computer imagines the possible solutions of a problem as a landscape of peaks and valleys; each coordinate represents a possible solution and its elevation represents its energy. Annealing allows you to set up the problem, and then let the system fall into the answer—in about 20 milliseconds. As it does so, it can tunnel through the peaks as it searches for the lowest valleys. It finds the lowest point in the vast landscape of solutions, which corresponds to the best possible outcome—although it does not attempt to fully correct for any errors, inevitable in quantum computation. D-Wave is now working on a prototype of a universal annealing quantum computer, says Alan Baratz, the company’s chief product officer.

    Apart from D-Wave’s quantum annealing, there are three other main approaches to try and bend the quantum world to our whim: integrated circuits, topological qubits and ions trapped with lasers. CERN is placing high hopes on the first method but is closely watching other efforts too.

    IBM, whose computer Carminati has just started using, as well as Google and Intel, all make quantum chips with integrated circuits—quantum gates—that are superconducting, a state when certain metals conduct electricity with zero resistance. Each quantum gate holds a pair of very fragile qubits. Any noise will disrupt them and introduce errors—and in the quantum world, noise is anything from temperature fluctuations to electromagnetic and sound waves to physical vibrations.

    To isolate the chip from the outside world as much as possible and get the circuits to exhibit quantum mechanical effects, it needs to be supercooled to extremely low temperatures. At the IBM quantum lab in Zurich, the chip is housed in a white tank—a cryostat—suspended from the ceiling. The temperature inside the tank is a steady 10 millikelvin or –273 degrees Celsius, a fraction above absolute zero and colder than outer space. But even this isn’t enough.

    Just working with the quantum chip, when scientists manipulate the qubits, causes noise. “The outside world is continually interacting with our quantum hardware, damaging the information we are trying to process,” says physicist John Preskill at the California Institute of Technology, who in 2012 coined the term quantum supremacy. It’s impossible to get rid of the noise completely, so researchers are trying to suppress it as much as possible, hence the ultracold temperatures to achieve at least some stability and allow more time for quantum computations.

    “My job is to extend the lifetime of qubits, and we’ve got four of them to play with,” says Matthias Mergenthaler, an Oxford University postdoc student working at IBM’s Zurich lab. That doesn’t sound like a lot, but, he explains, it’s not so much the number of qubits that counts but their quality, meaning qubits with as low a noise level as possible, to ensure they last as long as possible in superposition and allow the machine to compute. And it’s here, in the fiddly world of noise reduction, that quantum computing hits up against one of its biggest challenges. Right now, the device you’re reading this on probably performs at a level similar to that of a quantum computer with 30 noisy qubits. But if you can reduce the noise, then the quantum computer is many times more powerful.

    Once the noise is reduced, researchers try to correct any remaining errors with the help of special error-correcting algorithms, run on a classical computer. The problem is, such error correction works qubit by qubit, so the more qubits there are, the more errors the system has to cope with. Say a computer makes an error once every 1,000 computational steps; it doesn’t sound like much, but after 1,000 or so operations, the program will output incorrect results. To be able to achieve meaningful computations and surpass standard computers, a quantum machine has to have about 1,000 qubits that are relatively low noise and with error rates as corrected as possible. When you put them all together, these 1,000 qubits will make up what researchers call a logical qubit. None yet exist—so far, the best that prototype quantum devices have achieved is error correction for up to 10 qubits. That’s why these prototypes are called noisy intermediate-scale quantum computers (NISQ), a term also coined by Preskill in 2017.

    For Carminati, it’s clear the technology isn’t ready yet. But that isn’t really an issue. At CERN the challenge is to be ready to unlock the power of quantum computers when and if the hardware becomes available. “One exciting possibility will be to perform very, very accurate simulations of quantum systems with a quantum computer—which in itself is a quantum system,” he says. “Other groundbreaking opportunities will come from the blend of quantum computing and artificial intelligence to analyze big data, a very ambitious proposition at the moment, but central to our needs.”

    But some physicists think NISQ machines will stay just that—noisy—forever. Gil Kalai, a professor at Yale University, says that error correcting and noise suppression will never be good enough to allow any kind of useful quantum computation. And it’s not even due to technology, he says, but to the fundamentals of quantum mechanics. Interacting systems have a tendency for errors to be connected, or correlated, he says, meaning errors will affect many qubits simultaneously. Because of that, it simply won’t be possible to create error-correcting codes that keep noise levels low enough for a quantum computer with the required large number of qubits.

    “My analysis shows that noisy quantum computers with a few dozen qubits deliver such primitive computational power that it will simply not be possible to use them as the building blocks we need to build quantum computers on a wider scale,” he says. Among scientists, such skepticism is hotly debated. The blogs of Kalai and fellow quantum skeptics are forums for lively discussion, as was a recent much-shared article titled “The Case Against Quantum Computing”—followed by its rebuttal, “The Case Against the Case Against Quantum Computing.

    For now, the quantum critics are in a minority. “Provided the qubits we can already correct keep their form and size as we scale, we should be okay,” says Ray Laflamme, a physicist at the University of Waterloo in Ontario, Canada. The crucial thing to watch out for right now is not whether scientists can reach 50, 72, or 128 qubits, but whether scaling quantum computers to this size significantly increases the overall rate of error.

    3
    The Quantum Nano Centre in Canada is one of numerous big-budget research and development labs focussed on quantum computing. James Brittain/Getty Images

    Others believe that the best way to suppress noise and create logical qubits is by making qubits in a different way. At Microsoft, researchers are developing topological qubits—although its array of quantum labs around the world has yet to create a single one. If it succeeds, these qubits would be much more stable than those made with integrated circuits. Microsoft’s idea is to split a particle—for example an electron—in two, creating Majorana fermion quasi-particles. They were theorized back in 1937, and in 2012 researchers at Delft University of Technology in the Netherlands, working at Microsoft’s condensed matter physics lab, obtained the first experimental evidence of their existence.

    “You will only need one of our qubits for every 1,000 of the other qubits on the market today,” says Chetan Nayak, general manager of quantum hardware at Microsoft. In other words, every single topological qubit would be a logical one from the start. Reilly believes that researching these elusive qubits is worth the effort, despite years with little progress, because if one is created, scaling such a device to thousands of logical qubits would be much easier than with a NISQ machine. “It will be extremely important for us to try out our code and algorithms on different quantum simulators and hardware solutions,” says Carminati. “Sure, no machine is ready for prime time quantum production, but neither are we.”

    Another company Carminati is watching closely is IonQ, a US startup that spun out of the University of Maryland. It uses the third main approach to quantum computing: trapping ions. They are naturally quantum, having superposition effects right from the start and at room temperature, meaning that they don’t have to be supercooled like the integrated circuits of NISQ machines. Each ion is a singular qubit, and researchers trap them with special tiny silicon ion traps and then use lasers to run algorithms by varying the times and intensities at which each tiny laser beam hits the qubits. The beams encode data to the ions and read it out from them by getting each ion to change its electronic states.

    In December, IonQ unveiled its commercial device, capable of hosting 160 ion qubits and performing simple quantum operations on a string of 79 qubits. Still, right now, ion qubits are just as noisy as those made by Google, IBM, and Intel, and neither IonQ nor any other labs around the world experimenting with ions have achieved quantum supremacy.

    As the noise and hype surrounding quantum computers rumbles on, at CERN, the clock is ticking. The collider will wake up in just five years, ever mightier, and all that data will have to be analyzed. A non-noisy, error-corrected quantum computer will then come in quite handy.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 11:49 am on March 1, 2019 Permalink | Reply
    Tags: "Yale researchers create a ‘universal entangler’ for new quantum tech", Potential uses in quantum computing and cryptography and quantum communications, , Qubits, The entangling mechanism is called an exponential-SWAP gate,   

    From Yale University: “Yale researchers create a ‘universal entangler’ for new quantum tech” 

    Yale University bloc

    From Yale University

    February 27, 2019
    Jim Shelton

    One of the key concepts in quantum physics is entanglement, in which two or more quantum systems become so inextricably linked that their collective state can’t be determined by observing each element individually. Now Yale researchers have developed a “universal entangler” that can link a variety of encoded particles on demand.

    The discovery represents a powerful new mechanism with potential uses in quantum computing, cryptography, and quantum communications. The research is led by the Yale laboratory of Robert Schoelkopf and appears in the journal Nature.

    Quantum calculations are accomplished with delicate bits of data called qubits, which are prone to errors. To implement faithful quantum computation, scientists say, they need “logical” qubits whose errors can be detected and rectified using quantum error correction codes.

    “We’ve shown a new way of creating gates between logically-encoded qubits that can eventually be error-corrected,” said Schoelkopf, the Sterling Professor of Applied Physics and Physics at Yale and director of the Yale Quantum Institute. “It’s a much more sophisticated operation than what has been performed previously.”

    The entangling mechanism is called an exponential-SWAP gate. In the study, researchers demonstrated the new technology by deterministically entangling encoded states in any chosen configurations or codes, each housed in two otherwise isolated, 3D superconducting microwave cavities.

    1
    Yale researchers have created a way to entangle a variety of encoded particles on demand.

    “This universal entangler is critical for robust quantum computation,” said Yvonne Gao, co-first author of the study. “Scientists have invented a wealth of hardware-efficient, quantum error correction codes — each one cleverly designed with unique characteristics that can be exploited for different applications. However, each of them requires wiring up a new set of tailored operations, introducing a significant hardware overhead and reduced versatility.”

    The universal entangler mitigates this limitation by providing a gate between any desired input states. “We can now choose any desired codes or even change them on the fly without having to re-wire the operation,” said co-first author Brian Lester.

    The discovery is just the latest step in Yale’s quantum research work. Yale scientists are at the forefront of efforts to develop the first fully useful quantum computers and have done pioneering work in quantum computing with superconducting circuits.

    Additional authors of the study are Kevin Chou, Luigi Frunzio, Michel Devoret, Liang Jiang, and Steven Girvin. The research was supported by the U.S. Army Research Office.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Yale University Campus

    Yale University comprises three major academic components: Yale College (the undergraduate program), the Graduate School of Arts and Sciences, and the professional schools. In addition, Yale encompasses a wide array of centers and programs, libraries, museums, and administrative support offices. Approximately 11,250 students attend Yale.

     
  • richardmitnick 10:22 am on February 23, 2019 Permalink | Reply
    Tags: , , , , , Qubits, Semiconductor quantum dots,   

    From University of Cambridge: “Physicists get thousands of semiconductor nuclei to do ‘quantum dances’ in unison” 

    U Cambridge bloc

    From University of Cambridge

    22 Feb 2019
    Communications office

    1
    Theoretical ESR spectrum buildup as a function of two-photon detuning δ and drive time τ, for a Rabi frequency of Ω = 3.3 MHz on the central transition. Credit: University of Cambridge.

    A team of Cambridge researchers have found a way to control the sea of nuclei in semiconductor quantum dots so they can operate as a quantum memory device.

    Quantum dots are crystals made up of thousands of atoms, and each of these atoms interacts magnetically with the trapped electron. If left alone to its own devices, this interaction of the electron with the nuclear spins, limits the usefulness of the electron as a quantum bit – a qubit.

    Led by Professor Mete Atatüre from Cambridge’s Cavendish Laboratory, the researchers are exploiting the laws of quantum physics and optics to investigate computing, sensing or communication applications.

    “Quantum dots offer an ideal interface, as mediated by light, to a system where the dynamics of individual interacting spins could be controlled and exploited,” said Atatüre, who is a Fellow of St John’s College. “Because the nuclei randomly ‘steal’ information from the electron they have traditionally been an annoyance, but we have shown we can harness them as a resource.”

    The Cambridge team found a way to exploit the interaction between the electron and the thousands of nuclei using lasers to ‘cool’ the nuclei to less than 1 milliKelvin, or a thousandth of a degree above the absolute zero temperature. They then showed they can control and manipulate the thousands of nuclei as if they form a single body in unison, like a second qubit. This proves the nuclei in the quantum dot can exchange information with the electron qubit and can be used to store quantum information as a memory device. The results are reported in the journal Science.

    Quantum computing aims to harness fundamental concepts of quantum physics, such as entanglement and superposition principle, to outperform current approaches to computing and could revolutionise technology, business and research. Just like classical computers, quantum computers need a processor, memory, and a bus to transport the information backwards and forwards. The processor is a qubit which can be an electron trapped in a quantum dot, the bus is a single photon that these quantum dots generate and are ideal for exchanging information. But the missing link for quantum dots is quantum memory.

    Atatüre said: “Instead of talking to individual nuclear spins, we worked on accessing collective spin waves by lasers. This is like a stadium where you don’t need to worry about who raises their hands in the Mexican wave going round, as long as there is one collective wave because they all dance in unison.

    “We then went on to show that these spin waves have quantum coherence. This was the missing piece of the jigsaw and we now have everything needed to build a dedicated quantum memory for every qubit.”

    In quantum technologies, the photon, the qubit and the memory need to interact with each other in a controlled way. This is mostly realised by interfacing different physical systems to form a single hybrid unit which can be inefficient. The researchers have been able to show that in quantum dots, the memory element is automatically there with every single qubit.

    Dr Dorian Gangloff, one of the first authors of the paper [Science] and a Fellow at St John’s, said the discovery will renew interest in these types of semiconductor quantum dots. Dr Gangloff explained: “This is a Holy Grail breakthrough for quantum dot research – both for quantum memory and fundamental research; we now have the tools to study dynamics of complex systems in the spirit of quantum simulation.”

    The long term opportunities of this work could be seen in the field of quantum computing. Last month, IBM launched the world’s first commercial quantum computer, and the Chief Executive of Microsoft has said quantum computing has the potential to ‘radically reshape the world’.

    Gangloff said: “The impact of the qubit could be half a century away but the power of disruptive technology is that it is hard to conceive of the problems we might open up – you can try to think of it as known unknowns but at some point you get into new territory. We don’t yet know the kind of problems it will help to solve which is very exciting.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Cambridge Campus

    The University of Cambridge (abbreviated as Cantab in post-nominal letters) is a collegiate public research university in Cambridge, England. Founded in 1209, Cambridge is the second-oldest university in the English-speaking world and the world’s fourth-oldest surviving university. It grew out of an association of scholars who left the University of Oxford after a dispute with townsfolk. The two ancient universities share many common features and are often jointly referred to as “Oxbridge”.

    Cambridge is formed from a variety of institutions which include 31 constituent colleges and over 100 academic departments organised into six schools. The university occupies buildings throughout the town, many of which are of historical importance. The colleges are self-governing institutions founded as integral parts of the university. In the year ended 31 July 2014, the university had a total income of £1.51 billion, of which £371 million was from research grants and contracts. The central university and colleges have a combined endowment of around £4.9 billion, the largest of any university outside the United States. Cambridge is a member of many associations and forms part of the “golden triangle” of leading English universities and Cambridge University Health Partners, an academic health science centre. The university is closely linked with the development of the high-tech business cluster known as “Silicon Fen”.

     
  • richardmitnick 2:32 pm on October 5, 2018 Permalink | Reply
    Tags: , DOE Ofice of HIgh Energy Physics, , ORNL researchers advance quantum computing science through six DOE awards, , Qubits   

    From Oak Ridge National Laboratory: “ORNL researchers advance quantum computing, science through six DOE awards” 

    i1

    From Oak Ridge National Laboratory

    October 3, 2018
    Scott Jones, Communications
    jonesg@ornl.gov
    865.241.6491

    1
    Oak Ridge National Laboratory will be working on new projects aimed at accelerating quantum information science. Credit: Andy Sproles/Oak Ridge National Laboratory, U.S. Dept. of Energy.

    2
    ORNL researchers will leverage various microscopy platforms for quantum computing projects. Credit: Genevieve Martin/Oak Ridge National Laboratory, U.S. Dept. of Energy.

    The Department of Energy’s Oak Ridge National Laboratory is the recipient of six awards from DOE’s Office of Science aimed at accelerating quantum information science (QIS), a burgeoning field of research increasingly seen as vital to scientific innovation and national security.

    The awards, which were made in conjunction with the White House Summit on Advancing American Leadership in QIS, will leverage and strengthen ORNL’s established programs in quantum information processing and quantum computing.

    The application of quantum mechanics to computing and the processing of information has enormous potential for innovation across the scientific spectrum. Quantum technologies use units known as qubits to greatly increase the threshold at which information can be transmitted and processed. Whereas traditional “bits” have a value of either 0 or 1, qubits are encoded with values of both 0 and 1, or any combination thereof, at the same time, allowing for a vast number of possibilities for storing data.

    While in its infancy, the technology is being harnessed to develop computers that, when mature, will be exponentially more powerful than today’s leading systems. Beyond computing, however, quantum information science shows great promise to advance a vast array of research domains, from encryption to artificial intelligence to cosmology.

    The ORNL awards represent three Office of Science programs.

    “Software Stack and Algorithms for Automating Quantum-Classical Computing,” a new project supported by the Office of Advanced Scientific Computing Research, will develop methods for programming quantum computers. Led by ORNL’s Pavel Lougovski, the team of researchers from ORNL, Johns Hopkins University Applied Physics Lab, University of Southern California, University of Maryland, Georgetown University, and Microsoft, will tackle translating scientific applications into functional quantum programs that return accurate results when executed on real-world faulty quantum hardware. The team will develop an open-source algorithm and software stack that will automate the process of designing, executing, and analyzing the results of quantum algorithms, thus enabling new discovery across many scientific domains with an emphasis on applications in quantum field theory, nuclear physics, condensed matter, and quantum machine learning.

    ORNL’s Christopher M. Rouleau will lead the “Thin Film Platform for Rapid Prototyping Novel Materials with Entangled States for Quantum Information Science” project, funded by Basic Energy Sciences. The project aims to establish an agile AI-guided synthesis platform coupling reactive pulsed laser deposition with quick decision-making diagnostics to enable the rapid exploration of a wide spectrum of candidate thin-film materials for QIS; understand the dynamics of photonic states by combining a novel cathodoluminescence scanning electron microscopy platform with ultrafast laser spectroscopy; and enable understanding of entangled spin states for topological quantum computing by developing a novel scanning tunneling microscopy platform.

    ORNL’s Stephen Jesse will lead the “Understanding and Controlling Entangled and Correlated Quantum States in Confined Solid-State Systems Created via Atomic Scale Manipulation,” a new project supported by Basic Energy Sciences that includes collaborators from Harvard and MIT. The goal of the project is to use advanced electron microscopes to engineer novel materials on an atom-by-atom basis for use in QIS. These microscopes, along with other powerful instrumentation, will also be used to assess emerging quantum properties in-situ to aid the assembly process. Collaborators from Harvard will provide theoretical and computational effort to design quantum properties on demand using ORNL’s high-performance computing resources.

    ORNL is also partnering with Pacific Northwest National Laboratory, Berkeley Laboratory, and the University of Michigan on a project funded by the Office of Basic Energy Sciences titled “Embedding Quantum Computing into Many-Body Frameworks for Strongly-Correlated Molecular and Materials Systems.” The research team will develop methods for solving problems in computational chemistry for highly correlated electronic states. ORNL’s contribution, led by Travis Humble, will support this collaboration by translating applications of computational chemistry into the language needed for running on quantum computers and testing these ideas on experimental hardware.

    ORNL will support multiple projects awarded by the Office of High Energy Physics to develop methods for detecting high-energy particles using quantum information science. They include:

    “Quantum-Enhanced Detection of Dark Matter and Neutrinos,” in collaboration with the University of Wisconsin, Tufts, and San Diego State University. This project will use quantum simulation to calculate detector responses to dark matter particles and neutrinos. A new simulation technique under development will require extensive work in error mitigation strategies to correctly evaluate scattering cross sections and other physical quantities. ORNL’s effort, led by Raphael Pooser, will help develop these simulation techniques and error mitigation strategies for the new quantum simulator device, thus ensuring successful detector calculations.

    “Particle Track Pattern Recognition via Content Addressable Memory and Adiabatic Quantum Optimization: OLYMPUS Experiment Revisited,” a collaboration with John Hopkins Applied Physics Laboratory aimed at identifying rare events found in the data generated by experiments at particle colliders. ORNL principal investigator Travis Humble will apply new ideas for data analysis using experimental quantum computers that target faster response times and greater memory capacity for tracking signatures of high-energy particles.

    “HEP ML and Optimization Go Quantum,” in collaboration with Fermi National Accelerator Laboratory and Lockheed Martin Corporation, which will investigate how quantum machine learning methods may be applied to solving key challenges in optimization and data analysis. Advances in training machine learning networks using quantum computer promise greater accuracy and faster response times for data analysis. ORNL principal investigators Travis Humble and Alex McCaskey will help to develop these new methods for quantum machine learning for existing quantum computers by using the XACC programming tools, which offer a flexible framework by which to integrate quantum computing into scientific software.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

     
  • richardmitnick 11:27 am on September 27, 2018 Permalink | Reply
    Tags: , Atom-based quantum computer, , , , , Qubits, Rubidium atoms, Rydberg state,   

    From Science Magazine: “Arrays of atoms emerge as dark horse candidate to power quantum computers” 

    AAAS
    From Science Magazine

    Sep. 26, 2018
    Sophia Chen

    1
    Lasers are used to trap arrays of atoms within glass chambers made by ColdQuanta, a neutral atom quantum computing startup.
    COLDQUANTA INC.

    In a small basement laboratory, Harry Levine, a Harvard University graduate student in physics, can assemble a rudimentary computer in a fraction of a second. There isn’t a processor chip in sight; his computer is powered by 51 rubidium atoms that reside in a glass cell the size of a matchbox. To create his computer, he lines up the atoms in single file, using a laser split into 51 beams. More lasers—six beams per atom—slow the atoms until they are nearly motionless. Then, with yet another set of lasers, he coaxes the atoms to interact with each other, and, in principle, perform calculations.

    It’s a quantum computer, which manipulates “qubits” that can encode zeroes and ones simultaneously in what’s called a superposition state. If scaled up, it might vastly outperform conventional computers at certain tasks. But in the world of quantum computing, Levine’s device is somewhat unusual. In the race to build a practical quantum device, investment has largely gone to qubits that can be built on silicon, such as tiny circuits of superconducting wire and small semiconductors structures known as quantum dots. Now, two recent studies have demonstrated the promise of the qubits Levine works with: neutral atoms. In one study, a group including Levine showed a quantum logic gate made of two neutral atoms could work with far fewer errors than ever before. And in another, researchers built 3D structures of carefully arranged atoms, showing that more qubits can be packed into a small space by taking advantage of the third dimension.

    The advances, along with the arrival of venture capital funding, suggest neutral atoms could be on the upswing, says Dana Anderson, CEO of ColdQuanta, a Boulder, Colorado–based company that is developing an atom-based quantum computer. “We’ve done our homework,” Anderson says. “This is really in the engineering arena now.”

    Because neutral atoms lack electric charge and interact reluctantly with other atoms, they would seem to make poor qubits. But by using specifically timed laser pulses, physicists can excite an atom’s outermost electron and move it away from the nucleus, inflating the atom to billions of times its usual size. Once in this so-called Rydberg state, the atom behaves more like an ion, interacting electromagnetically with neighboring atoms and preventing them from becoming Rydberg atoms themselves.

    Physicists can exploit that behavior to create entanglement—the quantum state of interdependence needed to perform a computation. If two adjacent atoms are excited into superposition, where both are partially in a Rydberg state and partially in their ground state, a measurement will collapse the atoms to one or the other state. But because only one of the atoms can be in its Rydberg state, the atoms are entangled, with the state of one depending on the state of the other.

    Once entangled, neutral atoms offer some inherent advantages. Atoms need no quality control: They are by definition identical. They’re much smaller than silicon-based qubits, which means, in theory, more qubits can be packed into a small space. The systems operate at room temperature, whereas superconducting qubits need to be placed inside a bulky freezer. And because neutral atoms don’t interact easily, they are more immune to outside noise and can hold onto quantum information for a relatively long time. “Neutral atoms have great potential,” says Mark Saffman, a physicist at the University of Wisconsin in Madison. “From a physics perspective, [they could offer] easier scalability and ultimately better performance.”

    Entangled atoms

    The two new studies bolster these claims. By engineering better quality lasers, Levine and his colleagues, led by physicist Mikhail Lukin at Harvard, were able to accurately program a two-rubidium atom logic gate 97% of the time, they report in a paper published on 20 September in Physical Review Letters. That puts the method closer to the performance of superconducting qubits, which already achieve fidelity rates above 99%. In a second study, published in Nature on 5 September, Antoine Browaeys of the Charles Fabry Laboratory near Paris and his colleagues demonstrated an unprecedented level of control over a 3D array of 72 atoms. To show off their control, they even arranged the atoms into the shape of the Eiffel Tower. Another popular qubit type, ions, are comparably small. But they can’t be stacked this densely because they repel each other, acknowledges Crystal Senko, a physicist at the University of Waterloo in Canada who works on ion quantum computers.

    Not everyone is convinced. Compared with other qubits, neutral atoms tend not to stay put, says Varun Vaidya, a physicist at Xanadu, a quantum computing company in Toronto, Canada, that builds quantum devices with photon qubits. “The biggest issue is just holding onto the atoms,” he says. If an atom falls out of place, Lukin’s automated laser system can reassemble the atoms in less than a second, but Vaidya says this may still prohibit the devices from performing longer tasks. “Right now, nobody knows what’s going to be the best qubit,” Senko says. “The bottom line is, they all have their problems.”

    Still, ColdQuanta has recently received $6.75 million in venture funding. Another startup, Atom Computing, based in Berkeley, California, has raised $5 million. CEO Ben Bloom says the company will pursue qubits made of atoms with two valence electrons instead of rubidium’s one, such as calcium and strontium. Bloom believes these atoms will allow for longer-lived qubits. Lukin says he’s also interested in commercializing his group’s technology.

    The startups, as well as Saffman’s group, are aiming to build fully programmable quantum computers. For now, Lukin wants his group to focus on building quantum simulators, a more limited kind of computer that specializes in solving specific optimization problems by preparing the qubits a certain way and letting them evolve naturally. Levine says his group’s device could, for example, help telecommunications engineers figure out where to put radio towers to minimize cost and maximize coverage. “We’re going to try to do something useful with these devices,” Levine says. “People still don’t know yet what quantum systems can do.”

    In the next year or two, he and his colleagues think neutral atom devices could deliver an answer.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 9:25 am on September 27, 2018 Permalink | Reply
    Tags: , Photonic bandgap, , Qubits, Superconducting Metamaterial Traps Quantum Light, Superconducting metamaterials   

    From Caltech: “Superconducting Metamaterial Traps Quantum Light” 

    Caltech Logo

    From Caltech

    09/26/2018

    Robert Perkins
    (626) 395-1862
    rperkins@caltech.edu

    1
    A superconducting metamaterial chip mounted into a microwave test package. The purplish-violet reflection in the center is an optical effect that can seen by the naked eye, and is the result of the diffaction of light by the periodic patterning of the microwave metamaterial. Credit: Oskar Painter/Caltech

    Newly developed material may be key to scaling up quantum circuits.

    Conventional computers store information in a bit, a fundamental unit of logic that can take a value of 0 or 1. Quantum computers rely on quantum bits, also known as a “qubits,” as their fundamental building blocks. Bits in traditional computers encode a single value, either a 0 or a 1. The state of a qubit, by contrast, can simultaneously have a value of both 0 and 1. This peculiar property, a consequence of the fundamental laws of quantum physics, results in the dramatic complexity in quantum systems.

    Quantum computing is a nascent and rapidly developing field that promises to use this complexity to solve problems that are difficult to tackle with conventional computers. A key challenge for quantum computing, however, is that it requires making large numbers of qubits work together—which is difficult to accomplish while avoiding interactions with the outside environment that would rob the qubits of their quantum properties.

    New research from the lab of Oskar Painter, John G Braun Professor of Applied Physics and Physics in the Division of Engineering and Applied Science, explores the use of superconducting metamaterials to overcome this challenge.

    Metamaterials are specially engineered by combining multiple component materials at a scale smaller than the wavelength of light, giving them the ability to manipulate how particles of light, or photons, behave. Metamaterials can be used to reflect, turn, or focus beams of light in nearly any desired manner. A metamaterial can also create a frequency band where the propagation of photons becomes entirely forbidden, a so-called “photonic bandgap.”

    The Caltech team used a photonic bandgap to trap microwave photons in a superconducting quantum circuit, creating a promising technology for building future quantum computers.

    “In principle, this is a scalable and flexible substrate on which to build complex circuits for interconnecting certain types of qubits,” says Painter, leader of the group that conducted the research, which was published in Nature Communications on September 12. “Not only can one play with the spatial arrangement of the connectivity between qubits, but one can also design the connectivity to occur only at certain desired frequencies.”

    Painter and his team created a quantum circuit consisting of thin films of a superconductor—a material that transmits electric current with little to no loss of energy—traced onto a silicon microchip. These superconducting patterns transport microwaves from one part of the microchip to another. What makes the system operate in a quantum regime, however, is the use of a so-called Josephson junction, which consists of an atomically thin non-conductive layer sandwiched between two superconducting electrodes. The Josephson junction creates a source of microwave photons with two distinct and isolated states, like an atom’s ground and excited electronic states, that are involved in the emission of light, or, in the language of quantum computing, a qubit.

    “Superconducting quantum circuits allow one to perform fundamental quantum electrodynamics experiments using a microwave electrical circuit that looks like it could have been yanked directly from your cell phone,” Painter says. “We believe that augmenting these circuits with superconducting metamaterials may enable future quantum computing technologies and further the study of more complex quantum systems that lie beyond our capability to model using even the most powerful classical computer simulations.”

    The paper is titled “Superconducting metamaterials for waveguide quantum electrodynamics,” The team of authors was led by Mohammad Mirhosseini, a Kavli Nanoscience Institute Postdoctoral Scholar at Caltech. Co-authors include postdoctoral scholars Andrew Keller and Alp Sipahigil of the Institute for Quantum Information and Matter (IQIM); and graduate students Eun Jong Kim, Vinicius Ferreira, and Mahmoud Kalaee. The work was performed as part of a pair of Multidisciplinary University Research Initiatives from the Air Force Office of Scientific Research (“Quantum Photonic Matter” and “Wiring Quantum Networks with Mechanical Transducers”), and in conjunction with IQIM, a National Science Foundation Physics Frontiers Center supported by the Gordon and Betty Moore Foundation.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”

    Caltech campus

     
  • richardmitnick 3:06 pm on September 14, 2018 Permalink | Reply
    Tags: , , , Quantum information science on the verge of a technological revolution, Qubits,   

    From UC Santa Cruz: “Quantum information science on the verge of a technological revolution” Revised 

    UC Santa Cruz

    From UC Santa Cruz

    September 13, 2018
    Tim Stephens
    stephens@ucsc.edu

    Theorist Yuan Ping is developing computational methods to guide the design of new materials for quantum computing and other quantum information technologies.

    1
    Materials scientist Yuan Ping (center) with graduate student Tyler Smart (left) and postdoctoral fellow Feng Wu (right) at the UCSC supercomputer center. (Photo by C. Lagattuta)

    See https://sciencesprings.wordpress.com/2018/09/10/from-uc-santa-cruz-nsf-funds-powerful-new-supercomputer-for-uc-santa-cruz-researchers/

    3
    Researchers are racing to develop quantum information technologies, in which information will be stored in quantum bits, or qubits. Qubits can be made from any quantum system that has two states, such as the spin states of electrons. (Image credit: National Science Foundation)

    Quantum computers may one day solve problems that are effectively beyond the capacity of conventional supercomputers. Quantum communications may enable instantaneous, secure transmission of information across vast distances, and quantum sensors may provide previously unheard of sensitivities.

    A global race is on to develop these new quantum information technologies, in which information will be stored in quantum bits, or qubits. In conventional digital technologies, a bit is either 0 or 1, whereas a qubit can represent both states at the same time because of a strange phenomenon of quantum physics called superposition. In theory, this will enable a massive increase in computing speed and capacity for certain types of calculations.

    At UC Santa Cruz, materials scientists are working to develop novel materials that can serve as the foundation for quantum information technology, just as silicon chips paved the way for today’s digital technologies. Several different systems for creating and manipulating qubits have been proposed and implemented, but for now they remain too cumbersome for real-world applications.

    “Our focus as materials scientists is on what material we should use as the fundamental element to carry the information. Other researchers are more concerned with how to wire it up to make a device that can perform calculations, but we’re focused on the material basis of the qubit,” said Yuan Ping, assistant professor of chemistry and biochemistry at UC Santa Cruz.

    2D materials

    In particular, Ping and other UCSC researchers are focusing on defects in extremely thin materials, called two-dimensional (2D) materials. Defects or imperfections in the atomic structure of a material can function as qubits because information can be encoded in the spin states of their electrons. This phenomenon has been well studied in other types of materials, most notably the “nitrogen vacancy” or NV defect in diamond. But according to Ping, 2D materials offer significant advantages.

    “Unlike diamond, 2D materials are relatively cheap and easy to make, they are scalable, and they are easy to integrate into a solid-state device,” she said. “They are also stable at room temperature, which is important because a lot of the qubit systems implemented so far use superconductors that can only operate at very low temperatures.”

    There are a lot of different 2D materials, however, and a lot of ways to put defects into them. The possibilities are almost endless, and it’s not practical to synthesize and test them all experimentally to see which have the best properties for quantum technologies.

    That’s where theorists like Ping come in. She is developing computational methods that can be used to predict the properties of defects in 2D materials reliably and efficiently. In December 2017, her team published a paper in Physical Review Materials establishing the fundamental principles for doing calculations to accurately describe charge defects, electronic states, and spin dynamics in 2D materials. (Her coauthors on the paper include postdoctoral fellow Feng Wu, graduate student Andrew Galatas, and collaborators Dario Rocca at University of Lorraine in France and Ravishankar Sundararaman at Rensselaer Polytechnic Institute.)

    In July, Ping won a $350,000 grant from the National Science Foundation to further develop these computational methods.

    “We’re developing a reliable set of tools to predict the electronic structure, excited-state lifetime, and quantum-state coherence time of defects in 2D materials at a quantum mechanical level,” Ping said. “We do calculations from first principles, meaning we don’t need any input from experiments. Everything is predicted based on quantum mechanics.”

    Quantum weirdness

    The world of quantum mechanics is notoriously counter-intuitive and hard to grasp. Concepts such as superposition and entanglement defy common sense, yet they have been demonstrated conclusively and are fundamental to quantum information technologies. Superposition, when a particle exists in two different states simultaneously, is often compared to a spinning coin, neither heads nor tails until it stops spinning. Entanglement creates a link between the quantum states of two particles or qubits, so it is as if the outcome of one spinning coin determined the outcome of another spinning coin.

    A major challenge in exploiting these phenomena for quantum information technologies is their inherent fragility. Interaction with the environment causes a superposition to fall into one state or the other. Called decoherence, this can be caused by vibrations of the atoms in the material and other subtle effects.

    “You want qubits to be well insulated from the environment to give longer coherence times,” Ping said.

    One 2D material that has shown promise for quantum technologies is ultrathin hexagonal boron nitride. Ping used her computational methods to investigate various defects in this material and identified a promising candidate for scalable quantum applications. This defect (a nitrogen vacancy adjacent to carbon substitution of boron) is predicted to have stable spin states well insulated from the environment and bright optical transitions, making it a good source for single photon emission and a good candidate for qubits.

    “Quantum emitters, which can emit one photon at a time, are important for optically-based quantum information processing, information security, and ultrasensitive sensing,” Ping said.

    She works closely with experimentalists, helping to interpret their results and guide their efforts to create novel materials with desirable properties for quantum technologies. Her group is part of a large collaborative effort, the Quantum Information Science and Engineering Network (QISE-NET), funded by the National Science Foundation. Tyler Smart, a graduate student in Ping’s group, is funded by QISE-NET and is working on a project at Argonne National Laboratory.

    “He will be traveling to Chicago to present his research every few months,” Ping said. “There are about 20 universities as well as national laboratories and industry partners in the network, meeting regularly and sharing ideas, which is important because it’s a fast-moving field.”

    One of the National Science Foundation’s 10 Big Ideas for Future NSF Investments is “The Quantum Leap: Leading the Next Quantum Revolution.”

    The Department of Energy is also investing in this area, as are companies such as Google and Intel, hoping to exploit quantum mechanics to develop next-generation technologies for computing, sensing, and communications.

    “They are all investing in it because it will take a lot of effort to develop this field, and the potential is so great,” Ping said.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    UCO Lick Shane Telescope
    UCO Lick Shane Telescope interior
    Shane Telescope at UCO Lick Observatory, UCSC

    Lick Automated Planet Finder telescope, Mount Hamilton, CA, USA

    Lick Automated Planet Finder telescope, Mount Hamilton, CA, USA

    UC Santa Cruz campus
    The University of California, Santa Cruz, opened in 1965 and grew, one college at a time, to its current (2008-09) enrollment of more than 16,000 students. Undergraduates pursue more than 60 majors supervised by divisional deans of humanities, physical & biological sciences, social sciences, and arts. Graduate students work toward graduate certificates, master’s degrees, or doctoral degrees in more than 30 academic fields under the supervision of the divisional and graduate deans. The dean of the Jack Baskin School of Engineering oversees the campus’s undergraduate and graduate engineering programs.

    UCSC is the home base for the Lick Observatory.

    Lick Observatory's Great Lick 91-centimeter (36-inch) telescope housed in the South (large) Dome of main building
    Lick Observatory’s Great Lick 91-centimeter (36-inch) telescope housed in the South (large) Dome of main building

    Search for extraterrestrial intelligence expands at Lick Observatory
    New instrument scans the sky for pulses of infrared light
    March 23, 2015
    By Hilary Lebow
    1
    The NIROSETI instrument saw first light on the Nickel 1-meter Telescope at Lick Observatory on March 15, 2015. (Photo by Laurie Hatch) UCSC Lick Nickel telescope

    Astronomers are expanding the search for extraterrestrial intelligence into a new realm with detectors tuned to infrared light at UC’s Lick Observatory. A new instrument, called NIROSETI, will soon scour the sky for messages from other worlds.

    “Infrared light would be an excellent means of interstellar communication,” said Shelley Wright, an assistant professor of physics at UC San Diego who led the development of the new instrument while at the University of Toronto’s Dunlap Institute for Astronomy & Astrophysics.

    Wright worked on an earlier SETI project at Lick Observatory as a UC Santa Cruz undergraduate, when she built an optical instrument designed by UC Berkeley researchers. The infrared project takes advantage of new technology not available for that first optical search.

    Infrared light would be a good way for extraterrestrials to get our attention here on Earth, since pulses from a powerful infrared laser could outshine a star, if only for a billionth of a second. Interstellar gas and dust is almost transparent to near infrared, so these signals can be seen from great distances. It also takes less energy to send information using infrared signals than with visible light.

    Frank Drake, professor emeritus of astronomy and astrophysics at UC Santa Cruz and director emeritus of the SETI Institute, said there are several additional advantages to a search in the infrared realm.

    “The signals are so strong that we only need a small telescope to receive them. Smaller telescopes can offer more observational time, and that is good because we need to search many stars for a chance of success,” said Drake.

    The only downside is that extraterrestrials would need to be transmitting their signals in our direction, Drake said, though he sees this as a positive side to that limitation. “If we get a signal from someone who’s aiming for us, it could mean there’s altruism in the universe. I like that idea. If they want to be friendly, that’s who we will find.”

    Scientists have searched the skies for radio signals for more than 50 years and expanded their search into the optical realm more than a decade ago. The idea of searching in the infrared is not a new one, but instruments capable of capturing pulses of infrared light only recently became available.

    “We had to wait,” Wright said. “I spent eight years waiting and watching as new technology emerged.”

    Now that technology has caught up, the search will extend to stars thousands of light years away, rather than just hundreds. NIROSETI, or Near-Infrared Optical Search for Extraterrestrial Intelligence, could also uncover new information about the physical universe.

    “This is the first time Earthlings have looked at the universe at infrared wavelengths with nanosecond time scales,” said Dan Werthimer, UC Berkeley SETI Project Director. “The instrument could discover new astrophysical phenomena, or perhaps answer the question of whether we are alone.”

    NIROSETI will also gather more information than previous optical detectors by recording levels of light over time so that patterns can be analyzed for potential signs of other civilizations.

    “Searching for intelligent life in the universe is both thrilling and somewhat unorthodox,” said Claire Max, director of UC Observatories and professor of astronomy and astrophysics at UC Santa Cruz. “Lick Observatory has already been the site of several previous SETI searches, so this is a very exciting addition to the current research taking place.”

    NIROSETI will be fully operational by early summer and will scan the skies several times a week on the Nickel 1-meter telescope at Lick Observatory, located on Mt. Hamilton east of San Jose.

    The NIROSETI team also includes Geoffrey Marcy and Andrew Siemion from UC Berkeley; Patrick Dorval, a Dunlap undergraduate, and Elliot Meyer, a Dunlap graduate student; and Richard Treffers of Starman Systems. Funding for the project comes from the generous support of Bill and Susan Bloomfield.

     
  • richardmitnick 12:27 pm on September 6, 2018 Permalink | Reply
    Tags: , For The First Time, , Quantum gates, , Qubits, , Scientists Have Teleported And Measured a Quantum Gate in Real Time, Teleporting a special quantum operation between two locations,   

    From Yale University via Science Alert: “For The First Time, Scientists Have Teleported And Measured a Quantum Gate in Real Time” 

    Yale University bloc

    From Yale University

    via

    Science Alert

    6 SEP 2018
    MIKE MCRAE

    1
    (agsandrew/istock)

    Welcome to the future.

    Around 20 years ago, two computer scientists proposed a technique for teleporting a special quantum operation between two locations with the goal of making quantum computers more reliable.

    Now a team of researchers from Yale University have successfully turned their idea into reality, demonstrating a practical approach to making this incredibly delicate form of technology scalable.

    These physicists have developed a practical method for teleporting a quantum operation – or gate – across a distance and measuring its effect. While this feat has been done before, it’s never been done in real time. This paves the way for developing a process that can make quantum computing modular, and therefore more reliable.

    Unlike regular computers, which perform their calculations with states of reality called bits (on or off, 1 or 0), quantum computers operate with qubits – a strange state of reality we can’t wrap our heads around, but which taps into some incredibly useful mathematics.

    In classical computers, bits interact with operations called logic gates. Like the world’s smallest gladiatorial arena, two bits enter, one bit leaves. Gates come in different forms, selecting a winner depending on their particular rule.

    These bits, channelled through gates, form the basis of just about any calculation you can think of, as far as classical computers are concerned.

    But qubits offer an alternative unit to base algorithms on. More than just a 1 or a 0, they also provide a special blend of the two states. It’s like a coin held in a hand before you see whether it’s heads or tails.

    In conjunction with a quantum version of a logic gate, qubits can do what classical bits can’t. There’s just one problem – that indeterminate state of 1 and 0 turns into a definite 1 or 0 when it becomes part of a measured system.

    Worse still, it doesn’t take much to collapse the qubit’s maybe into a definitely, which means a quantum computer can become an expensive paperweight if those delicate components aren’t adequately hidden from their noisy environment.

    Right now, quantum computer engineers are super excited by devices that can wrangle just over 70 qubits – which is impressive, but quantum computers will really only earn their keep as they stock up on hundreds, if not thousands of qubits all hovering on the brink of reality at the same time.

    To make this kind of scaling a reality, scientists need additional tricks. One option would be to make the technology as modular as possible, networking smaller quantum systems into a bigger one in order to offset errors.

    But for that to work, quantum gates – those special operations that deal with the heavy lifting of qubits – also need to be shared.

    Teleporting information, such as a quantum gate, sounds pretty sci-fi. But we’re obviously not talking about Star Trek transport systems here.

    In reality it simply refers to the fact that objects can have their history entangled so that when one is measured, the other immediately collapses into a related state, no matter how far away it is.

    This has technically been demonstrated experimentally already [Physical Review Letters], but, until now, the process hasn’t been reliably performed and measured in real time, which is crucial if it’s to become part of a practical computer.

    “Our work is the first time that this protocol has been demonstrated where the classical communication occurs in real-time, allowing us to implement a ‘deterministic’ operation that performs the desired operation every time,” says lead author Kevin Chou.

    The researchers used qubits in sapphire chips inside a cutting-edge setup to teleport a type of quantum operation called a controlled-NOT gate. Importantly, by applying error-correctable coding, the process was 79 percent reliable.

    “It is a milestone toward quantum information processing using error-correctable qubits,” says principal investigator Robert Schoelkopf.

    It’s a baby step on the road to making quantum modules, but this proof-of-concept shows modules could still be the way to go in growing quantum computers to the scale we need.

    This research was published in Nature.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Yale University Campus

    Yale University comprises three major academic components: Yale College (the undergraduate program), the Graduate School of Arts and Sciences, and the professional schools. In addition, Yale encompasses a wide array of centers and programs, libraries, museums, and administrative support offices. Approximately 11,250 students attend Yale.

     
  • richardmitnick 11:01 am on September 5, 2018 Permalink | Reply
    Tags: , , , , Qubits,   

    From Duke University via The News&Observer: “Look out, IBM. A Duke-led group is also a player in quantum computing” 

    Duke Bloc
    Duke Crest

    From Duke University

    via

    The News&Observer

    August 13, 2018
    Ray Gronberg

    1
    Duke University professors Iman Marvian, Jungsang Kim and Kenneth Brown, gathered here in Kim’s lab in the Chesterfield Building in downtown Durham, are working together to develop a quantum computer that relies on “trapped ion” technology. The National Science Foundation and the federal Intelligence Advanced Research Projects Activity are helping fund the project. Les Todd LKT Photography, Inc.

    There’s a group based at Duke University that thinks it can out-do IBM in the quantum-computing game, and it just got another $15 million in funding from the U.S. government.

    Quantum computing – IBM

    The National Science Foundation grant is helping underwrite a consortium led by professors Jungsang Kim and Ken Brown that’s previously received backing from the federal Intelligence Advanced Research Projects Activity.

    Kim said the group is developing a quantum computer that has “up to a couple dozen qubits” of computational power and reckons it’s a year or so from being operational. The world qubit is the quantum-computing world’s equivalent of normal computing’s “bit” when it comes to gauging processing ability, and each additional qubit represents a doubling of that power.

    “One of the goals of this [grant] is to establish the hardware so we can allow researchers to work on the software and systems optimization,” Kim said of the National Science Foundation grant the agency awarded on Aug. 6.

    Two or three dozen qubits might not sound like a lot when IBM says it has built and tested a 50-qubit machine. But the Duke-led research group is approaching the problem from an entirely different angle.

    The “trapped-ion” design it’s using could hold qubits steady in its internal memory for much longer than superconducting designs like those IBM is working on can manage, Brown said.

    Superconducting designs — which operate at extremely cold temperatures — “are a bit faster” than trapped-ion ones and are the focus of “a much larger industrial effort,” Brown said.

    That speed-versus-resilience tradeoff could matter because IBM says its machines can hold a qubit steady in memory for only up to about 90 microseconds. That means processing runs have to be short, on the order of no more than a couple of seconds total.

    “One thing that’s becoming clear in the community is, the thing we need to scale is not just the number of qubits but also the quality of operations,” said Brown, who in January traded a faculty post at Georgia Tech for a new one at Duke. “If you have a huge number of qubits but the operations are not very good, you effectively have a bad classical computer.”

    Kim added that designers working on quantum computers have to look for the same kind of breakthrough in thinking about the technology that the Wright brothers brought to the development of flight.

    Just as the Wrights and other people working in the field in the late 19th and early 20th centuries figured out that mimicking birds was a developmental dead end, the builders of quantum computers “have to start with something that’s fundamentally quantum and build the right technology to scale it,” Kim said. “You don’t build quantum computers by mimicking classical computers.”

    But for now, the government agencies that are subsidizing the field are backing different approaches and waiting to see what pans out.

    The Aug. 6 grant is the third big one Kim’s lab has secured, building on awards from IARPA in 2010 and 2016 that together brought it about $54.5 million in funding. But in both those rounds of funding, teams from IBM were also among those getting awards from the federal agency, which funds what it calls “high-risk/high-payoff” research for the intelligence community.

    The stakes are so high because quantum computing could become a breakthrough technology. It exploits the physics of subatomic particles in hopes of developing a machine that can process data that exists in multiple states at once, rather than the binary 1 or 0 of traditional computing.

    IBM and the government aren’t the only heavy hitters involved. Google has a quantum-computing project of its own that’s grown with help from IARPA funding.

    3
    Google’s Quantum Dream Machine

    Kim and other people involved in the Duke-led group have also formed a company called IonQ that’s received investment from Google and Amazon.

    The Duke-led group also includes teams from from the University of Maryland, the University of Chicago and Tufts University that are working on hardware, software and applications development, respectively, Duke officials say. Researchers from the University of New Mexico, MIT, the National Institute of Standards and Technology and the University of California-Berkeley are also involved.

    Duke doesn’t have quantum computing all to itself in the Triangle, as in the spring IBM made N.C. State University part of its Q Network, a group of businesses, universities and government agencies that can use IBM’s quantum machines via the cloud.

    But the big difference between the N.C. State and Duke efforts is that with State, the focus is on developing both the future workforce and beginning to push software development, while at Duke it’s more fundamentally about trying to develop the technology.

    Not that software is a side issue, mind.

    “If I had a quantum computer with 60 qubits, I know there are algorithms I can run on it that I can’t simulate with my regular computers,” Brown said, explaining that the technology requires new thinking there, too. “That’s a weird place to be.”

    The quantum project is important enough that Duke has backed it with faculty hires. Brown had been collaborating with Kim’s group for a while, but elected to move to Duke from Georgia Tech after Duke officials decided to conduct what Kim termed “a cluster hire” of quantum specialists.

    Brown joined Kim in the Pratt School of Engineering’s electrical and computer engineering department. A search for someone to fill an an endowed chair in physics continues.

    Another professor involved, Iman Marvian, also joined the Duke faculty at the start of 2018 thanks to the university’s previously announced “quantitative initiative.” A quantum information theorist, he got a joint appointment in physics and engineering. He came to Duke from MIT after a post-doc stint at the Boston school.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Duke Campus

    Younger than most other prestigious U.S. research universities, Duke University consistently ranks among the very best. Duke’s graduate and professional schools — in business, divinity, engineering, the environment, law, medicine, nursing and public policy — are among the leaders in their fields. Duke’s home campus is situated on nearly 9,000 acres in Durham, N.C, a city of more than 200,000 people. Duke also is active internationally through the Duke-NUS Graduate Medical School in Singapore, Duke Kunshan University in China and numerous research and education programs across the globe. More than 75 percent of Duke students pursue service-learning opportunities in Durham and around the world through DukeEngage and other programs that advance the university’s mission of “knowledge in service to society.”

     
  • richardmitnick 10:30 am on July 30, 2018 Permalink | Reply
    Tags: , , Hello quantum world, , , Qubits   

    From COSMOS Magazine: “Hello quantum world” 

    Cosmos Magazine bloc

    From COSMOS Magazine

    30 July 2018
    Will Knight

    Quantum computing – IBM

    Inside a small laboratory in lush countryside about 80 kilometres north of New York City, an elaborate tangle of tubes and electronics dangles from the ceiling. This mess of equipment is a computer. Not just any computer, but one on the verge of passing what may, perhaps, go down as one of the most important milestones in the history of the field.

    Quantum computers promise to run calculations far beyond the reach of any conventional supercomputer. They might revolutionise the discovery of new materials by making it possible to simulate the behaviour of matter down to the atomic level. Or they could upend cryptography and security by cracking otherwise invincible codes. There is even hope they will supercharge artificial intelligence by crunching through data more efficiently.

    Yet only now, after decades of gradual progress, are researchers finally close to building quantum computers powerful enough to do things that conventional computers cannot. It’s a landmark somewhat theatrically dubbed ‘quantum supremacy’. Google has been leading the charge toward this milestone, while Intel and Microsoft also have significant quantum efforts. And then there are well-funded startups including Rigetti Computing, IonQ and Quantum Circuits.

    No other contender can match IBM’s pedigree in this area, though. Starting 50 years ago, the company produced advances in materials science that laid the foundations for the computer revolution. Which is why, last October, I found myself at IBM’s Thomas J. Watson Research Center to try to answer these questions: What, if anything, will a quantum computer be good for? And can a practical, reliable one even be built?

    2
    Credit: Graham Carlow

    Why we think we need a quantum computer

    The research center, located in Yorktown Heights, looks a bit like a flying saucer as imagined in 1961. It was designed by the neo-futurist architect Eero Saarinen and built during IBM’s heyday as a maker of large mainframe business machines. IBM was the world’s largest computer company, and within a decade of the research centre’s construction it had become the world’s fifth-largest company of any kind, just behind Ford and General Electric.

    While the hallways of the building look out onto the countryside, the design is such that none of the offices inside have any windows. It was in one of these cloistered rooms that I met Charles Bennett. Now in his 70s, he has large white sideburns, wears black socks with sandals and even sports a pocket protector with pens in it.

    3
    Charles Bennett was one of the pioneers who realised quantum computers could solve some problems exponentially faster than conventional computers. Credit:Bartek Sadowski

    Surrounded by old computer monitors, chemistry models and, curiously, a small disco ball, he recalled the birth of quantum computing as if it were yesterday.

    When Bennett joined IBM in 1972, quantum physics was already half a century old, but computing still relied on classical physics and the mathematical theory of information that Claude Shannon had developed at MIT in the 1950s. It was Shannon who defined the quantity of information in terms of the number of ‘bits’ (a term he popularised but did not coin) required to store it. Those bits, the 0s and 1s of binary code, are the basis of all conventional computing.

    A year after arriving at Yorktown Heights, Bennett helped lay the foundation for a quantum information theory that would challenge all that. It relies on exploiting the peculiar behaviour of objects at the atomic scale. At that size, a particle can exist ‘superposed’ in many states (e.g., many different positions) at once. Two particles can also exhibit ‘entanglement’, so that changing the state of one may instantaneously affect the other.

    Bennett and others realised that some kinds of computations that are exponentially time consuming, or even impossible, could be efficiently performed with the help of quantum phenomena. A quantum computer would store information in quantum bits, or qubits. Qubits can exist in superpositions of 1 and 0, and entanglement and a trick called interference can be used to find the solution to a computation over an exponentially large number of states. It’s annoyingly hard to compare quantum and classical computers, but roughly speaking, a quantum computer with just a few hundred qubits would be able to perform more calculations simultaneously than there are atoms in the known universe.

    In the summer of 1981, IBM and MIT organised a landmark event called the First Conference on the Physics of Computation. It took place at Endicott House, a French-style mansion not far from the MIT campus.

    In a photo that Bennett took during the conference, several of the most influential figures from the history of computing and quantum physics can be seen on the lawn, including Konrad Zuse, who developed the first programmable computer, and Richard Feynman, an important contributor to quantum theory. Feynman gave the conference’s keynote speech, in which he raised the idea of computing using quantum effects. “The biggest boost quantum information theory got was from Feynman,” Bennett told me. “He said, ‘Nature is quantum, goddamn it! So if we want to simulate it, we need a quantum computer.’”

    IBM’s quantum computer – one of the most promising in existence – is located just down the hall from Bennett’s office. The machine is designed to create and manipulate the essential element in a quantum computer: the qubits that store information.

    The gap between the dream and the reality

    The IBM machine exploits quantum phenomena that occur in superconducting materials. For instance, sometimes current will flow clockwise and counterclockwise at the same time. IBM’s computer uses superconducting circuits in which two distinct electromagnetic energy states make up a qubit.

    The superconducting approach has key advantages. The hardware can be made using well-established manufacturing methods, and a conventional computer can be used to control the system. The qubits in a superconducting circuit are also easier to manipulate and less delicate than individual photons or ions.

    Inside IBM’s quantum lab, engineers are working on a version of the computer with 50 qubits. You can run a simulation of a simple quantum computer on a normal computer, but at around 50 qubits it becomes nearly impossible.

    That means IBM is theoretically approaching the point where a quantum computer can solve problems a classical computer cannot: in other words, quantum supremacy.

    But as IBM’s researchers will tell you, quantum supremacy is an elusive concept. You would need all 50 qubits to work perfectly, when in reality quantum computers are beset by errors that need to be corrected. It is also devilishly difficult to maintain qubits for any length of time; they tend to ‘decohere’, or lose their delicate quantum nature, much as a smoke ring breaks up at the slightest air current. And the more qubits, the harder both challenges become.

    3
    The cutting-edge science of quantum computing requires nanoscale precision mixed with the tinkering spirit of home electronics. Researcher Jerry Chow is here shown fitting a circuitboard in the IBM quantum research lab. Jon Simon

    “If you had 50 or 100 qubits and they really worked well enough, and were fully error-corrected – you could do unfathomable calculations that can’t be replicated on any classical machine, now or ever,” says Robert Schoelkopf, a Yale professor and founder of a company called Quantum Circuits. “The flip side to quantum computing is that there are exponential ways for it to go wrong.”

    Another reason for caution is that it isn’t obvious how useful even a perfectly functioning quantum computer would be. It doesn’t simply speed up any task you throw at it; in fact, for many calculations, it would actually be slower than classical machines. Only a handful of algorithms have so far been devised where a quantum computer would clearly have an edge. And even for those, that edge might be short-lived. The most famous quantum algorithm, developed by Peter Shor at MIT, is for finding the prime factors of an integer. Many common cryptographic schemes rely on the fact that this is hard for a conventional computer to do. But cryptography could adapt, creating new kinds of codes that don’t rely on factorisation.

    This is why, even as they near the 50-qubit milestone, IBM’s own researchers are keen to dispel the hype around it. At a table in the hallway that looks out onto the lush lawn outside, I encountered Jay Gambetta, a tall, easygoing Australian who researches quantum algorithms and potential applications for IBM’s hardware. “We’re at this unique stage,” he said, choosing his words with care. “We have this device that is more complicated than you can simulate on a classical computer, but it’s not yet controllable to the precision that you could do the algorithms you know how to do.”

    What gives the IBMers hope is that even an imperfect quantum computer might still be a useful one.

    Gambetta and other researchers have zeroed in on an application that Feynman envisioned back in 1981. Chemical reactions and the properties of materials are determined by the interactions between atoms and molecules. Those interactions are governed by quantum phenomena. A quantum computer can – at least in theory – model those in a way a conventional one cannot.

    Last year, Gambetta and colleagues at IBM used a seven-qubit machine to simulate the precise structure of beryllium hydride. At just three atoms, it is the most complex molecule ever modelled with a quantum system. Ultimately, researchers might use quantum computers to design more efficient solar cells, more effective drugs or catalysts that turn sunlight into clean fuels.

    Those goals are a long way off. But, Gambetta says, it may be possible to get valuable results from an error-prone quantum machine paired with a classical computer.

    4
    Credit Cosmos Magazine

    Physicist’s dream to engineer’s nightmare

    “The thing driving the hype is the realisation that quantum computing is actually real,” says Isaac Chuang, a lean, soft-spoken MIT professor. “It is no longer a physicist’s dream – it is an engineer’s nightmare.”

    Chuang led the development of some of the earliest quantum computers, working at IBM in Almaden, California, during the late 1990s and early 2000s. Though he is no longer working on them, he thinks we are at the beginning of something very big – that quantum computing will eventually even play a role in artificial intelligence.

    But he also suspects that the revolution will not really begin until a new generation of students and hackers get to play with practical machines. Quantum computers require not just different programming languages but a fundamentally different way of thinking about what programming is. As Gambetta puts it: “We don’t really know what the equivalent of ‘Hello, world’ is on a quantum computer.”

    We are beginning to find out. In 2016 IBM connected a small quantum computer to the cloud. Using a programming tool kit called QISKit, you can run simple programs on it; thousands of people, from academic researchers to schoolkids, have built QISKit programs that run basic quantum algorithms. Now Google and other companies are also putting their nascent quantum computers online. You can’t do much with them, but at least they give people outside the leading labs a taste of what may be coming.

    The startup community is also getting excited. A short while after seeing IBM’s quantum computer, I went to the University of Toronto’s business school to sit in on a pitch competition for quantum startups. Teams of entrepreneurs nervously got up and presented their ideas to a group of professors and investors. One company hoped to use quantum computers to model the financial markets. Another planned to have them design new proteins. Yet another wanted to build more advanced AI systems. What went unacknowledged in the room was that each team was proposing a business built on a technology so revolutionary that it barely exists. Few seemed daunted by that fact.

    This enthusiasm could sour if the first quantum computers are slow to find a practical use. The best guess from those who truly know the difficulties –people like Bennett and Chuang – is that the first useful machines are still several years away. And that’s assuming the problem of managing and manipulating a large collection of qubits won’t ultimately prove intractable.

    Still, the experts hold out hope. When I asked him what the world might be like when my two-year-old son grows up, Chuang, who learned to use computers by playing with microchips, responded with a grin. “Maybe your kid will have a kit for building a quantum computer,” he said.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: