Tagged: Quantum Computing Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:37 am on May 14, 2019 Permalink | Reply
    Tags: "Quantum world-first: researchers can now tell how accurate two-qubit calculations in silicon really are", ...you can only tap into the tremendous power of quantum computing if the qubit operations are near perfect with only tiny errors allowed” Dr Yang says., , “Fidelity is a critical parameter which determines how viable a qubit technology is..., , Quantum Computing, The researchers say the study is further proof that silicon as a technology platform is ideal for scaling up to the large numbers of qubits needed for universal quantum computing., Two-qubit gate,   

    From University of New South Wales: “Quantum world-first: researchers can now tell how accurate two-qubit calculations in silicon really are” 

    U NSW bloc

    From University of New South Wales – Sidney

    14 May 2019

    Isabelle Dubach
    Media and Content Manager
    +61 2 9385 7307, 0432 307 244
    i.dubach@unsw.edu.au

    Scientia Professor Andrew Dzurak
    Electrical Engineering & Telecommunications
    +61 432 405 434
    a.dzurak@unsw.edu.au

    After being the first team to create a two-qubit gate in silicon in 2015, UNSW Sydney engineers are breaking new ground again: they have measured the accuracy of silicon two-qubit operations for the first time – and their results confirm the promise of silicon for quantum computing.

    1
    Wister Huang, a final-year PhD student in Electrical Engineering; Professor Andrew Dzurak; and Dr Henry Yang, a senior research fellow.

    For the first time ever, researchers have measured the fidelity – that is, the accuracy – of two-qubit logic operations in silicon, with highly promising results that will enable scaling up to a full-scale quantum processor.

    The research, carried out by Professor Andrew Dzurak’s team in UNSW Engineering, was published today in the world-renowned journal Nature.

    The experiments were performed by Wister Huang, a final-year PhD student in Electrical Engineering, and Dr Henry Yang, a senior research fellow at UNSW.

    “All quantum computations can be made up of one-qubit operations and two-qubit operations – they’re the central building blocks of quantum computing,” says Professor Dzurak.

    “Once you’ve got those, you can perform any computation you want – but the accuracy of both operations needs to be very high.”

    In 2015 Dzurak’s team was the first to build a quantum logic gate in silicon, making calculations between two qubits of information possible – and thereby clearing a crucial hurdle to making silicon quantum computers a reality.

    A number of groups around the world have since demonstrated two-qubit gates in silicon – but until this landmark paper today, the true accuracy of such a two-qubit gate was unknown.

    Accuracy crucial for quantum success

    “Fidelity is a critical parameter which determines how viable a qubit technology is – you can only tap into the tremendous power of quantum computing if the qubit operations are near perfect, with only tiny errors allowed,” Dr Yang says.

    In this study, the team implemented and performed Clifford-based fidelity benchmarking – a technique that can assess qubit accuracy across all technology platforms – demonstrating an average two-qubit gate fidelity of 98%.

    “We achieved such a high fidelity by characterising and mitigating primary error sources, thus improving gate fidelities to the point where randomised benchmarking sequences of significant length – more than 50 gate operations – could be performed on our two-qubit device,” says Mr Huang, the lead author on the paper.

    Quantum computers will have a wide range of important applications in the future thanks to their ability to perform far more complex calculations at much greater speeds, including solving problems that are simply beyond the ability of today’s computers.

    “But for most of those important applications, millions of qubits will be needed, and you’re going to have to correct quantum errors, even when they’re small,” Professor Dzurak says.

    “For error correction to be possible, the qubits themselves have to be very accurate in the first place – so it’s crucial to assess their fidelity.”

    “The more accurate your qubits, the fewer you need – and therefore, the sooner we can ramp up the engineering and manufacturing to realise a full-scale quantum computer.”


    Silicon confirmed as the way to go.

    The researchers say the study is further proof that silicon as a technology platform is ideal for scaling up to the large numbers of qubits needed for universal quantum computing. Given that silicon has been at the heart of the global computer industry for almost 60 years, its properties are already well understood and existing silicon chip production facilities can readily adapt to the technology.

    “If our fidelity value had been too low, it would have meant serious problems for the future of silicon quantum computing. The fact that it is near 99% puts it in the ballpark we need, and there are excellent prospects for further improvement. Our results immediately show, as we predicted, that silicon is a viable platform for full-scale quantum computing,” Professor Dzurak says.

    “We think that we’ll achieve significantly higher fidelities in the near future, opening the path to full-scale, fault-tolerant quantum computation. We’re now on the verge of a two-qubit accuracy that’s high enough for quantum error correction.”

    In another paper – recently published in Nature Electronics and featured on its cover – on which Dr Yang is lead author, the same team also achieved the record for the world’s most accurate 1-qubit gate in a silicon quantum dot, with a remarkable fidelity of 99.96%.

    3

    “Besides the natural advantages of silicon qubits, one key reason we’ve been able to achieve such impressive results is because of the fantastic team we have here at UNSW. My student Wister and Dr Yang are both incredibly talented. They personally conceived the complex protocols required for this benchmarking experiment,” says Professor Dzurak.

    Other authors on today’s Nature paper are UNSW researchers Tuomo Tanttu, Ross Leon, Fay Hudson, Andrea Morello and Arne Laucht, as well as former Dzurak team members Kok Wai Chan, Bas Hensen, Michael Fogarty and Jason Hwang, while Professor Kohei Itoh from Japan’s Keio University provided isotopically enriched silicon wafers for the project.

    UNSW Dean of Engineering, Professor Mark Hoffman, says the breakthrough is yet another piece of proof that this world-leading team are in the process of taking quantum computing across the threshold from the theoretical to the real.

    “Quantum computing is this century’s space race – and Sydney is leading the charge,” Professor Hoffman says.

    “This milestone is another step towards realising a large-scale quantum computer – and it reinforces the fact that silicon is an extremely attractive approach that we believe will get UNSW there first.”

    Spin qubits based on silicon CMOS technology – the specific method developed by Professor Dzurak’s group – hold great promise for quantum computing because of their long coherence times and the potential to leverage existing integrated circuit technology to manufacture the large numbers of qubits needed for practical applications.

    Professor Dzurak leads a project to advance silicon CMOS qubit technology with Silicon Quantum Computing, Australia’s first quantum computing company.

    “Our latest result brings us closer to commercialising this technology – my group is all about building a quantum chip that can be used for real-world applications,” Professor Dzurak says.

    The silicon qubit device that was used in this study was fabricated entirely at UNSW using a novel silicon-CMOS process line, high-resolution patterning systems, and supporting nanofabrication equipment that are made available by ANFF-NSW.

    A full-scale quantum processor would have major applications in the finance, security and healthcare sectors – it would help identify and develop new medicines by greatly accelerating the computer-aided design of pharmaceutical compounds, it could contribute to developing new, lighter and stronger materials spanning consumer electronics to aircraft, and faster information searching through large databases.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U NSW Campus

    Welcome to UNSW Australia (The University of New South Wales), one of Australia’s leading research and teaching universities. At UNSW, we take pride in the broad range and high quality of our teaching programs. Our teaching gains strength and currency from our research activities, strong industry links and our international nature; UNSW has a strong regional and global engagement.

    In developing new ideas and promoting lasting knowledge we are creating an academic environment where outstanding students and scholars from around the world can be inspired to excel in their programs of study and research. Partnerships with both local and global communities allow UNSW to share knowledge, debate and research outcomes. UNSW’s public events include concert performances, open days and public forums on issues such as the environment, healthcare and global politics. We encourage you to explore the UNSW website so you can find out more about what we do.

     
  • richardmitnick 10:16 am on May 11, 2019 Permalink | Reply
    Tags: Beginning this summer IBM will host developer boot camps and hackathons for hands-on training of the open source IBM Q Experience cloud services platform, IBM Q Network, Quantum Computing, This new effort will build on Virginia Tech’s ongoing efforts with the IBM Q Hub at Oak Ridge in Tennessee, , Virginia Tech has significant expertise in designing control schemes for quantum computing hardware and in developing algorithms for simulating molecular chemistry problems on quantum processors   

    From Virginia Tech: “Virginia Tech joining IBM Q Network to accelerate research, educational opportunities in quantum computing” 

    From Virginia Tech

    1
    Left to right, Nick Mayhall, Sophia Economou, and Ed Barnes, all researchers and faculty members in the Virginia Tech College of Science, discuss quantum computing algorithms.

    Virginia Tech has joined the expanding IBM Q Network as a member of the IBM Q Hub at Oak Ridge National Laboratory to accelerate joint research in quantum computing, as well as develop curricula to help prepare students for new careers in science, engineering, and business influenced by the next era of computing.

    The IBM Q Network is the world’s first community of Fortune 500 companies, startups, academic institutions, and research labs working to advance quantum computing. Virginia Tech researchers and students will have direct access to IBM Q’s most-advanced quantum computing systems for research projects that advance quantum science, exploring early uses of quantum computing, and for teaching.

    IBM iconic image of Quantum computer

    Faculty and students from Virginia Tech’s College of Science and College of Engineering will collaborate with IBM scientists on research to advance the foundational science, technology, and software required to enable more capable quantum systems.

    “Virginia Tech has significant expertise in designing control schemes for quantum computing hardware and in developing algorithms for simulating molecular chemistry problems on quantum processors,” said Sophia Economou, an associate professor from the Department of Physics in the College of Science. “The collaboration with IBM will allow us to advance our efforts in these directions by directly testing our ideas on IBM hardware. Interactions with IBM researchers and student internships will further accelerate Virginia Tech’s expansion into the burgeoning field of quantum computing.”

    Beginning this summer, IBM will host developer boot camps and hackathons for hands-on training of the open source IBM Q Experience cloud services platform, and Qiskit quantum software platform on the campuses of participating universities.

    For now, quantum computers are “noisy,” error-prone prototypes, much like classical computers were in the 1940s. But the exponential properties of their fundamental processing element, the quantum bit (or qubit), holds promise to solve problems in chemistry, artificial intelligence, and other areas that are intractable for today’s computers. Consider: 300 perfectly stable qubits could represent more values than there are atoms in the observable universe – well beyond the capacity of what a classical computer could ever compute. Today’s research is paving the way toward improving these early devices to develop practical quantum applications, according to IBM.

    Robert McGwier, chief scientist at Virginia Tech’s Hume Center and a research professor in the College of Engineering, said this new effort will build on Virginia Tech’s ongoing efforts with the IBM Q Hub at Oak Ridge in Tennessee on construction and analysis of Noisy Qubit Quantum Algorithm and forthcoming efforts with the Office of the Director of Navy Intelligence’s augmented intelligence with machines program. Also in the College of Engineering, the Department of Computer Science’s Wu Feng is teaching an undergraduate course in quantum computing and faculty are preparing research projects in the field for funding proposals with the National Science Foundation.

    IBM also has been partnering on efforts in computational chemistry with Daniel Crawford, a professor in the Department of Chemistry in the College of Science and director of the National Science Foundation-funded Molecular Sciences Software Institute. “The growing collaboration between researchers at Virginia Tech and IBM focuses on the development of novel algorithms that bind the well-established field of quantum chemistry and the emerging domain of quantum computing in order to attack larger and more complex molecular problems than those currently in our grasp,” Crawford said.

    Additional Virginia Tech faculty partnering on the IBM quantum computing project include Ed Barnes, an assistant professor of physics, and Nick Mayhall, an assistant professor of chemistry.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Virginia Polytechnic Institute and State University, commonly known as Virginia Tech and by the initialisms VT and VPI,[8] is an American public, land-grant, research university with a main campus in Blacksburg, Virginia, educational facilities in six regions statewide, and a study-abroad site in Lugano, Switzerland. Through its Corps of Cadets ROTC program, Virginia Tech is also designated as one of six senior military colleges in the United States.

    As Virginia’s third-largest university, Virginia Tech offers 225 undergraduate and graduate degree programs to some 30,600 students and manages a research portfolio of $513 million, the largest of any university in Virginia.[9] The university fulfills its land-grant mission of transforming knowledge to practice through technological leadership and by fueling economic growth and job creation locally, regionally, and across Virginia.

    Virginia Polytechnic Institute and State University officially opened on Oct. 1, 1872, as Virginia’s white land-grant institution (Hampton Normal and Industrial Institute, founded in 1868, was designated the commonwealth’s first black land-grant school. This continued until 1920, when the funds were shifted by the legislature to the Virginia Normal and Industrial Institute in Petersburg, which in 1946 was renamed to Virginia State University by the legislature). During its existence, the university has operated under four different legal names. The founding name was Virginia Agricultural and Mechanical College. Following a reorganization of the college in the 1890s, the state legislature changed the name to Virginia Agricultural and Mechanical College and Polytechnic Institute, effective March 5, 1896. Faced with such an unwieldy name, people began calling it Virginia Polytechnic Institute, or simply VPI. On June 23, 1944, the legislature followed suit, officially changing the name to Virginia Polytechnic Institute. At the same time, the commonwealth moved most women’s programs from VPI to nearby Radford College, and that school’s official name became Radford College, Women’s Division of Virginia Polytechnic Institute. The commonwealth dissolved the affiliation between the two colleges in 1964. The state legislature sanctioned university status for VPI and bestowed upon it the present legal name, Virginia Polytechnic Institute and State University, effective June 26, 1970. While some older alumni and other friends of the university continue to call it VPI, its most popular–and its official—nickname today is Virginia Tech.

     
  • richardmitnick 8:53 am on April 26, 2019 Permalink | Reply
    Tags: A promising building block for supercomputers of the future: a two-dimensional platform for that could lead to quantum bits that are both stable and able to be mass produced., , Center for Quantum Devices (QDev) a Center of Excellence sponsored by the Danish National Research Foundation at the Niels Bohr Institute University of Copenhagen, Our prototype is a significant first step towards using this type of system to make quantum bits that are protected from disturbances., Quantum Computing, The Copenhagen team was able to demonstrate Majorana zero modes in the one-dimensional semiconductor gap between two superconductors forming a spatially extended Josephson junction.,   

    From University of Copenhagen: “University of Copenhagen researchers realize new platform for future quantum computer” 

    From University of Copenhagen

    Niels Bohr Institute bloc

    Niels Bohr Institute

    26 April 2019

    Antonio Fornieri
    Postdoc
    antonio.fornieri@nbi.ku.dk
    http://www.nbi.ku.dk/
    Phone: +45 35 33 48 89

    Michael Skov Jensen
    Press officer
    Faculty of Science
    msj@science.ku.dk
    +45 93 56 58 97

    Quantum physics

    University of Copenhagen physicists, as part of the University and Microsoft collaboration focused on topological quantum computing, may have unloosed a Gordian knot in quantum computer development. In partnership with researchers from University of Chicago, ETH Zürich, Weizmann Institute of Science, and fellow Microsoft Quantum Lab collaborators at Purdue University, they have designed and realized a promising building block for supercomputers of the future: a two-dimensional platform for that could lead to quantum bits that are both stable and able to be mass produced.

    1
    Led by two young physicists, Antonio Fornieri and Alex Whiticar, under the supervision of Professor Charles Marcus, Director of Microsoft Quantum Lab Copenhagen, researchers at the Center for Quantum Devices (QDev) a Center of Excellence sponsored by the Danish National Research Foundation at the Niels Bohr Institute, University of Copenhagen, designed, built, and characterized a key component that could cut a Gordian knot in the development of viable quantum computers – specifically, the building block for a quantum bit, or qubit, that is both protected from disturbances and able to be mass produced. Their results have just been published in the scientific journal, Nature.

    Together with a back-to-back publication from a team at Harvard University on a related system, the Copenhagen team was able to demonstrate Majorana zero modes in the one-dimensional semiconductor gap between two superconductors forming a spatially extended Josephson junction, an effect predicted theoretically by teams at Harvard-Weizmann, and Niels Bohr Institute-Lund University.

    The wide Josephson junction is part of a complex chip of hybrid superconductor and semiconductor materials grown by Michael Manfra’s Microsoft Quantum Lab group at Purdue. It is anticipated to be an important component in the development of topological quantum information. The discovery unlocks a range of possibilities for researchers. “A major advantage of the discovered component is that it can be mass produced. We can design a large and complex system of quantum bits on a contemporary laptop and have it manufactured using a common production technique for ordinary computer circuits,” says co-lead author Postdoctoral Fellow Antonio Fornieri.

    From handcraft to mass production

    Majorana quantum states are the foundation for the quantum computer being developed by a combination of University students, PhDs and postdocs, and Microsoft employees pursuing collaborative research at Microsoft Quantum Lab Copenhagen at the Niels Bohr Institute. The Majorana quantum state has an important property that protects it from external disturbances, in principle enabling longer periods of quantum processing compared with other types of quantum bits. One of the greatest challenges for researchers worldwide is to develop qubits that are stable enough to allow a computer to perform complicated calculations before the quantum state disappears and the information stored in the bits is lost.

    In the past decade, Majorana particles have been created in the lab using semiconductor nanowires connected to superconductors and placed in a large magnetic field. Nanowires are not well suited for scale-up to a full-blown quantum technology because of the laborious assembly required to manipulate microscopic threads with a needle, move them individually from one substrate to another, and then secure them into a network. Given that a quantum computer will likely require thousands or more bits, this would be an exceptionally difficult process using hand-placed nanowires. Furthermore, nanowires require high magnetic fields to function. The new Josephson junction-based platform replaces the nanowires with a two-dimensional device which requires lower magnetic fields to form the Majorana states.

    Promising structure

    “Our prototype is a significant first step towards using this type of system to make quantum bits that are protected from disturbances. Right now, we still need some fine-tuning – we can improve the design and materials. But it is a potentially perfect structure,” asserts Fornieri.

    The two-dimensional system has another important quality according to research group member Alex Whiticar, a doctoral student: “Our component has an additional control parameter, in the form of the superconducting phase difference across the Josephson junction that makes it possible to simultaneously control the presence of Majorana-states throughout a system of quantum bits. This has never been seen before. Furthermore, this system needs a much lower magnetic field to achieve Majorana states. This will significantly ease the manufacturing of larger quantities of quantum bits.”

    Charles Marcus adds, “Moving from one dimensional nanowires into two-dimensional hybrids opened the field. This device is the first of many advances that can be anticipated once topological structures can be patterned and repeated with precision on the 10nm scale. Stay tuned.”

    Collaborative public-private partnering

    This breakthrough underscores the productiveness of the deepened collaboration established September of 2017 between the University of Copenhagen and Microsoft. This collaboration has only intensified and expanded with the establishing of Microsoft Quantum Materials Lab Copenhagen just one year after, drawing from talent both the University of Copenhagen, the Technical University of Denmark, and around Europe.

    As summarized by Michael Manfra, “The close collaboration between the Microsoft Quantum Laboratories has resulted in a promising new platform for the study and control of Majorana zero modes. It is exciting that this approach is potentially scalable.”

    3
    Schematic representation of the device.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Niels Bohr Institute Campus

    The Niels Bohr Institute (Danish: Niels Bohr Institutet) is a research institute of the University of Copenhagen. The research of the institute spans astronomy, geophysics, nanotechnology, particle physics, quantum mechanics and biophysics.

    The Institute was founded in 1921, as the Institute for Theoretical Physics of the University of Copenhagen, by the Danish theoretical physicist Niels Bohr, who had been on the staff of the University of Copenhagen since 1914, and who had been lobbying for its creation since his appointment as professor in 1916. On the 80th anniversary of Niels Bohr’s birth – October 7, 1965 – the Institute officially became The Niels Bohr Institute.[1] Much of its original funding came from the charitable foundation of the Carlsberg brewery, and later from the Rockefeller Foundation.[2]

    During the 1920s, and 1930s, the Institute was the center of the developing disciplines of atomic physics and quantum physics. Physicists from across Europe (and sometimes further abroad) often visited the Institute to confer with Bohr on new theories and discoveries. The Copenhagen interpretation of quantum mechanics is named after work done at the Institute during this time.

    On January 1, 1993 the institute was fused with the Astronomic Observatory, the Ørsted Laboratory and the Geophysical Institute. The new resulting institute retained the name Niels Bohr Institute.

    The University of Copenhagen (UCPH) (Danish: Københavns Universitet) is the oldest university and research institution in Denmark. Founded in 1479 as a studium generale, it is the second oldest institution for higher education in Scandinavia after Uppsala University (1477). The university has 23,473 undergraduate students, 17,398 postgraduate students, 2,968 doctoral students and over 9,000 employees. The university has four campuses located in and around Copenhagen, with the headquarters located in central Copenhagen. Most courses are taught in Danish; however, many courses are also offered in English and a few in German. The university has several thousands of foreign students, about half of whom come from Nordic countries.

    The university is a member of the International Alliance of Research Universities (IARU), along with University of Cambridge, Yale University, The Australian National University, and UC Berkeley, amongst others. The 2016 Academic Ranking of World Universities ranks the University of Copenhagen as the best university in Scandinavia and 30th in the world, the 2016-2017 Times Higher Education World University Rankings as 120th in the world, and the 2016-2017 QS World University Rankings as 68th in the world. The university has had 9 alumni become Nobel laureates and has produced one Turing Award recipient

     
  • richardmitnick 10:34 am on April 11, 2019 Permalink | Reply
    Tags: , , Quantum Computing, ,   

    From Science Node: “The end of an era” 

    Science Node bloc
    From Science Node

    10 Apr, 2019
    Alisa Alering

    For the last fifty years, computer technology has been getting faster and cheaper. Now that extraordinary progress is coming to an end. What happens next?

    John Shalf, department head for Computer Science at Berkeley Lab, has a few ideas. He’s going to share them in his keynote at ISC High Performance 2019 in Frankfurt, Germany (June 16-20), but he gave Science Node a sneak preview.

    Moore’s Law is based on Gordon Moore’s 1965 prediction that the number of transistors on a microchip doubles every two years, while the cost is halved. His prediction proved true for several decades. What’s different now?

    1
    Double trouble. From 1965 to 2004, the number of transistors on a microchip doubled every two years while cost decreased. Now that you can’t get more transistors on a chip, high-performance computing is in need of a new direction. Data courtesy Data quest/Intel.

    The end of Dennard scaling happened in 2004, when we couldn’t crank up the clock frequencies anymore on chips, so we moved to exponentially increasing parallelism in order to continue performance scaling. It was not an ideal solution, but it enabled us to continue some semblance of performance scaling. Now we’ve gotten to the point where we can’t squeeze any more transistors onto the chip.

    If you can’t cram any more transistors on the chip, then we can’t continue to scale the number of cores as a means to scale performance. And we’ll get no power improvement: with the end of Moore’s Law, in order to get ten times more performance we would need ten times more power in the future. Capital equipment cost won’t improve either. Meaning that if I spend $100 million and can get a 100 petaflop machine today, then I spend $100 million ten years from now, I’ll get the same machine.

    That sounds fairly dire. Is there anything we can do?

    There are three dimensions we can pursue: One is new architectures and packaging, the other is CMOS transistor replacements using new materials, the third is new models of computation that are not necessarily digital.

    Let’s break it down. Tell me about architectures.

    2
    John Shalf, of Lawrence Berkeley National Laboratory, wants to consider all options—from new materials and specialization to industry partnerships–when it comes to imagining the future of high-performance computing. Courtesy John Shalf.

    We need to change course and learn from our colleagues in other industries. Our friends in the phone business and in mega data centers are already pointing out the solution. Architectural specialization is one of the biggest sources of improvement in the iPhone. The A8 chip, introduced in 2014, had 29 different discreet accelerators. We’re now at the A11, and it has nearly 40 different discreet hardware accelerators. Future generation chips are slowly squeezing out the CPUs and having special function accelerators for different parts of their workload.

    And for the mega-data center, Google is making its own custom chip. They weren’t seeing the kind of performance improvements they needed from Intel or Nvidia, so they’re building their own custom chips tailored to improve the performance for their workloads. So are Facebook and Amazon. The only people absent from this are HPC.

    With Moore’s Law tapering off, the only way to get a leg up in performance is to go back to customization. The embedded systems and the ARM ecosystem is an example where, even though the chips are custom, the components—the little circuit designs on those chips—are reusable across many different disciplines. The new commodity is going to be these little IP blocks we arrange on the chip. We may need to add some IP blocks that are useful for scientific applications, but there’s a lot of IP reuse in that embedded ecosystem and we need to learn how to tap into that.

    How do new materials fit in?

    We’ve been using silicon for the past several decades because it is inexpensive and ubiquitous, and has many years of development effort behind it. We have developed an entire scalable manufacturing infrastructure around it, so it continues to be the most cost-effective route for mass-manufacture of digital devices. It’s pretty amazing, to use one material system for that long. But now we need to look at some new transistor that can continue to scale performance beyond what we’re able to wring out of silicon. Silicon is, frankly, not that great of a material when it comes to electron mobility.

    _________________________________________________________
    The Materials Project

    The current pace of innovation is extremely slow because the primary means available for characterizing new materials is to read a lot of papers. One solution might be Kristin Persson’s Materials Project, originally invented to advance the exploration of battery materials.

    By scaling materials computations over supercomputing clusters, research can be targeted to the most promising compounds, helping to remove guesswork from materials design. The hope is that reapplying this technology to also discover better electronic materials will speed the pace of discovery for new electronic devices.
    In 2016, an eight laboratory consortium was formed to push this in the DOE “Big ideas Summit” where grass-roots ideas from the labs are presented to the highest levels of DOE leadership. Read the whitepaper and elevator pitch here.

    After the ‘Beyond Moore’s Law’ project was invited back for the 2017 Big Ideas Summit, the DOE created a Microelectronics BRN (Basic Research Needs) Workshop. The initial report from that meeting is released, and the DOE’s FY20 budget includes a line item for Microelectronics research.
    _________________________________________________________

    The problem is, we know historically that once you demonstrate a new device concept in the laboratory, it takes about ten years to commercialize it. Prior experience has shown a fairly consistent timeline of 10 years from lab to fab. Although there are some promising directions, nobody has demonstrated something that’s clearly superior to silicon transistors in the lab yet. With no CMOS replacement imminent, that means we’re already ten years too late! We need to develop tools and processes to accelerate the pace for discovery of more efficient microelectronic devices to replace CMOS and the materials that make them possible.

    So, until we find a new material for the perfect chip, can we solve the problem with new models of computing. What about quantum computing?

    New models would include quantum and neuromorphic computing. These models expand computing into new directions, but they’re best at computing problems that are done poorly using digital computing.

    I like to use the example of ‘quantum Excel.’ Say I balance my checkbook by creating a spreadsheet with formulas, and it tells me how balanced my checkbook is. If I were to use a quantum computer for that—and it would be many, many, many years in the future where we’d have enough qubits to do it, but let’s just imagine—quantum Excel would be the superposition of all possible balanced checkbooks.

    And a neuromorphic computer would say, ‘Yes, it looks correct,’ and then you’d ask it again and it would say, ‘It looks correct within an 80% confidence interval.’ Neuromorphic is great at pattern recognition, but it wouldn’t be as good for running partial differential equations and computing exact arithmetic.

    We really need to go back to the basics. We need to go back to ‘What are the application requirements?’

    Clearly there are a lot of challenges. What’s exciting about this time right now?

    3
    The Summit supercomputer at Oak Ridge National Laboratory operates at a top speed of 200 petaflops and is currently the world’s fastest computer. But the end of Moore’s Law means that to get 10x that performance in the future, we also would need 10x more power. Courtesy Carlos Jones/ORNL.

    Computer architecture has become very, very important again. The previous era of exponential scaling created a much narrower space for innovation because the focus was general purpose computing, the universal machine. The problems we now face opens up the door again to mathematicians and computer architects to collaborate to solve big problems together. And I think that’s very exciting. Those kinds of collaborations lead to really fun, creative, and innovative solutions to worldwide important scientific problems.

    The real issue is that our economic model for acquiring supercomputing systems will be deeply disrupted. Originally, systems were designed by mathematicians to solve important mathematical problems. However, the exponential improvement rates of Moore’s law ensured that the most general purpose machines that were designed for the broadest range of problems would have a superior development budget and, over time, would ultimately deliver more cost-effective performance than specialized solutions.

    The end of Moore’s Law spells the end of general purpose computing as we know it. Continuing with this approach dooms us to modest or even non-existent performance improvements. But the cost of customization using current processes is unaffordable.

    We must reconsider our relationship with industry to re-enable specialization targeted at our relatively small HPC market. Developing a self-sustaining business model is paramount. The embedded ecosystem (including the ARM ecosystem) provides one potential path forward, but there is also the possibility of leveraging the emerging open source hardware ecosystem and even packaging technologies such as Chiplets to create cost-effective specialization.

    We must consider all options for business models and all options for partnerships across agencies or countries to ensure an affordable and sustainable path forward for the future of scientific and technical computing.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 12:50 pm on April 5, 2019 Permalink | Reply
    Tags: "Putting a New Spin on Majorana Fermions", , , , Majorana fermions are particle-like excitations called quasiparticles that emerge as a result of the fractionalization (splitting) of individual electrons into two halves., , , , Quantum Computing, Spin ladders- crystals formed of atoms with a three-dimensional (3-D) structure subdivided into pairs of chains that look like ladders.   

    From Brookhaven National Lab: “Putting a New Spin on Majorana Fermions” 

    From Brookhaven National Lab

    April 1, 2019
    Ariana Tantillo
    atantillo@bnl.gov
    (631) 344-2347

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    Split electrons that emerge at the boundaries between different magnetic states in materials known as spin ladders could act as stable bits of information in next-generation quantum computers.

    2
    Theoretical calculations performed by (left to right) Neil Robinson, Robert Konik, Alexei Tsvelik, and Andreas Weichselbaum of Brookhaven Lab’s Condensed Matter Physics and Materials Science Department suggest that Majorana fermions exist in the boundaries of magnetic materials with different magnetic phases. Majorana fermions are particle-like excitations that emerge when single electrons fractionalize into two halves, and their unique properties are of interest for quantum applications.

    The combination of different phases of water—solid ice, liquid water, and water vapor—would require some effort to achieve experimentally. For instance, if you wanted to place ice next to vapor, you would have to continuously chill the water to maintain the solid phase while heating it to maintain the gas phase.

    For condensed matter physicists, this ability to create different conditions in the same system is desirable because interesting phenomena and properties often emerge at the interfaces between two phases. Of current interest is the conditions under which Majorana fermions might appear near these boundaries.

    Majorana fermions are particle-like excitations called quasiparticles that emerge as a result of the fractionalization (splitting) of individual electrons into two halves. In other words, an electron becomes an entangled (linked) pair of two Majorana quasiparticles, with the link persisting regardless of the distance between them. Scientists hope to use Majorana fermions that are physically separated in a material to reliably store information in the form of qubits, the building blocks of quantum computers. The exotic properties of Majoranas—including their high insensitivity to electromagnetic fields and other environmental “noise”—make them ideal candidates for carrying information over long distances without loss.

    However, to date, Majorana fermions have only been realized in materials at extreme conditions, including at frigid temperatures close to absolute zero (−459 degrees Fahrenheit) and under high magnetic fields. And though they are “topologically” protected from local atomic impurities, disorder, and defects that are present in all materials (i.e., their spatial properties remain the same even if the material is bent, twisted, stretched, or otherwise distorted), they do not survive under strong perturbations. In addition, the range of temperatures over which they can operate is very narrow. For these reasons, Majorana fermions are not yet ready for practical technological application.

    Now, a team of physicists led by the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory and including collaborators from China, Germany, and the Netherlands has proposed a novel theoretical method for producing more robust Majorana fermions. According to their calculations, as described in a paper published on Jan. 15 in Physical Review Letters, these Majoranas emerge at higher temperatures (by many orders of magnitude) and are largely unaffected by disorder and noise. Even though they are not topologically protected, they can persist if the perturbations change slowly from one point to another in space.

    “Our numerical and analytical calculations provide evidence that Majorana fermions exist in the boundaries of magnetic materials with different magnetic phases, or directions of electron spins, positioned next to one other,” said co-author Alexei Tsvelik, senior scientist and leader of the Condensed Matter Theory Group in Brookhaven Lab’s Condensed Matter Physics and Materials Science (CMPMS) Department. “We also determined the number of Majorana fermions you should expect to get if you combine certain magnetic phases.”

    For their theoretical study, the scientists focused on magnetic materials called spin ladders, which are crystals formed of atoms with a three-dimensional (3-D) structure subdivided into pairs of chains that look like ladders. Though the scientists have been studying the properties of spin ladder systems for many years and expected that they would produce Majorana fermions, they did not know how many. To perform their calculations, they applied the mathematical framework of quantum field theory for describing the fundamental physics of elementary particles, and a numerical method (density-matrix renormalization group) for simulating quantum systems whose electrons behave in a strongly correlated way.

    “We were surprised to learn that for certain configurations of magnetic phases we can generate more than one Majorana fermion at each boundary,” said co-author and CMPMS Department Chair Robert Konik.

    For Majorana fermions to be practically useful in quantum computing, they need to be generated in large numbers. Computing experts believe that the minimum threshold at which quantum computers will be able to solve problems that classical computers cannot is 100 qubits. The Majorana fermions also have to be moveable in such a way that they can become entangled.

    The team plans to follow up their theoretical study with experiments using engineered systems such as quantum dots (nanosized semiconducting particles) or trapped (confined) ions. Compared to the properties of real materials, those of engineered ones can be more easily tuned and manipulated to introduce the different phase boundaries where Majorana fermions may emerge.

    “What the next generation of quantum computers will be made of is unclear right now,” said Konik. “We’re trying to find better alternatives to the low-temperature superconductors of the current generation, similar to how silicon replaced germanium in transistors. We’re in such early stages that we need to explore every possibility available.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 9:48 am on March 30, 2019 Permalink | Reply
    Tags: , Cubits, , , Quantum Computing,   

    From Pennsylvania State University: “Extremely accurate measurements of atom states for quantum computing” 

    Penn State Bloc

    From Pennsylvania State University

    25 March 2019

    David Weiss
    dsweiss@phys.psu.edu
    (814) 863-3076

    Sam Sholtis
    samsholtis@psu.edu
    814-865-1390

    1
    New method allows extremely accurate measurement of the quantum state of atomic qubits—the basic unit of information in quantum computers. Atoms are initially sorted to fill two 5×5 planes (dashed yellow grid marks their initial locations). After the first images are taken, microwaves are used to put the atoms into equal superpositions of two spin states. A shift to the left or right in the final images corresponds to detection in one spin state or the other. Associated square patterns denote atom locations (cyan: initial position, orange and blue: shifted positions). Credit: Weiss Laboratory, Penn State

    A new method allows the quantum state of atomic “qubits”—the basic unit of information in quantum computers—to be measured with twenty times less error than was previously possible, without losing any atoms. Accurately measuring qubit states, which are analogous to the one or zero states of bits in traditional computing, is a vital step in the development of quantum computers. A paper describing the method by researchers at Penn State appears March 25, 2019 in the journal Nature Physics.

    “We are working to develop a quantum computer that uses a three-dimensional array of laser-cooled and trapped cesium atoms as qubits,” said David Weiss, professor of physics at Penn State and the leader of the research team. “Because of how quantum mechanics works, the atomic qubits can exist in a ‘superposition’ of two states, which means they can be, in a sense, in both states simultaneously. To read out the result of a quantum computation, it is necessary to perform a measurement on each atom. Each measurement finds each atom in only one of its two possible states. The relative probability of the two results depends on the superposition state before the measurement.”

    To measure qubit states, the team first uses lasers to cool and trap about 160 atoms in a three-dimensional lattice with X, Y, and Z axes. Initially, the lasers trap all of the atoms identically, regardless of their quantum state. The researchers then rotate the polarization of one of the laser beams that creates the X lattice, which spatially shifts atoms in one qubit state to the left and atoms in the other qubit state to the right. If an atom starts in a superposition of the two qubit states, it ends up in a superposition of having moved to the left and having moved to the right. They then switch to an X lattice with a smaller lattice spacing, which tightly traps the atoms in their new superposition of shifted positions. When light is then scattered from each atom to observe where it is, each atom is either found shifted left or shifted right, with a probability that depends on its initial state. The measurement of each atom’s position is equivalent to a measurement of each atom’s initial qubit state.

    “Mapping internal states onto spatial locations goes a long way towards making this an ideal measurement,” said Weiss. “Another advantage of our approach is that the measurements do not cause the loss of any of the atoms we are measuring, which is a limiting factor in many previous methods.”

    The team determined the accuracy of their new method by loading their lattices with atoms in either one or the other qubit states and performing the measurement. They were able to accurately measure atom states with a fidelity of 0.9994, meaning that there were only six errors in 10,000 measurements, a twenty-fold improvement on previous methods. Additionally, the error rate was not impacted by the number of qubits that the team measured in each experiment and because there was no loss of atoms, the atoms could be reused in a quantum computer to perform the next calculation.

    “Our method is similar to the Stern-Gerlach experiment from 1922—an experiment that is integral to the history of quantum physics,” said Weiss. “In the experiment, a beam of silver atoms was passed through a magnetic field gradient with their north poles aligned perpendicular to the gradient. When Stern and Gerlach saw half the atoms deflect up and half down, it confirmed the idea of quantum superposition, one of the defining aspects of quantum mechanics. In our experiment, we also map the internal quantum states of atoms onto positions, but we can do it on an atom by atom basis. Of course, we do not need to test this aspect of quantum mechanics, we can just use it.”

    In addition to Weiss, the research team at Penn State includes Tsung-Yao Wu, Aishwarya Kumar, and Felipe Giraldo. The research was supported by the U.S. National Science Foundation.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Penn State Campus

    WHAT WE DO BEST

    We teach students that the real measure of success is what you do to improve the lives of others, and they learn to be hard-working leaders with a global perspective. We conduct research to improve lives. We add millions to the economy through projects in our state and beyond. We help communities by sharing our faculty expertise and research.

    Penn State lives close by no matter where you are. Our campuses are located from one side of Pennsylvania to the other. Through Penn State World Campus, students can take courses and work toward degrees online from anywhere on the globe that has Internet service.

    We support students in many ways, including advising and counseling services for school and life; diversity and inclusion services; social media sites; safety services; and emergency assistance.

    Our network of more than a half-million alumni is accessible to students when they want advice and to learn about job networking and mentor opportunities as well as what to expect in the future. Through our alumni, Penn State lives all over the world.

    The best part of Penn State is our people. Our students, faculty, staff, alumni, and friends in communities near our campuses and across the globe are dedicated to education and fostering a diverse and inclusive environment.

     
  • richardmitnick 2:47 pm on March 15, 2019 Permalink | Reply
    Tags: , , Quantum Computing, Quantum information can be stored and exchanged using electron spin states., , Size matters in quantum information exchange even on the nanometer scale, The collaboration between researchers with diverse expertise was key to success., Two correlated electron pairs were coherently superposed and entangled over five quantum dots constituting a new world record within the community.   

    From Niels Bohr Institute: “Long-distance quantum information exchange – success at the nanoscale” 

    University of Copenhagen

    Niels Bohr Institute bloc

    From Niels Bohr Institute

    At the Niels Bohr Institute, University of Copenhagen, researchers have realized the swap of electron spins between distant quantum dots. The discovery brings us a step closer to future applications of quantum information, as the tiny dots have to leave enough room on the microchip for delicate control electrodes. The distance between the dots has now become big enough for integration with traditional microelectronics and perhaps, a future quantum computer. The result is achieved via a multinational collaboration with Purdue University and the University of Sydney, Australia, now published in Nature Communications.

    Size matters in quantum information exchange even on the nanometer scale.

    Quantum information can be stored and exchanged using electron spin states. The electrons’ charge can be manipulated by gate-voltage pulses, which also controls their spin. It was believed that this method can only be practical if quantum dots touch each other; if squeezed too close together the spins will react too violently, if placed too far apart the spins will interact far too slowly. This creates a dilemma, because if a quantum computer is ever going to see the light of day, we need both, fast spin exchange and enough room around quantum dots to accommodate the pulsed gate electrodes.

    Normally, the left and right dots in the linear array of quantum dots (Illustration 1) are too far apart to exchange quantum information with each other. Frederico Martins, postdoc at UNSW, Sydney, Australia, explains: “We encode quantum information in the electrons’ spin states, which have the desirable property that they don’t interact much with the noisy environment, making them useful as robust and long-lived quantum memories. But when you want to actively process quantum information, the lack of interaction is counterproductive – because now you want the spins to interact!” What to do? You can’t have both long lived information and information exchange – or so it seems. “We discovered that by placing a large, elongated quantum dot between the left dots and right dots, it can mediate a coherent swap of spin states, within a billionth of a second, without ever moving electrons out of their dots. In other words, we now have both fast interaction and the necessary space for the pulsed gate electrodes ”, says Ferdinand Kuemmeth, associate professor at the Niels Bohr Institute.

    1
    Researchers at the Niels Bohr Institute cooled a chip containing a large array of spin qubits below -273 Celsius. To manipulate individual electrons within the quantum-dot array, they applied fast voltage pulses to metallic gate electrodes located on the surface of the gallium-arsenide crystal (see scanning electron micrograph). Because each electron also carries a quantum spin, this allows quantum information processing based on the array’s spin states (the arrows on the graphic illustration). During the mediated spin exchange, which only took a billionth of a second, two correlated electron pairs were coherently superposed and entangled over five quantum dots, constituting a new world record within the community.

    Collaborations are an absolute necessity, both internally and externally.

    The collaboration between researchers with diverse expertise was key to success. Internal collaborations constantly advance the reliability of nanofabrication processes and the sophistication of low-temperature techniques. In fact, at the Center for Quantum Devices, major contenders for the implementation of solid-state quantum computers are currently intensely studied, namely semiconducting spin qubits, superconducting gatemon qubits, and topological Majorana qubits.

    All of them are voltage-controlled qubits, allowing researchers to share tricks and solve technical challenges together. But Kuemmeth is quick to add that “all of this would be futile if we didn’t have access to extremely clean semiconducting crystals in the first place”. Michael Manfra, Professor of Materials Engineering, agrees: “Purdue has put a lot of work into understanding the mechanisms that lead to quiet and stable quantum dots. It is fantastic to see this work yield benefits for Copenhagen’s novel qubits”.

    The theoretical framework of the discovery is provided by the University of Sydney, Australia. Stephen Bartlett, a professor of quantum physics at the University of Sydney, said: “What I find exciting about this result as a theorist, is that it frees us from the constraining geometry of a qubit only relying on its nearest neighbours”. His team performed detailed calculations, providing the quantum mechanical explanation for the counterintuitive discovery.

    Overall, the demonstration of fast spin exchange constitutes not only a remarkable scientific and technical achievement, but may have profound implications for the architecture of solid-state quantum computers. The reason is the distance: “If spins between non-neighboring qubits can be controllably exchanged, this will allow the realization of networks in which the increased qubit-qubit connectivity translates into a significantly increased computational quantum volume”, predicts Kuemmeth.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings


    Stem Education Coalition

    Niels Bohr Institute Campus

    Niels Bohr Institute (Danish: Niels Bohr Institutet) is a research institute of the University of Copenhagen. The research of the institute spans astronomy, geophysics, nanotechnology, particle physics, quantum mechanics and biophysics.

    The Institute was founded in 1921, as the Institute for Theoretical Physics of the University of Copenhagen, by the Danish theoretical physicist Niels Bohr, who had been on the staff of the University of Copenhagen since 1914, and who had been lobbying for its creation since his appointment as professor in 1916. On the 80th anniversary of Niels Bohr’s birth – October 7, 1965 – the Institute officially became The Niels Bohr Institute.[1] Much of its original funding came from the charitable foundation of the Carlsberg brewery, and later from the Rockefeller Foundation.[2]

    During the 1920s, and 1930s, the Institute was the center of the developing disciplines of atomic physics and quantum physics. Physicists from across Europe (and sometimes further abroad) often visited the Institute to confer with Bohr on new theories and discoveries. The Copenhagen interpretation of quantum mechanics is named after work done at the Institute during this time.

    On January 1, 1993 the institute was fused with the Astronomic Observatory, the Ørsted Laboratory and the Geophysical Institute. The new resulting institute retained the name Niels Bohr Institute.

    The University of Copenhagen (UCPH) (Danish: Københavns Universitet) is the oldest university and research institution in Denmark. Founded in 1479 as a studium generale, it is the second oldest institution for higher education in Scandinavia after Uppsala University (1477). The university has 23,473 undergraduate students, 17,398 postgraduate students, 2,968 doctoral students and over 9,000 employees. The university has four campuses located in and around Copenhagen, with the headquarters located in central Copenhagen. Most courses are taught in Danish; however, many courses are also offered in English and a few in German. The university has several thousands of foreign students, about half of whom come from Nordic countries.

    The university is a member of the International Alliance of Research Universities (IARU), along with University of Cambridge, Yale University, The Australian National University, and UC Berkeley, amongst others. The 2016 Academic Ranking of World Universities ranks the University of Copenhagen as the best university in Scandinavia and 30th in the world, the 2016-2017 Times Higher Education World University Rankings as 120th in the world, and the 2016-2017 QS World University Rankings as 68th in the world. The university has had 9 alumni become Nobel laureates and has produced one Turing Award recipient

     
  • richardmitnick 1:10 pm on March 10, 2019 Permalink | Reply
    Tags: A quantum computer would greatly speed up analysis of the collisions hopefully finding evidence of supersymmetry much sooner—or at least allowing us to ditch the theory and move on., And they’ve been waiting for decades. Google is in the race as are IBM Microsoft Intel and a clutch of startups academic groups and the Chinese government., , At the moment researchers spend weeks and months sifting through the debris from proton-proton collisions in the LCH trying to find exotic heavy sister-particles to all our known particles of matter., “This is a marathon” says David Reilly who leads Microsoft’s quantum lab at the University of Sydney Australia. “And it's only 10 minutes into the marathon.”, , , CERN-Future Circular Collider, For CERN the quantum promise could for instance help its scientists find evidence of supersymmetry or SUSY which so far has proven elusive., HL-LHC-High-Luminosity LHC, IBM has steadily been boosting the number of qubits on its quantum computers starting with a meagre 5-qubit computer then 16- and 20-qubit machines and just recently showing off its 50-qubit processor, In a bid to make sense of the impending data deluge some at CERN are turning to the emerging field of quantum computing., In a quantum computer each circuit can have one of two values—either one (on) or zero (off) in binary code; the computer turns the voltage in a circuit on or off to make it work., In theory a quantum computer would process all the states a qubit can have at once and with every qubit added to its memory size its computational power should increase exponentially., Last year physicists from the California Institute of Technology in Pasadena and the University of Southern California managed to replicate the discovery of the Higgs boson found at the LHC in 2012, None of the competing teams have come close to reaching even the first milestone., Quantum Computing, , , The quest has now lasted decades and a number of physicists are questioning if the theory behind SUSY is really valid., Traditional computers—be it an Apple Watch or the most powerful supercomputer—rely on tiny silicon transistors that work like on-off switches to encode bits of data., Venture capitalists invested some $250 million in various companies researching quantum computing in 2018 alone.,   

    From WIRED: “Inside the High-Stakes Race to Make Quantum Computers Work” 

    Wired logo

    From WIRED

    03.08.19
    Katia Moskvitch

    1
    View Pictures/Getty Images

    Deep beneath the Franco-Swiss border, the Large Hadron Collider is sleeping.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    But it won’t be quiet for long. Over the coming years, the world’s largest particle accelerator will be supercharged, increasing the number of proton collisions per second by a factor of two and a half.

    Once the work is complete in 2026, researchers hope to unlock some of the most fundamental questions in the universe. But with the increased power will come a deluge of data the likes of which high-energy physics has never seen before. And, right now, humanity has no way of knowing what the collider might find.

    To understand the scale of the problem, consider this: When it shut down in December 2018, the LHC generated about 300 gigabytes of data every second, adding up to 25 petabytes (PB) annually. For comparison, you’d have to spend 50,000 years listening to music to go through 25 PB of MP3 songs, while the human brain can store memories equivalent to just 2.5 PB of binary data. To make sense of all that information, the LHC data was pumped out to 170 computing centers in 42 countries [http://greybook.cern.ch/]. It was this global collaboration that helped discover the elusive Higgs boson, part of the Higgs field believed to give mass to elementary particles of matter.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    To process the looming data torrent, scientists at the European Organization for Nuclear Research, or CERN, will need 50 to 100 times more computing power than they have at their disposal today. A proposed Future Circular Collider, four times the size of the LHC and 10 times as powerful, would create an impossibly large quantity of data, at least twice as much as the LHC.

    CERN FCC Future Circular Collider map

    In a bid to make sense of the impending data deluge, some at CERN are turning to the emerging field of quantum computing. Powered by the very laws of nature the LHC is probing, such a machine could potentially crunch the expected volume of data in no time at all. What’s more, it would speak the same language as the LHC. While numerous labs around the world are trying to harness the power of quantum computing, it is the future work at CERN that makes it particularly exciting research. There’s just one problem: Right now, there are only prototypes; nobody knows whether it’s actually possible to build a reliable quantum device.

    Traditional computers—be it an Apple Watch or the most powerful supercomputer—rely on tiny silicon transistors that work like on-off switches to encode bits of data.

    ORNL IBM AC922 SUMMIT supercomputer, No.1 on the TOP500. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    Each circuit can have one of two values—either one (on) or zero (off) in binary code; the computer turns the voltage in a circuit on or off to make it work.

    A quantum computer is not limited to this “either/or” way of thinking. Its memory is made up of quantum bits, or qubits—tiny particles of matter like atoms or electrons. And qubits can do “both/and,” meaning that they can be in a superposition of all possible combinations of zeros and ones; they can be all of those states simultaneously.

    For CERN, the quantum promise could, for instance, help its scientists find evidence of supersymmetry, or SUSY, which so far has proven elusive.

    Standard Model of Supersymmetry via DESY

    At the moment, researchers spend weeks and months sifting through the debris from proton-proton collisions in the LCH, trying to find exotic, heavy sister-particles to all our known particles of matter. The quest has now lasted decades, and a number of physicists are questioning if the theory behind SUSY is really valid. A quantum computer would greatly speed up analysis of the collisions, hopefully finding evidence of supersymmetry much sooner—or at least allowing us to ditch the theory and move on.

    A quantum device might also help scientists understand the evolution of the early universe, the first few minutes after the Big Bang. Physicists are pretty confident that back then, our universe was nothing but a strange soup of subatomic particles called quarks and gluons. To understand how this quark-gluon plasma has evolved into the universe we have today, researchers simulate the conditions of the infant universe and then test their models at the LHC, with multiple collisions. Performing a simulation on a quantum computer, governed by the same laws that govern the very particles that the LHC is smashing together, could lead to a much more accurate model to test.

    Beyond pure science, banks, pharmaceutical companies, and governments are also waiting to get their hands on computing power that could be tens or even hundreds of times greater than that of any traditional computer.

    And they’ve been waiting for decades. Google is in the race, as are IBM, Microsoft, Intel and a clutch of startups, academic groups, and the Chinese government. The stakes are incredibly high. Last October, the European Union pledged to give $1 billion to over 5,000 European quantum technology researchers over the next decade, while venture capitalists invested some $250 million in various companies researching quantum computing in 2018 alone. “This is a marathon,” says David Reilly, who leads Microsoft’s quantum lab at the University of Sydney, Australia. “And it’s only 10 minutes into the marathon.”

    Despite the hype surrounding quantum computing and the media frenzy triggered by every announcement of a new qubit record, none of the competing teams have come close to reaching even the first milestone, fancily called quantum supremacy—the moment when a quantum computer performs at least one specific task better than a standard computer. Any kind of task, even if it is totally artificial and pointless. There are plenty of rumors in the quantum community that Google may be close, although if true, it would give the company bragging rights at best, says Michael Biercuk, a physicist at the University of Sydney and founder of quantum startup Q-CTRL. “It would be a bit of a gimmick—an artificial goal,” says Reilly “It’s like concocting some mathematical problem that really doesn’t have an obvious impact on the world just to say that a quantum computer can solve it.”

    That’s because the first real checkpoint in this race is much further away. Called quantum advantage, it would see a quantum computer outperform normal computers on a truly useful task. (Some researchers use the terms quantum supremacy and quantum advantage interchangeably.) And then there is the finish line, the creation of a universal quantum computer. The hope is that it would deliver a computational nirvana with the ability to perform a broad range of incredibly complex tasks. At stake is the design of new molecules for life-saving drugs, helping banks to adjust the riskiness of their investment portfolios, a way to break all current cryptography and develop new, stronger systems, and for scientists at CERN, a way to glimpse the universe as it was just moments after the Big Bang.

    Slowly but surely, work is already underway. Federico Carminati, a physicist at CERN, admits that today’s quantum computers wouldn’t give researchers anything more than classical machines, but, undeterred, he’s started tinkering with IBM’s prototype quantum device via the cloud while waiting for the technology to mature. It’s the latest baby step in the quantum marathon. The deal between CERN and IBM was struck in November last year at an industry workshop organized by the research organization.

    Set up to exchange ideas and discuss potential collab­orations, the event had CERN’s spacious auditorium packed to the brim with researchers from Google, IBM, Intel, D-Wave, Rigetti, and Microsoft. Google detailed its tests of Bristlecone, a 72-qubit machine. Rigetti was touting its work on a 128-qubit system. Intel showed that it was in close pursuit with 49 qubits. For IBM, physicist Ivano Tavernelli took to the stage to explain the company’s progress.

    IBM has steadily been boosting the number of qubits on its quantum computers, starting with a meagre 5-qubit computer, then 16- and 20-qubit machines, and just recently showing off its 50-qubit processor.

    IBM iconic image of Quantum computer

    Carminati listened to Tavernelli, intrigued, and during a much needed coffee break approached him for a chat. A few minutes later, CERN had added a quantum computer to its impressive technology arsenal. CERN researchers are now starting to develop entirely new algorithms and computing models, aiming to grow together with the device. “A fundamental part of this process is to build a solid relationship with the technology providers,” says Carminati. “These are our first steps in quantum computing, but even if we are coming relatively late into the game, we are bringing unique expertise in many fields. We are experts in quantum mechanics, which is at the base of quantum computing.”

    The attraction of quantum devices is obvious. Take standard computers. The prediction by former Intel CEO Gordon Moore in 1965 that the number of components in an integrated circuit would double roughly every two years has held true for more than half a century. But many believe that Moore’s law is about to hit the limits of physics. Since the 1980s, however, researchers have been pondering an alternative. The idea was popularized by Richard Feynman, an American physicist at Caltech in Pasadena. During a lecture in 1981, he lamented that computers could not really simulate what was happening at a subatomic level, with tricky particles like electrons and photons that behave like waves but also dare to exist in two states at once, a phenomenon known as quantum superposition.

    Feynman proposed to build a machine that could. “I’m not happy with all the analyses that go with just the classical theory, because nature isn’t classical, dammit,” he told the audience back in 1981. “And if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy.”

    And so the quantum race began. Qubits can be made in different ways, but the rule is that two qubits can be both in state A, both in state B, one in state A and one in state B, or vice versa, so there are four probabilities in total. And you won’t know what state a qubit is at until you measure it and the qubit is yanked out of its quantum world of probabilities into our mundane physical reality.

    In theory, a quantum computer would process all the states a qubit can have at once, and with every qubit added to its memory size, its computational power should increase exponentially. So, for three qubits, there are eight states to work with simultaneously, for four, 16; for 10, 1,024; and for 20, a whopping 1,048,576 states. You don’t need a lot of qubits to quickly surpass the memory banks of the world’s most powerful modern supercomputers—meaning that for specific tasks, a quantum computer could find a solution much faster than any regular computer ever would. Add to this another crucial concept of quantum mechanics: entanglement. It means that qubits can be linked into a single quantum system, where operating on one affects the rest of the system. This way, the computer can harness the processing power of both simultaneously, massively increasing its computational ability.

    While a number of companies and labs are competing in the quantum marathon, many are running their own races, taking different approaches. One device has even been used by a team of researchers to analyze CERN data, albeit not at CERN. Last year, physicists from the California Institute of Technology in Pasadena and the University of Southern California managed to replicate the discovery of the Higgs boson, found at the LHC in 2012, by sifting through the collider’s troves of data using a quantum computer manufactured by D-Wave, a Canadian firm based in Burnaby, British Columbia. The findings didn’t arrive any quicker than on a traditional computer, but, crucially, the research showed a quantum machine could do the work.

    One of the oldest runners in the quantum race, D-Wave announced back in 2007 that it had built a fully functioning, commercially available 16-qubit quantum computer prototype—a claim that’s controversial to this day. D-Wave focuses on a technology called quantum annealing, based on the natural tendency of real-world quantum systems to find low-energy states (a bit like a spinning top that inevitably will fall over). A D-Wave quantum computer imagines the possible solutions of a problem as a landscape of peaks and valleys; each coordinate represents a possible solution and its elevation represents its energy. Annealing allows you to set up the problem, and then let the system fall into the answer—in about 20 milliseconds. As it does so, it can tunnel through the peaks as it searches for the lowest valleys. It finds the lowest point in the vast landscape of solutions, which corresponds to the best possible outcome—although it does not attempt to fully correct for any errors, inevitable in quantum computation. D-Wave is now working on a prototype of a universal annealing quantum computer, says Alan Baratz, the company’s chief product officer.

    Apart from D-Wave’s quantum annealing, there are three other main approaches to try and bend the quantum world to our whim: integrated circuits, topological qubits and ions trapped with lasers. CERN is placing high hopes on the first method but is closely watching other efforts too.

    IBM, whose computer Carminati has just started using, as well as Google and Intel, all make quantum chips with integrated circuits—quantum gates—that are superconducting, a state when certain metals conduct electricity with zero resistance. Each quantum gate holds a pair of very fragile qubits. Any noise will disrupt them and introduce errors—and in the quantum world, noise is anything from temperature fluctuations to electromagnetic and sound waves to physical vibrations.

    To isolate the chip from the outside world as much as possible and get the circuits to exhibit quantum mechanical effects, it needs to be supercooled to extremely low temperatures. At the IBM quantum lab in Zurich, the chip is housed in a white tank—a cryostat—suspended from the ceiling. The temperature inside the tank is a steady 10 millikelvin or –273 degrees Celsius, a fraction above absolute zero and colder than outer space. But even this isn’t enough.

    Just working with the quantum chip, when scientists manipulate the qubits, causes noise. “The outside world is continually interacting with our quantum hardware, damaging the information we are trying to process,” says physicist John Preskill at the California Institute of Technology, who in 2012 coined the term quantum supremacy. It’s impossible to get rid of the noise completely, so researchers are trying to suppress it as much as possible, hence the ultracold temperatures to achieve at least some stability and allow more time for quantum computations.

    “My job is to extend the lifetime of qubits, and we’ve got four of them to play with,” says Matthias Mergenthaler, an Oxford University postdoc student working at IBM’s Zurich lab. That doesn’t sound like a lot, but, he explains, it’s not so much the number of qubits that counts but their quality, meaning qubits with as low a noise level as possible, to ensure they last as long as possible in superposition and allow the machine to compute. And it’s here, in the fiddly world of noise reduction, that quantum computing hits up against one of its biggest challenges. Right now, the device you’re reading this on probably performs at a level similar to that of a quantum computer with 30 noisy qubits. But if you can reduce the noise, then the quantum computer is many times more powerful.

    Once the noise is reduced, researchers try to correct any remaining errors with the help of special error-correcting algorithms, run on a classical computer. The problem is, such error correction works qubit by qubit, so the more qubits there are, the more errors the system has to cope with. Say a computer makes an error once every 1,000 computational steps; it doesn’t sound like much, but after 1,000 or so operations, the program will output incorrect results. To be able to achieve meaningful computations and surpass standard computers, a quantum machine has to have about 1,000 qubits that are relatively low noise and with error rates as corrected as possible. When you put them all together, these 1,000 qubits will make up what researchers call a logical qubit. None yet exist—so far, the best that prototype quantum devices have achieved is error correction for up to 10 qubits. That’s why these prototypes are called noisy intermediate-scale quantum computers (NISQ), a term also coined by Preskill in 2017.

    For Carminati, it’s clear the technology isn’t ready yet. But that isn’t really an issue. At CERN the challenge is to be ready to unlock the power of quantum computers when and if the hardware becomes available. “One exciting possibility will be to perform very, very accurate simulations of quantum systems with a quantum computer—which in itself is a quantum system,” he says. “Other groundbreaking opportunities will come from the blend of quantum computing and artificial intelligence to analyze big data, a very ambitious proposition at the moment, but central to our needs.”

    But some physicists think NISQ machines will stay just that—noisy—forever. Gil Kalai, a professor at Yale University, says that error correcting and noise suppression will never be good enough to allow any kind of useful quantum computation. And it’s not even due to technology, he says, but to the fundamentals of quantum mechanics. Interacting systems have a tendency for errors to be connected, or correlated, he says, meaning errors will affect many qubits simultaneously. Because of that, it simply won’t be possible to create error-correcting codes that keep noise levels low enough for a quantum computer with the required large number of qubits.

    “My analysis shows that noisy quantum computers with a few dozen qubits deliver such primitive computational power that it will simply not be possible to use them as the building blocks we need to build quantum computers on a wider scale,” he says. Among scientists, such skepticism is hotly debated. The blogs of Kalai and fellow quantum skeptics are forums for lively discussion, as was a recent much-shared article titled “The Case Against Quantum Computing”—followed by its rebuttal, “The Case Against the Case Against Quantum Computing.

    For now, the quantum critics are in a minority. “Provided the qubits we can already correct keep their form and size as we scale, we should be okay,” says Ray Laflamme, a physicist at the University of Waterloo in Ontario, Canada. The crucial thing to watch out for right now is not whether scientists can reach 50, 72, or 128 qubits, but whether scaling quantum computers to this size significantly increases the overall rate of error.

    3
    The Quantum Nano Centre in Canada is one of numerous big-budget research and development labs focussed on quantum computing. James Brittain/Getty Images

    Others believe that the best way to suppress noise and create logical qubits is by making qubits in a different way. At Microsoft, researchers are developing topological qubits—although its array of quantum labs around the world has yet to create a single one. If it succeeds, these qubits would be much more stable than those made with integrated circuits. Microsoft’s idea is to split a particle—for example an electron—in two, creating Majorana fermion quasi-particles. They were theorized back in 1937, and in 2012 researchers at Delft University of Technology in the Netherlands, working at Microsoft’s condensed matter physics lab, obtained the first experimental evidence of their existence.

    “You will only need one of our qubits for every 1,000 of the other qubits on the market today,” says Chetan Nayak, general manager of quantum hardware at Microsoft. In other words, every single topological qubit would be a logical one from the start. Reilly believes that researching these elusive qubits is worth the effort, despite years with little progress, because if one is created, scaling such a device to thousands of logical qubits would be much easier than with a NISQ machine. “It will be extremely important for us to try out our code and algorithms on different quantum simulators and hardware solutions,” says Carminati. “Sure, no machine is ready for prime time quantum production, but neither are we.”

    Another company Carminati is watching closely is IonQ, a US startup that spun out of the University of Maryland. It uses the third main approach to quantum computing: trapping ions. They are naturally quantum, having superposition effects right from the start and at room temperature, meaning that they don’t have to be supercooled like the integrated circuits of NISQ machines. Each ion is a singular qubit, and researchers trap them with special tiny silicon ion traps and then use lasers to run algorithms by varying the times and intensities at which each tiny laser beam hits the qubits. The beams encode data to the ions and read it out from them by getting each ion to change its electronic states.

    In December, IonQ unveiled its commercial device, capable of hosting 160 ion qubits and performing simple quantum operations on a string of 79 qubits. Still, right now, ion qubits are just as noisy as those made by Google, IBM, and Intel, and neither IonQ nor any other labs around the world experimenting with ions have achieved quantum supremacy.

    As the noise and hype surrounding quantum computers rumbles on, at CERN, the clock is ticking. The collider will wake up in just five years, ever mightier, and all that data will have to be analyzed. A non-noisy, error-corrected quantum computer will then come in quite handy.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 10:16 am on February 28, 2019 Permalink | Reply
    Tags: "Immunising quantum computers against errors", , , Quantum Computing, Researchers at ETH Zürich have used trapped calcium ions to demonstrate a new method for making quantum computers immune to errors   

    From ETH Zürich: “Immunising quantum computers against errors” 

    ETH Zurich bloc

    From ETH Zürich

    Researchers at ETH Zürich have used trapped calcium ions to demonstrate a new method for making quantum computers immune to errors. To do so, they created a periodic oscillatory state of an ion that circumvents the usual limits to measurement accuracy.

    1
    In the ETH experiment, calcium ions are made to oscillate in such a way that their wave functions look like the teeth of a comb. The measurement uncertainty can thus be distributed over many such teeth, which in principle enables precise error detection. (Visualisations: Christa Flühmann / Shutterstock)

    When building a quantum computer, one needs to reckon with errors – in both senses of the word. Quantum bits or “qubits”, which can take on the logical values 0 and 1 at the same time and thus carry out calculations faster, are extremely susceptible to perturbations. A possible remedy for this is quantum error correction, which means that each qubit is represented “redundantly” in several copies, such that errors can be detected and eventually corrected without disturbing the fragile quantum state of the qubit itself. Technically this is very demanding. However, several years ago an alternative suggestion came up in which information isn’t stored in several redundant qubits, but rather in the many oscillatory states of a single quantum harmonic oscillator. The research group of Jonathan Home, professor at the Institute for Quantum Electronics at ETH Zurich, has now realised such a qubit encoded in an oscillator. Their results have been published in the scientific journal Nature.

    Periodic oscillatory states

    In Home’s laboratory, PhD student Christa Flühmann and her colleagues work with electrically charged calcium atoms that are trapped by electric fields. Using appropriately chosen laser beams, these ions are cooled down to very low temperatures at which their oscillations in the electric fields (inside which the ions slosh back and forth like marbles in a bowl) are described by quantum mechanics as so-called wave functions. “At that point things get exciting”, says Flühmann, who is first author of the Nature paper. “We can now manipulate the oscillatory states of the ions in such a way that their position and momentum uncertainties are distributed among many periodically arranged states.”

    Here, “uncertainty” refers to Werner Heisenberg’s famous formula, which states that in quantum physics the product of the measurement uncertainties of the position and velocity (more precisely: the momentum) of a particle can never go below a well-defined minimum. For instance, if one wants to manipulate the particle in order to know its position very well – physicists call this “squeezing” – one automatically makes its momentum less certain.

    Reduced uncertainty

    Squeezing a quantum state in this way is, on its own, only of limited value if the aim is to make precise measurements. However, there is a clever way out: if, on top of the squeezing, one prepares an oscillatory state in which the particle’s wave function is distributed over many periodically spaced positions, the measurement uncertainty of each position and of the respective momentum can be smaller than Heisenberg would allow. Such a spatial distribution of the wave function – the particle can be in several places at once, and only a measurement decides where one actually finds it – is reminiscent of Erwin Schrödinger’s famous cat, which is simultaneously dead and alive.

    This strongly reduced measurement uncertainty also means that the tiniest change in the wave function, for instance by some external disturbance, can be determined very precisely and – at least in principle – corrected. “Our realisation of those periodic or comb-like oscillatory states of the ion are an important step towards such an error detection”, Flühmann explains. “Moreover, we can prepare arbitrary states of the ion and perform all possible logical operations on it. All this is necessary for building a quantum computer. In a next step we want to combine that with error detection and error correction.”

    Applications in quantum sensors

    A few experimental obstacles have to be overcome on the way, Flühmann admits. The calcium ion first needs to be coupled to another ion by electric forces, so that the oscillatory state can be read out without destroying it. Still, even in its present form the method of the ETH researchers is of great interest for applications, Flühmann explains: “Owing to their extreme sensitivity to disturbances, those oscillatory states are a great tool for measuring tiny electric fields or other physical quantities very precisely.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus

    ETH Zürich is one of the leading international universities for technology and the natural sciences. It is well known for its excellent education, ground-breaking fundamental research and for implementing its results directly into practice.

    Founded in 1855, ETH Zürich today has more than 18,500 students from over 110 countries, including 4,000 doctoral students. To researchers, it offers an inspiring working environment, to students, a comprehensive education.

    Twenty-one Nobel Laureates have studied, taught or conducted research at ETH Zürich, underlining the excellent reputation of the university.

     
  • richardmitnick 10:22 am on February 23, 2019 Permalink | Reply
    Tags: , , , Quantum Computing, , , Semiconductor quantum dots,   

    From University of Cambridge: “Physicists get thousands of semiconductor nuclei to do ‘quantum dances’ in unison” 

    U Cambridge bloc

    From University of Cambridge

    22 Feb 2019
    Communications office

    1
    Theoretical ESR spectrum buildup as a function of two-photon detuning δ and drive time τ, for a Rabi frequency of Ω = 3.3 MHz on the central transition. Credit: University of Cambridge.

    A team of Cambridge researchers have found a way to control the sea of nuclei in semiconductor quantum dots so they can operate as a quantum memory device.

    Quantum dots are crystals made up of thousands of atoms, and each of these atoms interacts magnetically with the trapped electron. If left alone to its own devices, this interaction of the electron with the nuclear spins, limits the usefulness of the electron as a quantum bit – a qubit.

    Led by Professor Mete Atatüre from Cambridge’s Cavendish Laboratory, the researchers are exploiting the laws of quantum physics and optics to investigate computing, sensing or communication applications.

    “Quantum dots offer an ideal interface, as mediated by light, to a system where the dynamics of individual interacting spins could be controlled and exploited,” said Atatüre, who is a Fellow of St John’s College. “Because the nuclei randomly ‘steal’ information from the electron they have traditionally been an annoyance, but we have shown we can harness them as a resource.”

    The Cambridge team found a way to exploit the interaction between the electron and the thousands of nuclei using lasers to ‘cool’ the nuclei to less than 1 milliKelvin, or a thousandth of a degree above the absolute zero temperature. They then showed they can control and manipulate the thousands of nuclei as if they form a single body in unison, like a second qubit. This proves the nuclei in the quantum dot can exchange information with the electron qubit and can be used to store quantum information as a memory device. The results are reported in the journal Science.

    Quantum computing aims to harness fundamental concepts of quantum physics, such as entanglement and superposition principle, to outperform current approaches to computing and could revolutionise technology, business and research. Just like classical computers, quantum computers need a processor, memory, and a bus to transport the information backwards and forwards. The processor is a qubit which can be an electron trapped in a quantum dot, the bus is a single photon that these quantum dots generate and are ideal for exchanging information. But the missing link for quantum dots is quantum memory.

    Atatüre said: “Instead of talking to individual nuclear spins, we worked on accessing collective spin waves by lasers. This is like a stadium where you don’t need to worry about who raises their hands in the Mexican wave going round, as long as there is one collective wave because they all dance in unison.

    “We then went on to show that these spin waves have quantum coherence. This was the missing piece of the jigsaw and we now have everything needed to build a dedicated quantum memory for every qubit.”

    In quantum technologies, the photon, the qubit and the memory need to interact with each other in a controlled way. This is mostly realised by interfacing different physical systems to form a single hybrid unit which can be inefficient. The researchers have been able to show that in quantum dots, the memory element is automatically there with every single qubit.

    Dr Dorian Gangloff, one of the first authors of the paper [Science] and a Fellow at St John’s, said the discovery will renew interest in these types of semiconductor quantum dots. Dr Gangloff explained: “This is a Holy Grail breakthrough for quantum dot research – both for quantum memory and fundamental research; we now have the tools to study dynamics of complex systems in the spirit of quantum simulation.”

    The long term opportunities of this work could be seen in the field of quantum computing. Last month, IBM launched the world’s first commercial quantum computer, and the Chief Executive of Microsoft has said quantum computing has the potential to ‘radically reshape the world’.

    Gangloff said: “The impact of the qubit could be half a century away but the power of disruptive technology is that it is hard to conceive of the problems we might open up – you can try to think of it as known unknowns but at some point you get into new territory. We don’t yet know the kind of problems it will help to solve which is very exciting.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Cambridge Campus

    The University of Cambridge (abbreviated as Cantab in post-nominal letters) is a collegiate public research university in Cambridge, England. Founded in 1209, Cambridge is the second-oldest university in the English-speaking world and the world’s fourth-oldest surviving university. It grew out of an association of scholars who left the University of Oxford after a dispute with townsfolk. The two ancient universities share many common features and are often jointly referred to as “Oxbridge”.

    Cambridge is formed from a variety of institutions which include 31 constituent colleges and over 100 academic departments organised into six schools. The university occupies buildings throughout the town, many of which are of historical importance. The colleges are self-governing institutions founded as integral parts of the university. In the year ended 31 July 2014, the university had a total income of £1.51 billion, of which £371 million was from research grants and contracts. The central university and colleges have a combined endowment of around £4.9 billion, the largest of any university outside the United States. Cambridge is a member of many associations and forms part of the “golden triangle” of leading English universities and Cambridge University Health Partners, an academic health science centre. The university is closely linked with the development of the high-tech business cluster known as “Silicon Fen”.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: