Tagged: Quantum Computing Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:53 am on July 18, 2019 Permalink | Reply
    Tags: "200 times faster than ever before: the speediest quantum operation yet", , Quantum Computing, Scanning tunnelling microscopy, The first two-qubit gate between atom qubits in silicon,   

    From University of New South Wales: “200 times faster than ever before: the speediest quantum operation yet” 

    U NSW bloc

    From University of New South Wales

    18 Jul 2019
    Isabelle Dubach

    A group of physicists at UNSW Sydney have built a super-fast version of the central building block of a quantum computer. The research is the milestone result of a vision first outlined by scientists 20 years ago.

    1
    From left to right: Professor Michelle Simmons, Dr. Sam Gorman, Postdoc Research Associate, Dr. Yu He, Postdoc Research Associate, Ludwik Kranz, PhD student, Dr. Joris Keizer, Senior Research Fellow, Daniel Keith, PhD student

    A group of scientists led by 2018 Australian of the Year Professor Michelle Simmons have achieved the first two-qubit gate between atom qubits in silicon – a major milestone on the team’s quest to build an atom-scale quantum computer. The pivotal piece of research was published today in world-renowned journal Nature.

    A two-qubit gate is the central building block of any quantum computer – and the UNSW team’s version of it is the fastest that’s ever been demonstrated in silicon, completing an operation in 0.8 nanoseconds, which is ~200 times faster than other existing spin-based two-qubit gates.

    In the Simmons’ group approach, a two-qubit gate is an operation between two electron spins – comparable to the role that classical logic gates play in conventional electronics. For the first time, the team was able to build a two-qubit gate by placing two atom qubits closer together than ever before, and then – in real-time – controllably observing and measuring their spin states.

    The team’s unique approach to quantum computing requires not only the placement of individual atom qubits in silicon but all the associated circuitry to initialise, control and read-out the qubits at the nanoscale – a concept that requires such exquisite precision it was long thought to be impossible. But with this major milestone, the team is now positioned to translate their technology into scalable processors.

    Professor Simmons, Director of the Centre of Excellence for Quantum Computation and Communication Technology (CQC2T) and founder of Silicon Quantum Computing Pty Ltd., says the past decade of previous results perfectly set the team up to shift the boundaries of what’s thought to be “humanly possible”.

    “Atom qubits hold the world record for the longest coherence times of a qubit in silicon with the highest fidelities,” she says. “Using our unique fabrication technologies, we have already demonstrated the ability to read and initialise single electron spins on atom qubits in silicon with very high accuracy. We’ve also demonstrated that our atomic-scale circuitry has the lowest electrical noise of any system yet devised to connect to a semiconductor qubit.

    “Optimising every aspect of the device design with atomic precision has now allowed us to build a really fast, highly accurate two-qubit gate, which is the fundamental building block of a scalable, silicon-based quantum computer.

    “We’ve really shown that it is possible to control the world at the atomic scale – and that the benefits of the approach are transformational, including the remarkable speed at which our system operates.”

    UNSW Science Dean, Professor Emma Johnston AO, says this key paper further shows just how ground-breaking Professor Simmons’ research is.

    “This was one of Michelle’s team’s final milestones to demonstrate that they can actually make a quantum computer using atom qubits. Their next major goal is building a 10-qubit quantum integrated circuit – and we hope they reach that within 3-4 years.”

    Getting up and close with qubits – engineering with a precision of just thousand-millionths of a metre

    Using a scanning tunnelling microscope to precision-place and encapsulate phosphorus atoms in silicon, the team first had to work out the optimal distance between two qubits to enable the crucial operation.

    “Our fabrication technique allows us to place the qubits exactly where we want them. This allows us to engineer our two-qubit gate to be as fast as possible,” says study lead co-author Sam Gorman from CQC2T.

    “Not only have we brought the qubits closer together since our last breakthrough, but we have learnt to control every aspect of the device design with sub-nanometer precision to maintain the high fidelities.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U NSW Campus

    Welcome to UNSW Australia (The University of New South Wales), one of Australia’s leading research and teaching universities. At UNSW, we take pride in the broad range and high quality of our teaching programs. Our teaching gains strength and currency from our research activities, strong industry links and our international nature; UNSW has a strong regional and global engagement.

    In developing new ideas and promoting lasting knowledge we are creating an academic environment where outstanding students and scholars from around the world can be inspired to excel in their programs of study and research. Partnerships with both local and global communities allow UNSW to share knowledge, debate and research outcomes. UNSW’s public events include concert performances, open days and public forums on issues such as the environment, healthcare and global politics. We encourage you to explore the UNSW website so you can find out more about what we do.

     
  • richardmitnick 8:53 am on July 17, 2019 Permalink | Reply
    Tags: "Quantum control with light paves way for ultra-fast computers", , , Quantum Computing,   

    From Iowa State University via Futurity: “Quantum control with light paves way for ultra-fast computers” 

    From Iowa State University

    via

    Futurity

    July 16th, 2019
    Mike Krapfl-Iowa State

    1

    Terahertz light can control some of the essential quantum properties of superconducting states, report researchers.

    Jigang Wang patiently explains his latest discovery in quantum control that could lead to superfast computing based on quantum mechanics: He mentions light-induced superconductivity without energy gap. He brings up forbidden supercurrent quantum beats. And he mentions terahertz-speed symmetry breaking.

    Then he backs up and clarified all that. After all, the quantum world of matter and energy at terahertz and nanometer scales—trillions of cycles per second and billionths of meters—is still a mystery to most of us.

    “I like to study quantum control of superconductivity exceeding the gigahertz, or billions of cycles per second, bottleneck in current state-of-the-art quantum computation applications,” says Wang, a professor of physics and astronomy at Iowa State University. “We’re using terahertz light as a control knob to accelerate supercurrents.”

    A bit more explanation

    Superconductivity is the movement of electricity through certain materials without resistance. It typically occurs at very, very cold temperatures. Think -400 Fahrenheit for “high-temperature” superconductors.

    Terahertz light is light at very, very high frequencies. Think trillions of cycles per second. It’s essentially extremely strong and powerful microwave bursts firing at very short time frames.

    It all sounds esoteric and strange. But the new method could have very practical applications.

    “Light-induced supercurrents chart a path forward for electromagnetic design of emergent materials properties and collective coherent oscillations for quantum engineering applications,” Wang and his coauthors write in a paper in Nature Photonics.

    In other words, the discovery could help physicists “create crazy-fast quantum computers by nudging supercurrents,” Wang writes in a summary of the research team’s findings.

    Controlling quantum physics

    Finding ways to control, access, and manipulate the special characteristics of the quantum world and connect them to real-world problems is a major scientific push these days. The National Science Foundation has included the “Quantum Leap” in its “10 big ideas” for future research and development.

    “By exploiting interactions of these quantum systems, next-generation technologies for sensing, computing, modeling, and communicating will be more accurate and efficient,” says a summary of the science foundation’s support of quantum studies. “To reach these capabilities, researchers need understanding of quantum mechanics to observe, manipulate, and control the behavior of particles and energy at dimensions at least a million times smaller than the width of a human hair.”

    The researchers are advancing the quantum frontier by finding new macroscopic supercurrent flowing states and developing quantum controls for switching and modulating them.

    A summary of the research team’s study says experimental data they obtained from a terahertz spectroscopy instrument indicates terahertz light-wave tuning of supercurrents is a universal tool “and is key for pushing quantum functionalities to reach their ultimate limits in many cross-cutting disciplines” such as those mentioned by the science foundation.

    And so, the researchers write, “We believe that it is fair to say that the present study opens a new arena of light-wave superconducting electronics via terahertz quantum control for many years to come.”

    The Army Research Office supports Wang’s research. Additional researchers from Iowa State, the University of Wisconsin-Madison, and the University of Alabama at Birmingham contributed to the work.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Iowa State University is a public, land-grant university, where students get a great academic start in learning communities and stay active in 800-plus student organizations, undergrad research, internships and study abroad. They learn from world-class scholars who are tackling some of the world’s biggest challenges — feeding the hungry, finding alternative fuels and advancing manufacturing.

    Iowa Agricultural College and Model Farm (now Iowa State University) was officially established on March 22, 1858, by the legislature of the State of Iowa. Story County was selected as a site on June 21, 1859, and the original farm of 648 acres was purchased for a cost of $5,379. The Farm House, the first building on the Iowa State campus, was completed in 1861, and in 1862, the Iowa legislature voted to accept the provision of the Morrill Act, which was awarded to the agricultural college in 1864.

    Iowa State University Knapp-Wilson Farm House. Photo between 1911-1926

    Iowa Agricultural College (Iowa State College of Agricultural and Mechanic Arts as of 1898), as a land grant institution, focused on the ideals that higher education should be accessible to all and that the university should teach liberal and practical subjects. These ideals are integral to the land-grant university.

    The first official class entered at Ames in 1869, and the first class (24 men and 2 women) graduated in 1872. Iowa State was and is a leader in agriculture, engineering, extension, home economics, and created the nation’s first state veterinary medicine school in 1879.

    In 1959, the college was officially renamed Iowa State University of Science and Technology. The focus on technology has led directly to many research patents and inventions including the first binary computer (the ABC), Maytag blue cheese, the round hay baler, and many more.

    Beginning with a small number of students and Old Main, Iowa State University now has approximately 27,000 students and over 100 buildings with world class programs in agriculture, technology, science, and art.

    Iowa State University is a very special place, full of history. But what truly makes it unique is a rare combination of campus beauty, the opportunity to be a part of the land-grant experiment, and to create a progressive and inventive spirit that we call the Cyclone experience. Appreciate what we have here, for it is indeed, one of a kind.

     
  • richardmitnick 8:10 am on July 12, 2019 Permalink | Reply
    Tags: , , Quantum Computing,   

    From University of Oxford: “Oxford to lead quantum computing hub as part of UK’s research and innovation drive” 

    U Oxford bloc

    From University of Oxford

    11 Jul 2019

    1
    Oxford to lead quantum computing hub as part of UK’s research and innovation drive.

    Science Minister Chris Skidmore has today announced £94 million of funding for the UK’s Quantum Technologies Research Hubs – including a quantum computing and simulation hub led by Oxford University.

    Hubs centred at Oxford, Birmingham, Glasgow and York will revolutionise computing, sensing and timing, imaging, and communications respectively. The collaborations will involve 26 universities, 138 investigators and over 100 partners.

    Among the developments in quantum research already taking place in the UK are technologies that will allow fire crews to see through smoke and dust, computers to solve previously unsolvable computational problems, construction projects to image unmapped voids like old mine workings, and cameras that will let vehicles ‘see’ around corners.

    The National Quantum Technologies Programme, which began in 2013, has now entered its second phase of funding, part of which will involve the newly announced £94 million investment in four research hubs by the UK government, via UK Research and Innovation’s (UKRI) Engineering and Physical Sciences Research Council (EPSRC).

    Through these hubs, the UK’s world-leading quantum technologies research base will continue to drive the development of new technologies through its network of academic and business partnerships.

    Science Minister Chris Skidmore said: “Harnessing the full potential of emerging technologies is vital as we strive to meet our Industrial Strategy ambition to be the most innovative economy in the world.

    “Our world-leading universities are pioneering ways to apply quantum technologies that could have serious commercial benefits for UK businesses. That’s why I am delighted to be announcing further investment in quantum technology hubs that will bring academics and innovators together and make this once futuristic technology applicable to our everyday lives.”

    UKRI’s chief executive, Professor Sir Mark Walport, said: “The UK is leading the field in developing quantum technologies, and this new investment will help us make the next leap forward in the drive to link discoveries to innovative applications. UKRI is committed to ensuring the best research and researchers are supported in this area.”

    Oxford will lead the UKRI EPSRC Hub in Quantum Computing and Simulation, which will enable the UK to be internationally leading in quantum computing and simulation. It will drive progress towards practical quantum computers and usher in the era where they will have revolutionary impact on real-world challenges in a range of multidisciplinary themes, from the discovery of novel drugs and new materials through to quantum-enhanced machine learning, information security and even carbon reduction through optimised resource usage.

    The hub will bring together leading quantum research teams across 17 universities into a collaboration with more than 25 national and international commercial, governmental and academic entities. It will address critical research challenges and work with partners to accelerate the development of quantum computing in the UK. Hub research will focus on the hardware and software that will be needed for future quantum computers and simulators.

    Professor David Lucas of Oxford’s Department of Physics, principal investigator for the new hub, said: “The quantum computing and simulation hub will drive forward the UK’s progress in developing future quantum computing technology. It will build on the successes of the Oxford-led ‘Phase 1’ NQIT hub, which has delivered world-leading performance in quantum logic and quantum networking, as well as a number of spinout companies to take quantum research out of the lab into the commercial arena.”

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Oxford campus

    Oxford is a collegiate university, consisting of the central University and colleges. The central University is composed of academic departments and research centres, administrative departments, libraries and museums. The 38 colleges are self-governing and financially independent institutions, which are related to the central University in a federal system. There are also six permanent private halls, which were founded by different Christian denominations and which still retain their Christian character.

    The different roles of the colleges and the University have evolved over time.

     
  • richardmitnick 1:06 pm on June 30, 2019 Permalink | Reply
    Tags: , , , , Quantum Computing   

    From COSMOS Magazine: “Thanks to AI, we know we can teleport qubits in the real world” 

    Cosmos Magazine bloc

    From COSMOS Magazine

    26 June 2019
    Gabriella Bernardi

    Deep learning shows its worth in the word of quantum computing.

    1
    We’re coming to terms with quantum computing, (qu)bit by (qu)bit.
    MEHAU KULYK/GETTY IMAGES

    Italian researchers have shown that it is possible to teleport a quantum bit (or qubit) in what might be called a real-world situation.

    And they did it by letting artificial intelligence do much of the thinking.

    The phenomenon of qubit transfer is not new, but this work, which was led by Enrico Prati of the Institute of Photonics and Nanotechnologies in Milan, is the first to do it in a situation where the system deviates from ideal conditions.

    Moreover, it is the first time that a class of machine-learning algorithms known as deep reinforcement learning has been applied to a quantum computing problem.

    The findings are published in a paper in the journal Communications Physics.

    One of the basic problems in quantum computing is finding a fast and reliable method to move the qubit – the basic piece of quantum information – in the machine. This piece of information is coded by a single electron that has to be moved between two positions without passing through any of the space in between.

    In the so-called “adiabatic”, or thermodynamic, quantum computing approach, this can be achieved by applying a specific sequence of laser pulses to a chain of an odd number of quantum dots – identical sites in which the electron can be placed.

    It is a purely quantum process and a solution to the problem was invented by Nikolay Vitanov of the Helsinki Institute of Physics in 1999. Given its nature, rather distant from the intuition of common sense, this solution is called a “counterintuitive” sequence.

    However, the method applies only in ideal conditions, when the electron state suffers no disturbances or perturbations.

    Thus, Prati and colleagues Riccardo Porotti and Dario Tamaschelli of the University of Milan and Marcello Restelli of the Milan Polytechnic, took a different approach.

    “We decided to test the deep learning’s artificial intelligence, which has already been much talked about for having defeated the world champion at the game Go, and for more serious applications such as the recognition of breast cancer, applying it to the field of quantum computers,” Prati says.

    Deep learning techniques are based on artificial neural networks arranged in different layers, each of which calculates the values for the next one so that the information is processed more and more completely.

    Usually, a set of known answers to the problem is used to “train” the network, but when these are not known, another technique called “reinforcement learning” can be used.

    In this approach two neural networks are used: an “actor” has the task of finding new solutions, and a “critic” must assess the quality of these solution. Provided a reliable way to judge the respective results can be given by the researchers, these two networks can examine the problem independently.

    The researchers, then, set up this artificial intelligence method, assigning it the task of discovering alone how to control the qubit.

    “So, we let artificial intelligence find its own solution, without giving it preconceptions or examples,” Prati says. “It found another solution that is faster than the original one, and furthermore it adapts when there are disturbances.”

    In other words, he adds, artificial intelligence “has understood the phenomenon and generalised the result better than us”.

    “It is as if artificial intelligence was able to discover by itself how to teleport qubits regardless of the disturbance in place, even in cases where we do not already have any solution,” he explains.

    “With this work we have shown that the design and control of quantum computers can benefit from the using of artificial intelligence.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 11:38 am on June 7, 2019 Permalink | Reply
    Tags: EU’s €1 billion 10-year Quantum Flagship initiative to kickstart a competitive European industry in quantum technologies., Europe's plans are quicly eclipsing both the U.S. and China., , Quantum Computing   

    From Horizon The EU Research and Innovation Magazine: “Quantum computers will soon outperform classical machines” 

    1

    From Horizon The EU Research and Innovation Magazine

    04 June 2019
    Joanna Roberts

    1
    As a quantum computer can be in many states at the time it enables the calculation of many possibilities at once, says Dr Thomas Monz. Image credit – Flickr/IBM Research, licensed under CC BY-ND 2.0.

    European scientists have spent 100 years developing the field of quantum mechanics – a branch of physics dealing with the atomic and subatomic scale – and we need to reap the profits now that quantum computers and other technologies are becoming a reality, according to Dr Thomas Monz from the University of Innsbruck, Austria.

    He is leading a project to develop a fully scalable quantum computer. The project is part of the EU’s €1 billion, 10-year Quantum Flagship initiative to kickstart a competitive European industry in quantum technologies.

    What is a quantum computer and how does it differ from classical computers?

    ‘The big difference compared to a classical computer is that a quantum computer is following a different rule set. It’s not using zeros and ones like classical computers are – bits and bytes – but it is actually able to work with something called qubits.

    ‘Qubits are quantum bits, and have the special property that at the same time they can be zero and one. The classical computer can only be – like a light switch – either on or off, and the quantum bits can be on and off at the same time.’

    What’s the effect of that?

    ‘This superposition essentially allows it to do things that a classical computer can’t do. Because it’s in many states at the same time, in simplified terms, it allows you to probe many possibilities at the same time. (For example), if you are working in finance and you want to say which portfolio has the largest profit, you need to take many, many different cases into account and then find the best one. And this is something that a quantum computer, because it essentially allows you to calculate many things at the same time, is notably more suitable for.

    ‘(Another) prominent example is energy material design. Think about the power line you get at home. You have friction – ohmic resistance – in the cable. That’s why an electric motor or your hairdryer gets warm. Quite a bit of a power is lost from the power plant before it gets to your house. Can we come up with a new material, which doesn’t have ohmic resistance, so we don’t have (energy) losses in the cable? The inherent properties of how friction in materials work, that’s partially governed by quantum mechanics. And a quantum computer (finds it) easy to follow the rules of quantum.

    ‘It allows you to do material design and check what are good candidates for materials that wouldn’t have, say, ohmic resistance, and suddenly we save a couple of percent on global energy loss from the power plants to the consumer.’

    Where are we now in the development of quantum computers?

    ‘There are currently several proof of concept implementations. (For example) companies are working in the finance area on portfolio optimisation. Certain companies are working towards chemistry – one prominent example is how to generate fertiliser.

    ‘(But) I think the key question is, regardless of what quantum computer we actually talk about, give me one case where it will outperform the best classical computer worldwide.

    ‘And the timescale on that, I would guess it’s in the order of another year.’

    One of the big challenges is going to be writing algorithms to program quantum computers. Where are we with that?

    ‘There are a couple of (algorithms) already, but obviously you want to have more.

    ‘Everything started with Shor’s algorithm (which can find the factors of prime numbers on which today’s encryption systems are based). This was an algorithm that could convince (government) agencies to look into quantum computing, because it can break some of the most prominent encryption methods that we currently use. That was the starting point about two decades ago.

    ‘In the meantime, people have been working on (algorithms for) optimisation calculations. If you want to optimise a (financial) portfolio or make sure no-one gets stuck in a traffic jam – mathematically they are all very similar.’

    Will our use of quantum computers depend on the algorithms that we’re able to develop?

    ‘Sure. Think about your smartphone. Your smartphone is a computer and depending on which app you load, it can be something where you send out a message or you hear some music. A quantum computer is also fully programmable. The more algorithms we have, the more apps we can build with those algorithms, and then you want to have your quantum app store.’

    Is it fair to say that we won’t have quantum computers at home in the future because they’re something very specialised?

    ‘Partially. Say there is a quantum computer available right now, would you buy one? If you say, “I mainly write emails, watch videos and store my pictures,” you wouldn’t need a quantum computer – not for the moment.

    ‘(But) think about your classic computer. For graphics, it has a graphics card. It’s likely there will be a quantum co-processor in the long-run. It will be an add-on to your classical computer to give you some additional capabilities for special computing (or something such as secure communication).

    ‘(Or) it could be that there are quantum computers available and you can have access to them via a simple cloud interface.’

    You run a project called AQTION, which is trying to build a quantum computer. Can you tell us a bit about it?

    ‘IBM, Intel and Google are building on solid-state semiconductors (used to make computer chips) whereas we are using single atoms. In solid-state systems, you start with a lump of material and try to control it so it becomes quantum. Our approach is the opposite. We start with an atom, which is already quantum, and look at how to control it. We already have the quantum properties, so we only – only in quotation marks – have to focus on the classical (engineering) part of it.

    ‘Another aspect is that most of these (solid-state) computers are built in a lab environment. (Our quantum computer) is meant to operate in an office environment. If the air conditioning breaks, it still ought to work. If you want to ship it to a partner, you disassemble it, put the boxes into wooden crates, you ship that, you assemble it, and it ought to work. Rather than this once-in-a-lifetime prototype that only exists in one lab.’

    So the question is still open about the best way to build a quantum computer?

    ‘Yes. I would argue there are probably five or six approaches. The two most promising for the moment are trapped ions – that is what we pursue – and superconducting systems.’

    We always hear about the private companies that are driving forward quantum computing. What effect will the EU’s Quantum Flagship have?

    ‘Intel, IBM, Google all pursue a technology (to build quantum computers) that’s close to their hearts. If you have a hammer, everything looks like a nail; if you are in the semiconductor industry you want to address every new potential application with semiconductor technology. That’s what they do.

    ‘I think what the funding from the EU allows us to do is to compete with these entities because there are not that many companies in Europe (pursuing quantum technologies). The second part is, maybe you want to have a different tool using a new technology (not just semiconductors). So if one technology might be a dead end in the long run, we have in Europe a plan B and a plan C because we don’t put everything into one bucket.

    ‘The Flagship is looking not only at computing, but also clocks, sensors, communications. It has a long-term vision of quantum computing as well.’

    Where does Europe stand in the global race on quantum technologies?

    ‘Think about Schrödinger, Einstein – all of quantum physics was developed 100 years ago in Europe. This is something that we excel at. We have invested in fundamental research for essentially the last 100 years, and now that there is the chance of turning research into technology and applications, we shouldn’t miss that train. We have a very good chance of making that work.’

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

     
  • richardmitnick 8:37 am on May 14, 2019 Permalink | Reply
    Tags: "Quantum world-first: researchers can now tell how accurate two-qubit calculations in silicon really are", ...you can only tap into the tremendous power of quantum computing if the qubit operations are near perfect with only tiny errors allowed” Dr Yang says., , “Fidelity is a critical parameter which determines how viable a qubit technology is..., , Quantum Computing, The researchers say the study is further proof that silicon as a technology platform is ideal for scaling up to the large numbers of qubits needed for universal quantum computing., Two-qubit gate,   

    From University of New South Wales: “Quantum world-first: researchers can now tell how accurate two-qubit calculations in silicon really are” 

    U NSW bloc

    From University of New South Wales – Sidney

    14 May 2019

    Isabelle Dubach
    Media and Content Manager
    +61 2 9385 7307, 0432 307 244
    i.dubach@unsw.edu.au

    Scientia Professor Andrew Dzurak
    Electrical Engineering & Telecommunications
    +61 432 405 434
    a.dzurak@unsw.edu.au

    After being the first team to create a two-qubit gate in silicon in 2015, UNSW Sydney engineers are breaking new ground again: they have measured the accuracy of silicon two-qubit operations for the first time – and their results confirm the promise of silicon for quantum computing.

    1
    Wister Huang, a final-year PhD student in Electrical Engineering; Professor Andrew Dzurak; and Dr Henry Yang, a senior research fellow.

    For the first time ever, researchers have measured the fidelity – that is, the accuracy – of two-qubit logic operations in silicon, with highly promising results that will enable scaling up to a full-scale quantum processor.

    The research, carried out by Professor Andrew Dzurak’s team in UNSW Engineering, was published today in the world-renowned journal Nature.

    The experiments were performed by Wister Huang, a final-year PhD student in Electrical Engineering, and Dr Henry Yang, a senior research fellow at UNSW.

    “All quantum computations can be made up of one-qubit operations and two-qubit operations – they’re the central building blocks of quantum computing,” says Professor Dzurak.

    “Once you’ve got those, you can perform any computation you want – but the accuracy of both operations needs to be very high.”

    In 2015 Dzurak’s team was the first to build a quantum logic gate in silicon, making calculations between two qubits of information possible – and thereby clearing a crucial hurdle to making silicon quantum computers a reality.

    A number of groups around the world have since demonstrated two-qubit gates in silicon – but until this landmark paper today, the true accuracy of such a two-qubit gate was unknown.

    Accuracy crucial for quantum success

    “Fidelity is a critical parameter which determines how viable a qubit technology is – you can only tap into the tremendous power of quantum computing if the qubit operations are near perfect, with only tiny errors allowed,” Dr Yang says.

    In this study, the team implemented and performed Clifford-based fidelity benchmarking – a technique that can assess qubit accuracy across all technology platforms – demonstrating an average two-qubit gate fidelity of 98%.

    “We achieved such a high fidelity by characterising and mitigating primary error sources, thus improving gate fidelities to the point where randomised benchmarking sequences of significant length – more than 50 gate operations – could be performed on our two-qubit device,” says Mr Huang, the lead author on the paper.

    Quantum computers will have a wide range of important applications in the future thanks to their ability to perform far more complex calculations at much greater speeds, including solving problems that are simply beyond the ability of today’s computers.

    “But for most of those important applications, millions of qubits will be needed, and you’re going to have to correct quantum errors, even when they’re small,” Professor Dzurak says.

    “For error correction to be possible, the qubits themselves have to be very accurate in the first place – so it’s crucial to assess their fidelity.”

    “The more accurate your qubits, the fewer you need – and therefore, the sooner we can ramp up the engineering and manufacturing to realise a full-scale quantum computer.”


    Silicon confirmed as the way to go.

    The researchers say the study is further proof that silicon as a technology platform is ideal for scaling up to the large numbers of qubits needed for universal quantum computing. Given that silicon has been at the heart of the global computer industry for almost 60 years, its properties are already well understood and existing silicon chip production facilities can readily adapt to the technology.

    “If our fidelity value had been too low, it would have meant serious problems for the future of silicon quantum computing. The fact that it is near 99% puts it in the ballpark we need, and there are excellent prospects for further improvement. Our results immediately show, as we predicted, that silicon is a viable platform for full-scale quantum computing,” Professor Dzurak says.

    “We think that we’ll achieve significantly higher fidelities in the near future, opening the path to full-scale, fault-tolerant quantum computation. We’re now on the verge of a two-qubit accuracy that’s high enough for quantum error correction.”

    In another paper – recently published in Nature Electronics and featured on its cover – on which Dr Yang is lead author, the same team also achieved the record for the world’s most accurate 1-qubit gate in a silicon quantum dot, with a remarkable fidelity of 99.96%.

    3

    “Besides the natural advantages of silicon qubits, one key reason we’ve been able to achieve such impressive results is because of the fantastic team we have here at UNSW. My student Wister and Dr Yang are both incredibly talented. They personally conceived the complex protocols required for this benchmarking experiment,” says Professor Dzurak.

    Other authors on today’s Nature paper are UNSW researchers Tuomo Tanttu, Ross Leon, Fay Hudson, Andrea Morello and Arne Laucht, as well as former Dzurak team members Kok Wai Chan, Bas Hensen, Michael Fogarty and Jason Hwang, while Professor Kohei Itoh from Japan’s Keio University provided isotopically enriched silicon wafers for the project.

    UNSW Dean of Engineering, Professor Mark Hoffman, says the breakthrough is yet another piece of proof that this world-leading team are in the process of taking quantum computing across the threshold from the theoretical to the real.

    “Quantum computing is this century’s space race – and Sydney is leading the charge,” Professor Hoffman says.

    “This milestone is another step towards realising a large-scale quantum computer – and it reinforces the fact that silicon is an extremely attractive approach that we believe will get UNSW there first.”

    Spin qubits based on silicon CMOS technology – the specific method developed by Professor Dzurak’s group – hold great promise for quantum computing because of their long coherence times and the potential to leverage existing integrated circuit technology to manufacture the large numbers of qubits needed for practical applications.

    Professor Dzurak leads a project to advance silicon CMOS qubit technology with Silicon Quantum Computing, Australia’s first quantum computing company.

    “Our latest result brings us closer to commercialising this technology – my group is all about building a quantum chip that can be used for real-world applications,” Professor Dzurak says.

    The silicon qubit device that was used in this study was fabricated entirely at UNSW using a novel silicon-CMOS process line, high-resolution patterning systems, and supporting nanofabrication equipment that are made available by ANFF-NSW.

    A full-scale quantum processor would have major applications in the finance, security and healthcare sectors – it would help identify and develop new medicines by greatly accelerating the computer-aided design of pharmaceutical compounds, it could contribute to developing new, lighter and stronger materials spanning consumer electronics to aircraft, and faster information searching through large databases.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U NSW Campus

    Welcome to UNSW Australia (The University of New South Wales), one of Australia’s leading research and teaching universities. At UNSW, we take pride in the broad range and high quality of our teaching programs. Our teaching gains strength and currency from our research activities, strong industry links and our international nature; UNSW has a strong regional and global engagement.

    In developing new ideas and promoting lasting knowledge we are creating an academic environment where outstanding students and scholars from around the world can be inspired to excel in their programs of study and research. Partnerships with both local and global communities allow UNSW to share knowledge, debate and research outcomes. UNSW’s public events include concert performances, open days and public forums on issues such as the environment, healthcare and global politics. We encourage you to explore the UNSW website so you can find out more about what we do.

     
  • richardmitnick 10:16 am on May 11, 2019 Permalink | Reply
    Tags: Beginning this summer IBM will host developer boot camps and hackathons for hands-on training of the open source IBM Q Experience cloud services platform, IBM Q Network, Quantum Computing, This new effort will build on Virginia Tech’s ongoing efforts with the IBM Q Hub at Oak Ridge in Tennessee, , Virginia Tech has significant expertise in designing control schemes for quantum computing hardware and in developing algorithms for simulating molecular chemistry problems on quantum processors   

    From Virginia Tech: “Virginia Tech joining IBM Q Network to accelerate research, educational opportunities in quantum computing” 

    From Virginia Tech

    1
    Left to right, Nick Mayhall, Sophia Economou, and Ed Barnes, all researchers and faculty members in the Virginia Tech College of Science, discuss quantum computing algorithms.

    Virginia Tech has joined the expanding IBM Q Network as a member of the IBM Q Hub at Oak Ridge National Laboratory to accelerate joint research in quantum computing, as well as develop curricula to help prepare students for new careers in science, engineering, and business influenced by the next era of computing.

    The IBM Q Network is the world’s first community of Fortune 500 companies, startups, academic institutions, and research labs working to advance quantum computing. Virginia Tech researchers and students will have direct access to IBM Q’s most-advanced quantum computing systems for research projects that advance quantum science, exploring early uses of quantum computing, and for teaching.

    IBM iconic image of Quantum computer

    Faculty and students from Virginia Tech’s College of Science and College of Engineering will collaborate with IBM scientists on research to advance the foundational science, technology, and software required to enable more capable quantum systems.

    “Virginia Tech has significant expertise in designing control schemes for quantum computing hardware and in developing algorithms for simulating molecular chemistry problems on quantum processors,” said Sophia Economou, an associate professor from the Department of Physics in the College of Science. “The collaboration with IBM will allow us to advance our efforts in these directions by directly testing our ideas on IBM hardware. Interactions with IBM researchers and student internships will further accelerate Virginia Tech’s expansion into the burgeoning field of quantum computing.”

    Beginning this summer, IBM will host developer boot camps and hackathons for hands-on training of the open source IBM Q Experience cloud services platform, and Qiskit quantum software platform on the campuses of participating universities.

    For now, quantum computers are “noisy,” error-prone prototypes, much like classical computers were in the 1940s. But the exponential properties of their fundamental processing element, the quantum bit (or qubit), holds promise to solve problems in chemistry, artificial intelligence, and other areas that are intractable for today’s computers. Consider: 300 perfectly stable qubits could represent more values than there are atoms in the observable universe – well beyond the capacity of what a classical computer could ever compute. Today’s research is paving the way toward improving these early devices to develop practical quantum applications, according to IBM.

    Robert McGwier, chief scientist at Virginia Tech’s Hume Center and a research professor in the College of Engineering, said this new effort will build on Virginia Tech’s ongoing efforts with the IBM Q Hub at Oak Ridge in Tennessee on construction and analysis of Noisy Qubit Quantum Algorithm and forthcoming efforts with the Office of the Director of Navy Intelligence’s augmented intelligence with machines program. Also in the College of Engineering, the Department of Computer Science’s Wu Feng is teaching an undergraduate course in quantum computing and faculty are preparing research projects in the field for funding proposals with the National Science Foundation.

    IBM also has been partnering on efforts in computational chemistry with Daniel Crawford, a professor in the Department of Chemistry in the College of Science and director of the National Science Foundation-funded Molecular Sciences Software Institute. “The growing collaboration between researchers at Virginia Tech and IBM focuses on the development of novel algorithms that bind the well-established field of quantum chemistry and the emerging domain of quantum computing in order to attack larger and more complex molecular problems than those currently in our grasp,” Crawford said.

    Additional Virginia Tech faculty partnering on the IBM quantum computing project include Ed Barnes, an assistant professor of physics, and Nick Mayhall, an assistant professor of chemistry.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Virginia Polytechnic Institute and State University, commonly known as Virginia Tech and by the initialisms VT and VPI,[8] is an American public, land-grant, research university with a main campus in Blacksburg, Virginia, educational facilities in six regions statewide, and a study-abroad site in Lugano, Switzerland. Through its Corps of Cadets ROTC program, Virginia Tech is also designated as one of six senior military colleges in the United States.

    As Virginia’s third-largest university, Virginia Tech offers 225 undergraduate and graduate degree programs to some 30,600 students and manages a research portfolio of $513 million, the largest of any university in Virginia.[9] The university fulfills its land-grant mission of transforming knowledge to practice through technological leadership and by fueling economic growth and job creation locally, regionally, and across Virginia.

    Virginia Polytechnic Institute and State University officially opened on Oct. 1, 1872, as Virginia’s white land-grant institution (Hampton Normal and Industrial Institute, founded in 1868, was designated the commonwealth’s first black land-grant school. This continued until 1920, when the funds were shifted by the legislature to the Virginia Normal and Industrial Institute in Petersburg, which in 1946 was renamed to Virginia State University by the legislature). During its existence, the university has operated under four different legal names. The founding name was Virginia Agricultural and Mechanical College. Following a reorganization of the college in the 1890s, the state legislature changed the name to Virginia Agricultural and Mechanical College and Polytechnic Institute, effective March 5, 1896. Faced with such an unwieldy name, people began calling it Virginia Polytechnic Institute, or simply VPI. On June 23, 1944, the legislature followed suit, officially changing the name to Virginia Polytechnic Institute. At the same time, the commonwealth moved most women’s programs from VPI to nearby Radford College, and that school’s official name became Radford College, Women’s Division of Virginia Polytechnic Institute. The commonwealth dissolved the affiliation between the two colleges in 1964. The state legislature sanctioned university status for VPI and bestowed upon it the present legal name, Virginia Polytechnic Institute and State University, effective June 26, 1970. While some older alumni and other friends of the university continue to call it VPI, its most popular–and its official—nickname today is Virginia Tech.

     
  • richardmitnick 8:53 am on April 26, 2019 Permalink | Reply
    Tags: A promising building block for supercomputers of the future: a two-dimensional platform for that could lead to quantum bits that are both stable and able to be mass produced., , Center for Quantum Devices (QDev) a Center of Excellence sponsored by the Danish National Research Foundation at the Niels Bohr Institute University of Copenhagen, Our prototype is a significant first step towards using this type of system to make quantum bits that are protected from disturbances., Quantum Computing, The Copenhagen team was able to demonstrate Majorana zero modes in the one-dimensional semiconductor gap between two superconductors forming a spatially extended Josephson junction.,   

    From University of Copenhagen: “University of Copenhagen researchers realize new platform for future quantum computer” 

    From University of Copenhagen

    Niels Bohr Institute bloc

    Niels Bohr Institute

    26 April 2019

    Antonio Fornieri
    Postdoc
    antonio.fornieri@nbi.ku.dk
    http://www.nbi.ku.dk/
    Phone: +45 35 33 48 89

    Michael Skov Jensen
    Press officer
    Faculty of Science
    msj@science.ku.dk
    +45 93 56 58 97

    Quantum physics

    University of Copenhagen physicists, as part of the University and Microsoft collaboration focused on topological quantum computing, may have unloosed a Gordian knot in quantum computer development. In partnership with researchers from University of Chicago, ETH Zürich, Weizmann Institute of Science, and fellow Microsoft Quantum Lab collaborators at Purdue University, they have designed and realized a promising building block for supercomputers of the future: a two-dimensional platform for that could lead to quantum bits that are both stable and able to be mass produced.

    1
    Led by two young physicists, Antonio Fornieri and Alex Whiticar, under the supervision of Professor Charles Marcus, Director of Microsoft Quantum Lab Copenhagen, researchers at the Center for Quantum Devices (QDev) a Center of Excellence sponsored by the Danish National Research Foundation at the Niels Bohr Institute, University of Copenhagen, designed, built, and characterized a key component that could cut a Gordian knot in the development of viable quantum computers – specifically, the building block for a quantum bit, or qubit, that is both protected from disturbances and able to be mass produced. Their results have just been published in the scientific journal, Nature.

    Together with a back-to-back publication from a team at Harvard University on a related system, the Copenhagen team was able to demonstrate Majorana zero modes in the one-dimensional semiconductor gap between two superconductors forming a spatially extended Josephson junction, an effect predicted theoretically by teams at Harvard-Weizmann, and Niels Bohr Institute-Lund University.

    The wide Josephson junction is part of a complex chip of hybrid superconductor and semiconductor materials grown by Michael Manfra’s Microsoft Quantum Lab group at Purdue. It is anticipated to be an important component in the development of topological quantum information. The discovery unlocks a range of possibilities for researchers. “A major advantage of the discovered component is that it can be mass produced. We can design a large and complex system of quantum bits on a contemporary laptop and have it manufactured using a common production technique for ordinary computer circuits,” says co-lead author Postdoctoral Fellow Antonio Fornieri.

    From handcraft to mass production

    Majorana quantum states are the foundation for the quantum computer being developed by a combination of University students, PhDs and postdocs, and Microsoft employees pursuing collaborative research at Microsoft Quantum Lab Copenhagen at the Niels Bohr Institute. The Majorana quantum state has an important property that protects it from external disturbances, in principle enabling longer periods of quantum processing compared with other types of quantum bits. One of the greatest challenges for researchers worldwide is to develop qubits that are stable enough to allow a computer to perform complicated calculations before the quantum state disappears and the information stored in the bits is lost.

    In the past decade, Majorana particles have been created in the lab using semiconductor nanowires connected to superconductors and placed in a large magnetic field. Nanowires are not well suited for scale-up to a full-blown quantum technology because of the laborious assembly required to manipulate microscopic threads with a needle, move them individually from one substrate to another, and then secure them into a network. Given that a quantum computer will likely require thousands or more bits, this would be an exceptionally difficult process using hand-placed nanowires. Furthermore, nanowires require high magnetic fields to function. The new Josephson junction-based platform replaces the nanowires with a two-dimensional device which requires lower magnetic fields to form the Majorana states.

    Promising structure

    “Our prototype is a significant first step towards using this type of system to make quantum bits that are protected from disturbances. Right now, we still need some fine-tuning – we can improve the design and materials. But it is a potentially perfect structure,” asserts Fornieri.

    The two-dimensional system has another important quality according to research group member Alex Whiticar, a doctoral student: “Our component has an additional control parameter, in the form of the superconducting phase difference across the Josephson junction that makes it possible to simultaneously control the presence of Majorana-states throughout a system of quantum bits. This has never been seen before. Furthermore, this system needs a much lower magnetic field to achieve Majorana states. This will significantly ease the manufacturing of larger quantities of quantum bits.”

    Charles Marcus adds, “Moving from one dimensional nanowires into two-dimensional hybrids opened the field. This device is the first of many advances that can be anticipated once topological structures can be patterned and repeated with precision on the 10nm scale. Stay tuned.”

    Collaborative public-private partnering

    This breakthrough underscores the productiveness of the deepened collaboration established September of 2017 between the University of Copenhagen and Microsoft. This collaboration has only intensified and expanded with the establishing of Microsoft Quantum Materials Lab Copenhagen just one year after, drawing from talent both the University of Copenhagen, the Technical University of Denmark, and around Europe.

    As summarized by Michael Manfra, “The close collaboration between the Microsoft Quantum Laboratories has resulted in a promising new platform for the study and control of Majorana zero modes. It is exciting that this approach is potentially scalable.”

    3
    Schematic representation of the device.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Niels Bohr Institute Campus

    The Niels Bohr Institute (Danish: Niels Bohr Institutet) is a research institute of the University of Copenhagen. The research of the institute spans astronomy, geophysics, nanotechnology, particle physics, quantum mechanics and biophysics.

    The Institute was founded in 1921, as the Institute for Theoretical Physics of the University of Copenhagen, by the Danish theoretical physicist Niels Bohr, who had been on the staff of the University of Copenhagen since 1914, and who had been lobbying for its creation since his appointment as professor in 1916. On the 80th anniversary of Niels Bohr’s birth – October 7, 1965 – the Institute officially became The Niels Bohr Institute.[1] Much of its original funding came from the charitable foundation of the Carlsberg brewery, and later from the Rockefeller Foundation.[2]

    During the 1920s, and 1930s, the Institute was the center of the developing disciplines of atomic physics and quantum physics. Physicists from across Europe (and sometimes further abroad) often visited the Institute to confer with Bohr on new theories and discoveries. The Copenhagen interpretation of quantum mechanics is named after work done at the Institute during this time.

    On January 1, 1993 the institute was fused with the Astronomic Observatory, the Ørsted Laboratory and the Geophysical Institute. The new resulting institute retained the name Niels Bohr Institute.

    The University of Copenhagen (UCPH) (Danish: Københavns Universitet) is the oldest university and research institution in Denmark. Founded in 1479 as a studium generale, it is the second oldest institution for higher education in Scandinavia after Uppsala University (1477). The university has 23,473 undergraduate students, 17,398 postgraduate students, 2,968 doctoral students and over 9,000 employees. The university has four campuses located in and around Copenhagen, with the headquarters located in central Copenhagen. Most courses are taught in Danish; however, many courses are also offered in English and a few in German. The university has several thousands of foreign students, about half of whom come from Nordic countries.

    The university is a member of the International Alliance of Research Universities (IARU), along with University of Cambridge, Yale University, The Australian National University, and UC Berkeley, amongst others. The 2016 Academic Ranking of World Universities ranks the University of Copenhagen as the best university in Scandinavia and 30th in the world, the 2016-2017 Times Higher Education World University Rankings as 120th in the world, and the 2016-2017 QS World University Rankings as 68th in the world. The university has had 9 alumni become Nobel laureates and has produced one Turing Award recipient

     
  • richardmitnick 10:34 am on April 11, 2019 Permalink | Reply
    Tags: , , Quantum Computing, ,   

    From Science Node: “The end of an era” 

    Science Node bloc
    From Science Node

    10 Apr, 2019
    Alisa Alering

    For the last fifty years, computer technology has been getting faster and cheaper. Now that extraordinary progress is coming to an end. What happens next?

    John Shalf, department head for Computer Science at Berkeley Lab, has a few ideas. He’s going to share them in his keynote at ISC High Performance 2019 in Frankfurt, Germany (June 16-20), but he gave Science Node a sneak preview.

    Moore’s Law is based on Gordon Moore’s 1965 prediction that the number of transistors on a microchip doubles every two years, while the cost is halved. His prediction proved true for several decades. What’s different now?

    1
    Double trouble. From 1965 to 2004, the number of transistors on a microchip doubled every two years while cost decreased. Now that you can’t get more transistors on a chip, high-performance computing is in need of a new direction. Data courtesy Data quest/Intel.

    The end of Dennard scaling happened in 2004, when we couldn’t crank up the clock frequencies anymore on chips, so we moved to exponentially increasing parallelism in order to continue performance scaling. It was not an ideal solution, but it enabled us to continue some semblance of performance scaling. Now we’ve gotten to the point where we can’t squeeze any more transistors onto the chip.

    If you can’t cram any more transistors on the chip, then we can’t continue to scale the number of cores as a means to scale performance. And we’ll get no power improvement: with the end of Moore’s Law, in order to get ten times more performance we would need ten times more power in the future. Capital equipment cost won’t improve either. Meaning that if I spend $100 million and can get a 100 petaflop machine today, then I spend $100 million ten years from now, I’ll get the same machine.

    That sounds fairly dire. Is there anything we can do?

    There are three dimensions we can pursue: One is new architectures and packaging, the other is CMOS transistor replacements using new materials, the third is new models of computation that are not necessarily digital.

    Let’s break it down. Tell me about architectures.

    2
    John Shalf, of Lawrence Berkeley National Laboratory, wants to consider all options—from new materials and specialization to industry partnerships–when it comes to imagining the future of high-performance computing. Courtesy John Shalf.

    We need to change course and learn from our colleagues in other industries. Our friends in the phone business and in mega data centers are already pointing out the solution. Architectural specialization is one of the biggest sources of improvement in the iPhone. The A8 chip, introduced in 2014, had 29 different discreet accelerators. We’re now at the A11, and it has nearly 40 different discreet hardware accelerators. Future generation chips are slowly squeezing out the CPUs and having special function accelerators for different parts of their workload.

    And for the mega-data center, Google is making its own custom chip. They weren’t seeing the kind of performance improvements they needed from Intel or Nvidia, so they’re building their own custom chips tailored to improve the performance for their workloads. So are Facebook and Amazon. The only people absent from this are HPC.

    With Moore’s Law tapering off, the only way to get a leg up in performance is to go back to customization. The embedded systems and the ARM ecosystem is an example where, even though the chips are custom, the components—the little circuit designs on those chips—are reusable across many different disciplines. The new commodity is going to be these little IP blocks we arrange on the chip. We may need to add some IP blocks that are useful for scientific applications, but there’s a lot of IP reuse in that embedded ecosystem and we need to learn how to tap into that.

    How do new materials fit in?

    We’ve been using silicon for the past several decades because it is inexpensive and ubiquitous, and has many years of development effort behind it. We have developed an entire scalable manufacturing infrastructure around it, so it continues to be the most cost-effective route for mass-manufacture of digital devices. It’s pretty amazing, to use one material system for that long. But now we need to look at some new transistor that can continue to scale performance beyond what we’re able to wring out of silicon. Silicon is, frankly, not that great of a material when it comes to electron mobility.

    _________________________________________________________
    The Materials Project

    The current pace of innovation is extremely slow because the primary means available for characterizing new materials is to read a lot of papers. One solution might be Kristin Persson’s Materials Project, originally invented to advance the exploration of battery materials.

    By scaling materials computations over supercomputing clusters, research can be targeted to the most promising compounds, helping to remove guesswork from materials design. The hope is that reapplying this technology to also discover better electronic materials will speed the pace of discovery for new electronic devices.
    In 2016, an eight laboratory consortium was formed to push this in the DOE “Big ideas Summit” where grass-roots ideas from the labs are presented to the highest levels of DOE leadership. Read the whitepaper and elevator pitch here.

    After the ‘Beyond Moore’s Law’ project was invited back for the 2017 Big Ideas Summit, the DOE created a Microelectronics BRN (Basic Research Needs) Workshop. The initial report from that meeting is released, and the DOE’s FY20 budget includes a line item for Microelectronics research.
    _________________________________________________________

    The problem is, we know historically that once you demonstrate a new device concept in the laboratory, it takes about ten years to commercialize it. Prior experience has shown a fairly consistent timeline of 10 years from lab to fab. Although there are some promising directions, nobody has demonstrated something that’s clearly superior to silicon transistors in the lab yet. With no CMOS replacement imminent, that means we’re already ten years too late! We need to develop tools and processes to accelerate the pace for discovery of more efficient microelectronic devices to replace CMOS and the materials that make them possible.

    So, until we find a new material for the perfect chip, can we solve the problem with new models of computing. What about quantum computing?

    New models would include quantum and neuromorphic computing. These models expand computing into new directions, but they’re best at computing problems that are done poorly using digital computing.

    I like to use the example of ‘quantum Excel.’ Say I balance my checkbook by creating a spreadsheet with formulas, and it tells me how balanced my checkbook is. If I were to use a quantum computer for that—and it would be many, many, many years in the future where we’d have enough qubits to do it, but let’s just imagine—quantum Excel would be the superposition of all possible balanced checkbooks.

    And a neuromorphic computer would say, ‘Yes, it looks correct,’ and then you’d ask it again and it would say, ‘It looks correct within an 80% confidence interval.’ Neuromorphic is great at pattern recognition, but it wouldn’t be as good for running partial differential equations and computing exact arithmetic.

    We really need to go back to the basics. We need to go back to ‘What are the application requirements?’

    Clearly there are a lot of challenges. What’s exciting about this time right now?

    3
    The Summit supercomputer at Oak Ridge National Laboratory operates at a top speed of 200 petaflops and is currently the world’s fastest computer. But the end of Moore’s Law means that to get 10x that performance in the future, we also would need 10x more power. Courtesy Carlos Jones/ORNL.

    Computer architecture has become very, very important again. The previous era of exponential scaling created a much narrower space for innovation because the focus was general purpose computing, the universal machine. The problems we now face opens up the door again to mathematicians and computer architects to collaborate to solve big problems together. And I think that’s very exciting. Those kinds of collaborations lead to really fun, creative, and innovative solutions to worldwide important scientific problems.

    The real issue is that our economic model for acquiring supercomputing systems will be deeply disrupted. Originally, systems were designed by mathematicians to solve important mathematical problems. However, the exponential improvement rates of Moore’s law ensured that the most general purpose machines that were designed for the broadest range of problems would have a superior development budget and, over time, would ultimately deliver more cost-effective performance than specialized solutions.

    The end of Moore’s Law spells the end of general purpose computing as we know it. Continuing with this approach dooms us to modest or even non-existent performance improvements. But the cost of customization using current processes is unaffordable.

    We must reconsider our relationship with industry to re-enable specialization targeted at our relatively small HPC market. Developing a self-sustaining business model is paramount. The embedded ecosystem (including the ARM ecosystem) provides one potential path forward, but there is also the possibility of leveraging the emerging open source hardware ecosystem and even packaging technologies such as Chiplets to create cost-effective specialization.

    We must consider all options for business models and all options for partnerships across agencies or countries to ensure an affordable and sustainable path forward for the future of scientific and technical computing.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 12:50 pm on April 5, 2019 Permalink | Reply
    Tags: "Putting a New Spin on Majorana Fermions", , , , Majorana fermions are particle-like excitations called quasiparticles that emerge as a result of the fractionalization (splitting) of individual electrons into two halves., , , , Quantum Computing, Spin ladders- crystals formed of atoms with a three-dimensional (3-D) structure subdivided into pairs of chains that look like ladders.   

    From Brookhaven National Lab: “Putting a New Spin on Majorana Fermions” 

    From Brookhaven National Lab

    April 1, 2019
    Ariana Tantillo
    atantillo@bnl.gov
    (631) 344-2347

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    Split electrons that emerge at the boundaries between different magnetic states in materials known as spin ladders could act as stable bits of information in next-generation quantum computers.

    2
    Theoretical calculations performed by (left to right) Neil Robinson, Robert Konik, Alexei Tsvelik, and Andreas Weichselbaum of Brookhaven Lab’s Condensed Matter Physics and Materials Science Department suggest that Majorana fermions exist in the boundaries of magnetic materials with different magnetic phases. Majorana fermions are particle-like excitations that emerge when single electrons fractionalize into two halves, and their unique properties are of interest for quantum applications.

    The combination of different phases of water—solid ice, liquid water, and water vapor—would require some effort to achieve experimentally. For instance, if you wanted to place ice next to vapor, you would have to continuously chill the water to maintain the solid phase while heating it to maintain the gas phase.

    For condensed matter physicists, this ability to create different conditions in the same system is desirable because interesting phenomena and properties often emerge at the interfaces between two phases. Of current interest is the conditions under which Majorana fermions might appear near these boundaries.

    Majorana fermions are particle-like excitations called quasiparticles that emerge as a result of the fractionalization (splitting) of individual electrons into two halves. In other words, an electron becomes an entangled (linked) pair of two Majorana quasiparticles, with the link persisting regardless of the distance between them. Scientists hope to use Majorana fermions that are physically separated in a material to reliably store information in the form of qubits, the building blocks of quantum computers. The exotic properties of Majoranas—including their high insensitivity to electromagnetic fields and other environmental “noise”—make them ideal candidates for carrying information over long distances without loss.

    However, to date, Majorana fermions have only been realized in materials at extreme conditions, including at frigid temperatures close to absolute zero (−459 degrees Fahrenheit) and under high magnetic fields. And though they are “topologically” protected from local atomic impurities, disorder, and defects that are present in all materials (i.e., their spatial properties remain the same even if the material is bent, twisted, stretched, or otherwise distorted), they do not survive under strong perturbations. In addition, the range of temperatures over which they can operate is very narrow. For these reasons, Majorana fermions are not yet ready for practical technological application.

    Now, a team of physicists led by the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory and including collaborators from China, Germany, and the Netherlands has proposed a novel theoretical method for producing more robust Majorana fermions. According to their calculations, as described in a paper published on Jan. 15 in Physical Review Letters, these Majoranas emerge at higher temperatures (by many orders of magnitude) and are largely unaffected by disorder and noise. Even though they are not topologically protected, they can persist if the perturbations change slowly from one point to another in space.

    “Our numerical and analytical calculations provide evidence that Majorana fermions exist in the boundaries of magnetic materials with different magnetic phases, or directions of electron spins, positioned next to one other,” said co-author Alexei Tsvelik, senior scientist and leader of the Condensed Matter Theory Group in Brookhaven Lab’s Condensed Matter Physics and Materials Science (CMPMS) Department. “We also determined the number of Majorana fermions you should expect to get if you combine certain magnetic phases.”

    For their theoretical study, the scientists focused on magnetic materials called spin ladders, which are crystals formed of atoms with a three-dimensional (3-D) structure subdivided into pairs of chains that look like ladders. Though the scientists have been studying the properties of spin ladder systems for many years and expected that they would produce Majorana fermions, they did not know how many. To perform their calculations, they applied the mathematical framework of quantum field theory for describing the fundamental physics of elementary particles, and a numerical method (density-matrix renormalization group) for simulating quantum systems whose electrons behave in a strongly correlated way.

    “We were surprised to learn that for certain configurations of magnetic phases we can generate more than one Majorana fermion at each boundary,” said co-author and CMPMS Department Chair Robert Konik.

    For Majorana fermions to be practically useful in quantum computing, they need to be generated in large numbers. Computing experts believe that the minimum threshold at which quantum computers will be able to solve problems that classical computers cannot is 100 qubits. The Majorana fermions also have to be moveable in such a way that they can become entangled.

    The team plans to follow up their theoretical study with experiments using engineered systems such as quantum dots (nanosized semiconducting particles) or trapped (confined) ions. Compared to the properties of real materials, those of engineered ones can be more easily tuned and manipulated to introduce the different phase boundaries where Majorana fermions may emerge.

    “What the next generation of quantum computers will be made of is unclear right now,” said Konik. “We’re trying to find better alternatives to the low-temperature superconductors of the current generation, similar to how silicon replaced germanium in transistors. We’re in such early stages that we need to explore every possibility available.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: