Tagged: Quantum Computing Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:24 am on January 15, 2020 Permalink | Reply
    Tags: "How to verify that quantum chips are computing correctly", , , Quantum Computing   

    From MIT News: “How to verify that quantum chips are computing correctly” 

    MIT News

    From MIT News

    January 13, 2020
    Rob Matheson

    1
    Researchers from MIT, Google, and elsewhere have designed a novel method for verifying when quantum processors have accurately performed complex computations that classical computers can’t. They validate their method on a custom system (pictured) that’s able to capture how accurately a photonic chip (“PNP”) computed a notoriously difficult quantum problem. Image: Mihika Prabhu

    A new method determines whether circuits are accurately executing complex operations that classical computers can’t tackle.

    In a step toward practical quantum computing, researchers from MIT, Google, and elsewhere have designed a system that can verify when quantum chips have accurately performed complex computations that classical computers can’t.

    Quantum chips perform computations using quantum bits, called “qubits,” that can represent the two states corresponding to classic binary bits — a 0 or 1 — or a “quantum superposition” of both states simultaneously. The unique superposition state can enable quantum computers to solve problems that are practically impossible for classical computers, potentially spurring breakthroughs in material design, drug discovery, and machine learning, among other applications.

    Full-scale quantum computers will require millions of qubits, which isn’t yet feasible. In the past few years, researchers have started developing “Noisy Intermediate Scale Quantum” (NISQ) chips, which contain around 50 to 100 qubits. That’s just enough to demonstrate “quantum advantage,” meaning the NISQ chip can solve certain algorithms that are intractable for classical computers. Verifying that the chips performed operations as expected, however, can be very inefficient. The chip’s outputs can look entirely random, so it takes a long time to simulate steps to determine if everything went according to plan.

    In a paper published today in Nature Physics, the researchers describe a novel protocol to efficiently verify that an NISQ chip has performed all the right quantum operations. They validated their protocol on a notoriously difficult quantum problem running on custom quantum photonic chip.

    “As rapid advances in industry and academia bring us to the cusp of quantum machines that can outperform classical machines, the task of quantum verification becomes time critical,” says first author Jacques Carolan, a postdoc in the Department of Electrical Engineering and Computer Science (EECS) and the Research Laboratory of Electronics (RLE). “Our technique provides an important tool for verifying a broad class of quantum systems. Because if I invest billions of dollars to build a quantum chip, it sure better do something interesting.”

    Joining Carolan on the paper are researchers from EECS and RLE at MIT, as well from the Google Quantum AI Laboratory, Elenion Technologies, Lightmatter, and Zapata Computing.

    Divide and conquer

    The researchers’ work essentially traces an output quantum state generated by the quantum circuit back to a known input state. Doing so reveals which circuit operations were performed on the input to produce the output. Those operations should always match what researchers programmed. If not, the researchers can use the information to pinpoint where things went wrong on the chip.

    At the core of the new protocol, called “Variational Quantum Unsampling,” lies a “divide and conquer” approach, Carolan says, that breaks the output quantum state into chunks. “Instead of doing the whole thing in one shot, which takes a very long time, we do this unscrambling layer by layer. This allows us to break the problem up to tackle it in a more efficient way,” Carolan says.

    For this, the researchers took inspiration from neural networks — which solve problems through many layers of computation — to build a novel “quantum neural network” (QNN), where each layer represents a set of quantum operations.

    To run the QNN, they used traditional silicon fabrication techniques to build a 2-by-5-millimeter NISQ chip with more than 170 control parameters — tunable circuit components that make manipulating the photon path easier. Pairs of photons are generated at specific wavelengths from an external component and injected into the chip. The photons travel through the chip’s phase shifters — which change the path of the photons — interfering with each other. This produces a random quantum output state — which represents what would happen during computation. The output is measured by an array of external photodetector sensors.

    That output is sent to the QNN. The first layer uses complex optimization techniques to dig through the noisy output to pinpoint the signature of a single photon among all those scrambled together. Then, it “unscrambles” that single photon from the group to identify what circuit operations return it to its known input state. Those operations should match exactly the circuit’s specific design for the task. All subsequent layers do the same computation — removing from the equation any previously unscrambled photons — until all photons are unscrambled.

    As an example, say the input state of qubits fed into the processor was all zeroes. The NISQ chip executes a bunch of operations on the qubits to generate a massive, seemingly randomly changing number as output. (An output number will constantly be changing as it’s in a quantum superposition.) The QNN selects chunks of that massive number. Then, layer by layer, it determines which operations revert each qubit back down to its input state of zero. If any operations are different from the original planned operations, then something has gone awry. Researchers can inspect any mismatches between the expected output to input states, and use that information to tweak the circuit design.

    Boson “unsampling”

    In experiments, the team successfully ran a popular computational task used to demonstrate quantum advantage, called “boson sampling,” which is usually performed on photonic chips. In this exercise, phase shifters and other optical components will manipulate and convert a set of input photons into a different quantum superposition of output photons. Ultimately, the task is to calculate the probability that a certain input state will match a certain output state. That will essentially be a sample from some probability distribution.

    But it’s nearly impossible for classical computers to compute those samples, due to the unpredictable behavior of photons. It’s been theorized that NISQ chips can compute them fairly quickly. Until now, however, there’s been no way to verify that quickly and easily, because of the complexity involved with the NISQ operations and the task itself.

    “The very same properties which give these chips quantum computational power makes them nearly impossible to verify,” Carolan says.

    In experiments, the researchers were able to “unsample” two photons that had run through the boson sampling problem on their custom NISQ chip — and in a fraction of time it would take traditional verification approaches.

    “This is an excellent paper that employs a nonlinear quantum neural network to learn the unknown unitary operation performed by a black box,” says Stefano Pirandola, a professor of computer science who specializes in quantum technologies at the University of York. “It is clear that this scheme could be very useful to verify the actual gates that are performed by a quantum circuit — [for example] by a NISQ processor. From this point of view, the scheme serves as an important benchmarking tool for future quantum engineers. The idea was remarkably implemented on a photonic quantum chip.”

    While the method was designed for quantum verification purposes, it could also help capture useful physical properties, Carolan says. For instance, certain molecules when excited will vibrate, then emit photons based on these vibrations. By injecting these photons into a photonic chip, Carolan says, the unscrambling technique could be used to discover information about the quantum dynamics of those molecules to aid in bioengineering molecular design. It could also be used to unscramble photons carrying quantum information that have accumulated noise by passing through turbulent spaces or materials.

    “The dream is to apply this to interesting problems in the physical world,” Carolan says.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 4:17 pm on January 9, 2020 Permalink | Reply
    Tags: , New Quantum Algorithm, Quantum Computing, Quantum imaginary time evolution, The Hamiltonian represents the energy of the system   

    From Caltech: “Caltech Researchers Develop New Quantum Algorithm” 

    Caltech Logo

    From Caltech

    December 18, 2019
    Emily Velasco
    626‑395‑6487
    evelasco@caltech.edu

    1

    Quantum computers, just like classical computers, are only as good as the instructions that we give them. And although quantum computing is one of the hottest topics in science these days, the instructions, or algorithms, for quantum computers still have a long way to go to become useful. Garnet Chan, Caltech’s Bren Professor of Chemistry, is tackling this problem. In a new paper [Nature Physics], he describes how he, together with Fernando Brandao, Bren Professor of Theoretical Physics, and Austin Minnich, professor of mechanical engineering and applied physics, developed an algorithm for quantum computers that will help them find use in simulations in the physical sciences.

    The algorithm is derived from one already in use in classical computing called imaginary time evolution. Chan’s new algorithm, tailored to run on quantum computers, has been fittingly dubbed quantum imaginary time evolution and allows a user to find the lowest energy of a given molecule or material.

    We sat down with Chan to talk about his research and what it means for quantum computing.

    In lay terms, what have you achieved with your new research?

    There has been a lot of interest in what kind of problems a quantum computer can potentially help to solve in the physical sciences. One problem that many people are interested in is how to simulate the ground states of molecules and materials. Our new paper proposes a way to calculate ground states of Hamiltonians that runs on near-term quantum computers with very few resources.

    What is a Hamiltonian, and why would you want to know its ground state?

    The Hamiltonian represents the energy of the system, and the ground state of the Hamiltonian is the most stable state of the problem. Most physical systems, under ordinary conditions, are not too excited, and thus live close to their ground states.

    For example, if we want to do a simulation of water, we could look at how water behaves after it has been blasted into a plasma—an electrically charged gas—but that’s not the state water is usually found in; it is not the ground state of water. Ground states are of special interest in understanding the world under ordinary conditions.
    Why is it challenging to perform these calculations on a quantum computer?

    Quantum devices currently decohere after a short period of time, which means that the computer needs to be recalibrated and cannot be used for calculations until it is set up again. That means we need to find a way to perform calculations on them very efficiently so we solve our problem before decoherence occurs.

    What does your algorithm do?

    There have been many proposals for how to obtain ground states on quantum computers. One of the first was by Alexei Kitaev [Caltech’s Ronald and Maxine Linde Professor of Theoretical Physics and Mathematics], but unfortunately that algorithm, known as phase estimation, requires too many instructions and cannot be implemented before current quantum computers decohere. Another way, called the variational approach, is very simple to implement but in practice turns out not to be so accurate. We wanted to find a way that could be potentially as accurate as phase estimation but which could also be practically programmed on today’s quantum computers.

    What does the development of this algorithm mean for quantum computing?

    Quantum computers are still very new, and we still need to learn what they will be useful for. Because we can barely use them right now, part of the answer lies in developing efficient programs that can be run on them in very little time. Our work provides a basis for assessing the capabilities of quantum computers as they are now, which will help tell us what we can expect in the future.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”

    Caltech campus

     
  • richardmitnick 8:48 am on January 5, 2020 Permalink | Reply
    Tags: "Immortal quantum particles", , , , , Quantum Computing, Quantum mechanical wave-particle duality, , Technische Universität München,   

    From Technische Universität München: “Immortal quantum particles” 

    Techniche Universitat Munchen

    From Technische Universität München

    14.06.2019

    1
    Strong quantum interactions prevent quasiparticles from decay. Image: K. Verresen / TUM

    Oscillating Quasiparticles: the cycle of decay and rebirth

    Immortal quantum particles

    Decay is relentless in the macroscopic world: broken objects do not fit themselves back together again. However, other laws are valid in the quantum world: new research [Nature Physics] shows that so-called quasiparticles can decay and reorganize themselves again and are thus become virtually immortal. These are good prospects for the development of durable data memories.

    As the saying goes, nothing lasts forever. The laws of physics confirm this: on our planet, all processes increase entropy, thus molecular disorder. For example, a broken glass would never put itself back together again.

    Theoretical physicists at the Technische Universität München (TUM) and the Max Planck Institute for the Physics of Complex Systems have discovered that things which seem inconceivable in the everyday world are possible on a microscopic level.

    2

    “Until now, the assumption was that quasiparticles in interacting quantum systems decay after a certain time. We now know that the opposite is the case: strong interactions can even stop decay entirely,” explains Frank Pollmann, Professor for Theoretical Solid-State Physics at the TUM. Collective lattice vibrations in crystals, so-called phonons, are one example of such quasiparticles.

    The concept of quasiparticles was coined by the physicist and Nobel prize winner Lev Davidovich Landau. He used it to describe collective states of lots of particles or rather their interactions due to electrical or magnetic forces. Due to this interaction, several particles act like one single one.

    Numeric methods open up new perspectives

    Up until now, it wasn’t known in detail which processes influence the fate of these quasiparticles in interacting systems,” says Pollmann. “It is only now that we possess numerical methods with which we can calculate complex interactions as well as computers with a performance which is high enough to solve these equations.”

    “The result of the elaborate simulation: admittedly, quasiparticles do decay, however new, identical particle entities emerge from the debris,” says the lead author, Ruben Verresen. “If this decay proceeds very quickly, an inverse reaction will occur after a certain time and the debris will converge again. This process can recur endlessly and a sustained oscillation between decay and rebirth emerges.”

    From a physical point of view, this oscillation is a wave which is transformed into matter, which, according to quantum mechanical wave-particle duality, is possible. Therefore, the immortal quasiparticles do not transgress the second law of thermodynamics. Their entropy remains constant, decay has been stopped.

    The reality check

    The discovery also explains phenomena which were baffling until now. Experimental physicists had measured that the magnetic compound Ba3CoSb2O9 is astonishingly stable. Magnetic quasiparticles, magnons, are responsible for it. Other quasiparticles, rotons, ensure that helium which is a gas on the earth’s surface becomes a liquid at absolute zero which can flow unrestricted.

    “Our work is purely basic research,“ emphasizes Pollmann. However, it is perfectly possible that one day the results will even allow for applications, for example the construction of durable data memories for future quantum computers.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Techniche Universitat Munchin Campus

    Techniche Universitat Munchin is one of Europe’s top universities. It is committed to excellence in research and teaching, interdisciplinary education and the active promotion of promising young scientists. The university also forges strong links with companies and scientific institutions across the world. TUM was one of the first universities in Germany to be named a University of Excellence. Moreover, TUM regularly ranks among the best European universities in international rankings.

     
  • richardmitnick 11:20 am on January 1, 2020 Permalink | Reply
    Tags: , , , , Quantum Computing,   

    From Princeton University: “In leap for quantum computing, silicon quantum bits establish a long-distance relationship” 

    Princeton University
    From Princeton University

    Dec. 30, 2019
    Catherine Zandonella, Office of the Dean for Research

    Imagine a world where people could only talk to their next-door neighbor, and messages must be passed house to house to reach far destinations.

    Until now, this has been the situation for the bits of hardware that make up a silicon quantum computer, a type of quantum computer with the potential to be cheaper and more versatile than today’s versions.

    Now a team based at Princeton University has overcome this limitation and demonstrated that two quantum-computing components, known as silicon “spin” qubits, can interact even when spaced relatively far apart on a computer chip. The study was published in the journal Nature.

    1
    Researchers at Princeton University have made an important step forward in the quest to build a quantum computer using silicon components, which are prized for their low cost and versatility compared to the hardware in today’s quantum computers. The team showed that a silicon-spin quantum bit (shown in the box) can communicate with another quantum bit located a significant distance away on a computer chip. The feat could enable connections between multiple quantum bits to perform complex calculations. Image by Felix Borjans

    “The ability to transmit messages across this distance on a silicon chip unlocks new capabilities for our quantum hardware,” said Jason Petta, the Eugene Higgins Professor of Physics at Princeton and leader of the study. “The eventual goal is to have multiple quantum bits arranged in a two-dimensional grid that can perform even more complex calculations. The study should help in the long term to improve communication of qubits on a chip as well as from one chip to another.”

    Quantum computers have the potential to tackle challenges beyond the capabilities of everyday computers, such as factoring large numbers. A quantum bit, or qubit, can process far more information than an everyday computer bit because, whereas each classical computer bit can have a value of 0 or 1, a quantum bit can represent a range of values between 0 and 1 simultaneously.

    To realize quantum computing’s promise, these futuristic computers will require tens of thousands of qubits that can communicate with each other. Today’s prototype quantum computers from Google, IBM and other companies contain tens of qubits made from a technology involving superconducting circuits, but many technologists view silicon-based qubits as more promising in the long run.

    Silicon spin qubits have several advantages over superconducting qubits. The silicon spin qubits retain their quantum state longer than competing qubit technologies. The widespread use of silicon for everyday computers means that silicon-based qubits could be manufactured at low cost.

    The challenge stems in part from the fact that silicon spin qubits are made from single electrons and are extremely small.

    “The wiring or ‘interconnects’ between multiple qubits is the biggest challenge towards a large scale quantum computer,” said James Clarke, director of quantum hardware at Intel, whose team is building silicon qubits using using Intel’s advanced manufacturing line, and who was not involved in the study. “Jason Petta’s team has done great work toward proving that spin qubits can be coupled at long distances.”

    To accomplish this, the Princeton team connected the qubits via a “wire” that carries light in a manner analogous to the fiber optic wires that deliver internet signals to homes. In this case, however, the wire is actually a narrow cavity containing a single particle of light, or photon, that picks up the message from one qubit and transmits it to the next qubit.

    The two qubits were located about half a centimeter, or about the length of a grain of rice, apart. To put that in perspective, if each qubit were the size of a house, the qubit would be able to send a message to another qubit located 750 miles away.

    The key step forward was finding a way to get the qubits and the photon to speak the same language by tuning all three to vibrate at the same frequency. The team succeeded in tuning both qubits independently of each other while still coupling them to the photon. Previously the device’s architecture permitted coupling of only one qubit to the photon at a time.

    “You have to balance the qubit energies on both sides of the chip with the photon energy to make all three elements talk to each other,” said Felix Borjans, a graduate student and first author on the study. “This was the really challenging part of the work.”

    Each qubit is composed of a single electron trapped in a tiny chamber called a double quantum dot. Electrons possess a property known as spin, which can point up or down in a manner analogous to a compass needle that points north or south. By zapping the electron with a microwave field, the researchers can flip the spin up or down to assign the qubit a quantum state of 1 or 0.

    “This is the first demonstration of entangling electron spins in silicon separated by distances much larger than the devices housing those spins,” said Thaddeus Ladd, senior scientist at HRL Laboratories and a collaborator on the project. “Not too long ago, there was doubt as to whether this was possible, due to the conflicting requirements of coupling spins to microwaves and avoiding the effects of noisy charges moving in silicon-based devices. This is an important proof-of-possibility for silicon qubits because it adds substantial flexibility in how to wire those qubits and how to lay them out geometrically in future silicon-based ‘quantum microchips.’”

    The communication between two distant silicon-based qubits devices builds on previous work by the Petta research team. In a 2010 paper in the journal Science, the team showed it is possible to trap single electrons in quantum wells. In the journal Nature in 2012, the team reported the transfer of quantum information from electron spins in nanowires to microwave-frequency photons, and in 2016 in Science they demonstrated the ability to transmit information from a silicon-based charge qubit to a photon. They demonstrated nearest-neighbor trading of information in qubits in 2017 in Science. And the team showed in 2018 in Nature that a silicon spin qubit could exchange information with a photon.

    2
    Jelena Vuckovic, professor of electrical engineering and the Jensen Huang Professor in Global Leadership at Stanford University, who was not involved in the study, commented: “Demonstration of long-range interactions between qubits is crucial for further development of quantum technologies such as modular quantum computers and quantum networks. This exciting result from Jason Petta’s team is an important milestone towards this goal, as it demonstrates non-local interaction between two electron spins separated by more than 4 millimeters, mediated by a microwave photon. Moreover, to build this quantum circuit, the team employed silicon and germanium – materials heavily used in the semiconductor industry.”

    In addition to Borjans and Petta, the following contributed to the study: Xanthe Croot, a Dicke postdoctoral fellow; associate research scholar Michael Gullans; and Xiao Mi, who earned his Ph.D. at Princeton in Petta’s group and is now a research scientist at Google.

    The study was funded by Army Research Office (grant W911NF-15-1-0149) and the Gordon and Betty Moore Foundation’s EPiQS Initiative (grant GBMF4535).

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Princeton University Campus

    About Princeton: Overview

    Princeton University is a vibrant community of scholarship and learning that stands in the nation’s service and in the service of all nations. Chartered in 1746, Princeton is the fourth-oldest college in the United States. Princeton is an independent, coeducational, nondenominational institution that provides undergraduate and graduate instruction in the humanities, social sciences, natural sciences and engineering.

    As a world-renowned research university, Princeton seeks to achieve the highest levels of distinction in the discovery and transmission of knowledge and understanding. At the same time, Princeton is distinctive among research universities in its commitment to undergraduate teaching.

    Today, more than 1,100 faculty members instruct approximately 5,200 undergraduate students and 2,600 graduate students. The University’s generous financial aid program ensures that talented students from all economic backgrounds can afford a Princeton education.

    Princeton Shield

     
  • richardmitnick 1:27 pm on December 28, 2019 Permalink | Reply
    Tags: , Quantum Computing   

    From Discover Magazine: “Quantum Computers Finally Beat Supercomputers in 2019” 

    DiscoverMag

    From Discover Magazine

    December 28, 2019
    Stephen Ornes

    1
    LSU physicist Jonathan Dowling (right), shown with alumnus Todd Moulder, has pushed the growth rate in quantum computing. (Credit: LSU)

    In his 2013 book, Schrödinger’s Killer App, Louisiana State University theoretical physicist Jonathan Dowling predicted what he called “super exponential growth.” He was right. Back in May, during Google’s Quantum Spring Symposium, computer engineer Hartmut Neven reported the company’s quantum computing chip had been gaining power at breakneck speed.

    2
    Google’s Sycamore chip is kept cool inside their quantum cryostat.
    (Image: © Eric Lucero/Google, Inc.)

    The subtext: We are venturing into an age of quantum supremacy — the point at which quantum computers outperform the best classical supercomputers in solving a well-defined problem.

    Engineers test the accuracy of quantum computing chips by using them to solve a problem, and then verifying the work with a classical machine. But in early 2019, that process became problematic, reported Neven, who runs Google’s Quantum Artificial Intelligence Lab. Google’s quantum chip was improving so quickly that his group had to commandeer increasingly large computers — and then clusters of computers — to check its work. It’s become clear that eventually, they’ll run out of machines.

    Case in point: Google announced in October that its 53-qubit quantum processor had needed only 200 seconds to complete a problem that would have required 10,000 years on a supercomputer.

    Neven’s group observed a “double exponential” growth rate in the chip’s computing power over a few months. Plain old exponential growth is already really fast: It means that from one step to the next, the value of something multiplies. Bacterial growth can be exponential if the number of organisms doubles during an observed time interval. So can computing power of classical computers under Moore’s Law, the idea that it doubles roughly every year or two. But under double exponential growth, the exponents have exponents. That makes a world of difference: Instead of a progression from 2 to 4 to 8 to 16 to 32 bacteria, for example, a double-exponentially growing colony in the same time would grow from 2 to 4 to 16 to 256 to 65,536.

    Neven credits the growth rate to two factors: the predicted way that quantum computers improve on the computational power of classical ones, and quick improvement of quantum chips themselves. Some began referring to this growth rate as “Neven’s Law.” Some theorists say such growth was unavoidable.

    We talked to Dowling (who suggests a more fitting moniker: the “Dowling-Neven Law”) about double exponential growth, his prediction and his underappreciated Beer Theory of Quantum Mechanics.

    Q: You saw double exponential growth on the horizon long before it showed up in a lab. How?

    A: Anytime there’s a new technology, if it is worthwhile, eventually it kicks into exponential growth in something. We see this with the internet, we saw this with classical computers. You eventually hit a point where all of the engineers figure out how to make this work, miniaturize it and then you suddenly run into exponential growth in terms of the hardware. If it doesn’t happen, that hardware falls off the face of the Earth as a nonviable technology.

    Q: So you weren’t surprised to see Google’s chip improving so quickly?

    A: I’m only surprised that it happened earlier than I expected. In my book, I said within the next 50 to 80 years. I guessed a little too conservatively.

    Q: You’re a theoretical physicist. Are you typically conservative in your predictions?

    People say I’m fracking nuts when I publish this stuff. I like to think that I’m the crazy guy that always makes the least conservative prediction. I thought this was far-out wacky stuff, and I was making the most outrageous prediction. That’s why it’s taking everybody by surprise. Nobody expected double exponential growth in processing power to happen this soon.

    Q: Given that quantum chips are getting so fast, can I buy my own quantum computer now?

    A: Most of the people think the quantum computer is a solved problem. That we can just wait, and Google will sell you one that can do whatever you want. But no. We’re in the [prototype] era. The number of qubits is doubling every six months, but the qubits are not perfect. They fail a lot and have imperfections and so forth. But Intel and Google and IBM aren’t going to wait for perfect qubits. The people who made the [first computers] didn’t say, “We’re going to stop making bigger computers until we figure out how to make perfect vacuum tubes.”

    Q: What’s the big deal about doing problems with quantum mechanics instead of classical physics?

    A: If you have 32 qubits, it’s like you have 232 parallel universes that are working on parts of your computation. Or like you have a parallel processor with 232 processors. But you only pay the electric bill in our universe.

    Q: Quantum mechanics gets really difficult, really fast. How do you deal with that?

    A: Everybody has their own interpretation of quantum mechanics. Mine is the Many Beers Interpretation of Quantum Mechanics. With no beer, quantum mechanics doesn’t make any sense. After one, two or three beers, it makes perfect sense. But once you get to six or 10, it doesn’t make any sense again. I’m on my first bottle, so I’m in the zone.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 1:10 pm on December 6, 2019 Permalink | Reply
    Tags: "A platform for stable quantum computing, a playground for exotic physics", , , , Quantum Computing, , Topological insulators are materials that can conduct electricity on their surface or edge but not in the middle.   

    From Harvard Gazette: “A platform for stable quantum computing, a playground for exotic physics” 

    Harvard University


    From Harvard Gazette

    December 5, 2019
    Leah Burrows

    1
    A close-up view of a quantum computer. Courtesy of Harvard SEAS

    Recent research settles a long-standing debate.

    Move over Godzilla vs. King Kong. This is the crossover event you’ve been waiting for — at least if you’re a condensed-matter physicist. Harvard University researchers have demonstrated the first material that can have both strongly correlated electron interactions and topological properties.

    Not sure what that means? Don’t worry, we’ll walk you through it. But the important thing to know is that this discovery not only paves the way for more stable quantum computing, but also creates an entirely new platform to explore the wild world of exotic physics.

    The research was published in Nature Physics.

    Let’s start with the basics. Topological insulators are materials that can conduct electricity on their surface or edge, but not in the middle. The strange thing about these materials is that no matter how you cut them, the surface will always be conducting and the middle always insulating. These materials offer a playground for fundamental physics, and are also promising for a number of applications in special types of electronics and quantum computing.

    Since the discovery of topological insulators, researchers around the world have been working to identify materials with these powerful properties.

    “A recent boom in condensed-matter physics has come from discovering materials with topologically protected properties,” said Harris Pirie, a graduate student in the Department of Physics and first author of the paper.

    One potential material, samarium hexaboride, has been at the center of a fierce debate among condensed-matter physicists for more than a decade. At issue: Is it or isn’t it a topological insulator?

    “Over the last 10 years, a bunch of papers have come out saying yes and a bunch of papers have come out saying no,” said Pirie. “The crux of the issue is that most topological materials don’t have strongly interacting electrons, meaning the electrons move too quickly to feel each other. But samarium hexaboride does, meaning that electrons inside this material slow down enough to interact strongly. In this realm, the theory gets fairly speculative and it’s been unclear whether or not it’s possible for materials with strongly interacting properties to also be topological. As experimentalists, we’ve been largely operating blind with materials like this.”

    In order to settle the debate and figure out, once and for all, whether it’s possible to have both strongly interacting and topological properties, the researchers first needed to find a well-ordered patch of samarium hexaboride surface on which to perform the experiment.

    2
    A simulation of electrons scattering off atomic defects in samarium hexaboride. By observing the waves, the researchers could figure out the momentum of the electrons in relation to their energy. Video courtesy of Harris Pirie/Harvard University

    It was no easy task, considering the majority of the material surface is a craggy, disordered mess. The researchers used ultrahigh precision measurement tools developed in the lab of Jenny Hoffman, the Clowes Professor of Science and senior author of the paper, to find a suitable, atomic-scale patch of samarium hexaboride.

    Next, the team set out to determine if the material was topologically insulating by sending waves of electrons through the material and scattering them off of atomic defects — like dropping a pebble into a pond. By observing the waves, the researchers could figure out the momentum of the electrons in relation to their energy.

    “We found that the momentum of the electrons is directly proportional to their energy, which is the smoking gun of a topological insulator,” said Pirie. “It’s really exciting to be finally moving into this intersection of interacting physics and topological physics. We don’t know what we’ll find here.”

    As it relates to quantum computing, strongly interacting topological materials may be able to protect qubits from forgetting their quantum state, a process called decoherence.

    “If we could encode the quantum information in a topologically protected state, it is less susceptible to external noise that can accidentally switch the qubit,” said Hoffman. “Microsoft already has a large team pursuing topological quantum computation in composite materials and nanostructures. Our work demonstrates a first in a single topological material that harnesses strong electron interactions that might eventually be used for topological quantum computing.”

    “The next step will be to use the combination of topologically protected quantum states and strong interactions to engineer novel quantum states of matter, such as topological superconductors,” said Dirk Morr, professor of physics at the University of Illinois, Chicago, and the senior theorist on the paper. “Their extraordinary properties could open unprecedented possibilities for the implementation of topological quantum bits.”

    This research was co-authored by Yu Liu, Anjan Soumyanarayanan, Pengcheng Chen, Yang He, M.M. Yee, P.F.S. Rosa, J.D. Thompson, Dae-Jeong Kim, Z. Fisk, Xiangfeng Wang, Johnpierre Paglione, and M.H. Hamidian.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Harvard University campus
    Harvard University is the oldest institution of higher education in the United States, established in 1636 by vote of the Great and General Court of the Massachusetts Bay Colony. It was named after the College’s first benefactor, the young minister John Harvard of Charlestown, who upon his death in 1638 left his library and half his estate to the institution. A statue of John Harvard stands today in front of University Hall in Harvard Yard, and is perhaps the University’s best known landmark.

    Harvard University has 12 degree-granting Schools in addition to the Radcliffe Institute for Advanced Study. The University has grown from nine students with a single master to an enrollment of more than 20,000 degree candidates including undergraduate, graduate, and professional students. There are more than 360,000 living alumni in the U.S. and over 190 other countries.

     
  • richardmitnick 5:59 pm on November 30, 2019 Permalink | Reply
    Tags: , , Black phosphorus, , , Chromium triiodide, , , , Phosphorene, , Quantum Computing   

    From Discover Magazine: “Move over Graphene: Next-Gen 2D Materials Could Revolutionize Technology” 

    DiscoverMag

    From Discover Magazine

    November 29, 2019
    John Wenz

    Move over, flat carbon. Meet borophene, phosphorene and the rest of the next generation of “atomically thin” super-materials.

    1
    An illustration of graphene’s hexagonal molecular structure. (Credit: OliveTree/Shutterstock)

    The wonder material graphene — an array of interlinked carbon atoms arranged in a sheet just one atom thick — promised a world of applications, including super-fast electronics, ultra-sensitive sensors and incredibly durable materials. After a few false starts, that promise is close to realization. And a suite of other extremely thin substances is following in its wake.

    Graphene got its beginnings in 2003, when scientists at the University of Manchester found they could peel off a gossamer film of the material just by touching a piece of ordinary sticky tape to a block of purified graphite — the solid form of carbon that’s mixed with clay and used as the “lead” in most pencils. Graphene proved stronger than steel but extremely flexible, and electrons could zip through it at high speeds. It earned its discoverers the Nobel Prize in 2010, but researchers spent years struggling to manufacture it on larger scales and figuring out how its remarkable properties could best be used.

    They didn’t get it right straight out of the gate, says Todd Krauss, a chemist at the University of Rochester. “Scientists are pretty bad at predicting what’s going to be useful in applications,” he says.

    With its atom-thin sheets layered into tiny particles known as quantum dots, graphene was tried as a microscopic medical sensor, but it didn’t perform as desired, Krauss says. With its sheets rolled up into straw-like nanotubes, graphene was built into items like hockey sticks and baseball bats in the hopes that its strength and durability could better existing carbon fiber. But Krauss notes that there has since been a trend away from using nanotubes in consumer products. (Some also worry that long carbon nanotubes could harm the lungs since they have been shown to have some chemical resemblance to asbestos.)

    Today graphene is finding its way into different types of products. “Graphene is here,” says Mark Hersam of Northwestern University. Layered over zinc, graphene oxide is actively being developed as a replacement, with higher storage capacity, for the sometimes unreliable graphite now used in battery anodes. And nanotubes were recently used as transistors to build a microprocessor, replacing silicon (unlike flat graphene, nanotubes can be coaxed into acting like a semiconductor). Though the microprocessor was primitive by modern computing standards, akin to the processing level of a Sega Genesis, materials scientists think it could ultimately pave the way for more efficient, faster and smaller carbon components for computer processors.

    At the same time, a new generation of two-dimensional materials is emerging. The success of graphene further fueled the ongoing effort to find useful atomically thin materials, working with a range of different chemicals, so as to exploit the physical properties that emerge in such super-thin substances. The newcomers include an insulator more efficient than conventional ones at stopping the movement of electrons, and another that allows electrons to glide across it at a good percent of the speed of light, with little friction. Researchers think some of these may one day replace silicon in computer chips, among other potential uses.

    Other materials now in development have even higher aspirations, such as advancing scientists toward one of the most tantalizing goals in chemistry — the creation of high-temperature superconductors.

    Speedy Electrons

    In graphene, carbon atoms link up in an orderly honeycomb pattern, each atom sharing electrons with three neighboring carbon atoms. That structure allows any added electrons to move speedily across its surface. Ordinarily, a single electron might move through a conducting metal like copper at 1.2 inches per minute (given a 12-gauge wire with 10 amps of electricity). But in early experiments on graphene, electrons zipped along at 2.34 billion inches per minute — which could make for electronics that charge in just a few minutes and eventually in a matter of seconds.

    Graphene’s physical properties have inspired many potential applications, including in medicine. A variant of graphene, graphene oxide, is being studied as an experimental drug delivery vehicle. Seen here through a microscope, this chunk of graphene oxide is about 80 nanometers high. A single sheet of graphene is just 0.34 nanometers thick.

    Graphene conducts heat just as well as it conducts electricity. It’s also one of the strongest materials ever studied — stronger than steel, it can stop a bullet — but oddly stretchy too, meaning it’s both flexible and tough.

    Other 2D materials under exploration may have similar attributes as well as novel qualities all their own, but chemical impurities have until recently kept them hidden, says Angela Hight Walker, a project leader at the National Institute of Standards and Technology in Gaithersburg, Maryland. “We’re now getting to the point where we can see the new physics that’s been covered up by poor sample quality,” she says.

    One of the newcomers is black phosphorus, explored by Hersam and his coauthor Vinod Sangwan in the 2018 Annual Review of Physical Chemistry. When white phosphorus — a caustic, highly reactive chemical — is super-heated under high pressure, it becomes a flaky, conductive material with graphite-like behavior. Peeling off an atom-thin layer of this black phosphorus with sticky tape produces a material called phosphorene. First fabricated in 2014, phosphorene rivals graphene in terms of strength and ability to efficiently move electrons. But at the atomic level, it isn’t as perfectly flat as graphene — and that has intriguing consequences.

    Phosphorene interacts with electrons and photons in quirky ways, pointing to potential uses in future computer chips and fiber optics.

    In graphene, carbon atoms lie side by side, hence its flatness. But phosphorene’s 2D configuration looks a bit like a pleat, with two atoms at a lower level connected to two at a higher level, forming what’s called a bandgap. This wavy structure, in turn, affects the flow of electrons in a way that makes phosphorene a “semiconductor,” meaning that it’s very easy to switch the flow of electrons on or off. Phosphorene, like silicon, could find application in computer chips, where the toggled electrons represent 1s or 0s.

    Phosphorene also is especially good at emitting or absorbing photons at infrared wavelengths. This optical trick gives phosphorene huge potential for use in fiber-optic communication, Hersam says, because the bandgap matches the energy of infrared light near-exactly. It could also prove very useful in solar cells.

    Working with phosphorene is not easy, however. It is highly unstable and rapidly oxidizes unless stored correctly. “Literally, it will decompose if it is sitting out in the room,” Hight Walker says, typically in less than a minute. Layering it with other 2D materials could help protect the fragile chemical.

    Two Sides of Boron

    Boron would seem an odd fit for electronic applications. It’s better known as a fertilizer, an ingredient in fiberglass or (combined with salt) a laundry-detergent additive. But make it very thin and very flat, and boron begins to act more like a metal, conducting electricity easily. Two-dimensional boron, called borophene, is also ultra-flexible and transparent. Combined with its conductive properties, borophene’s flexibility and transparency could eventually make it a go-to material for new gadgets, including ultra-thin, foldable touch screens.

    Like graphene, borophene’s structure allows electrons to fly through it. It’s such a good conductor that it’s now being studied as a way to boost energy storage in lithium-ion batteries. Some researchers even think it might be coaxed into superconducting states at relatively high temperatures — though that’s still very cold (initial tests show the effect between minus-415 to minus-425 degrees Fahrenheit). Most current superconductors work close to absolute zero, or nearly minus -460 degrees F. A superconducting material allows electrons to move through it without any resistance, creating the potential for a device that accomplishes robust electronic feats while using only a small amount of power.

    3
    Emerging 2D materials phosphorene, borophene and boron nitride form thin films. Their atomic arrangements are viewed here from above and in profile. (Credit: Modified from V.K. Sangwan and M.C. Hersam/AR Physical Chemistry 2018)

    In the form of borophene, boron can conduct electrons like a metal. Yet, as part of a 2D-film of boron nitride, it can block the flow of electrons quite effectively. “In other words, 2D boron and [2D] boron nitride are on opposite ends of the electrical conductivity spectrum,” Hersam says.

    Boron nitride’s insulative property has come in handy for research on other 2D materials. Take that ephemeral black phosphorus: One way scientists have managed to keep it stable enough to study is by sandwiching it between two sheets of boron nitride.

    Even as it is blocking electrons, however, boron nitride will allow photons to pass, says physicist Milos Toth of the University of Technology Sydney, who coauthored an article about the potential of boron nitride, and other 2D materials, in the 2019 Annual Review of Physical Chemistry. That’s ideal for creating things called single-photon sources, which can emit a single particle of light at a time and are used in quantum computing, quantum information processing and physics experiments.

    Magnetic Material

    Another atomically thin material creating quite a buzz in materials science circles is a compound of chromium and iodine called chromium triiodide. It’s the first 2D material that naturally generates a magnetic field. Scientists working on chromium triiodide propose the material could eventually find uses in computer memory and storage, as well as in more research-focused purposes such as controlling how an electron spins.

    There’s a hitch, Hersam says: “This material is extremely hard to work with,” because it is both tough to synthesize and unstable once it’s made. Right now the only way to work with it is at extremely low temperatures, at minus-375 degrees Fahrenheit and below. But boron nitride might again come to the rescue: Some chromium triiodide samples have been preserved for months on end inside boron nitride sandwiches.

    Because of its finicky properties, chromium triiodide may not itself end up built into devices, Hight Walker says. “But when we understand the physics of what’s happening, we can go look for this 2D magnetic behavior in other materials.” A number of 2D magnetic materials are now being explored — single-layer manganese crystals woven into an insulating material is one possibility.

    Thin Sandwichesere

    Wrangling any of these thin layers into something usable may ultimately depend — literally — on how they stack up. Different super-thin materials would be layered together so that the properties inherent in each material can complement one another. “We have insulators, semiconductors, metals and now magnets,” Hight Walker says. “Those are the pieces that you need to make almost anything you want.”

    One potential application especially exciting to Hight Walker is in quantum computing. Unlike traditional computing, in which bits of information are either ones or zeroes, quantum computing allows each “qubit” of information to be both one and zero at once. In principle, this would allow quantum computers to quickly solve problems that would take an impossibly long time with conventional machines.

    Right now, though, most qubits are made of superconductors that have to be kept freezing cold, limiting their real-world use and motivating the search for new types of superconducting materials. For this reason, researchers are eager to explore borophene’s ability to superconduct. (Graphene, layered a certain way, also has shown potential superconducting properties.)

    But a stacked material involving several superconducting layers separated by strong insulators could enable smaller, more stable qubits that don’t require quite as low temperatures — which could reduce the overall size of quantum computers. Right now, these are room-sized affairs, much like early computers were. Reducing their size is going to require novel approaches and, possibly, very thin materials — layered sheet by little sheet.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 7:48 am on October 28, 2019 Permalink | Reply
    Tags: , Controlling spin dynamics — the movement of electron spins — is key to improving the performance of nanomagnet-based applications., , , , Quantum Computing, ,   

    From UC Riverside: “Small magnets reveal big secrets” 

    UC Riverside bloc

    From UC Riverside

    October 24, 2019
    Iqbal Pittalwala

    Work by international research team could have wide-ranging impact on information technology applications.

    2

    An international research team led by a physicist at the University of California, Riverside, has identified a microscopic process of electron spin dynamics in nanoparticles that could impact the design of applications in medicine, quantum computation, and spintronics.

    Magnetic nanoparticles and nanodevices have several applications in medicine — such as drug delivery and MRI — and information technology. Controlling spin dynamics — the movement of electron spins — is key to improving the performance of such nanomagnet-based applications.

    “This work advances our understanding of spin dynamics in nanomagnets,” said Igor Barsukov, an assistant professor in the Department of Physics and Astronomy and lead author of the study that appears today in Science Advances.

    1
    Physicist Igor Barsukov is an assistant professor at UC Riverside. (UCR/Barsukov lab)

    Electron spins, which precess like spinning tops, are linked to each other. When one spin begins to precess, the precession propagates to neighboring spins, which sets a wave going. Spin waves, which are thus collective excitations of spins, behave differently in nanoscale magnets than they do in large or extended magnets. In nanomagnets, the spin waves are confined by the size of the magnet, typically around 50 nanometers, and therefore present unusual phenomena.

    In particular, one spin wave can transform into another through a process called “three magnon scattering,” a magnon being a quantum unit of a spin wave. In nanomagnets, this process is resonantly enhanced, meaning it is amplified for specific magnetic fields.

    In collaboration with researchers at UC Irvine and Western Digital in San Jose, as well as theory colleagues in Ukraine and Chile, Barsukov demonstrated how three magnon scattering, and thus the dimensions of nanomagnets, determines how these magnets respond to spin currents. This development could lead to paradigm-shifting advancements.

    “Spintronics is leading the way for faster and energy-efficient information technology,” Barsukov said. “For such technology, nanomagnets are the building blocks, which need to be controlled by spin currents.”

    Barsukov explained that despite its technological importance, a fundamental understanding of energy dissipation in nanomagnets has been elusive. The research team’s work provides insights into the principles of energy dissipation in nanomagnets and could enable engineers who work on spintronics and information technology to build better devices.

    “Microscopic processes explored in our study may also be of significance in the context of quantum computation where researchers currently are attempting to address individual magnons,” Barsukov said. “Our work can potentially impact multiple areas of research.”

    Barsukov was joined in the research by H. K. Lee, A. A. Jara, Y.-J. Chen, A. M. Gonçalves, C. Sha, and I. N. Krivorotov of UC Irvine; J. A. Katine of Western Digital in San Jose; R. E. Arias of the University of Chile in Santiago; and B. A. Ivanov of the National Academy of Sciences of Ukraine and the National University of Science and Technology in Russia.

    The collaborative study was primarily funded by the U.S. Army Research Office, Defense Threat Reduction Agency, and National Science Foundation, or NSF, as well as by agencies in Chile, Brazil, Ukraine, and Russia. Barsukov was funded by the NSF.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC Riverside Campus

    The University of California, Riverside is one of 10 universities within the prestigious University of California system, and the only UC located in Inland Southern California.

    Widely recognized as one of the most ethnically diverse research universities in the nation, UCR’s current enrollment is more than 21,000 students, with a goal of 25,000 students by 2020. The campus is in the midst of a tremendous growth spurt with new and remodeled facilities coming on-line on a regular basis.

    We are located approximately 50 miles east of downtown Los Angeles. UCR is also within easy driving distance of dozens of major cultural and recreational sites, as well as desert, mountain and coastal destinations.

     
  • richardmitnick 1:39 pm on October 23, 2019 Permalink | Reply
    Tags: , Quantum Computing,   

    From UC Santa Barbara: “Achieving Quantum Supremacy” 

    UC Santa Barbara Name bloc
    From UC Santa Barbara

    October 23, 2019
    Sonia Fernandez

    1
    Google’s quantum supreme cryostat with Sycamore inside. Photo Credit: Eric Lucero/Google, Inc.

    Researchers in UC Santa Barbara/Google scientist John Martinis’ group have made good on their claim to quantum supremacy. Using 53 entangled quantum bits (“qubits”), their Sycamore computer has taken on — and solved — a problem considered intractable for classical computers.

    “A computation that would take 10,000 years on a classical supercomputer took 200 seconds on our quantum computer,” said Brooks Foxen, a graduate student researcher in the Martinis Group. “It is likely that the classical simulation time, currently estimated at 10,000 years, will be reduced by improved classical hardware and algorithms, but, since we are currently 1.5 trillion times faster, we feel comfortable laying claim to this achievement.”

    The feat is outlined in a paper in the journal Nature.

    The milestone comes after roughly two decades of quantum computing research conducted by Martinis and his group, from the development of a single superconducting qubit to systems including architectures of 72 and, with Sycamore, 54 qubits (one didn’t perform) that take advantage of the both awe-inspiring and bizarre properties of quantum mechanics.

    “The algorithm was chosen to emphasize the strengths of the quantum computer by leveraging the natural dynamics of the device,” said Ben Chiaro, another graduate student researcher in the Martinis Group. That is, the researchers wanted to test the computer’s ability to hold and rapidly manipulate a vast amount of complex, unstructured data.

    “We basically wanted to produce an entangled state involving all of our qubits as quickly as we can,” Foxen said, “and so we settled on a sequence of operations that produced a complicated superposition state that, when measured, returned output (“bitstring”) with a probability determined by the specific sequence of operations used to prepare that particular superposition.” The exercise, which was to verify that the circuit’s output correspond to the sequence used to prepare the state, sampled the quantum circuit a million times in just a few minutes, exploring all possibilities — before the system could lose its quantum coherence.

    ‘A complex superposition state’

    “We performed a fixed set of operations that entangles 53 qubits into a complex superposition state,” Chiaro explained. “This superposition state encodes the probability distribution. For the quantum computer, preparing this superposition state is accomplished by applying a sequence of tens of control pulses to each qubit in a matter of microseconds. We can prepare and then sample from this distribution by measuring the qubits a million times in 200 seconds.”

    “For classical computers, it is much more difficult to compute the outcome of these operations because it requires computing the probability of being in any one of the 2^53 possible states, where the 53 comes from the number of qubits — the exponential scaling is why people are interested in quantum computing to begin with,” Foxen said. “This is done by matrix multiplication, which is expensive for classical computers as the matrices become large.”

    According to the new paper, the researchers used a method called cross-entropy benchmarking to compare the quantum circuit’s bitstring to its “corresponding ideal probability computed via simulation on a classical computer” to ascertain that the quantum computer was working correctly.

    “We made a lot of design choices in the development of our processor that are really advantageous,” said Chiaro. Among these advantages, he said, are the ability to experimentally tune the parameters of the individual qubits as well as their interactions.

    While the experiment was chosen as a proof-of-concept for the computer, the research has resulted in a very real and valuable tool: a certified random number generator. Useful in a variety of fields, random numbers can ensure that encrypted keys can’t be guessed, or that a sample from a larger population is truly representative, leading to optimal solutions for complex problems and more robust machine learning applications. The speed with which the quantum circuit can produce its randomized bitstring is so great that there is no time to analyze and “cheat” the system.

    “Quantum mechanical states do things that go beyond our day-to-day experience and so have the potential to provide capabilities and application that would otherwise be unattainable,” commented Joe Incandela, UC Santa Barbara’s vice chancellor for research. “The team has demonstrated the ability to reliably create and repeatedly sample complicated quantum states involving 53 entangled elements to carry out an exercise that would take millennia to do with a classical supercomputer. This is a major accomplishment. We are at the threshold of a new era of knowledge acquisition.”

    Looking ahead

    With an achievement like “quantum supremacy,” it’s tempting to think that the UC Santa Barbara/Google researchers will plant their flag and rest easy. But for Foxen, Chiaro, Martinis and the rest of the UCSB/Google AI Quantum group, this is just the beginning.

    “It’s kind of a continuous improvement mindset,” Foxen said. “There are always projects in the works.” In the near term, further improvements to these “noisy” qubits may enable the simulation of interesting phenomena in quantum mechanics, such as thermalization, or the vast amount of possibility in the realms of materials and chemistry.

    In the long term, however, the scientists are always looking to improve coherence times, or, at the other end, to detect and fix errors, which would take many additional qubits per qubit being checked. These efforts have been running parallel to the design and build of the quantum computer itself, and ensure the researchers have a lot of work before hitting their next milestone.

    “It’s been an honor and a pleasure to be associated with this team,” Chiaro said. “It’s a great collection of strong technical contributors with great leadership and the whole team really synergizes well.”

    [In spite of the work and paper in Nature, this claim is considered highly doubtful.]

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC Santa Barbara Seal
    The University of California, Santa Barbara (commonly referred to as UC Santa Barbara or UCSB) is a public research university and one of the 10 general campuses of the University of California system. Founded in 1891 as an independent teachers’ college, UCSB joined the University of California system in 1944 and is the third-oldest general-education campus in the system. The university is a comprehensive doctoral university and is organized into five colleges offering 87 undergraduate degrees and 55 graduate degrees. In 2012, UCSB was ranked 41st among “National Universities” and 10th among public universities by U.S. News & World Report. UCSB houses twelve national research centers, including the renowned Kavli Institute for Theoretical Physics.

     
  • richardmitnick 9:39 am on October 23, 2019 Permalink | Reply
    Tags: , , Quantum Computing   

    From National Science Foundation- “NSF statement: New development in quantum computing” 

    From National Science Foundation

    1
    In this rendering, a trefoil knot, an iconic topological object, is shown coming out of a tunnel with an image of superconducting qubit chips reflected on its surface. Credit: P. Roushan\Martinis lab\UC Santa Barbara

    October 23, 2019
    Public Affairs, NSF
    (703) 292-7090
    media@nsf.gov

    In Quantum supremacy using a programmable superconducting processor, in the Oct. 24 issue of the journal Nature, a team of researchers led by Google present evidence that their quantum computer has accomplished a task that existing computers built from silicon chips cannot. When verified, the result will add credence to the broader promise of quantum computing. In addition to funding a broad portfolio of quantum research, including for other quantum computing systems and approaches, NSF has provided research support to four of the Nature paper’s co-authors: John Martinis of the University of California, Santa Barbara; Fernando Brandao of Caltech; Edward Farhi of the Massachusetts Institute of Technology; and Dave Bacon of the University of Washington.

    Today, Google announced that a quantum computer has accomplished a task not yet possible on a classical device. When verified, this may prove to be a milestone moment, one that builds on more than three decades of continuous NSF investment in the fundamental physics, computer science, materials science, and engineering that underlies many of today’s quantum computing developments — and the researchers behind them — including four of the co-authors who helped create Google’s system. As quantum research continues bridging theory to practice across a range of experimental platforms, it is equally important that NSF, other agencies, and industry invest in the workforce developing quantum technologies and the countless applications that will benefit all of society. Together, we will ensure continuing U.S. leadership in quantum computing.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition
    The National Science Foundation (NSF) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.

    We fulfill our mission chiefly by issuing limited-term grants — currently about 12,000 new awards per year, with an average duration of three years — to fund specific research proposals that have been judged the most promising by a rigorous and objective merit-review system. Most of these awards go to individuals or small groups of investigators. Others provide funding for research centers, instruments and facilities that allow scientists, engineers and students to work at the outermost frontiers of knowledge.

    NSF’s goals — discovery, learning, research infrastructure and stewardship — provide an integrated strategy to advance the frontiers of knowledge, cultivate a world-class, broadly inclusive science and engineering workforce and expand the scientific literacy of all citizens, build the nation’s research capability through investments in advanced instrumentation and facilities, and support excellence in science and engineering research and education through a capable and responsive organization. We like to say that NSF is “where discoveries begin.”

    Many of the discoveries and technological advances have been truly revolutionary. In the past few decades, NSF-funded researchers have won some 236 Nobel Prizes as well as other honors too numerous to list. These pioneers have included the scientists or teams that discovered many of the fundamental particles of matter, analyzed the cosmic microwaves left over from the earliest epoch of the universe, developed carbon-14 dating of ancient artifacts, decoded the genetics of viruses, and created an entirely new state of matter called a Bose-Einstein condensate.

    NSF also funds equipment that is needed by scientists and engineers but is often too expensive for any one group or researcher to afford. Examples of such major research equipment include giant optical and radio telescopes, Antarctic research sites, high-end computer facilities and ultra-high-speed connections, ships for ocean research, sensitive detectors of very subtle physical phenomena and gravitational wave observatories.

    Another essential element in NSF’s mission is support for science and engineering education, from pre-K through graduate school and beyond. The research we fund is thoroughly integrated with education to help ensure that there will always be plenty of skilled people available to work in new and emerging scientific, engineering and technological fields, and plenty of capable teachers to educate the next generation.

    No single factor is more important to the intellectual and economic progress of society, and to the enhanced well-being of its citizens, than the continuous acquisition of new knowledge. NSF is proud to be a major part of that process.

    Specifically, the Foundation’s organic legislation authorizes us to engage in the following activities:

    Initiate and support, through grants and contracts, scientific and engineering research and programs to strengthen scientific and engineering research potential, and education programs at all levels, and appraise the impact of research upon industrial development and the general welfare.
    Award graduate fellowships in the sciences and in engineering.
    Foster the interchange of scientific information among scientists and engineers in the United States and foreign countries.
    Foster and support the development and use of computers and other scientific methods and technologies, primarily for research and education in the sciences.
    Evaluate the status and needs of the various sciences and engineering and take into consideration the results of this evaluation in correlating our research and educational programs with other federal and non-federal programs.
    Provide a central clearinghouse for the collection, interpretation and analysis of data on scientific and technical resources in the United States, and provide a source of information for policy formulation by other federal agencies.
    Determine the total amount of federal money received by universities and appropriate organizations for the conduct of scientific and engineering research, including both basic and applied, and construction of facilities where such research is conducted, but excluding development, and report annually thereon to the President and the Congress.
    Initiate and support specific scientific and engineering activities in connection with matters relating to international cooperation, national security and the effects of scientific and technological applications upon society.
    Initiate and support scientific and engineering research, including applied research, at academic and other nonprofit institutions and, at the direction of the President, support applied research at other organizations.
    Recommend and encourage the pursuit of national policies for the promotion of basic research and education in the sciences and engineering. Strengthen research and education innovation in the sciences and engineering, including independent research by individuals, throughout the United States.
    Support activities designed to increase the participation of women and minorities and others underrepresented in science and technology.

    At present, NSF has a total workforce of about 2,100 at its Alexandria, VA, headquarters, including approximately 1,400 career employees, 200 scientists from research institutions on temporary duty, 450 contract workers and the staff of the NSB office and the Office of the Inspector General.

    NSF is divided into the following seven directorates that support science and engineering research and education: Biological Sciences, Computer and Information Science and Engineering, Engineering, Geosciences, Mathematical and Physical Sciences, Social, Behavioral and Economic Sciences, and Education and Human Resources. Each is headed by an assistant director and each is further subdivided into divisions like materials research, ocean sciences and behavioral and cognitive sciences.

    Within NSF’s Office of the Director, the Office of Integrative Activities also supports research and researchers. Other sections of NSF are devoted to financial management, award processing and monitoring, legal affairs, outreach and other functions. The Office of the Inspector General examines the foundation’s work and reports to the NSB and Congress.

    Each year, NSF supports an average of about 200,000 scientists, engineers, educators and students at universities, laboratories and field sites all over the United States and throughout the world, from Alaska to Alabama to Africa to Antarctica. You could say that NSF support goes “to the ends of the earth” to learn more about the planet and its inhabitants, and to produce fundamental discoveries that further the progress of research and lead to products and services that boost the economy and improve general health and well-being.

    As described in our strategic plan, NSF is the only federal agency whose mission includes support for all fields of fundamental science and engineering, except for medical sciences. NSF is tasked with keeping the United States at the leading edge of discovery in a wide range of scientific areas, from astronomy to geology to zoology. So, in addition to funding research in the traditional academic areas, the agency also supports “high risk, high pay off” ideas, novel collaborations and numerous projects that may seem like science fiction today, but which the public will take for granted tomorrow. And in every case, we ensure that research is fully integrated with education so that today’s revolutionary work will also be training tomorrow’s top scientists and engineers.

    Unlike many other federal agencies, NSF does not hire researchers or directly operate our own laboratories or similar facilities. Instead, we support scientists, engineers and educators directly through their own home institutions (typically universities and colleges). Similarly, we fund facilities and equipment such as telescopes, through cooperative agreements with research consortia that have competed successfully for limited-term management contracts.

    NSF’s job is to determine where the frontiers are, identify the leading U.S. pioneers in these fields and provide money and equipment to help them continue. The results can be transformative. For example, years before most people had heard of “nanotechnology,” NSF was supporting scientists and engineers who were learning how to detect, record and manipulate activity at the scale of individual atoms — the nanoscale. Today, scientists are adept at moving atoms around to create devices and materials with properties that are often more useful than those found in nature.

    Dozens of companies are gearing up to produce nanoscale products. NSF is funding the research projects, state-of-the-art facilities and educational opportunities that will teach new skills to the science and engineering students who will make up the nanotechnology workforce of tomorrow.

    At the same time, we are looking for the next frontier.

    NSF’s task of identifying and funding work at the frontiers of science and engineering is not a “top-down” process. NSF operates from the “bottom up,” keeping close track of research around the United States and the world, maintaining constant contact with the research community to identify ever-moving horizons of inquiry, monitoring which areas are most likely to result in spectacular progress and choosing the most promising people to conduct the research.

    NSF funds research and education in most fields of science and engineering. We do this through grants and cooperative agreements to more than 2,000 colleges, universities, K-12 school systems, businesses, informal science organizations and other research organizations throughout the U.S. The Foundation considers proposals submitted by organizations on behalf of individuals or groups for support in most fields of research. Interdisciplinary proposals also are eligible for consideration. Awardees are chosen from those who send us proposals asking for a specific amount of support for a specific project.

    Proposals may be submitted in response to the various funding opportunities that are announced on the NSF website. These funding opportunities fall into three categories — program descriptions, program announcements and program solicitations — and are the mechanisms NSF uses to generate funding requests. At any time, scientists and engineers are also welcome to send in unsolicited proposals for research and education projects, in any existing or emerging field. The Proposal and Award Policies and Procedures Guide (PAPPG) provides guidance on proposal preparation and submission and award management. At present, NSF receives more than 42,000 proposals per year.

    To ensure that proposals are evaluated in a fair, competitive, transparent and in-depth manner, we use a rigorous system of merit review. Nearly every proposal is evaluated by a minimum of three independent reviewers consisting of scientists, engineers and educators who do not work at NSF or for the institution that employs the proposing researchers. NSF selects the reviewers from among the national pool of experts in each field and their evaluations are confidential. On average, approximately 40,000 experts, knowledgeable about the current state of their field, give their time to serve as reviewers each year.

    The reviewer’s job is to decide which projects are of the very highest caliber. NSF’s merit review process, considered by some to be the “gold standard” of scientific review, ensures that many voices are heard and that only the best projects make it to the funding stage. An enormous amount of research, deliberation, thought and discussion goes into award decisions.

    The NSF program officer reviews the proposal and analyzes the input received from the external reviewers. After scientific, technical and programmatic review and consideration of appropriate factors, the program officer makes an “award” or “decline” recommendation to the division director. Final programmatic approval for a proposal is generally completed at NSF’s division level. A principal investigator (PI) whose proposal for NSF support has been declined will receive information and an explanation of the reason(s) for declination, along with copies of the reviews considered in making the decision. If that explanation does not satisfy the PI, he/she may request additional information from the cognizant NSF program officer or division director.

    If the program officer makes an award recommendation and the division director concurs, the recommendation is submitted to NSF’s Division of Grants and Agreements (DGA) for award processing. A DGA officer reviews the recommendation from the program division/office for business, financial and policy implications, and the processing and issuance of a grant or cooperative agreement. DGA generally makes awards to academic institutions within 30 days after the program division/office makes its recommendation.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: