Tagged: Superposition Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:03 pm on May 27, 2022 Permalink | Reply
    Tags: "Constructor theory", "Maxwell’s demon", "Physicists Rewrite the Fundamental Law That Leads to Disorder", , Hilbert’s Problem, , , , , Quantum information theory, , Quantum resource theories, Superposition, , The informational perspective on the second law is now being recast as a quantum problem., The Second Law of Thermodynamics, The universe began — for reasons not fully understood or agreed on — in a low-entropy state and is heading toward one of ever higher entropy.   

    From “Quanta Magazine”: “Physicists Rewrite the Fundamental Law That Leads to Disorder” 

    From “Quanta Magazine”

    May 26, 2022
    Philip Ball

    Is the rise of entropy merely probabilistic, or can it be straightened out by use of clear quantum axioms? Maggie Chiang for Quanta Magazine

    The Second Law of Thermodynamics is among the most sacred in all of science, but it has always rested on 19th century arguments about probability. New arguments trace its true source to the flows of quantum information.

    In all of physical law, there’s arguably no principle more sacrosanct than the Second Law of Thermodynamics — the notion that entropy, a measure of disorder, will always stay the same or increase. “If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations — then so much the worse for Maxwell’s equations,” wrote the British astrophysicist Arthur Eddington in his 1928 book The Nature of the Physical World. “If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.” No violation of this law has ever been observed, nor is any expected.

    But something about the second law troubles physicists. Some are not convinced that we understand it properly or that its foundations are firm. Although it’s called a law, it’s usually regarded as merely probabilistic: It stipulates that the outcome of any process will be the most probable one (which effectively means the outcome is inevitable given the numbers involved).

    Yet physicists don’t just want descriptions of what will probably happen. “We like laws of physics to be exact,” said the physicist Chiara Marletto of the University of Oxford. Can the second law be tightened up into more than just a statement of likelihoods?

    A number of independent groups appear to have done just that. They may have woven the second law out of the fundamental principles of quantum mechanics — which, some suspect, have directionality and irreversibility built into them at the deepest level. According to this view, the second law comes about not because of classical probabilities but because of quantum effects such as entanglement. It arises from the ways in which quantum systems share information, and from cornerstone quantum principles that decree what is allowed to happen and what is not. In this telling, an increase in entropy is not just the most likely outcome of change. It is a logical consequence of the most fundamental resource that we know of — the quantum resource of information.

    Quantum Inevitability

    Thermodynamics was conceived in the early 19th century to describe the flow of heat and the production of work. The need for such a theory was urgently felt as steam power drove the Industrial Revolution, and engineers wanted to make their devices as efficient as possible.

    In the end, thermodynamics wasn’t much help in making better engines and machinery. Instead, it became one of the central pillars of modern physics, providing criteria that govern all processes of change.

    Classical thermodynamics has only a handful of laws, of which the most fundamental are the first and second. The first says that energy is always conserved; the second law says that heat always flows from hot to cold. More commonly this is expressed in terms of entropy, which must increase overall in any process of change. Entropy is loosely equated with disorder, but the Austrian physicist Ludwig Boltzmann formulated it more rigorously as a quantity related to the total number of microstates a system has: how many equivalent ways its particles can be arranged.

    The second law appears to show why change happens in the first place. At the level of individual particles, the classical laws of motion can be reversed in time. But the second law implies that change must happen in a way that increases entropy. This directionality is widely considered to impose an arrow of time. In this view, time seems to flow from past to future because the universe began — for reasons not fully understood or agreed on — in a low-entropy state and is heading toward one of ever higher entropy. The implication is that eventually heat will be spread completely uniformly and there will be no driving force for further change — a depressing prospect that scientists of the mid-19th century called the heat death of the universe.

    Boltzmann’s microscopic description of entropy seems to explain this directionality. Many-particle systems that are more disordered and have higher entropy vastly outnumber ordered, lower-entropy states, so molecular interactions are much more likely to end up producing them. The second law seems then to be just about statistics: It’s a law of large numbers. In this view, there’s no fundamental reason why entropy can’t decrease — why, for example, all the air molecules in your room can’t congregate by chance in one corner. It’s just extremely unlikely.

    Yet this probabilistic statistical physics leaves some questions hanging. It directs us toward the most probable microstates in a whole ensemble of possible states and forces us to be content with taking averages across that ensemble.

    But the laws of classical physics are deterministic — they allow only a single outcome for any starting point. Where, then, can that hypothetical ensemble of states enter the picture at all, if only one outcome is ever possible?

    David Deutsch, a physicist at Oxford, has for several years been seeking to avoid this dilemma by developing a theory of (as he puts it) “a world in which probability and randomness are totally absent from physical processes.” His project, on which Marletto is now collaborating, is called “Constructor theory”. It aims to establish not just which processes probably can and can’t happen, but which are possible and which are forbidden outright.

    Constructor theory aims to express all of physics in terms of statements about possible and impossible transformations. It echoes the way thermodynamics itself began, in that it considers change in the world as something produced by “machines” (constructors) that work in a cyclic fashion, following a pattern like that of the famous Carnot cycle, proposed in the 19th century to describe how engines perform work. The constructor is rather like a catalyst, facilitating a process and being returned to its original state at the end.

    “Say you have a transformation like building a house out of bricks,” said Marletto. “You can think of a number of different machines that can achieve this, to different accuracies. All of these machines are constructors, working in a cycle” — they return to their original state when the house is built.

    But just because a machine for conducting a certain task might exist, that doesn’t mean it can also undo the task. A machine for building a house might not be capable of dismantling it. This makes the operation of the constructor different from the operation of the dynamical laws of motion describing the movements of the bricks, which are reversible.

    The reason for the irreversibility, said Marletto, is that for most complex tasks, a constructor is geared to a given environment. It requires some specific information from the environment relevant to completing that task. But the reverse task will begin with a different environment, so the same constructor won’t necessarily work. “The machine is specific to the environment it is working on,” she said.

    Recently, Marletto, working with the quantum theorist Vlatko Vedral at Oxford and colleagues in Italy, showed that constructor theory does identify processes that are irreversible in this sense — even though everything happens according to quantum mechanical laws that are themselves perfectly reversible. “We show that there are some transformations for which you can find a constructor for one direction but not the other,” she said.

    The researchers considered a transformation involving the states of quantum bits (qubits), which can exist in one of two states or in a combination, or superposition, of both. In their model, a single qubit B may be transformed from some initial, perfectly known state B1 to a target state B2 when it interacts with other qubits by moving past a row of them one qubit at a time. This interaction entangles the qubits: Their properties become interdependent, so that you can’t fully characterize one of the qubits unless you look at all the others too.

    As the number of qubits in the row gets very large, it becomes possible to bring B into state B2 as accurately as you like, said Marletto. The process of sequential interactions of B with the row of qubits constitutes a constructor-like machine that transforms B1 to B2. In principle you can also undo the process, turning B2 back to B1, by sending B back along the row.

    But what if, having done the transformation once, you try to reuse the array of qubits for the same process with a fresh B? Marletto and colleagues showed that if the number of qubits in the row is not very large and you use the same row repeatedly, the array becomes less and less able to produce the transformation from B1 to B2. But crucially, the theory also predicts that the row becomes even less able to do the reverse transformation from B2 to B1. The researchers have confirmed this prediction experimentally using photons for B and a fiber optic circuit to simulate a row of three qubits.

    “You can approximate the constructor arbitrarily well in one direction but not the other,” Marletto said. There’s an asymmetry to the transformation, just like the one imposed by the second law. This is because the transformation takes the system from a so-called pure quantum state (B1) to a mixed one (B2, which is entangled with the row). A pure state is one for which we know all there is to be known about it. But when two objects are entangled, you can’t fully specify one of them without knowing everything about the other too. The fact is that it’s easier to go from a pure quantum state to a mixed state than vice versa — because the information in the pure state gets spread out by entanglement and is hard to recover. It’s comparable to trying to re-form a droplet of ink once it has dispersed in water, a process in which the irreversibility is imposed by the second law.

    So here the irreversibility is “just a consequence of the way the system dynamically evolves,” said Marletto. There’s no statistical aspect to it. Irreversibility is not just the most probable outcome but the inevitable one, governed by the quantum interactions of the components. “Our conjecture,” said Marletto, “is that thermodynamic irreversibility might stem from this.”

    Demon in the Machine

    There’s another way of thinking about the second law, though, that was first devised by James Clerk Maxwell, the Scottish scientist who pioneered the statistical view of thermodynamics along with Boltzmann. Without quite realizing it, Maxwell connected the thermodynamic law to the issue of information.

    Maxwell was troubled by the theological implications of a cosmic heat death and of an inexorable rule of change that seemed to undermine free will. So in 1867 he sought a way to “pick a hole” in the second law. In his hypothetical scenario, a microscopic being (later, to his annoyance, called a demon) turns “useless” heat back into a resource for doing work. Maxwell had previously shown that in a gas at thermal equilibrium there is a distribution of molecular energies. Some molecules are “hotter” than others — they are moving faster and have more energy. But they are all mixed at random so there appears to be no way to make use of those differences.

    Enter Maxwell’s demon. It divides the compartment of gas in two, then installs a frictionless trapdoor between them. The demon lets the hot molecules moving about the compartments pass through the trapdoor in one direction but not the other. Eventually the demon has a hot gas on one side and a cooler one on the other, and it can exploit the temperature gradient to drive some machine.

    The demon has used information about the motions of molecules to apparently undermine the second law. Information is thus a resource that, just like a barrel of oil, can be used to do work. But as this information is hidden from us at the macroscopic scale, we can’t exploit it. It’s this ignorance of the microstates that compels classical thermodynamics to speak of averages and ensembles.

    Almost a century later, physicists proved that Maxwell’s demon doesn’t subvert the second law in the long term, because the information it gathers must be stored somewhere, and any finite memory must eventually be wiped to make room for more. In 1961 the physicist Rolf Landauer showed that this erasure of information can never be accomplished without dissipating some minimal amount of heat, thus raising the entropy of the surroundings. So the second law is only postponed, not broken.

    The informational perspective on the second law is now being recast as a quantum problem. That’s partly because of the perception that quantum mechanics is a more fundamental description — Maxwell’s demon treats the gas particles as classical billiard balls, essentially. But it also reflects the burgeoning interest in quantum information theory itself. We can do things with information using quantum principles that we can’t do classically. In particular, entanglement of particles enables information about them to be spread around and manipulated in nonclassical ways.

    Crucially, the quantum informational approach suggests a way of getting rid of the troublesome statistical picture that bedevils the classical view of thermodynamics, where you have to take averages over ensembles of many different microstates. “The true novelty with quantum information came with the understanding that one can replace ensembles with entanglement with the environment,” said Carlo Maria Scandolo of the University of Calgary.

    Taking recourse in an ensemble, he said, reflects the fact that we have only partial information about the state — it could be this microstate or that one, with different probabilities, and so we have to average over a probability distribution. But quantum theory offers another way to generate states of partial information: through entanglement. When a quantum system gets entangled with its environment, about which we can’t know everything, some information about the system itself is inevitably lost: It ends up in a mixed state, where you can’t know everything about it even in principle by focusing on just the system.

    Then you are forced to speak in terms of probabilities not because there are things about the system you don’t know, but because some of that information is fundamentally unknowable. In this way, “probabilities arise naturally from entanglement,” said Scandolo. “The whole idea of getting thermodynamic behavior by considering the role of the environment works only as long as there is entanglement.”

    Those ideas have now been made precise. Working with Giulio Chiribella of the University of Hong Kong, Scandolo has proposed four axioms about quantum information that are required to obtain a “sensible thermodynamics” — that is, one not based on probabilities. The axioms describe constraints on the information in a quantum system that becomes entangled with its environment. In particular, everything that happens to the system plus environment is in principle reversible, just as is implied by the standard mathematical formulation of how a quantum system evolves in time.

    As a consequence of these axioms, Scandolo and Chiribella show, uncorrelated systems always grow more correlated through reversible interactions. Correlations are what connect entangled objects: The properties of one are correlated with those of the other. They are measured by “mutual information,” a quantity that’s related to entropy. So a constraint on how correlations can change is also a constraint on entropy. If the entropy of the system decreases, the entropy of the environment must increase such that the sum of the two entropies can only increase or stay the same, but never decrease. In this way, Scandolo said, their approach derives the existence of entropy from the underlying axioms, rather than postulating it at the outset.

    Redefining Thermodynamics

    One of the most versatile ways to understand this new quantum version of thermodynamics invokes so-called resource theories — which again speak about which transformations are possible and which are not. “A resource theory is a simple model for any situation in which the actions you can perform and the systems you can access are restricted for some reason,” said the physicist Nicole Yunger Halpern of the National Institutes of Standards and Technology. (Scandolo has incorporated resource theories into his work too.)

    Quantum resource theories adopt the picture of the physical world suggested by quantum information theory, in which there are fundamental limitations on which physical processes are possible. In quantum information theory these limitations are typically expressed as “no-go theorems”: statements that say “You can’t do that!” For example, it is fundamentally impossible to make a copy of an unknown quantum state, an idea called quantum no-cloning.

    Resource theories have a few main ingredients. The operations that are allowed are called free operations. “Once you specify the free operations, you have defined the theory — and then you can start reasoning about which transformations are possible or not, and ask what are the optimal efficiencies with which we can perform these tasks,” said Yunger Halpern. A resource, meanwhile, is something that an agent can access to do something useful — it could be a pile of coal to fire up a furnace and power a steam engine. Or it could be extra memory that will allow a Maxwellian demon to subvert the second law for a little longer.

    Quantum resource theories allow a kind of zooming in on the fine-grained details of the classical second law. We don’t need to think about huge numbers of particles; we can make statements about what is allowed among just a few of them. When we do this, said Yunger Halpern, it becomes clear that the classical second law (final entropy must be equal to or greater than initial entropy) is just a kind of coarse-grained sum of a whole family of inequality relationships. For instance, classically the second law says that you can transform a nonequilibrium state into one that is closer to thermal equilibrium. But “asking which of these states is closer to thermal is not a simple question,” said Yunger Halpern. To answer it, “we have to check a whole bunch of inequalities.”

    In other words, in resource theories there seem to be a whole bunch of mini-second laws. “So there could be some transformations allowed by the conventional second law but forbidden by this more detailed family of inequalities,” said Yunger Halpern. For that reason, she adds, “sometimes I feel like everyone [in this field] has their own second law.”

    The resource-theory approach, said physicist Markus Müller of the University of Vienna, “admits a fully mathematically rigorous derivation, without any conceptual or mathematical loose ends, of the thermodynamic laws and more.” He said that this approach involves “a reconsideration of what one really means by thermodynamics” — it is not so much about the average properties of large ensembles of moving particles, but about a game that an agent plays against nature to conduct a task efficiently with the available resources. In the end, though, it is still about information. The discarding of information — or the inability to keep track of it — is really the reason why the second law holds, Yunger Halpern said.

    Hilbert’s Problem

    All these efforts to rebuild thermodynamics and the second law recall a challenge laid down by the German mathematician David Hilbert. In 1900 he posed 23 outstanding problems in mathematics that he wanted to see solved. Item six in that list was “to treat, by means of axioms, those physical sciences in which already today mathematics plays an important part.” Hilbert was concerned that the physics of his day seemed to rest on rather arbitrary assumptions, and he wanted to see them made rigorous in the same way that mathematicians were attempting to derive fundamental axioms for their own discipline.

    Some physicists today are still working on Hilbert’s sixth problem, attempting in particular to reformulate quantum mechanics and its more abstract version, quantum field theory, using axioms that are simpler and more physically transparent than the traditional ones. But Hilbert evidently had thermodynamics in mind too, referring to aspects of physics that use “the theory of probabilities” as among those ripe for reinvention.

    Whether Hilbert’s sixth problem has yet been cracked for the second law seems to be a matter of taste. “I think Hilbert’s sixth problem is far from being completely solved, and I personally find it a very intriguing and important research direction in the foundations of physics,” said Scandolo. “There are still open problems, but I think they will be solved in the foreseeable future, provided enough time and energy are devoted to them.”

    Maybe, though, the real value of re-deriving the second law lies not in satisfying Hilbert’s ghost but just in deepening our understanding of the law itself. As Einstein said, “A theory is the more impressive the greater the simplicity of its premises.” Yunger Halpern compares the motivation for working on the law to the reason literary scholars still reanalyze the plays and poems of Shakespeare: not because such new analysis is “more correct,” but because works this profound are an endless source of inspiration and insight.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

  • richardmitnick 10:00 pm on April 5, 2021 Permalink | Reply
    Tags: "New computing algorithms expand the boundaries of a quantum future", , New amplification algorithms expand the utility of quantum computers to handle non-Boolean scenarios., Qubits can be in a superposition of 0 and 1 while classical bits can be only one or the other., Scientists developed an algorithm 25 years ago that will perform a series of operations on a superposition to amplify the probabilities of certain individual states and suppress others., Standard techniques are able to assess only Boolean scenarios-ones that can be answered with a yes or no output., Superposition   

    From DOE’s Fermi National Accelerator Laboratory(US): “New computing algorithms expand the boundaries of a quantum future” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From DOE’s Fermi National Accelerator Laboratory(US), an enduring source of strength for the US contribution to scientific research world wide.

    April 5, 2021
    Katrina Miller

    Quantum computing promises to harness the strange properties of quantum mechanics in machines that will outperform even the most powerful supercomputers of today. But the extent of their application, it turns out, isn’t entirely clear.

    To fully realize the potential of quantum computing, scientists must start with the basics: developing step-by-step procedures, or algorithms, for quantum computers to perform simple tasks, like the factoring of a number. These simple algorithms can then be used as building blocks for more complicated calculations.

    Prasanth Shyamsundar, a postdoctoral research associate at the Department of Energy’s Department of Energy’s Fermilab Quantum Institute (US), has done just that. In a preprint paper released in February [Non-Boolean Quantum Amplitude Amplification and Quantum Mean Estimation], he announced two new algorithms that build upon existing work in the field to further diversify the types of problems quantum computers can solve.

    “There are specific tasks that can be done faster using quantum computers, and I’m interested in understanding what those are,” Shyamsundar said. “These new algorithms perform generic tasks, and I am hoping they will inspire people to design even more algorithms around them.”

    Shyamsundar’s quantum algorithms, in particular, are useful when searching for a specific entry in an unsorted collection of data. Consider a toy example: Suppose we have a stack of 100 vinyl records, and we task a computer with finding the one jazz album in the stack.

    Classically, a computer would need to examine each individual record and make a yes-or-no decision about whether it is the album we are searching for, based on a given set of search criteria.

    “You have a query, and the computer gives you an output,” Shyamsundar said. “In this case, the query is: Does this record satisfy my set of criteria? And the output is yes or no.”

    Finding the record in question could take only a few queries if it is near the top of the stack, or closer to 100 queries if the record is near the bottom. On average, a classical computer would locate the correct record with 50 queries, or half the total number in the stack.

    A quantum computer, on the other hand, would locate the jazz album much faster. This is because it has the ability to analyze all of the records at once, using a quantum effect called superposition.

    With this property, the number of queries needed to locate the jazz album is only about 10, the square root of the number of records in the stack. This phenomenon is known as quantum speedup and is a result of the unique way quantum computers store information.

    The quantum advantage

    Classical computers use units of storage called bits to save and analyze data. A bit can be assigned one of two values: 0 or 1.

    The quantum version of this is called a qubit. Qubits can be either 0 or 1 as well, but unlike their classical counterparts, they can also be a combination of both values at the same time. This is known as superposition, and allows quantum computers to assess multiple records, or states, simultaneously.

    Qubits can be in a superposition of 0 and 1 while classical bits can be only one or the other. Credit: Jerald Pinson.

    Amplifying the probabilities of correct states

    Luckily, scientists developed an algorithm nearly 25 years ago that will perform a series of operations on a superposition to amplify the probabilities of certain individual states and suppress others, depending on a given set of search criteria. That means when it comes time to measure, the superposition will most likely collapse into the state they are searching for.

    But the limitation of this algorithm is that it can be applied only to Boolean situations, or ones that can be queried with a yes or no output, like searching for a jazz album in a stack of several records.

    A quantum computer can amplify the probabilities of certain individual records and suppress others, as indicated by the size and color of the disks in the output superposition. Standard techniques are able to assess only Boolean scenarios-ones that can be answered with a yes or no output. Credit: Prasanth Shyamsundar.

    Scenarios with non-Boolean outputs present a challenge. Music genres aren’t precisely defined, so a better approach to the jazz record problem might be to ask the computer to rate the albums by how “jazzy” they are. This could look like assigning each record a score on a scale from 1 to 10.

    New amplification algorithms expand the utility of quantum computers to handle non-Boolean scenarios, allowing for an extended range of values to characterize individual records, such as the scores assigned to each disk in the output superposition above. Credit: Prasanth Shyamsundar.

    Previously, scientists would have to convert non-Boolean problems such as this into ones with Boolean outputs.

    “You’d set a threshold and say any state below this threshold is bad, and any state above this threshold is good,” Shyamsundar said. In our jazz record example, that would be the equivalent of saying anything rated between 1 and 5 isn’t jazz, while anything between 5 and 10 is.

    But Shyamsundar has extended this computation such that a Boolean conversion is no longer necessary. He calls this new technique the non-Boolean quantum amplitude amplification algorithm.

    “If a problem requires a yes-or-no answer, the new algorithm is identical to the previous one,” Shyamsundar said. “But this now becomes open to more tasks; there are a lot of problems that can be solved more naturally in terms of a score rather than a yes-or-no output.”

    A second algorithm introduced in the paper, dubbed the quantum mean estimation algorithm, allows scientists to estimate the average rating of all the records. In other words, it can assess how “jazzy” the stack is as a whole.

    Both algorithms do away with having to reduce scenarios into computations with only two types of output, and instead allow for a range of outputs to more accurately characterize information with a quantum speedup over classical computing methods.

    Procedures like these may seem primitive and abstract, but they build an essential foundation for more complex and useful tasks in the quantum future. Within physics, the newly introduced algorithms may eventually allow scientists to reach target sensitivities faster in certain experiments. Shyamsundar is also planning to leverage these algorithms for use in quantum machine learning.

    And outside the realm of science? The possibilities are yet to be discovered.

    “We’re still in the early days of quantum computing,” Shyamsundar said, noting that curiosity often drives innovation. “These algorithms are going to have an impact on how we use quantum computers in the future.”

    This work is supported by the Department of Energy’s Office of Science Office of High Energy Physics QuantISED program.

    The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Fermi National Accelerator Laboratory(US), located just outside Batavia, Illinois, near Chicago, is a United States Department of Energy national laboratory specializing in high-energy particle physics. Since 2007, Fermilab has been operated by the Fermi Research Alliance, a joint venture of the University of Chicago (US), and the Universities Research Association (URA) (US). Fermilab is a part of the Illinois Technology and Research Corridor.

    Fermilab’s Tevatron was a landmark particle accelerator; until the startup in 2008 of the Large Hadron Collider(CH) near Geneva, Switzerland, it was the most powerful particle accelerator in the world, accelerating antiprotons to energies of 500 GeV, and producing proton-proton collisions with energies of up to 1.6 TeV, the first accelerator to reach one “tera-electron-volt” energy. At 3.9 miles (6.3 km), it was the world’s fourth-largest particle accelerator in circumference. One of its most important achievements was the 1995 discovery of the top quark, announced by research teams using the Tevatron’s CDF and DØ detectors. It was shut down in 2011.

    In addition to high-energy collider physics, Fermilab hosts fixed-target and neutrino experiments, such as MicroBooNE (Micro Booster Neutrino Experiment), NOνA (NuMI Off-Axis νe Appearance) and SeaQuest. Completed neutrino experiments include MINOS (Main Injector Neutrino Oscillation Search), MINOS+, MiniBooNE and SciBooNE (SciBar Booster Neutrino Experiment). The MiniBooNE detector was a 40-foot (12 m) diameter sphere containing 800 tons of mineral oil lined with 1,520 phototube detectors. An estimated 1 million neutrino events were recorded each year. SciBooNE sat in the same neutrino beam as MiniBooNE but had fine-grained tracking capabilities. The NOνA experiment uses, and the MINOS experiment used, Fermilab’s NuMI (Neutrinos at the Main Injector) beam, which is an intense beam of neutrinos that travels 455 miles (732 km) through the Earth to the Soudan Mine in Minnesota and the Ash River, Minnesota, site of the NOνA far detector. In 2017, the ICARUS neutrino experiment was moved from CERN to Fermilab.
    In the public realm, Fermilab is home to a native prairie ecosystem restoration project and hosts many cultural events: public science lectures and symposia, classical and contemporary music concerts, folk dancing and arts galleries. The site is open from dawn to dusk to visitors who present valid photo identification.

    Asteroid 11998 Fermilab is named in honor of the laboratory.

    Weston, Illinois, was a community next to Batavia voted out of existence by its village board in 1966 to provide a site for Fermilab.

    The laboratory was founded in 1969 as the National Accelerator Laboratory; it was renamed in honor of Enrico Fermi in 1974. The laboratory’s first director was Robert Rathbun Wilson, under whom the laboratory opened ahead of time and under budget. Many of the sculptures on the site are of his creation. He is the namesake of the site’s high-rise laboratory building, whose unique shape has become the symbol for Fermilab and which is the center of activity on the campus.

    After Wilson stepped down in 1978 to protest the lack of funding for the lab, Leon M. Lederman took on the job. It was under his guidance that the original accelerator was replaced with the Tevatron, an accelerator capable of colliding protons and antiprotons at a combined energy of 1.96 TeV. Lederman stepped down in 1989. The science education center at the site was named in his honor.

    The later directors include:

    John Peoples, 1989 to 1996
    Michael S. Witherell, July 1999 to June 2005
    Piermaria Oddone, July 2005 to July 2013
    Nigel Lockyer, September 2013 to the present

    Fermilab continues to participate in the work at the Large Hadron Collider (LHC); it serves as a Tier 1 site in the Worldwide LHC Computing Grid.

    FNAL Icon

  • richardmitnick 3:08 pm on January 19, 2021 Permalink | Reply
    Tags: "Rethinking Spin Chemistry from a Quantum Perspective", “Superposition” lets algorithms represent two variables at once which then allows scientists to focus on the relationship between these variables without any need to determine their individual sta, Bayesian inference, , , , Superposition   

    From Osaka City University (大阪市立大学: Ōsaka shiritsu daigaku) (JP): “Rethinking Spin Chemistry from a Quantum Perspective” 

    From Osaka City University (大阪市立大学: Ōsaka shiritsu daigaku) (JP)

    Jan 18, 2021
    James Gracey
    Global Exchange Office

    Researchers at Osaka City University use quantum superposition states and Bayesian inference to create a quantum algorithm, easily executable on quantum computers, that accurately and directly calculates energy differences between the electronic ground and excited spin states of molecular systems in polynomial time.

    A quantum circuit that enables for the maximum probability of P(0)
    in the measurement of the parameter J.

    Understanding how the natural world works enables us to mimic it for the benefit of humankind. Think of how much we rely on batteries. At the core is understanding molecular structures and the behavior of electrons within them. Calculating the energy differences between a molecule’s electronic ground and excited spin states helps us understand how to better use that molecule in a variety of chemical, biomedical and industrial applications. We have made much progress in molecules with closed-shell systems, in which electrons are paired up and stable. Open-shell systems, on the other hand, are less stable and their underlying electronic behavior is complex, and thus more difficult to understand. They have unpaired electrons in their ground state, which cause their energy to vary due to the intrinsic nature of electron spins, and makes measurements difficult, especially as the molecules increase in size and complexity. Although such molecules are abundant in nature, there is a lack of algorithms that can handle this complexity. One hurdle has been dealing with what is called the exponential explosion of computational time. Using a conventional computer to calculate how the unpaired spins influence the energy of an open-shell molecule would take hundreds of millions of years, time humans do not have.

    Quantum computers are in development to help reduce this to what is called “polynomial time”. However, the process scientists have been using to calculate the energy differences of open-shell molecules has essentially been the same for both conventional and quantum computers. This hampers the practical use of quantum computing in chemical and industrial applications.

    “Approaches that invoke true quantum algorithms help us treat open-shell systems much more efficiently than by utilizing classical computers”, state Kenji Sugisaki and Takeji Takui from Osaka City University. With their colleagues, they developed a quantum algorithm executable on quantum computers, which can, for the first time, accurately calculate energy differences between the electronic ground and excited spin states of open-shell molecular systems. Their findings were published in the journal Chemical Science on 24 Dec 2020.

    The energy difference between molecular spin states is characterized by the value of the exchange interaction parameter J. Conventional quantum algorithms have been able to accurately calculate energies for closed-shell molecules “but they have not been able to handle systems with a strong multi-configurational character”, states the group. Until now, scientists have assumed that to obtain the parameter J one must first calculate the total energy of each spin state. In open-shell molecules this is difficult because the total energy of each spin state varies greatly as the molecule changes in activity and size. However, “the energy difference itself is not greatly dependent on the system size”, notes the research team. This led them to create an algorithm with calculations that focused on the spin difference, not the individual spin states. Creating such an algorithm required that they let go of assumptions developed from years of using conventional computers and focus on the unique characteristics of quantum computing – namely “quantum superposition states”.

    “Superposition” lets algorithms represent two variables at once, which then allows scientists to focus on the relationship between these variables without any need to determine their individual states first. The research team used something called a broken-symmetry wave function as a superposition of wave functions with different spin states and rewrote it into the Hamiltonian equation for the parameter J. By running this new quantum circuit, the team was able to focus on deviations from their target and by applying Bayesian inference, a machine learning technique, they brought these deviations in to determine the exchange interaction parameter J. “Numerical simulations based on this method were performed for the covalent dissociation of molecular hydrogen (H2), the triple bond dissociation of molecular nitrogen (N2), and the ground states of C, O, Si atoms and NH, OH+, CH2, NF and O2 molecules with an error of less than 1 kcal/mol”, adds the research team,

    “We plan on installing our Bayesian eXchange coupling parameter calculator with Broken-symmetry wave functions (BxB) software on near-term quantum computers equipped with noisy (no quantum error correction) intermediate-scale (several hundreds of qubits) quantum devices (NISQ devices), testing the usefulness for quantum chemical calculations of actual sizable molecular systems.”

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Osaka City University (OCU) (大阪市立大学: Ōsaka shiritsu daigaku) (JP), is a public university in Japan. It is located in Sumiyoshi-ku, Osaka.

    OCU’s predecessor was founded in 1880, as Osaka Commercial Training Institute (大阪商業講習所) with donations by local merchants. It became Osaka Commercial School in 1885, then was municipalized in 1889. Osaka City was defeated in a bid to draw the Second National Commercial College (the winner was Kobe City), so the city authorities decided to establish a municipal commercial college without any aid from the national budget.

    In 1901, the school was reorganized to become Osaka City Commercial College (市立大阪高等商業学校), later authorized under Specialized School Order in 1904. The college had grand brick buildings around the Taishō period.

    In 1928, the college became Osaka University of Commerce (大阪商科大学), the first municipal university in Japan. The city mayor, Hajime Seki (関 一, Seki Hajime, 1873–1935) declared the spirit of the municipal university, that it should not simply copy the national universities and that it should become a place for research with a background of urban activities in Osaka. But, contrary to his words, the university was removed to the most rural part of the city by 1935. The first president of the university was a liberalist, so the campus gradually became what was thought to be “a den of the Reds (Marxists)”. During World War II, the Marxists and the socialists in the university were arrested (about 50 to 80 members) soon after the liberal president died. The campus was evacuated and used by the Japanese Navy.

    After the war, the campus was occupied by the U.S. Army (named “Camp Sakai”), and a number of students became anti-American fighters and “worshipers” of the Soviet Union. The campus was returned to the university, partly in 1952, and fully in 1955. In 1949, during the allied occupation, the university was merged (with other two municipal colleges) into Osaka City University, under Japan’s new educational system.

  • richardmitnick 10:48 am on January 19, 2021 Permalink | Reply
    Tags: "Transforming quantum computing’s promise into practice" William Oliver, , , Decoherence, , , MIT’s Lincoln Laboratory, , , Superposition   

    From MIT: “Transforming quantum computing’s promise into practice” William Oliver 

    MIT News

    From MIT News

    January 19, 2021
    Daniel Ackerman

    Electrical engineer William Oliver develops technology to enable reliable quantum computing at scale.

    MIT electrical engineer William D. Oliver develops the fundamental technology to enable reliable quantum computers at scale.
    Credit: Adam Glanzman.

    It was music that sparked William Oliver’s lifelong passion for computers.

    Growing up in the Finger Lakes region of New York, he was an avid keyboard player. “But I got into music school on voice,” says Oliver, “because it was a little bit easier.”

    But once in school, first at State University of New York at Fredonia then the University of Rochester, he hardly shied away from a challenge. “I was studying sound recording technology, which led me to digital signal processing,” explains Oliver. “And that led me to computers.” Twenty-five years later, he’s still stuck on them.

    Oliver, a recently tenured associate professor in MIT’s Department of Electrical Engineering and Computer Science, is building a new class of computer — the quantum computer — with the potential to radically improve how we process information and simulate complex systems. Quantum computing is still in its early days, and Oliver aims to help usher the field out of the laboratory and into the real world. “Our mission is to build the fundamental technologies that are necessary to scale up quantum computing,” he says.

    Coast to coast and back again

    Oliver’s first stop at MIT was as a master’s student in the Media Lab with adviser Tod Machover. Their interactive Brain Opera project paired Oliver’s love for both music and computing. Oliver orchestrated users’ voices with a computer-generated “angelic arpeggiation of strings and a chorus.” The project was installed at the Haus der Musik museum in Vienna. “It was a fantastic master’s project. I really loved it,” says Oliver. “But the question was ‘okay, what do I do next?’”

    Eager for a new challenge, Oliver chose to explore more fundamental research. “I found quantum mechanics to be really puzzling and interesting,” says Oliver. So he traveled to Stanford University to earn a PhD studying quantum optics using free electrons. “I feel very fortunate that I could do those experiments, which have almost no practical application, but that allowed me to think really deeply about quantum mechanics,” he says.

    Oliver’s timing was fortunate too. He was delving into quantum mechanics just as the field of quantum computing was emerging. A classical computer, like the one you’re using to read this story, stores information in binary bits, each of which holds a value of 0 or 1. In contrast, a quantum computer stores information in qubits, each of which can hold a 0, 1, or any simultaneous combination of 0 and 1, thanks to a quantum mechanical phenomenon called superposition. That means quantum computers can process information far faster than classical computers, in some cases completing tasks in minutes where a classical computer would take millennia — at least in theory. When Oliver was completing his PhD, quantum computing was a field in its infancy, more idea than reality. But Oliver grasped the potential of quantum computing, so he returned to MIT to help it grow.

    The qubit quandary

    Quantum computers are frustratingly inconsistent. That’s in part because those qubit superposition states are fragile. In a process called decoherence, qubits can err and lose their quantum information from the slightest disturbance or material defect. In 2003, Oliver took a staff position at MIT’s Lincoln Laboratory to help solve problems like decoherence. His goal, with colleagues Terry Orlando, Leonya Levitov, and Seth Lloyd, was to engineer reliable quantum computing systems that can be scaled up for practical use. “Quantum computing is transitioning from scientific curiosity to technical reality,” says Oliver. “We know that it works at small scale. And we’re now trying to increase the size of the systems so we can do problems that are actually meaningful.”

    Even background levels of radiation can trigger decoherence in mere milliseconds. In a recent Nature paper, Oliver and his colleagues, including professor of physics Joe Formaggio, described this problem and proposed ways to shelter qubits from damaging radiation, like shielding them with lead.

    He is quick to emphasize the role of collaboration in solving these complex challenges. “Engineering these quantum systems into useful, larger scale machines is going to require almost every department at the Institute,” says Oliver. In his own research, he builds qubits from electrical circuits in aluminum that are supercooled to just a smidge warmer than absolute zero. At that temperature, the system loses electrical resistance and can be used as an anharmonic oscillator that stores quantum information. Engineering such an intricate system to reliably process information means “we need to bring in a lot of people with their own talents,” says Oliver.

    “For example, materials scientists will have a lot to say about the materials and the defects on the surfaces,” he adds. “Electrical engineers will have something to say about how to fabricate and control the qubits. Computer scientists and applied mathematicians will have something to say about the algorithms. Chemists and biologists know the hard problems to solve. And so on.” When he first joined Lincoln Laboratory, Oliver says just two Lincoln staff were focused on quantum technologies. That number now exceeds 100.

    In 2015, Oliver founded the Engineering Quantum Systems (EQuS) group to focus specifically on superconducting qubit technology. He is also a Lincoln Laboratory Fellow, director of MIT’s Center for Quantum Engineering, and associate director of the Research Laboratory of Electronics.

    A quantum future

    Oliver envisions a steadily growing role for quantum computing. Already, Google has demonstrated that for a particular task, a 53-qubit quantum computer can far outpace even the world’s largest supercomputer, which features quadrillions of transistors. “That was like the flight at Kitty Hawk,” says Oliver. “It got off the ground.”

    Google quantum computer.

    In the near-term, Oliver thinks quantum and classical computers could work as partners. The classical machine would churn through an algorithm, dispatching specific calculations for the quantum computer to run before its qubits decohere. In the longer term, Oliver says that error-correcting codes could enable quantum computers to function indefinitely, even as some individual components remain faulty. “And that’s when quantum computers will basically be universal,” says Oliver. “They’ll be able to run any quantum algorithm at large scale.” That could enable vastly improved simulations of complex systems in fields like molecular biology, quantum chemistry, and climatology.

    Oliver will continue to push quantum computing toward that reality. “There are real accomplishments that have been happening,” he says. “At the same time, on the theoretical side, there are real problems we could solve if we just had a quantum computer big enough.” While focused on his mission to scale up quantum computing, Oliver hasn’t lost his passion for music. Although, he says he rarely sings these days: “Only in the shower.”

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

  • richardmitnick 2:02 pm on December 29, 2020 Permalink | Reply
    Tags: "New Views of Quantum Jumps Challenge Core Tenets of Physics", , , By “noncatchable” the researchers mean that the jump back to the ground state will not always be smooth and predictable., , , , Superconducting qubits, Superposition, The most fundamental breakthrough arguably came in 1986 when researchers for the first time experimentally verified that quantum jumps are actual physical events that can be observed and studied., The new study shows that the predictable “catchable” quantum jumps must have a noncatchable counterpart., The scientists behind the work called catchable jumps “islands of predictability in a sea of uncertainty."   

    From Scientific American: “New Views of Quantum Jumps Challenge Core Tenets of Physics” 

    From Scientific American

    December 29, 2020
    Eleni Petrakou

    One of the most basic processes in all of nature—a subatomic particle’s transition between discrete energy states—is surprisingly complex and sometimes predictable, recent work shows.

    Credit: Getty Images.

    Quantum mechanics, the theory that describes the physics of the universe at very small scales, is notorious for defying common sense. Consider, for instance, the way that standard interpretations of the theory suggest change occurs in the quantum turf: shifts from one state to another supposedly happen unpredictably and instantaneously. Put another way, if events in our familiar world unfolded similarly to those within atoms, we would expect to routinely see batter becoming a fully baked cake without passing through any intermediate steps. Everyday experience, of course, tells us this is not the case, but for the less accessible microscopic realm, the true nature of such “quantum jumps” has been a major unsolved problem in physics.

    In recent decades, however, technological advancements have allowed physicists to probe the issue more closely in carefully arranged laboratory settings. The most fundamental breakthrough arguably came in 1986, when researchers for the first time experimentally verified that quantum jumps are actual physical events that can be observed and studied. Ever since, steady technical progress has opened deeper vistas upon the mysterious phenomenon. Notably, an experiment published in 2019 [Nature] overturned the traditional view of quantum jumps by demonstrating that they move predictably and gradually once they start—and can even be stopped midway.

    That experiment, performed at Yale University, used a setup that let the researchers monitor the transitions with minimal intrusion. Each jump took place between two energy values of a superconducting qubit, a tiny circuit built to mimic the properties of atoms. The research team used measurements of “side activity” taking place in the circuit when the system had the lower energy. This is a bit like knowing which show is playing on a television in another room by only listening for certain key words. This indirect probe evaded one of the top concerns in quantum experiments—namely, how to avoid influencing the very system that one is observing. Known as “clicks” (from the sound that old Geiger counters made when detecting radioactivity), these measurements revealed an important property: jumps to the higher energy were always preceded by a halt in the “key words,” a pause in the side activity. This eventually permitted the team to predict the jumps’ unfolding and even to stop them at will.

    Now a new theoretical study delves deeper into what can be said about the jumps and when. And it finds that this seemingly simple and fundamental phenomenon is actually quite complex.


    The new study, published in Physical Review Research, models the step-by-step, cradle-to-grave evolution of quantum jumps—from the initial lower-energy state of the system, known as the ground state, then a second one where it has higher energy, called the excited state, and finally the transition back to the ground state. This modeling shows that the predictable, “catchable” quantum jumps must have a noncatchable counterpart, says author Kyrylo Snizhko, a postdoctoral researcher now at Karlsruhe Institute of Technology in Germany, who was formerly at the Weizmann Institute of Science in Israel, where the study was performed.

    Specifically, by “noncatchable” the researchers mean that the jump back to the ground state will not always be smooth and predictable. Instead the study’s results show that such an event’s evolution depends on how “connected” the measuring device is to the system (another peculiarity of the quantum realm, which, in this case, relates to the timescale of the measurements, compared with that of the transitions). The connection can be weak, in which case a quantum jump can also be predictable through the pause in clicks from the qubit’s side activity, in the way used by the Yale experiment.

    The system transitions by passing through a mixture of the excited state and ground state, a quantum phenomenon known as superposition. But sometimes, when the connection exceeds a certain threshold, this superposition will shift toward a specific value of the mixture and tend to stay at that state until it moves to the ground unannounced. In that special case, “this probabilistic quantum jump cannot be predicted and reversed midflight,” explains Parveen Kumar, a postdoctoral researcher at the Weizmann Institute and co-author of the most recent study. In other words, even jumps for which timing was initially predictable would be followed by inherently unpredictable ones.

    But there is yet more nuance when examining the originally catchable jumps. Snizhko says that even these possess an unpredictable element. A catchable quantum jump will always proceed on a “trajectory” through the superposition of the excited and ground states, but there can be no guarantee that the jump will ever finish. “At each point in the trajectory, there is a probability that the jump continues and a probability that it is projected back to the ground state,” Snizhko says. “So the jump may start happening and then abruptly get canceled. The trajectory is totally deterministic—but whether the system will complete the trajectory or not is unpredictable.”

    This behavior appeared in the Yale experiment’s results. The scientists behind that work called such catchable jumps “islands of predictability in a sea of uncertainty.” Ricardo Gutiérrez-Jáuregui, a postdoctoral researcher at Columbia University and one of the authors of the corresponding study, notes that “the beauty of that work was to show that in the absence of clicks, the system followed a predetermined path to reach the excited state in a short but nonzero time. The device, however, still has a chance to ‘click’ as the system transitions through this path, thus interrupting its transition.”


    Zlatko Minev, a researcher at the IBM Thomas J. Watson Research Center and lead author of the earlier Yale study, notes that the new theoretical paper “derives a very nice, simple model and explanation of the quantum jump phenomenon in the context of a qubit as a function of the parameters of the experiment.” Taken together with the experiment at Yale, the results “show that there is more to the story of discreteness, randomness and predictability in quantum mechanics than commonly thought.” Specifically, the surprisingly nuanced behavior of quantum jumps—the way a leap from the ground state to the excited state can be foretold—suggests a degree of predictability inherent to the quantum world that has never before been observed. Some would even consider it forbidden, had it not already been validated by experiment. When Minev first discussed the possibility of predictable quantum jumps with others in his group, a colleague responded by shouting back, “If this is true, then quantum physics is broken!”

    “In the end, our experiment worked, and from it one can infer that quantum jumps are random and discrete,” Minev says. “Yet on a finer timescale, their evolution is coherent and continuous. These two seemingly opposed viewpoints coexist.”

    As to whether such processes can apply to the material world at large—for instance, to atoms outside a quantum lab—Kumar is undecided, in large part because of how carefully specific the study’s conditions were. “It would be interesting to generalize our results,” he says. If the results turn out similar for different measurement setups, then this behavior—events that are in some sense both random and predictable, discrete yet continuous—could reflect more general properties of the quantum world.

    Meanwhile the predictions of the study could get checked soon. According to Serge Rosenblum, a researcher at the Weizmann Institute who did not participate in either study, these effects can be observed with today’s state-of-the-art superconducting quantum systems and are high on the list of experiments for the institute’s new qubits lab. “It was quite amazing to me that a deceptively simple system such as a single qubit can still hide such surprises when we measure it,” he adds.

    For a long time, quantum jumps—the most basic processes underlying everything in nature—were considered nearly impossible to probe. But technological progress is changing that. Kater Murch, an associate professor at Washington University in St. Louis, who did not participate in the two studies, remarks, “I like how the Yale experiment seems to have motivated this theory paper, which is uncovering new aspects of a physics problem that has been studied for decades. In my mind, experiments really help drive the ways that theorists think about things, and this leads to new discoveries.”

    The mystery might not just be going away, though. As Snizhko says, “I do not think that the quantum jumps problem will be resolved completely any time soon; it is too deeply ingrained in quantum theory. But by playing with different measurements and jumps, we might stumble upon something practically useful.”

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

  • richardmitnick 11:51 am on November 24, 2020 Permalink | Reply
    Tags: "GPU Clusters Accelerate Quantum Computer Simulator", A group of researchers at the U.S. Department of Energy’s PNNL have invented a quantum computer simulator called DM-SIM that is 10 times faster than existing methods., , GPUs devote most of their chip area to compute units and have high-throughput memory., New method improves error investigations in deep quantum circuits., , Superposition, The team achieved the increase in speed by harnessing the power of graphical processing units (GPUs).   

    From DOE’s Pacific Northwest National Laboratory: “GPU Clusters Accelerate Quantum Computer Simulator” 

    From DOE’s Pacific Northwest National Laboratory

    November 13, 2020
    Rebekah Orton

    Artist’s rendering of a quantum computer. Credit:Jeffrey London/PNNL.

    New method improves error investigations in deep quantum circuits

    Before quantum computers begin to be deployed, how will we know if they work? The answer: quantum computer simulators. These important tools, now under development, run on the world’s most powerful supercomputing resources and still take days or weeks to complete quantum computing scenarios.

    Now, a group of researchers at the U.S. Department of Energy’s Pacific Northwest National Laboratory (PNNL) have invented a quantum computer simulator, called DM-SIM, that is 10 times faster than existing methods. The feat was detailed in one of only a handful of articles nominated for “Best Paper” at the annual international conference for high-performance computing, SC20 [Paper: Density Matrix Quantum Circuit Simulation via theBSP Machine on Modern GPU Clusters.

    The team, led by computer scientist Ang Li, achieved the increase in speed by harnessing the power of graphical processing units (GPUs), the lightning-quick processors originally designed for images and videos.

    Simulating qubits: the basic unit of quantum computing

    The basic unit of quantum programming—the quantum bit, or qubit—is strikingly different from its counterpart, the bit, in a classical computer. Unlike bits, the binary units that represent ones and zeros in a classical computer, a quantum computer’s qubits can represent the possibility of both one and zero at the same time.

    Qubits, the basic unit of quantum computing, can represent the possibility of both one and zero at the same time. Credit: Jeffrey London/PNNL.

    A reliable quantum computer simulator needs to capture the complexity of superposition, but it’s not that simple. Multiple qubits in a quantum computer can also exhibit quantum entanglement, meaning when they are entangled, if a single qubit collapses into an individual one or zero, the rest will also collapse like a house of cards.

    Superposition and entanglement are the reasons quantum computers are more useful than classical computers for certain problems. Researchers need powerful quantum computer simulators that can accurately mimic qubits’ billions of possibilities—and errors.

    “Physical qubits currently aren’t perfect or logical,” said Li. “They are more like nature where everything responds to its environment, so you have to find a way to represent noise and errors to create a more realistic simulator.”

    But realism takes more time to calculate—and Li wanted to see if GPUs could hurry things up.

    Layering virtual quantum circuits onto multiple GPUs

    GPUs have been sold for decades to move images quickly across screens, but using them for general-purpose computation, like scientific simulation, emerged in 2007. Unlike a cache-heavy central processing unit (CPU), GPUs devote most of their chip area to compute units and have high-throughput memory. This makes their computations much faster.

    Li started working with GPUs in 2009 before they were as widely used as they are today. He was partway through his PhD research in high-performance computing by the time researchers began to use GPUs to accelerate deep learning in 2013. He’s seen the value ever since.

    “Many major computational problems will move to GPU centric computations, and this work is part of that trend,” said Li. “Big applications need GPUs’ faster delivery to expand their expected performance.”

    Connecting multiple graphical processing units (GPUs) amplifies their swift computing power as they simulate qubits. Credit: Jeffrey London/PNNL.

    Quantum circuits aren’t images. But because GPUs rely on a large number of compute units to deliver high performance, Li suspected they could more quickly perform the heavy computations that represent quantum gates—the building blocks of a quantum circuit.

    Creating deeper gates in a quantum computer simulator

    Quantum circuits are made through operations that change the qubit’s state. These operations are called gates. At the beginning of the gate, each qubit is like an arrow, or vector, pointing to the “0” direction. After the circuit ends and is measured, the qubits collapse to classic one or zero states. The statistical breakdown of ones and zeros indicates the results of the computation.

    An accurate quantum computer simulator needs to describe both pure and mixed quantum states within each circuit. That’s why Li and his team used a method to describe the statistical state of a quantum system called a density matrix. Unlike the widely used state-vector, a density matrix contains all the information about a particular quantum state.

    While researchers have used density matrices to represent qubits before, no one before Li’s team has combined an efficient density matrix quantum circuit simulator with a GPU-accelerated high-performance computing cluster. Because of the complexity in operating the large density matrix, it isn’t easy to manage the massive threads and communication interwoven between cooperating GPUs. And the researcher’s efforts could fail if they didn’t synchronize communication across the GPUs and GPU nodes holding part of the density matrix.

    But Li and his teammates Omer Subasi, Xiu Yang, and Sriram Krishnamoorthy were up for the challenge. After linking the GPUs, they proposed a new formula reform that can significantly avoid expensive communication between GPUs. More importantly, it reduces the communication overhead while conserving the natural error expected from noisy, real-world quantum gates.

    Faster and the future

    With the new method, the team ran a density matrix simulation with one million arbitrary gates in only 94 minutes. This outcome was far deeper and quicker than has been demonstrated before—10 times faster than simpler simulators, which represent quantum states in a state-vector.

    The PNNL team applied their GPU-centered method to help investigate how errors occur in quantum circuits, but the approach could be more broadly applicable. Until a perfect quantum computer is available, PNNL’s DM-Sim simulator can be used to help develop quantum algorithms that provide understanding of molecules for medical advances, explain complex chemistry problems, analyze big-data graphs, and perform quantum-based machine learning . In the meantime, PNNL’s DM-Sim quantum computing simulator will greatly influence making quantum computers work in practical terms.

    The research was funded by the Quantum Science, Advanced Accelerator (QUASAR) laboratory-directed research and development initiative. QUASAR research contributes to the National Quantum Initiative and is part of PNNL efforts to create the science and algorithms that advance hardware development and prepare for the future of quantum technology. The complete paper and a free download of the DM-Sim is available on GITHUB.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Pacific Northwest National Laboratory (PNNL) is one of the United States Department of Energy National Laboratories, managed by the Department of Energy’s Office of Science. The main campus of the laboratory is in Richland, Washington.

    PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.

  • richardmitnick 12:31 pm on August 27, 2020 Permalink | Reply
    Tags: "UArizona Scientists to Build What Einstein Wrote off as Science Fiction", A first-of-its-kind campuswide quantum networking testbed will be built at the University of Arizona., , , Quantum systems will provide a level of privacy security and computational clout that is impossible to achieve with today's internet., Superposition, The center will develop a quantum networking applications roadmap., The quantum internet will allow for applications that will never be possible on the internet as we know it., The quantum internet will rely on a global network of quantum processors speaking to one another via "quantum bits"., This is the third National Science Foundation Engineering Research Center led by the University of Arizona., UA Center for Quantum Networks,   

    From University of Arizona: “UArizona Scientists to Build What Einstein Wrote off as Science Fiction” 

    From University of Arizona

    Media contacts:
    Daniel Stolte
    University Communications

    Brianna Moreno
    James C. Wyant College of Optical Sciences

    With $26 million in federal funding, UArizona is charged with developing the internet of the future, ruled by quantum mechanical properties instead of conventional 0s and 1s. On Aug. 26, Arizona Gov. Doug Ducey, UArizona President Robert C. Robbins and others discussed the impact that the Center for Quantum Networks is expected to have on the way the world computes and communicates.

    Training tomorrow’s quantum engineer workforce is just one declared goals of the new Center for Quantum Networks, funded by the National Science Foundation to create a socially responsible of the internet of the future. Narang Lab/Harvard University.

    Arizona Gov. Doug Ducey today joined University of Arizona President Robert C. Robbins and leading scientists from the new University of Arizona-based Center for Quantum Networks to talk about how the center will help develop the “internet of the future.”

    The National Science Foundation has awarded UArizona a five-year, $26 million grant – with an additional $24 million, five-year option – to lead the Center for Quantum Networks, or CQN, which is a National Science Foundation Engineering Research Center. The award has placed Arizona at the forefront of quantum networking technologies, which are expected to transform areas such as medicine, finance, data security, artificial intelligence, autonomous systems and smart devices, which together are often are referred to as “the internet of things.”

    “Arizona continues to lead the nation in innovation. Establishing the Center for Quantum Networks will position the state as a global leader in advancing this technology and developing the workforce of the future,” Gov. Doug Ducey said. “We’re proud of the work the University of Arizona has done to secure this grant and look forward to the scientific achievements that will result from it.”

    The CQN will take center stage in a burgeoning field. Companies like IBM, Microsoft and Google are racing to build reliable quantum computers, and China has invested billions of dollars in quantum technology research. The U.S. has begun a serious push to exceed China’s investment and to “win” the global race to harness quantum technologies.

    “Less than a year ago, a quantum computer for the first time performed certain calculations that are no longer feasible for even the largest conventional supercomputers,” said Saikat Guha, CQN director and principal investigator and associate professor in the UArizona James C. Wyant College of Optical Sciences, who joined Ducey and Robbins for the virtual event. “The quantum internet will allow for applications that will never be possible on the internet as we know it.”

    Unlike the existing internet – in which computers around the globe exchange data encoded in the familiar 0s and 1s – the quantum internet will rely on a global network of quantum processors speaking to one another via “quantum bits,” or qubits.

    Qubits offer dramatic increases in processing capacity over conventional bits because they can exist in not just one state, but two at the same time. Known as superposition, this difficult-to-grasp principle was first popularized by “Schrödinger’s Cat” – the famous thought experiment in which an imaginative cat inside a box is neither dead nor alive until an equally imaginative observer opens the box and checks.

    The key new resource that the quantum network enables – by being able to communicate qubits from one point to another – is to create “entanglement” across various distant users of the network. Entanglement – another hallmark of quantum mechanics so strange that even Einstein was reluctant to accept it at first – allows a pair of particles, including qubits, to stay strongly correlated despite being separated by large physical distances. Entanglement enables communication among parties that is impossible to hack.

    One of the center’s goals is to develop technologies that will put the entanglement principle to use in real-world applications – for example, to stitch together far-apart sensors, such as the radio telescopes that glimpsed the first image of a black hole in space, into one giant instrument that is far more capable than the sum of the individual sensors. Similar far-reaching implications are expected in the autonomous vehicles industry and in medicine.

    “Who knows, 50 years from now, your internet service provider may send a technician to your house to install a CQN-patented quantum-enabled router that does everything your current router does, but more,” Guha said. “It lets you hook up your quantum gadgets to what we are beginning to build today – the new internet of the future.”

    A first-of-its-kind campuswide quantum networking testbed will be built at the University of Arizona, connecting laboratories across the UArizona campus, initially spanning the College of Optical Sciences, Department of Electrical and Computer Engineering, Department of Materials Science and Engineering and the BIO5 Institute.

    “The next few years will be very exciting, as we are at a time when the community puts emerging quantum computers, processors, sensors and other gadgets to real use,” Guha said. “We are just beginning to connect small quantum computers, sensors and other gadgets into quantum networks that transmit quantum bits.”

    According to Guha, quantum-enabled sensors will be more sensitive than classical ones, and will dramatically improve technologies such as microscopes used in biomedical research to look for cancer cells, sensors on low-Earth-orbit satellites, and magnetic field sensors used for positioning and navigation.

    Guha says today’s internet is a playground for hackers, due to insecure communication links to inadequately guarded data in the cloud. Quantum systems will provide a level of privacy, security and computational clout that is impossible to achieve with today’s internet.

    “The Center for Quantum Networking stands as an example for the core priorities of our university-wide strategic plan,” said UArizona President Robert C. Robbins. “As a leading international research university bringing the Fourth Industrial Revolution to life, we are deeply committed to (our strategic plan to) advance amazing new information technologies like quantum networking to benefit humankind. And we are equally committed to examining the complex, social, legal, economic and policy questions raised by these new technologies.

    “In addition to bringing researchers together from intellectually and culturally diverse disciplines, the CQN will provide future quantum engineers and social scientists with incredible learning opportunities and the chance to work side by side with the world’s leading experts.”

    The center will bring together scientists, engineers and social scientists working on quantum information science and engineering and its societal impacts. UArizona has teamed up with core partners Harvard University, the Massachusetts Institute of Technology and Yale University to work on the core hardware technologies for quantum networks and create an entrepreneurial ecosystem for quantum network technology transfer.

    In addition to creating a diverse quantum engineer workforce, the center will develop a quantum networking applications roadmap, developed cooperatively with industry partners, to help prioritize CQN’s research investments as new application concepts are developed.

    Jane Bambauer, CQN co-deputy director and professor in the James E. Rogers College of Law, who also spoke about the center, said that “the classical internet changed our relationship to computers and each other.”

    “While we build the technical foundations for the quantum internet, we are also building the foundation for a socially responsible rollout of the new technology,” Bambauer said. “We are embedding policy and social science expertise into our center’s core research activities. We’re also creating effective and inclusive education programs to make sure that the opportunities for jobs and for invention are shared broadly.”

    This is the third National Science Foundation Engineering Research Center led by the University of Arizona. The other two are the ERC for Environmentally Benign Semiconductor Manufacturing, led by the College of Engineering, and the Center for Integrated Access Networks, led by the Wyant College of Optical Sciences. CQN will be bolstered by the Wyant College’s recent endowments – including the largest faculty endowment gift in the history of the University of Arizona – and the planned construction of the new Grand Challenges Research Building, supported by the state of Arizona.

    Additional speakers at today’s event included:

    Dirk Englund, CQN Deputy Director for Engineering Research, MIT Electrical Engineering & Computer Science
    Charlie Tahan, Assistant Director for Quantum Information Science and Director, National Quantum Coordination Office, White House Office of Science and Technology Policy
    Linda Blevins, Deputy Assistant Director of the Engineering Directorate, National Science Foundation
    Kon-Well Wang, Division Director, Division of Engineering Education and Centers, Directorate for Engineering, National Science Foundation

    Center for Quantum Networks Briefing

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Arizona (UA) is a place without limits-where teaching, research, service and innovation merge to improve lives in Arizona and beyond. We aren’t afraid to ask big questions, and find even better answers.

    In 1885, establishing Arizona’s first university in the middle of the Sonoran Desert was a bold move. But our founders were fearless, and we have never lost that spirit. To this day, we’re revolutionizing the fields of space sciences, optics, biosciences, medicine, arts and humanities, business, technology transfer and many others. Since it was founded, the UA has grown to cover more than 380 acres in central Tucson, a rich breeding ground for discovery.

    U Arizona mirror lab-Where else in the world can you find an astronomical observatory mirror lab under a football stadium?

    University of Arizona’s Biosphere 2, located in the Sonoran desert. An entire ecosystem under a glass dome? Visit our campus, just once, and you’ll quickly understand why the UA is a university unlike any other.

  • richardmitnick 12:10 pm on August 18, 2020 Permalink | Reply
    Tags: "Quantum paradox points to shaky foundations of reality", Nearly 60 years ago the Nobel Prize–winning physicist Eugene Wigner captured one of the many oddities of quantum mechanics in a thought experiment., , , Researchers in Australia and Taiwan offer perhaps the sharpest demonstration that Wigner’s paradox is real., , Superposition, The team also tested the theorem with an experiment using photons as proxies for the humans., The team transformed the thought experiment into a mathematical theorem that confirms the irreconcilable contradiction at the heart of the scenario., Wigner’s thought experiment has seen renewed attention in recent years.   

    From Science Magazine: “Quantum paradox points to shaky foundations of reality” 

    From Science Magazine

    Aug. 17, 2020
    George Musser

    Not just “philosophical mumbo-jumbo”: An experiment shows how facts may depend on the observer.
    Davide Bonazzi/Salzman Art.

    Nearly 60 years ago, the Nobel Prize–winning physicist Eugene Wigner captured one of the many oddities of quantum mechanics in a thought experiment. He imagined a friend of his, sealed in a lab, measuring a particle such as an atom while Wigner stood outside. Quantum mechanics famously allows particles to occupy many locations at once—a so-called superposition—but the friend’s observation “collapses” the particle to just one spot. Yet for Wigner, the superposition remains: The collapse occurs only when he makes a measurement sometime later. Worse, Wigner also sees the friend in a superposition. Their experiences directly conflict.

    Now, researchers in Australia and Taiwan offer perhaps the sharpest demonstration that Wigner’s paradox is real. In a study published this week in Nature Physics, they transform the thought experiment into a mathematical theorem that confirms the irreconcilable contradiction at the heart of the scenario. The team also tests the theorem with an experiment, using photons as proxies for the humans. Whereas Wigner believed resolving the paradox requires quantum mechanics to break down for large systems such as human observers, some of the new study’s authors believe something just as fundamental is on thin ice: objectivity. It could mean there is no such thing as an absolute fact, one that is as true for me as it is for you.

    “It’s a bit disconcerting,” says co-author Nora Tischler of Griffith University. “A measurement outcome is what science is based on. If somehow that’s not absolute, it’s hard to imagine.”

    For physicists who have dismissed thought experiments like Wigner’s as interpretive navel gazing, the study shows the contradictions can emerge in actual experiments, says Dustin Lazarovici, a physicist and philosopher at the University of Lausanne who was not part of the team. “The paper goes to great lengths to speak the language of those who have tried to merely discuss foundational issues away and may thus compel at least some to face up to them,” he says.

    Wigner’s thought experiment has seen renewed attention in recent years. In 2015, Časlav Brukner of the University of Vienna tested the most intuitive way around the paradox: that the friend inside the lab has, in fact, seen the particle in one place or another, and Wigner just doesn’t know what it is yet. In the jargon of quantum theory, the friend’s result is a hidden variable.

    Brukner ruled out that conclusion in a thought experiment of his own, using a trick—based on quantum entanglement—to bring the hidden variable out into the open. He imagined setting up two friend-Wigner pairs and giving each a particle, entangled with its partner in such a way that their attributes, upon measurement, are correlated. Each friend measures the particle, each Wigner measures the friend measuring the particle, and the two Wigners compare notes. The process repeats. If the friends saw definite results—as you might suspect—the Wigners’ own findings would show only weak correlations. But instead, they find a pattern of strong correlations. “You run into contradictions,” Brukner says. His experiment and a similar one in 2016 by Daniela Frauchiger and Renato Renner of ETH Zürich led to an outpouring of papers and heated discussion at conferences.

    But in 2018, Richard Healey, a philosopher of physics at the University of Arizona, pointed out a loophole in Brukner’s thought experiment, which Tischler and her colleagues have now closed. In their new scenario they make four assumptions. One is that the results the friends obtain are real: They can be combined with other measurements to form a shared corpus of knowledge. They also assume quantum mechanics is universal, and as valid for observers as for particles; that the choices the observers make are free of peculiar biases induced by a godlike superdeterminism; and that physics is local, free of all but the most limited form of “spooky action” at a distance.

    Yet their analysis shows the contradictions of Wigner’s paradox persist. The team’s tabletop experiment, in which they created entangled photons, also backs up the paradox. Optical elements steered each photon onto a path that depended on its polarization: the equivalent of the friends’ observations. The photon then entered a second set of elements and detectors that played the role of the Wigners. The team found, again, an irreconcilable mismatch between the friends and the Wigners. What is more, they varied exactly how entangled the particles were and showed that the mismatch occurs for different conditions than in Brukner’s scenario. “That shows that we really have something new here,” Tischler says.

    It also indicates that one of the four assumptions has to give. Few physicists believe superdeterminism could be to blame. Some see locality as the weak point, but its failure would be stark: One observer’s actions would affect another’s results even across great distances—a stronger kind of nonlocality than the type quantum theorists often consider. So some are questioning the tenet that observers can pool their measurements empirically. “It could be that there are facts for one observer, and facts for another; they need not mesh,” says study co-author and Griffith physicist Howard Wiseman. It is a radical relativism, still jarring to many. “From a classical perspective, what everyone sees is considered objective, independent of what anyone else sees,” says Olimpia Lombardi, a philosopher of physics at the University of Buenos Aires.

    And then there is Wigner’s conclusion that quantum mechanics itself breaks down. Of the assumptions, it is the most directly testable, by experiments that are probing quantum mechanics on ever larger scales. But the one position that doesn’t survive the analysis is to have no position, says another co-author at Griffith, Eric Cavalcanti. “Most physicists, they think: ‘That’s just philosophical mumbo-jumbo,’” he says. “They will have a hard time.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 11:00 am on August 14, 2020 Permalink | Reply
    Tags: "Yale quantum researchers create an error-correcting cat", , , , , , Superposition,   

    From Yale University: “Yale quantum researchers create an error-correcting cat” 

    From Yale University

    August 12, 2020

    Media Contact
    Fred Mamoun

    By Jim Shelton

    (Illustration by Michael S. Helfenbein)

    Yale physicists have developed an error-correcting cat — a new device that combines the Schrödinger’s cat concept of superposition (a physical system existing in two states at once) with the ability to fix some of the trickiest errors in a quantum computation.

    It is Yale’s latest breakthrough in the effort to master and manipulate the physics necessary for a useful quantum computer: correcting the stream of errors that crop up among fragile bits of quantum information, called qubits, while performing a task.

    A new study reporting on the discovery appears in the journal Nature. The senior author is Michel Devoret, Yale’s F.W. Beinecke Professor of Applied Physics and Physics. The study’s co-first authors are Alexander Grimm, a former postdoctoral associate in Devoret’s lab who is now a tenure-track scientist at the Paul Scherrer Institute in Switzerland, and Nicholas Frattini, a graduate student in Devoret’s lab.

    Quantum computers have the potential to transform an array of industries, from pharmaceuticals to financial services, by enabling calculations that are orders of magnitude faster than today’s supercomputers.

    Yale — led by Devoret, Robert Schoelkopf, and Steven Girvin — continues to build upon two decades of groundbreaking quantum research. Yale’s approach to building a quantum computer is called “circuit QED” and employs particles of microwave light (photons) in a superconducting microwave resonator.

    In a traditional computer, information is encoded as either 0 or 1. The only errors that crop up during calculations are “bit-flips,” when a bit of information accidentally flips from 0 to 1 or vice versa. The way to correct it is by building in redundancy: using three “physical” bits of information to ensure one “effective” — or accurate — bit.

    In contrast, quantum information bits — qubits — are subject to both bit-flips and “phase-flips,” in which a qubit randomly flips between quantum superpositions (when two opposite states exist simultaneously).

    Until now, quantum researchers have tried to fix errors by adding greater redundancy, requiring an abundance of physical qubits for each effective qubit.

    Enter the cat qubit — named for Schrödinger’s cat, the famous paradox used to illustrate the concept of superposition.

    The idea is that a cat is placed in a sealed box with a radioactive source and a poison that will be triggered if an atom of the radioactive substance decays. The superposition theory of quantum physics suggests that until someone opens the box, the cat is both alive and dead, a superposition of states. Opening the box to observe the cat causes it to abruptly change its quantum state randomly, forcing it to be either alive or dead.

    “Our work flows from a new idea. Why not use a clever way to encode information in a single physical system so that one type of error is directly suppressed?” Devoret asked.

    Unlike the multiple physical qubits needed to maintain one effective qubit, a single cat qubit can prevent phase flips all by itself. The cat qubit encodes an effective qubit into superpositions of two states within a single electronic circuit — in this case a superconducting microwave resonator whose oscillations correspond to the two states of the cat qubit.

    “We achieve all of this by applying microwave frequency signals to a device that is not significantly more complicated than a traditional superconducting qubit,” Grimm said.

    The researchers said they are able to change their cat qubit from any one of its superposition states to any other superposition state, on command. In addition, the researchers developed a new way of reading out — or identifying — the information encoded into the qubit.

    “This makes the system we have developed a versatile new element that will hopefully find its use in many aspects of quantum computation with superconducting circuits,” Devoret said.

    Co-authors of the study are Girvin, Shruti Puri, Shantanu Mundhada, and Steven Touzard, all of Yale; Mazyar Mirrahimi of Inria Paris; and Shyam Shankar of the University of Texas-Austin.

    The United States Department of Defense, the United States Army Research Office, and the National Science Foundation funded the research.


    Physicists can predict the jumps of Schrödinger’s cat (and finally save it)
    In quantum computing, doubling down on Schrödinger’s cat

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Yale University comprises three major academic components: Yale College (the undergraduate program), the Graduate School of Arts and Sciences, and the professional schools. In addition, Yale encompasses a wide array of centers and programs, libraries, museums, and administrative support offices. Approximately 11,250 students attend Yale.

  • richardmitnick 4:19 pm on March 6, 2019 Permalink | Reply
    Tags: An atom-defect hybrid quantum system, , , Coherence in quantum behavior, If you can see things on smaller scales with better sensitivity than anybody else you’re going to find new physics, In the experiment we will have an atom on the diamond surface that couples to a shallow subsurface NV center inside the material in a highly controlled cryogenic and ultra-high vacuum environment, Key to this technology is the nitrogen-vacancy (NV) center in diamond an extensively studied point defect in diamond’s carbon atom lattice, , , , Superposition, The physical and materials knowledge gained by mastering the interface of such a hybrid system would contribute to the development of quantum computing systems, The technique is reminiscent of molecular beam epitaxy (MBE) a method of “growing” a material atom-by-atom on a substrate, This project is a “natural fit” for UC Santa Barbara say the researchers due to the campus’s strengths in both physics and materials sciences, To Hold Without Touching,   

    From UC Santa Barbara: “Sensing Disturbances in the Force” 

    UC Santa Barbara Name bloc
    From UC Santa Barbara

    March 5, 2019
    Sonia Fernandez

    UC Santa Barbara researchers receive U.S. Department of Energy grant to build atom-defect hybrid quantum sensor.


    It will be a feat of engineering and physics at the smallest scales, but it could open the biggest doors — to new science and more advanced technologies. UC Santa Barbara physicists Ania Jayich and David Weld, and materials scientist Kunal Mukherjee, are teaming up to build an atom-defect hybrid quantum system — a sensor technology that would use the power of quantum science to unlock the mysteries of the atomic and subatomic world.

    “We’re at this tipping point where we know there’s a lot of impactful and fundamentally exciting things we can do,” said Jayich, whose research investigates quantum effects at the nanoscale. The $1.5 million grant from the Department of Energy’s Office of Basic Sciences will kickstart the development of a system that will allow researchers an unusually high level of control over atoms while simultaneously leaving their “quantumness” untouched.

    “In this whole field of quantum technology, that has been the big challenge,” Jayich said. In the quirky and highly unintuitive world of quantum mechanics, she explained, objects can exist in a superposition of many places at once, and entangled elements separated by thousands of miles can be inextricably linked — phenomena which, in turn, have opened up new and powerful possibilities for areas such as sensing, computing and the deepest investigations of nature.

    However, the coherence that is the signature of these quantum behaviors — a state of information that is the foundation of quantum technology — is exceedingly fragile and fleeting.

    “Quantum coherence is such a delicate phenomenon,” Jayich said. “Any uncontrolled interaction with the environment will kill it. And that’s the whole challenge behind advancing this field — how do we preserve the very delicate quantumness of an atom or defect, or anything?” To study a quantum element such as an atom, one would have to interrogate it, she explained, but the act of measuring can also destroy its quantum nature.

    To Hold Without Touching

    Fortunately, Jayich and colleagues see a way around this conundrum.

    “It’s a hybrid atomic- and solid-state system,” Jayich said. Key to this technology is the nitrogen-vacancy (NV) center in diamond, an extensively studied point defect in diamond’s carbon atom lattice. The NV center is comprised of a vacancy created by a missing carbon atom next to another vacancy that is substituted with a nitrogen atom. With its several unpaired electrons, it is highly sensitive to and interactive with external perturbations, such as the minute magnetic or electric fields that would occur in the presence of individual atoms of interest.

    “In the proposed experiment, we would have an atom on the diamond surface that couples to a shallow, subsurface NV center inside the material, in a highly controlled, cryogenic and ultra-high vacuum environment,” Jayich explained. The diamond surface provides a natural trapping that allows researchers to more easily hold the atom in place — a challenge for many quantum scientists who want to trap individual atoms. Further, upon reading the state of the defect, one could understand the quantum properties of the atom under interrogation — without touching the atom itself and destroying its coherence.

    Previous methods aimed at interrogating individual adatoms (adsorbed atoms) relied on passing current through the atoms and necessitated metal surfaces, both of which, according to Jayich, reduce quantum coherence times.

    “The past several decades of work in atomic physics have resulted in tools that allow exquisite quantum control of all degrees of freedom of atomic ensembles, but typically only when the atoms are gently held in a vacuum far away from all other matter,” added Weld. “This experiment seeks to extend this level of control into a much messier but also much more technologically relevant regime, by manipulating and sensing individual atoms that are chemically bonded to a solid surface.”

    With the hybrid system, Jayich said, it would be “very easy to talk to the NV center defect with light, and the atoms have the benefit of retaining quantum information for very long periods of time. So we have a system where we leverage the best of both worlds — the best of the atom and the best of the defect — and put them together in a way that’s functional.”

    A Foundation for Future Quantum Tech

    Looking forward, the state-of-the-art spatial resolution and sensitivity of this atom-defect hybrid quantum system could offer researchers the deepest look at the workings of individual atoms, or structures of molecules at nanometer- and Angstrom scales.

    “If you can see things on smaller scales with better sensitivity than anybody else, you’re going to find new physics,” Jayich said. The connections of microscopic structure to macroscopic behavior in materials synthesis could be elucidated. Quantum phenomena in condensed matter systems could be probed. Proteins that have evaded structural determination — such as membrane proteins — could be studied.

    This project is a “natural fit” for UC Santa Barbara, say the researchers, due to the campus’s strengths in both physics and materials sciences. The technique is reminiscent of molecular beam epitaxy (MBE), a method of “growing” a material atom-by-atom on a substrate.

    “There is a strong tradition of materials deposition at UCSB, ranging from metals, semiconductors to novel electronic materials,” Mukherjee said of the campus’s long record of materials growth and world-class MBE facilities. Among the first few atoms they intend to study are rare-earth types such as holmium or dysprosium “as they have unpaired electrons which are protected from environmental interactions by the atomic structure,” noted Mukherjee, adding that he is “particularly excited” about the challenge of removing the atoms from and resetting the diamond surface without breaking vacuum.

    Additionally, the physical and materials knowledge gained by mastering the interface of such a hybrid system would contribute to the development of quantum computing systems. According to Jayich, future practicable quantum computers would likely be a hybrid of several elements, similar to how conventional computers are a mix of magnetic, electronic and solid-state components.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC Santa Barbara Seal
    The University of California, Santa Barbara (commonly referred to as UC Santa Barbara or UCSB) is a public research university and one of the 10 general campuses of the University of California system. Founded in 1891 as an independent teachers’ college, UCSB joined the University of California system in 1944 and is the third-oldest general-education campus in the system. The university is a comprehensive doctoral university and is organized into five colleges offering 87 undergraduate degrees and 55 graduate degrees. In 2012, UCSB was ranked 41st among “National Universities” and 10th among public universities by U.S. News & World Report. UCSB houses twelve national research centers, including the renowned Kavli Institute for Theoretical Physics.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: