Tagged: Quantum theory Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:44 am on June 2, 2022 Permalink | Reply
    Tags: "GPTs": generalized probabilistic theories-operational theories in which classical and quantum physics are special cases., "Physicists wonder what comes after quantum?", A new approach allows data to inform an interpretation theory., A quantum bit or qubit can be both 0 and 1 and is a two-level system., In this experiment the team investigated a three-level system where the bits have three degrees of freedom rather than two. The quantum analog of a three-level system is called a qutrit., , , Quantum theory, , This research identified quantitative boundaries on the scope of possible deviations from quantum theory for three-level systems.   

    From The University of Waterloo (CA): “Physicists wonder what comes after quantum?” 

    U Waterloo bloc

    From The University of Waterloo (CA)

    May 18, 2022 [Just today in social media.]

    Quantum theory, the physics of the very small, helps us to understand nature and our world by explaining and predicting the behaviour of atoms and molecules. Researchers at the Institute for Quantum Computing (IQC) are interested in what comes after quantum theory, specifically the possibility of a broader theory that replaces quantum theory as a more complete description of nature.

    In 1900, while studying radiation, Max Planck observed that energy could behave in a way not consistent with classical physics. Twenty-years later a more fulsome understanding of matter emerged. Based on the research of physicists like Bohr, Schrödinger, and Heisenberg this new theory, quantum theory, accounted for the unpredictable nature Planck observed two decades before. In the same way that quantum physics built on our understanding of classical physics, a novel, post-quantum theory may build off our current understanding of quantum physics.

    As a master’s student with the Department of Physics and Astronomy and IQC, Michael Grabowecky was interested in exploring any potential deviations from quantum theory and identifying restrictions on any new potential theories.

    To test quantum theory against possible alternates, a neutral or theory agnostic approach was needed. This approach allows data to inform an interpretation theory. The team designed an experiment to collect a large amount of data from a three-level system, then work out a theory directly from the obtained data.

    “We do not assume any particular theory to be true before conducting the experiment. We want to make as little assumptions as possible, and we definitely don’t want to assume that quantum mechanics is true,” said Grabowecky. “The whole purpose of these kind of experiments is to let the statistics and the photons speak for themselves.”

    To minimize the experimental assumptions and take a theory-agnostic approach, the team used the framework of generalized probabilistic theories (GPTs). GPTs are operational theories in which classical and quantum physics are special cases. The team used GPTs because they require minimal assumptions and can be used to avoid any inherent quantum biases when conducting an experiment.

    A digital computer stores and processes information using bits, which can either be 0 or 1. A quantum bit or qubit can be both 0 and 1 and is a two-level system. In this experiment the team investigated a three-level system where the bits have three degrees of freedom rather than two. The quantum analog of a three-level system is called a qutrit.

    “We prepared a three-level system in a wide variety of ways and on each of those preparations, we performed a large number of measurements. The statistics associated with these random preparations and measurements were used to construct a physical theory describing our system,” said Grabowecky.

    The experiment found that quantum theory works well in describing the obtained data, but a broader theory beyond quantum may be possible. Furthermore, this research identified quantitative boundaries on the scope of possible deviations from quantum theory for three-level systems.

    Grabowecky, now the Quantum Technology Lab Coordinator at IQC, is excited by the potential of this research.

    The experimental data sets from this research can be used to test future theories that may supersede quantum theory and advance fundamental research.

    The science paper was published in Physical Review A on March 10, 2022.

    Accepted 17 February 2022.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Waterloo campus

    In just half a century, the The University of Waterloo (CA) located at the heart of Canada’s technology hub, has become a leading comprehensive university with nearly 36,000 full- and part-time students in undergraduate and graduate programs.

    Consistently ranked Canada’s most innovative university, Waterloo is home to advanced research and teaching in science and engineering, mathematics and computer science, health, environment, arts and social sciences. From quantum computing and nanotechnology to clinical psychology and health sciences research, Waterloo brings ideas and brilliant minds together, inspiring innovations with real impact today and in the future.

    As home to the world’s largest post-secondary co-operative education program, Waterloo embraces its connections to the world and encourages enterprising partnerships in learning, research, and commercialization. With campuses and education centres on four continents, and academic partnerships spanning the globe, Waterloo is shaping the future of the planet.

  • richardmitnick 1:47 pm on October 8, 2021 Permalink | Reply
    Tags: "Fermilab boasts new Theory Division", Astrophysics Theory, , , , , , , Fermilab experts on perturbative QCD use high-performance computing to tackle the complexity of simulations for experiments at the Large Hadron Collider., Muon g-2 Theory Initiative and the Muon g-2 experiment, , Particle Theory, , Quantum theory, Superconducting Systems,   

    From DOE’s Fermi National Accelerator Laboratory (US) : “Fermilab boasts new Theory Division” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From DOE’s Fermi National Accelerator Laboratory (US) , an enduring source of strength for the US contribution to scientific research worldwide.

    October 8, 2021

    Theoretical physics research at Fermi National Particle Accelerator Laboratory has always sparked new ideas and scientific opportunities, while at the same time supporting the large experimental group that conducts research at Fermilab. In recent years, the Theoretical Physics Department has further strengthened its position worldwide as a hub for the high-energy physics theoretical community. The department has now become Fermilab’s newest division, the Theory Division, which officially launched early this year with strong support from HEP.

    This new division seeks to:

    support strategic theory leadership;
    promote new initiatives, as well as strengthen existing ones;
    and leverage U.S. Department of Energy support through partnerships with universities and more.

    “Creating the Theory Division increases the lab’s abilities to stimulate and develop new pathways to discovery,” said Fermilab Director Nigel Lockyer.

    Led by Marcela Carena and her deputy Patrick Fox, this new division features three departments: Particle Theory, Astrophysics Theory and Quantum Theory. “This structure will help us focus our scientific efforts in each area and will allow for impactful contributions to existing and developing programs for the theory community,” said Carena.

    Particle Theory Department

    At the helm of the Particle Theory Department is Andreas Kronfeld. This department studies all aspects of theoretical particle physics, especially those areas inspired by the experimental program—at Fermilab and elsewhere. It coordinates leading national efforts, including the Neutrino Theory Network, and the migration of the lattice gauge theory program to Exascale computing platforms. Lattice quantum chromodynamics, or QCD, experts support the Muon g-2 Theory Initiative, providing a solid theory foundation for the recently announced results of the Muon g-2 experiment.

    Fermilab particle theorists, working with DOE’s Argonne National Laboratory (US) nuclear theorists, are using machine learning for developing novel event generators to precisely model neutrino-nuclear interactions, and employ lattice QCD to model multi-nucleon interactions; both are important for achieving the science goals of DUNE.

    Fermilab experts on perturbative QCD use high-performance computing to tackle the complexity of simulations for experiments at the Large Hadron Collider. Fermilab theorists are strongly involved in the exploration of physics beyond the Standard Model, through model-building, particle physics phenomenology, and formal aspects of quantum field theory.

    Astrophysics Theory Department

    Astrophysics Theory, led by Dan Hooper, consists of researchers who work at the confluence of astrophysics, cosmology and particle physics. Fermilab’s scientists have played a key role in the development of this exciting field worldwide and continue to be deeply involved in supporting the Fermilab cosmic frontier program.

    Key areas of research include dark matter, dark energy, the cosmic microwave background, large-scale structure, neutrino astronomy and axion astrophysics. A large portion of the department’s research involves numerical cosmological simulations of galaxy formation, large-scale structures and gravitational lensing. The department is developing machine-learning tools to help solve these challenging problems.

    Quantum Theory Department

    Led by Roni Harnik, the Quantum Theory Department has researchers working at the interface of quantum information science and high-energy physics. Fermilab theorists are working to harness the developing power of unique quantum information capabilities to address important physics questions, such as the simulation of QCD processes, dynamics in the early universe, and more generally simulating quantum field theories. Quantum-enhanced capabilities also open new opportunities to explore the universe and test theories of new particles, dark matter, gravitational waves and other new physics.

    Scientists in the Quantum Theory Department are developing new algorithms for quantum simulations, and they are proposing novel methods to search for new phenomena using quantum technology, including quantum optics, atomic physics, optomechanical sensors and superconducting systems. The department works in close collaboration with both the Fermilab Superconducting Quantum Materials and Systems Center and the Fermilab Quantum Institute, as well as leads a national QuantISED theory consortium.

    Looking ahead

    The new Theory Division also intends to play a strong role in attracting and inspiring the next generation of theorists, training them in a data-rich environment, as well as promoting an inclusive culture that values diversity.

    “The best part about being a Fermilab theorist,” said Marcela Carena, “is working with brilliant junior scientists and sharing their excitement about exploring new ideas.”

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Fermi National Accelerator Laboratory (US), located just outside Batavia, Illinois, near Chicago, is a United States Department of Energy national laboratory specializing in high-energy particle physics. Since 2007, Fermilab has been operated by the Fermi Research Alliance, a joint venture of the University of Chicago, and the Universities Research Association (URA). Fermilab is a part of the Illinois Technology and Research Corridor.

    Fermilab’s Tevatron was a landmark particle accelerator; until the startup in 2008 of the Large Hadron Collider(CH) near Geneva, Switzerland, it was the most powerful particle accelerator in the world, accelerating antiprotons to energies of 500 GeV, and producing proton-proton collisions with energies of up to 1.6 TeV, the first accelerator to reach one “tera-electron-volt” energy. At 3.9 miles (6.3 km), it was the world’s fourth-largest particle accelerator in circumference. One of its most important achievements was the 1995 discovery of the top quark, announced by research teams using the Tevatron’s CDF and DØ detectors. It was shut down in 2011.

    In addition to high-energy collider physics, Fermilab hosts fixed-target and neutrino experiments, such as MicroBooNE (Micro Booster Neutrino Experiment), NOνA (NuMI Off-Axis νe Appearance) and SeaQuest. Completed neutrino experiments include MINOS (Main Injector Neutrino Oscillation Search), MINOS+, MiniBooNE and SciBooNE (SciBar Booster Neutrino Experiment). The MiniBooNE detector was a 40-foot (12 m) diameter sphere containing 800 tons of mineral oil lined with 1,520 phototube detectors. An estimated 1 million neutrino events were recorded each year. SciBooNE sat in the same neutrino beam as MiniBooNE but had fine-grained tracking capabilities. The NOνA experiment uses, and the MINOS experiment used, Fermilab’s NuMI (Neutrinos at the Main Injector) beam, which is an intense beam of neutrinos that travels 455 miles (732 km) through the Earth to the Soudan Mine in Minnesota and the Ash River, Minnesota, site of the NOνA far detector. In 2017, the ICARUS neutrino experiment was moved from CERN to Fermilab.
    In the public realm, Fermilab is home to a native prairie ecosystem restoration project and hosts many cultural events: public science lectures and symposia, classical and contemporary music concerts, folk dancing and arts galleries. The site is open from dawn to dusk to visitors who present valid photo identification.
    Asteroid 11998 Fermilab is named in honor of the laboratory.
    Weston, Illinois, was a community next to Batavia voted out of existence by its village board in 1966 to provide a site for Fermilab.

    The laboratory was founded in 1969 as the National Accelerator Laboratory; it was renamed in honor of Enrico Fermi in 1974. The laboratory’s first director was Robert Rathbun Wilson, under whom the laboratory opened ahead of time and under budget. Many of the sculptures on the site are of his creation. He is the namesake of the site’s high-rise laboratory building, whose unique shape has become the symbol for Fermilab and which is the center of activity on the campus.
    After Wilson stepped down in 1978 to protest the lack of funding for the lab, Leon M. Lederman took on the job. It was under his guidance that the original accelerator was replaced with the Tevatron, an accelerator capable of colliding protons and antiprotons at a combined energy of 1.96 TeV. Lederman stepped down in 1989. The science education center at the site was named in his honor.
    The later directors include:

    John Peoples, 1989 to 1996
    Michael S. Witherell, July 1999 to June 2005
    Piermaria Oddone, July 2005 to July 2013
    Nigel Lockyer, September 2013 to the present

    Fermilab continues to participate in the work at the Large Hadron Collider (LHC); it serves as a Tier 1 site in the Worldwide LHC Computing Grid.

    DOE’s Fermi National Accelerator Laboratory(US)/MINERvA Reidar Hahn.

    FNAL Don Lincoln.[/caption]

    FNAL Icon

  • richardmitnick 1:52 pm on June 18, 2021 Permalink | Reply
    Tags: "Mathematicians Prove 2D Version of Quantum Gravity Really Works", A trilogy of landmark publications, , “Liouville field”- see the description in the full blog post., , DOZZ formula: a finding of Harald Dorn; Hans-Jörg Otto; Alexeif Zamolodchikov; Alexander Zamolodchikov, Fields are central to quantum physics too; however the situation here is more complicated due to the deep randomness of quantum theory., In classical physics for example a single field tells you everything about how a force pushes objects around., In physics today the main actors in the most successful theories are fields., , , QFT: Quantum Field Theory-a model of how one or more quantum fields each with their infinite variations act and interact., , Quantum theory   

    From Quanta Magazine : “Mathematicians Prove 2D Version of Quantum Gravity Really Works” 

    From Quanta Magazine

    June 17, 2021
    Charlie Wood

    In three towering papers, a team of mathematicians has worked out the details of Liouville quantum field theory, a two-dimensional model of quantum gravity.

    Credit: Olena Shmahalo/Quanta Magazine.

    Alexander Polyakov, a theoretical physicist now at Princeton University (US), caught a glimpse of the future of quantum theory in 1981. A range of mysteries, from the wiggling of strings to the binding of quarks into protons, demanded a new mathematical tool whose silhouette he could just make out.

    “There are methods and formulae in science which serve as master keys to many apparently different problems,” he wrote in the introduction to a now famous four-page letter in Physics Letters B. “At the present time we have to develop an art of handling sums over random surfaces.”

    Polyakov’s proposal proved powerful. In his paper he sketched out a formula that roughly described how to calculate averages of a wildly chaotic type of surface, the “Liouville field.” His work brought physicists into a new mathematical arena, one essential for unlocking the behavior of theoretical objects called strings and building a simplified model of quantum gravity.

    Years of toil would lead Polyakov to breakthrough solutions for other theories in physics, but he never fully understood the mathematics behind the Liouville field.

    Over the last seven years, however, a group of mathematicians has done what many researchers thought impossible. In a trilogy of landmark publications, they have recast Polyakov’s formula using fully rigorous mathematical language and proved that the Liouville field flawlessly models the phenomena Polyakov thought it would.

    Vincent Vargas of the National Centre for Scientific Research [Centre national de la recherche scientifique, [CNRS] (FR) and his collaborators have achieved a rare feat: a strongly interacting quantum field theory perfectly described by a brief mathematical formula.

    “It took us 40 years in math to make sense of four pages,” said Vincent Vargas, a mathematician at the French National Center for Scientific Research and co-author of the research with Rémi Rhodes of Aix-Marseille University [Aix-Marseille Université] (FR), Antti Kupiainen of the University of Helsinki [ Helsingin yliopisto; Helsingfors universitet] (FI), François David of the French National Centre for Scientific Research [Centre national de la recherche scientifique, [CNRS] (FR), and Colin Guillarmou of Paris-Saclay University [Université Paris-Saclay] (FR).

    The three papers forge a bridge between the pristine world of mathematics and the messy reality of physics — and they do so by breaking new ground in the mathematical field of probability theory. The work also touches on philosophical questions regarding the objects that take center stage in the leading theories of fundamental physics: quantum fields.

    “This is a masterpiece in mathematical physics,” said Xin Sun, a mathematician at the University of Pennsylvania (US).

    Infinite Fields

    In physics today the main actors in the most successful theories are fields — objects that fill space, taking on different values from place to place.

    In classical physics for example a single field tells you everything about how a force pushes objects around. Take Earth’s magnetic field: The twitches of a compass needle reveal the field’s influence (its strength and direction) at every point on the planet.

    Fields are central to quantum physics too; however the situation here is more complicated due to the deep randomness of quantum theory. From the quantum perspective, Earth doesn’t generate one magnetic field, but rather an infinite number of different ones. Some look almost like the field we observe in classical physics, but others are wildly different.

    But physicists still want to make predictions — predictions that ideally match, in this case, what a mountaineer reads on a compass. Assimilating the infinite forms of a quantum field into a single prediction is the formidable task of a “quantum field theory,” or QFT. This is a model of how one or more quantum fields each with their infinite variations act and interact.

    Driven by immense experimental support, QFTs have become the basic language of particle physics. The Standard Model is one such QFT, depicting fundamental particles like electrons as fuzzy bumps that emerge from an infinitude of electron fields. It has passed every experimental test to date (although various groups may be on the verge of finding the first holes).

    Physicists play with many different QFTs. Some, like the Standard Model, aspire to model real particles moving through the four dimensions of our universe (three spatial dimensions plus one dimension of time). Others describe exotic particles in strange universes, from two-dimensional flatlands to six-dimensional uber-worlds. Their connection to reality is remote, but physicists study them in the hopes of gaining insights they can carry back into our own world.

    Polyakov’s Liouville field theory is one such example.


    Gravity’s Field

    The Liouville field, which is based on an equation from complex analysis developed in the 1800s by the French mathematician Joseph Liouville, describes a completely random two-dimensional surface — that is, a surface, like Earth’s crust, but one in which the height of every point is chosen randomly. Such a planet would erupt with mountain ranges of infinitely tall peaks, each assigned by rolling a die with infinite faces.

    Such an object might not seem like an informative model for physics, but randomness is not devoid of patterns. The bell curve, for example, tells you how likely you are to randomly pass a seven-foot basketball player on the street. Similarly, bulbous clouds and crinkly coastlines follow random patterns, but it’s nevertheless possible to discern consistent relationships between their large-scale and small-scale features.

    Liouville theory can be used to identify patterns in the endless landscape of all possible random, jagged surfaces. Polyakov realized this chaotic topography was essential for modeling strings, which trace out surfaces as they move. The theory has also been applied to describe quantum gravity in a two-dimensional world. Einstein defined gravity as space-time’s curvature, but translating his description into the language of quantum field theory creates an infinite number of space-times — much as the Earth produces an infinite collection of magnetic fields. Liouville theory packages all those surfaces together into one object. It gives physicists the tools to measure the curvature —and hence, gravitation — at every location on a random 2D surface.

    “Quantum gravity basically means random geometry, because quantum means random and gravity means geometry,” said Sun.

    Polyakov’s first step in exploring the world of random surfaces was to write down an expression defining the odds of finding a particular spiky planet, much as the bell curve defines the odds of meeting someone of a particular height. But his formula did not lead to useful numerical predictions.

    To solve a quantum field theory is to be able to use the field to predict observations. In practice, this means calculating a field’s “correlation functions,” which capture the field’s behavior by describing the extent to which a measurement of the field at one point relates, or correlates, to a measurement at another point. Calculating correlation functions in the photon field, for instance, can give you the textbook laws of quantum electromagnetism.

    Polyakov was after something more abstract: the essence of random surfaces, similar to the statistical relationships that make a cloud a cloud or a coastline a coastline. He needed the correlations between the haphazard heights of the Liouville field. Over the decades he tried two different ways of calculating them. He started with a technique called the Feynman path integral and ended up developing a workaround known as the bootstrap. Both methods came up short in different ways, until the mathematicians behind the new work united them in a more precise formulation.

    Add ’Em Up

    You might imagine that accounting for the infinitely many forms a quantum field can take is next to impossible. And you would be right. In the 1940s Richard Feynman, a quantum physics pioneer, developed one prescription for dealing with this bewildering situation, but the method proved severely limited.

    Take, again, Earth’s magnetic field. Your goal is to use quantum field theory to predict what you’ll observe when you take a compass reading at a particular location. To do this, Feynman proposed summing all the field’s forms together. He argued that your reading will represent some average of all the field’s possible forms. The procedure for adding up these infinite field configurations with the proper weighting is known as the Feynman path integral.

    It’s an elegant idea that yields concrete answers only for select quantum fields. No known mathematical procedure can meaningfully average an infinite number of objects covering an infinite expanse of space in general. The path integral is more of a physics philosophy than an exact mathematical recipe. Mathematicians question its very existence as a valid operation and are bothered by the way physicists rely on it.

    “I’m disturbed as a mathematician by something which is not defined,” said Eveliina Peltola, a mathematician at the University of Bonn [Rheinische Friedrich-Wilhelms-Universität Bonn](DE) in Germany.

    Physicists can harness Feynman’s path integral to calculate exact correlation functions for only the most boring of fields — free fields, which do not interact with other fields or even with themselves. Otherwise, they have to fudge it, pretending the fields are free and adding in mild interactions, or “perturbations.” This procedure, known as perturbation theory, gets them correlation functions for most of the fields in the Standard Model, because nature’s forces happen to be quite feeble.

    But it didn’t work for Polyakov. Although he initially speculated that the Liouville field might be amenable to the standard hack of adding mild perturbations, he found that it interacted with itself too strongly. Compared to a free field, the Liouville field seemed mathematically inscrutable, and its correlation functions appeared unattainable.

    Up by the Bootstraps

    Polyakov soon began looking for a workaround. In 1984, he teamed up with Alexander Belavin and Alexander Zamolodchikov to develop a technique called the bootstrap — a mathematical ladder that gradually leads to a field’s correlation functions.

    To start climbing the ladder, you need a function which expresses the correlations between measurements at a mere three points in the field. This “three-point correlation function,” plus some additional information about the energies a particle of the field can take, forms the bottom rung of the bootstrap ladder.

    From there you climb one point at a time: Use the three-point function to construct the four-point function, use the four-point function to construct the five-point function, and so on. But the procedure generates conflicting results if you start with the wrong three-point correlation function in the first rung.

    Polyakov, Belavin and Zamolodchikov used the bootstrap to successfully solve a variety of simple QFT theories, but just as with the Feynman path integral, they couldn’t make it work for the Liouville field.

    Then in the 1990s two pairs of physicists — Harald Dorn and Hans-Jörg Otto, and Zamolodchikov and his brother Alexei — managed to hit on the three-point correlation function that made it possible to scale the ladder, completely solving the Liouville field (and its simple description of quantum gravity). Their result, known by their initials as the DOZZ formula, let physicists make any prediction involving the Liouville field. But even the authors knew they had arrived at it partially by chance, not through sound mathematics.

    “They were these kind of geniuses who guessed formulas,” said Vargas.

    Educated guesses are useful in physics, but they don’t satisfy mathematicians, who afterward wanted to know where the DOZZ formula came from. The equation that solved the Liouville field should have come from some description of the field itself, even if no one had the faintest idea how to get it.

    “It looked to me like science fiction,” said Kupiainen. “This is never going to be proven by anybody.”

    Taming Wild Surfaces

    In the early 2010s, Vargas and Kupiainen joined forces with the probability theorist Rémi Rhodes and the physicist François David. Their goal was to tie up the mathematical loose ends of the Liouville field — to formalize the Feynman path integral that Polyakov had abandoned and, just maybe, demystify the DOZZ formula.

    As they began, they realized that a French mathematician named Jean-Pierre Kahane had discovered, decades earlier, what would turn out to be the key to Polyakov’s master theory.

    “In some sense it’s completely crazy that Liouville was not defined before us,” Vargas said. “All the ingredients were there.”

    The insight led to three milestone papers in mathematical physics completed between 2014 and 2020.


    They first polished off the path integral, which had failed Polyakov because the Liouville field interacts strongly with itself, making it incompatible with Feynman’s perturbative tools. So instead, the mathematicians used Kahane’s ideas to recast the wild Liouville field as a somewhat milder random object known as the Gaussian free field. The peaks in the Gaussian free field don’t fluctuate to the same random extremes as the peaks in the Liouville field, making it possible for the mathematicians to calculate averages and other statistical measures in sensible ways.

    “Somehow it’s all just using the Gaussian free field,” Peltola said. “From that they can construct everything in the theory.”

    In 2014, they unveiled their result: a new and improved version of the path integral Polyakov had written down in 1981, but fully defined in terms of the trusted Gaussian free field. It’s a rare instance in which Feynman’s path integral philosophy has found a solid mathematical execution.

    “Path integrals can exist, do exist,” said Jörg Teschner, a physicist at the German Electron Synchrotron.

    With a rigorously defined path integral in hand, the researchers then tried to see if they could use it to get answers from the Liouville field and to derive its correlation functions. The target was the mythical DOZZ formula — but the gulf between it and the path integral seemed vast.

    “We’d write in our papers, just for propaganda reasons, that we want to understand the DOZZ formula,” said Kupiainen.

    The team spent years prodding their probabilistic path integral, confirming that it truly had all the features needed to make the bootstrap work. As they did so, they built on earlier work by Teschner. Eventually, Vargas, Kupiainen and Rhodes succeeded with a paper posted in 2017 [Annals of Mathematics] and another in October 2020, with Colin Guillarmou. They derived DOZZ and other correlation functions from the path integral and showed that these formulas perfectly matched the equations physicists had reached using the bootstrap.

    “Now we’re done,” Vargas said. “Both objects are the same.”

    The work explains the origins of the DOZZ formula and connects the bootstrap procedure —which mathematicians had considered sketchy — with verified mathematical objects. Altogether, it resolves the final mysteries of the Liouville field.

    “It’s somehow the end of an era,” said Peltola. “But I hope it’s also the beginning of some new, interesting things.”

    New Hope for QFTs

    Vargas and his collaborators now have a unicorn on their hands, a strongly interacting QFT perfectly described in a nonperturbative way by a brief mathematical formula that also makes numerical predictions.

    Now the literal million-dollar question is: How far can these probabilistic methods go? Can they generate tidy formulas for all QFTs? Vargas is quick to dash such hopes, insisting that their tools are specific to the two-dimensional environment of Liouville theory. In higher dimensions, even free fields are too irregular, so he doubts the group’s methods will ever be able to handle the quantum behavior of gravitational fields in our universe.

    But the fresh minting of Polyakov’s “master key” will open other doors. Its effects are already being felt in probability theory, where mathematicians can now wield previously dodgy physics formulas with impunity. Emboldened by the Liouville work, Sun and his collaborators have already imported equations from physics to solve two problems regarding random curves.

    Physicists await tangible benefits too, further down the road. The rigorous construction of the Liouville field could inspire mathematicians to try their hand at proving features of other seemingly intractable QFTs — not just toy theories of gravity but descriptions of real particles and forces that bear directly on the deepest physical secrets of reality.

    “[Mathematicians] will do things that we can’t even imagine,” said Davide Gaiotto, a theoretical physicist at the Perimeter Institute for Theoretical Physics (CA).

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

  • richardmitnick 12:28 pm on February 24, 2021 Permalink | Reply
    Tags: "Lack of symmetry in qubits can’t fix errors in quantum computing but might explain matter/antimatter imbalance", A new way to separate isotopes, , Hobbled by decoherence, Kibble-Zurek theory, , Quantum annealing computers, Quantum theory, The adiabatic theorem   

    From DOE’s Los Alamos National Laboratory(US): “Lack of symmetry in qubits can’t fix errors in quantum computing but might explain matter/antimatter imbalance” 

    LANL bloc

    From DOE’s Los Alamos National Laboratory(US)

    February 22, 2021

    A new paper seeking to cure a time restriction in quantum annealing computers instead opened up a class of new physics problems that can now be studied with quantum annealers without requiring they be too slow.

    A team of quantum theorists seeking to cure a basic problem with quantum annealing computers—they have to run at a relatively slow pace to operate properly—found something intriguing instead. While probing how quantum annealers perform when operated faster than desired, the team unexpectedly discovered a new effect that may account for the imbalanced distribution of matter and antimatter in the universe and a novel approach to separating isotopes.

    “Although our discovery did not cure the annealing time restriction, it brought a class of new physics problems that can now be studied with quantum annealers without requiring they be too slow,” said Nikolai Sinitsyn, a theoretical physicist at Los Alamos National Laboratory. Sinitsyn is author of the paper published Feb. 19 in Physical Review Letters, with coauthors Bin Yan and Wojciech Zurek, both also of Los Alamos, and Vladimir Chernyak of Wayne State University(US).

    Significantly, this finding hints at how at least two famous scientific problems may be resolved in the future. The first one is the apparent asymmetry between matter and antimatter in the universe.

    “We believe that small modifications to recent experiments with quantum annealing of interacting qubits made of ultracold atoms across phase transitions will be sufficient to demonstrate our effect,” Sinitsyn said.

    Explaining the matter/antimatter discrepancy

    Both matter and antimatter resulted from the energy excitations that were produced at the birth of the universe. The symmetry between how matter and antimatter interact was broken but very weakly. It is still not completely clear how this subtle difference could lead to the large observed domination of matter compared to antimatter at the cosmological scale.

    The newly discovered effect demonstrates that such an asymmetry is physically possible. It happens when a large quantum system passes through a phase transition, that is, a very sharp rearrangement of quantum state. In such circumstances, strong but symmetric interactions roughly compensate each other. Then subtle, lingering differences can play the decisive role.

    Making quantum annealers slow enough

    Quantum annealing computers are built to solve complex optimization problems by associating variables with quantum states or qubits. Unlike a classical computer’s binary bits, which can only be in a state, or value, of 0 or 1, qubits can be in a quantum superposition of in-between values. That’s where all quantum computers derive their awesome, if still largely unexploited, powers.

    In a quantum annealing computer, the qubits are initially prepared in a simple lowest energy state by applying a strong external magnetic field. This field is then slowly switched off, while the interactions between the qubits are slowly switched on.

    “Ideally an annealer runs slow enough to run with minimal errors, but because of decoherence, one has to run the annealer faster,” Yan explained. The team studied the emerging effect when the annealers are operated at a faster speed, which limits them to a finite operation time.)

    “According to the adiabatic theorem in quantum mechanics, if all changes are very slow, so-called adiabatically slow, then the qubits must always remain in their lowest energy state,” Sinitsyn said. “Hence, when we finally measure them, we find the desired configuration of 0s and 1s that minimizes the function of interest, which would be impossible to get with a modern classical computer.”

    Hobbled by decoherence

    However, currently available quantum annealers, like all quantum computers so far, are hobbled by their qubits’ interactions with the surrounding environment, which causes decoherence. Those interactions restrict the purely quantum behavior of qubits to about one millionth of a second. In that timeframe, computations have to be fast—nonadiabatic—and unwanted energy excitations alter the quantum state, introducing inevitable computational mistakes.

    The Kibble-Zurek theory, co-developed by Wojciech Zurek, predicts that the most errors occur when the qubits encounter a phase transition, that is, a very sharp rearrangement of their collective quantum state.

    For this paper, the team studied a known solvable model where identical qubits interact only with their neighbors along a chain; the model verifies the Kibble-Zurek theory analytically. In the theorists’ quest to cure limited operation time in quantum annealing computers, they increased the complexity of that model by assuming that the qubits could be partitioned into two groups with identical interactions within each group but slightly different interactions for qubits from the different groups.

    In such a mixture, they discovered an unusual effect: One group still produced a large amount of energy excitations during the passage through a phase transition, but the other group remained in the energy minimum as if the system did not experience a phase transition at all.

    “The model we used is highly symmetric in order to be solvable, and we found a way to extend the model, breaking this symmetry and still solving it,” Sinitsyn explained. “Then we found that the Kibble-Zurek theory survived but with a twist—half of the qubits did not dissipate energy and behaved ‘nicely.’ In other words, they maintained their ground states.”

    Unfortunately, the other half of the qubits did produce many computational errors—thus, no cure so far for a passage through a phase transition in quantum annealing computers.

    A new way to separate isotopes

    Another long-standing problem that can benefit from this effect is isotope separation. For instance, natural uranium often must be separated into the enriched and depleted isotopes, so the enriched uranium can be used for nuclear power or national security purposes. The current separation process is costly and energy intensive. The discovered effect means that by making a mixture of interacting ultra-cold atoms pass dynamically through a quantum phase transition, different isotopes can be selectively excited or not and then separated using available magnetic deflection technique.

    The funding: This work was carried out under the support of the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division, Condensed Matter Theory Program. Bin Yan also acknowledges support from the Center for Nonlinear Studies at LANL.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory(US) mission is to solve national security challenges through scientific excellence.

    LANL campus

    DOE’sLos Alamos National Laboratory(US), a multidisciplinary research institution engaged in strategic science on behalf of national security, is operated by Los Alamos National Security, LLC, a team composed of Bechtel National, the University of California, The Babcock & Wilcox Company, and URS for the Department of Energy’s National Nuclear Security Administration.
    Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

    Operated by Los Alamos National Security, LLC for the U.S. Dept. of Energy’s NNSA

  • richardmitnick 2:44 pm on February 10, 2021 Permalink | Reply
    Tags: Confirming a 50-year-old theory and could boost the development of silicon-based quantum computers., First-ever observation of multi-photon Fano effect could lead to boost in quantum computing", Photoelectric effect, Quantum theory,   

    From University of Surrey (UK): “First-ever observation of multi-photon Fano effect could lead to boost in quantum computing” 

    From University of Surrey (UK)

    10 February 2021

    External Communications and PR team
    Phone: +44 (0)1483 684380 / 688914 / 684378
    Out of hours: +44 (0)7773 479911

    Dr Konstantin (Constantine) Litvinenko
    Research Fellow and Teaching Fellow in Physics
    +44 (0)1483 689867

    A breakthrough study has confirmed a 50-year-old theory and could boost the development of silicon-based quantum computers.


    In the first study of its kind, published by Nature Communications, an international team of researchers led by the University of Surrey has proven the existence of the fabled multi-photon Fano effect in an experiment.

    Ionisation is when electrons absorb photons to gain enough energy to escape the nucleus’ electrical force. Einstein explained in his Nobel Prize-winning theory of the photoelectric effect that there is a threshold for the photon energy required to cause an escape. If a single photon’s energy is not enough, there might be a convenient half-way step: ionisation can occur with two photons starting from the lowest energy state.

    However, according to the counter-intuitive world of quantum theory, the existence of this half-way step is not necessary for an electron to break free. All the electron needs to do is gain enough power from multiple photons which can be achieved through “ghostly” so-called virtual states. This multi-photon absorption only happens in extremely intense conditions where there are enough photons available.

    When there is a half-way step and enough photons around, both options are available for ionisation. However, the wave-like nature of atoms presents another obstacle: interference. Altering photon energy can cause the two different waves to crash into one another, leading either to enhancement or to complete annihilation of their effect on the absorption event.

    This Fano effect was theoretically predicted nearly 50 years ago and has remained elusive for decades because of the high intensity needed; manufacturing a stable laser that produced a large enough electrical field required to implement this effect to isolated atoms was not – and still is not – technically possible.

    The team led by the University of Surrey overcame this complication by using impurity atoms where, due to the influence of the semiconductor host material, the electric field that determines the outer electron orbits is significantly reduced and, consequently, much less laser intensity is required to demonstrate the Fano effect. The team used ordinary computer chips that contain phosphorous atoms embedded in a silicon crystal.

    The team then used powerful laser beams at the free-electron laser facility (FELIX) in Radboud University (HL), to ionise phosphorus atoms.

    Free-electron laser facility (FELIX) in Radboud University (HL)

    The outcome of ionisation was estimated by the absorption of a weak beam of light. By sweeping the laser radiation photon’s energy, the authors observed the Fano line shape’s different skewness.

    Dr Konstantin Litvinenko, co-author and Research Fellow at the University of Surrey, said: “We believe we have taken a very important step towards the implementation of novel and promising applications of ultrafast readout of silicon-based quantum computers; selective isotope-specific ionization; and a variety of new atomic and molecular physics spectroscopies.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    About the University of Surrey (UK)

    The University of Surrey is a global community of ideas and people, dedicated to life-changing education and research. With a beautiful and vibrant campus, we provide exceptional teaching and practical learning to inspire and empower our students for personal and professional success.

    Through our world-class research and innovation, we deliver transformational impact on society and shape future digital economy through agile collaboration and partnership with businesses, governments and communities.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: