Tagged: Quantum Computing Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 6:03 pm on January 21, 2022 Permalink | Reply
    Tags: "BQL" and "BQuL", "Computer Scientists Eliminate Pesky Quantum Computations", 28 years ago computer scientists established that for quantum algorithms you can wait until the end of a computation to make intermediate measurements without changing the final result., , , If at any point in a calculation you need to access the information contained in a qubit and you measure it the qubit collapses., Instead of encoding information in the 0s and 1s of typical bits quantum computers encode information in higher-dimensional combinations of bits called qubits., Proof that any quantum algorithm can be rearranged to move measurements performed in the middle of the calculation to the end of the process., , Quantum Computing, , The basic difference between quantum computers and the computers we have at home is the way each stores information., This collapse possibly affects all the other qubits in the system., Virtually all algorithms require knowing the value of a computation as it’s in progress.   

    From Quanta Magazine (US): “Computer Scientists Eliminate Pesky Quantum Computations” 

    From Quanta Magazine (US)

    January 19, 2022
    Nick Thieme

    Credit: Samuel Velasco/Quanta Magazine.

    As quantum computers have become more functional, our understanding of them has remained muddled. Work by a pair of computer scientists [Symposium on Theory of Computing] has clarified part of the picture, providing insight into what can be computed with these futuristic machines.

    “It’s a really nice result that has implications for quantum computation,” said John Watrous of The University of Waterloo (CA).

    The research, posted in June 2020 by Bill Fefferman and Zachary Remscrim of The University of Chicago (US), proves that any quantum algorithm can be rearranged to move measurements performed in the middle of the calculation to the end of the process, without changing the final result or drastically increasing the amount of memory required to carry out the task. Previously, computer scientists thought that the timing of those measurements affected memory requirements, creating a bifurcated view of the complexity of quantum algorithms.

    “This has been quite annoying,” said Fefferman. “We’ve had to talk about two complexity classes — one with intermediate measurements and one without.”

    This issue applies exclusively to quantum computers due to the unique way they work. The basic difference between quantum computers and the computers we have at home is the way each stores information. Instead of encoding information in the 0s and 1s of typical bits quantum computers encode information in higher-dimensional combinations of bits called qubits.

    This approach enables denser information storage and sometimes faster calculations. But it also presents a problem. If at any point in a calculation you need to access the information contained in a qubit and you measure it, the qubit collapses from a delicate combination of simultaneously possible bits into a single definite one, possibly affecting all the other qubits in the system.

    This can be a problem because virtually all algorithms require knowing the value of a computation as it’s in progress. For instance, an algorithm may contain a statement like “If the variable x is a number, multiply it by 10; if not, leave it alone.” Performing these steps would seem to require knowing what x is at that moment in the computation — a potential challenge for quantum computers, where measuring the state of a particle (to determine what x is) inherently changes it.

    But 28 years ago, computer scientists proved it’s possible to avoid this kind of no-win situation. They established that for quantum algorithms, you can wait until the end of a computation to make intermediate measurements without changing the final result.

    An essential part of that result showed that you can push intermediate measurements to the end of a computation without drastically increasing the total running time. These features of quantum algorithms — that measurements can be delayed without affecting the answer or the runtime — came to be called the principle of deferred measurement.

    This principle fortifies quantum algorithms, but at a cost. Deferring measurements uses a great deal of extra memory space, essentially one extra qubit per deferred measurement. While one bit per measurement might take only a tiny toll on a classical computer with 4 trillion bits, it’s prohibitive given the limited number of qubits currently in the largest quantum computers.

    Google 53-qubit “Sycamore” superconducting processor quantum computer.

    IBM Unveils Breakthrough 127-Qubit Quantum Processor. Credit: IBM Corp.

    Fefferman and Remscrim’s work resolves this issue in a surprising way. With an abstract proof, they show that subject to a few caveats, anything calculable with intermediate measurements can be calculated without them. Their proof offers a memory-efficient way to defer intermediate measurements — circumventing the memory problems that such measurements created.


    “In the most standard scenario, you don’t need intermediate measurements,” Fefferman said.

    Fefferman and Remscrim achieved their result by showing that a representative problem called “well-conditioned matrix powering” is, in a way, equivalent to a different kind of problem with important properties.

    The “well-conditioned matrix powering” problem effectively asks you to find the values for particular entries in a type of matrix (an array of numbers), given some conditions. Fefferman and Remscrim proved that matrix powering is just as hard as any other quantum computing problem that allows for intermediate measurements. This set of problems is called “BQL”, and the team’s work meant that matrix powering could serve as a representative for all other problems in that class — so anything they proved about matrix powering would be true for all other problems involving intermediate measurements.

    At this point, the researchers took advantage of some of their earlier work. In 2016, Fefferman and Cedric Lin proved that a related problem called “well-conditioned matrix inversion” was equivalent to the hardest problem in a very similar class of problems called “BQuL”. This class is like BQL’s little sibling. It’s identical to BQL, except that it comes with the requirement that every problem in the class must also be reversible.

    In quantum computing, the distinction between reversible and irreversible measurements is essential. If a calculation measures a qubit, it collapses the state of the qubit, making the initial information impossible to recover. As a result, all measurements in quantum algorithms are innately irreversible.

    That means that BQuL is not just the reversible version of BQL; it’s also BQL without any intermediate measurements (because intermediate measurements, like all quantum measurements, would be irreversible, violating the signal condition of the class). The 2016 work proved that matrix inversion is a prototypical quantum calculation without intermediate measurements — that is, a fully representative problem for BQuL.

    The new paper builds on that by connecting the two, proving that well-conditioned matrix powering, which represents all problems with intermediate measurements, can be reduced to well-conditioned matrix inversion, which represents all problems that cannot feature intermediate measurements. In other words, any quantum computing problem with intermediate measurements can be reduced to a quantum computing problem without intermediate measurements.

    This means that for quantum computers with limited memory, researchers no longer need to worry about intermediate measurements when classifying the memory needs of different types of quantum algorithms.

    In 2020, a group of researchers at Princeton University (US) — Ran Raz, Uma Girish and Wei Zhan — independently proved a slightly weaker but nearly identical result that they posted three days after Fefferman and Rimscrim’s work. Raz and Girish later extended the result, proving that intermediate measurements can be deferred in both a time-efficient and space-efficient way for a more limited class of computers.

    Altogether, the recent work provides a much better understanding of how limited-memory quantum computation works. With this theoretical guarantee, researchers have a road map for translating their theory into applied algorithms. Quantum algorithms are now free, in a sense, to proceed without the prohibitive costs of deferred measurements.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine (US) is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

  • richardmitnick 4:07 pm on January 20, 2022 Permalink | Reply
    Tags: "Going beyond the exascale", , , Classical computers have been central to physics research for decades., , , , Fermilab has used classical computing to simulate lattice quantum chromodynamics., , , , Planning for a future that is still decades out., Quantum computers could enable physicists to tackle questions even the most powerful computers cannot handle., Quantum Computing, Quantum computing is here—sort of., , Solving equations on a quantum computer requires completely new ways of thinking about programming and algorithms., , , The biggest place where quantum simulators will have an impact is in discovery science.   

    From Symmetry: “Going beyond the exascale” 

    Symmetry Mag

    From Symmetry

    Emily Ayshford

    Illustration by Sandbox Studio, Chicago with Ana Kova.

    Quantum computers could enable physicists to tackle questions even the most powerful computers cannot handle.

    After years of speculation, quantum computing is here—sort of.

    Physicists are beginning to consider how quantum computing could provide answers to the deepest questions in the field. But most aren’t getting caught up in the hype. Instead, they are taking what for them is a familiar tack—planning for a future that is still decades out, while making room for pivots, turns and potential breakthroughs along the way.

    “When we’re working on building a new particle collider, that sort of project can take 40 years,” says Hank Lamm, an associate scientist at The DOE’s Fermi National Accelerator Laboratory (US). “This is on the same timeline. I hope to start seeing quantum computing provide big answers for particle physics before I die. But that doesn’t mean there isn’t interesting physics to do along the way.”

    Equations that overpower even supercomputers.

    Classical computers have been central to physics research for decades, and simulations that run on classical computers have guided many breakthroughs. Fermilab, for example, has used classical computing to simulate lattice quantum chromodynamics. Lattice QCD is a set of equations that describe the interactions of quarks and gluons via the strong force.

    Theorists developed lattice QCD in the 1970s. But applying its equations proved extremely difficult. “Even back in the 1980s, many people said that even if they had an exascale computer [a computer that can perform a billion billion calculations per second], they still couldn’t calculate lattice QCD,” Lamm says.

    Depiction of ANL ALCF Cray Intel SC18 Shasta Aurora exascale supercomputer, to be built at DOE’s Argonne National Laboratory (US).

    Depiction of ORNL Cray Frontier Shasta based Exascale supercomputer with Slingshot interconnect featuring high-performance AMD EPYC CPU and AMD Radeon Instinct GPU technology , being built at DOE’s Oak Ridge National Laboratory (US).

    But that turned out not to be true.

    Within the past 10 to 15 years, researchers have discovered the algorithms needed to make their calculations more manageable, while learning to understand theoretical errors and how to ameliorate them. These advances have allowed them to use a lattice simulation, a simulation that uses a volume of a specified grid of points in space and time as a substitute for the continuous vastness of reality.

    Lattice simulations have allowed physicists to calculate the mass of the proton—a particle made up of quarks and gluons all interacting via the strong force—and find that the theoretical prediction lines up well with the experimental result. The simulations have also allowed them to accurately predict the temperature at which quarks should detach from one another in a quark-gluon plasma.

    Quark-Gluon Plasma from BNL Relative Heavy Ion Collider (US).

    DOE’s Brookhaven National Laboratory(US) RHIC Campus

    The limit of these calculations? Along with being approximate, or based on a confined, hypothetical area of space, only certain properties can be computed efficiently. Try to look at more than that, and even the biggest high-performance computer cannot handle all of the possibilities.

    Enter quantum computers.

    Quantum computers are all about possibilities. Classical computers don’t have the memory to compute the many possible outcomes of lattice QCD problems, but quantum computers take advantage of quantum mechanics to calculate differently.

    Quantum computing isn’t an easy answer, though. Solving equations on a quantum computer requires completely new ways of thinking about programming and algorithms.

    Using a classical computer, when you program code, you can look at its state at all times. You can check a classical computer’s work before it’s done and trouble-shoot if things go wrong. But under the laws of quantum mechanics, you cannot observe any intermediate step of a quantum computation without corrupting the computation; you can observe only the final state.

    That means you can’t store any information in an intermediate state and bring it back later, and you cannot clone information from one set of qubits into another, making error correction difficult.

    “It can be a nightmare designing an algorithm for quantum computation,” says Lamm, who spends his days trying to figure out how to do quantum simulations for high-energy physics. “Everything has to be redesigned from the ground up. We are right at the beginning of understanding how to do this.”

    Just getting started

    Quantum computers have already proved useful in basic research. Condensed matter physicists—whose research relates to phases of matter—have spent much more time than particle physicists thinking about how quantum computers and simulators can help them. They have used quantum simulators to explore quantum spin liquid states [Science] and to observe a previously unobserved phase of matter called a prethermal time crystal [Science].

    “The biggest place where quantum simulators will have an impact is in discovery science, in discovering new phenomena like this that exist in nature,” says Norman Yao, an assistant professor at The University of California-Berkeley (US) and co-author on the time crystal paper.

    Quantum computers are showing promise in particle physics and astrophysics. Many physics and astrophysics researchers are using quantum computers to simulate “toy problems”—small, simple versions of much more complicated problems. They have, for example, used quantum computing to test parts of theories of quantum gravity [npj Quantum Information] or create proof-of-principle models, like models of the parton showers that emit from particle colliders [Physical Review Letters] such as the Large Hadron Collider.

    The European Organization for Nuclear Research [Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) [CERN].

    The European Organization for Nuclear Research [Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH)[CERN] map.

    CERN LHC tube in the tunnel. Credit: Maximilien Brice and Julien Marius Ordan.

    SixTRack CERN LHC particles.

    “Physicists are taking on the small problems, ones that they can solve with other ways, to try to understand how quantum computing can have an advantage,” says Roni Harnik, a scientist at Fermilab. “Learning from this, they can build a ladder of simulations, through trial and error, to more difficult problems.”

    But just which approaches will succeed, and which will lead to dead ends, remains to be seen. Estimates of how many qubits will be needed to simulate big enough problems in physics to get breakthroughs range from thousands to (more likely) millions. Many in the field expect this to be possible in the 2030s or 2040s.

    “In high-energy physics, problems like these are clearly a regime in which quantum computers will have an advantage,” says Ning Bao, associate computational scientist at DOE’s Brookhaven National Laboratory (US). “The problem is that quantum computers are still too limited in what they can do.”

    Starting with physics

    Some physicists are coming at things from a different perspective: They’re looking to physics to better understand quantum computing.

    John Preskill is a physics professor at The California Institute of Technology (US) and an early leader in the field of quantum computing. A few years ago, he and Patrick Hayden, professor of physics at Stanford University (US), showed that if you entangled two photons and threw one into a black hole, decoding the information that eventually came back out via Hawking radiation would be significantly easier than if you had used non-entangled particles. Physicists Beni Yoshida and Alexei Kitaev then came up with an explicit protocol for such decoding, and Yao went a step further, showing that protocol could also be a powerful tool in characterizing quantum computers.

    “We took something that was thought about in terms of high-energy physics and quantum information science, then thought of it as a tool that could be used in quantum computing,” Yao says.

    That sort of cross-disciplinary thinking will be key to moving the field forward, physicists say.

    “Everyone is coming into this field with different expertise,” Bao says. “From computing, or physics, or quantum information theory—everyone gets together to bring different perspectives and figure out problems. There are probably many ways of using quantum computing to study physics that we can’t predict right now, and it will just be a matter of getting the right two people in a room together.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 11:48 am on January 20, 2022 Permalink | Reply
    Tags: , , Quantum Computing, , , "Global leaders unveil responsible quantum computing guidelines", Quantum computing is set to step out of the shadows., The "next big thing" in tech., A hot topic among investors and research and development (R&D) communities., Confused? Don’t worry- quantum computing is inherently confusing., Quantum computing makes it possible to process vast amounts of information very quickly., Such things are not feasible with classical computers., A (quantum) leap into the unknown, Unlocking innovation for decades to come.   

    From CSIROscope (AU): “Global leaders unveil responsible quantum computing guidelines” 

    CSIRO bloc

    From CSIROscope (AU)


    CSIRO (AU)-Commonwealth Scientific and Industrial Research Organisation

    20 Jan, 2022
    Sophie Schmidt

    We’ve joined forces with The World Economic Forum to contribute to best-practice governance principles for quantum technologies.

    Quantum computing is promising to transform the way we think about and understand the world around us. Credit: Shutterstock.

    As 2022 arrives full of uncertainty, one thing remains guaranteed: quantum computing is set to step out of the shadows.

    Many are announcing quantum technology as the “next big thing” in tech. And it has become a hot topic among investors and research and development (R&D) communities. This is largely because quantum computing has the potential to solve problems that wouldn’t be possible using conventional ‘classical’ computers.

    Harder, better, faster, stronger – quantum computing is promising to transform the way we think about and understand the world around us.

    Right now, the technology is still at an early stage (as in, no one has built the first practical quantum computer). Even so, excitement is at an all-time high. Health care (pharmaceuticals), climate modelling, machine learning and cybersecurity are just a few examples of where quantum might deliver significant value.

    Confused? Don’t worry- quantum computing is inherently confusing.

    Hype aside, many of us are still struggling to understand how quantum computing works – and what makes it so ‘new’. And rightly so, according to Professor Jim Rabeau, Director of our Quantum Technologies Future Science Platform (FSP).

    “Understanding the power of quantum computing requires us to think differently about how information is processed,” Jim explains.

    Quantum computers use ‘qubits’ which can be electrons, photons or other small particles. Only the very non-intuitive science of quantum mechanics can explain the qubit behaviour.

    But what’s more useful to focus on is how using quantum mechanics enables us to conduct multiple operations all at once.

    Jim compares classical versus quantum computing using the analogy of old-school style phone directories.

    “Rather than searching line by line, page by page, imagine being able to instantly find the name and number you are looking for by looking at all pages and all lines at once,” he says.

    Quantum computing makes it possible to process vast amounts of information very quickly. This is simply because it is looking at all ‘possibilities’ (or in this case, names and numbers) simultaneously.

    “In the case of pharmaceuticals, it means we would be able to very quickly look at all possible structural combinations of atoms and molecules to form the perfect drug to address a particular disease,” Jim says.

    “Such things are not feasible with classical computers.”

    The bottom line is that the quest to harness the potential of quantum computing is on. We are rapidly making progress to close in on gaps between research and real-world applications.

    A (quantum) leap into the unknown

    Quantum technology has been on our minds a lot lately. For starters, our researchers are exploring how we could use quantum computers to outperform today’s computers. For example, quantum computers could crack the cryptography protocols that keep our data private, making current security measures virtually useless.

    It’s not just post-quantum cryptography we’re exploring. The ethical challenges associated with it need to also be assessed. For example, providing fair and secure data storage and communication systems.

    As with any new technology, the assumption is that humans will automatically know to do the right thing – or even that we will agree on what that might be. This is where developing and applying ethical standards and responsible governance guidelines can help.

    It’s not entirely about stopping CEOs in tech companies from using the technology for nefarious purposes. That is, as Responsible Innovation FSP Director Dr Justine Lacey says, “a little too simplistic.”

    “It suggests that all bad outcomes are merely the result of bad actors,” Justine says.

    There will always be a risk of an individual using technology in an unethical way.

    “But what’s easier to lose sight of is whether or not a technology is used to generate broad societal benefit. And this also means ensuring it is not used to inadvertently create harmful outcomes, by for example, overlooking certain socioeconomic groups, or undermining cybersecurity measures,” she says.

    To help ensure quantum technology benefits everyone, we recently joined forces with the World Economic Forum and contributed to their latest Insight Report released at the World Economic Forum Annual Meeting 2022.

    This report outlines a set of governance principles. They are the result of an extensive international multi-sector collaboration with stakeholders from across the globe. Their aim is to help guide responsible design and adoption of quantum technology by applying best-practice governance principles.

    New guidelines will help guide responsible quantum

    It’s a familiar discussion that has ramped up over the last 10 years around ethics and artificial intelligence (AI). Except this time around, according to Justine, we are getting on the front foot.

    “The development of global governance principles for quantum technology presents a rare opportunity to embed responsible innovation practices from a very early stage and well before we have seen wide application, uptake and commercialisation of the technology,” Justine explains.

    “It also comes at a time when those in the quantum technology community are starting to consider how the application of this technology may broadly impact our lives and society, and how we can steer its application toward producing more desirable societal outcomes.

    “If we look to similar discussions on responsible AI, it is clear a major stumbling block was not the development of high-level ethical principles to guide the development of responsible AI systems. In fact, hundreds of such frameworks and guidelines exist.

    “The real and persistent challenge has been in how to effectively operationalise those principles to transform the practice and deployment of those AI systems,” she says.

    Recognising this, the Forum has designed quantum technology governance guidelines specifically for adoption by quantum technology stakeholders. They are unique from other new technology ethical guidelines by providing directed guidance and practical ‘off-the shelf’ applicability.

    The World Economic Forum’s latest Insight report outlines a set of governance principles for quantum computing.

    Unlocking innovation for decades to come

    The guidelines drew on a diverse array of thinking around quantum technology from all over the world.

    Justine and Jim are excited to see the guidelines embedded not only in the research and development stage of quantum technology, but through early-stage translation, commercialisation and application.

    “It’s an ideal time to be embracing this,” Jim says.

    “I am really glad to have people like Justine to work alongside as we ramp up the effort to translate quantum technology research into viable industry applications, with active consideration and implementation of Responsible Innovation from the get-go.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    CSIRO campus

    CSIRO (AU)-Commonwealth Scientific and Industrial Research Organisation , is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

    CSIRO works with leading organisations around the world. From its headquarters in Canberra, CSIRO maintains more than 50 sites across Australia and in France, Chile and the United States, employing about 5,500 people.

    Federally funded scientific research began in Australia 104 years ago. The Advisory Council of Science and Industry was established in 1916 but was hampered by insufficient available finance. In 1926 the research effort was reinvigorated by establishment of the Council for Scientific and Industrial Research (CSIR), which strengthened national science leadership and increased research funding. CSIR grew rapidly and achieved significant early successes. In 1949 further legislated changes included renaming the organisation as CSIRO.

    Notable developments by CSIRO have included the invention of atomic absorption spectroscopy; essential components of Wi-Fi technology; development of the first commercially successful polymer banknote; the invention of the insect repellent in Aerogard and the introduction of a series of biological controls into Australia, such as the introduction of myxomatosis and rabbit calicivirus for the control of rabbit populations.

    Research and focus areas

    Research Business Units

    As at 2019, CSIRO’s research areas are identified as “Impact science” and organised into the following Business Units:

    Agriculture and Food
    Health and Biosecurity
    Data 61
    Land and Water
    Mineral Resources
    Oceans and Atmosphere

    National Facilities

    CSIRO manages national research facilities and scientific infrastructure on behalf of the nation to assist with the delivery of research. The national facilities and specialized laboratories are available to both international and Australian users from industry and research. As at 2019, the following National Facilities are listed:

    Australian Animal Health Laboratory (AAHL)
    Australia Telescope National Facility – radio telescopes included in the Facility include the Australia Telescope Compact Array, the Parkes Observatory, Mopra Observatory and the Australian Square Kilometre Array Pathfinder.

    STCA CSIRO Australia Compact Array (AU), six radio telescopes at the Paul Wild Observatory, is an array of six 22-m antennas located about twenty five kilometres (16 mi) west of the town of Narrabri in Australia.

    CSIRO-Commonwealth Scientific and Industrial Research Organisation (AU) Parkes Observatory, [ Murriyang, the traditional Indigenous name] , located 20 kilometres north of the town of Parkes, New South Wales, Australia, 414.80m above sea level.

    CSIRO-Commonwealth Scientific and Industrial Research Organisation (AU) Mopra radio telescope

    Australian Square Kilometre Array Pathfinder

    NASA Canberra Deep Space Communication Complex, AU, Deep Space Network. Credit: The National Aeronautics and Space Agency (US)

    CSIRO Canberra campus

    ESA DSA 1, hosts a 35-metre deep-space antenna with transmission and reception in both S- and X-band and is located 140 kilometres north of Perth, Western Australia, near the town of New Norcia

    CSIRO-Commonwealth Scientific and Industrial Research Organisation (AU)CSIRO R/V Investigator.

    UK Space NovaSAR-1 satellite (UK) synthetic aperture radar satellite.

    CSIRO Pawsey Supercomputing Centre AU)

    Magnus Cray XC40 supercomputer at Pawsey Supercomputer Centre Perth Australia

    Galaxy Cray XC30 Series Supercomputer at at Pawsey Supercomputer Centre Perth Australia

    Pausey Supercomputer CSIRO Zeus SGI Linux cluster

    Others not shown


    SKA- Square Kilometer Array

    SKA Square Kilometre Array low frequency at Murchison Widefield Array, Boolardy station in outback Western Australia on the traditional lands of the Wajarri peoples.

    EDGES telescope in a radio quiet zone at the Murchison Radio-astronomy Observatory in Western Australia, on the traditional lands of the Wajarri peoples.

  • richardmitnick 1:14 pm on January 19, 2022 Permalink | Reply
    Tags: "How Sandia Labs is revealing the inner workings of quantum computers", , , , Gate set tomography, Gate set tomography even detects unexpected error, Gate set tomography is Sandia’s flagship technique for measuring the performance of qubits and quantum logic operations-also known as “gates.”, Quantum Computing, Quantum processors with many more qubits could enable users working in national security; science and industry to perform some tasks faster than they could with a conventional computer.   

    From DOE’s Sandia National Laboratories (US): “How Sandia Labs is revealing the inner workings of quantum computers” 

    From DOE’s Sandia National Laboratories (US)

    January 19, 2022

    Troy Rummler

    Gate set tomography used to discover and validate 2 innovations published in Nature.

    Sandia National Laboratories researchers Andrew Baczewski, left, and Erik Nielsen use gate set tomography to analyze problems in a quantum processor. Photo by Rebecca Gustaf.

    A precision diagnostic developed at the Department of Energy’s Sandia National Laboratories is emerging as a gold standard for detecting and describing problems inside quantum computing hardware.

    Two papers published today in the scientific journal Nature describe how separate research teams — one including Sandia researchers — used a Sandia technique called gate set tomography to develop and validate highly reliable quantum processors. Sandia has been developing gate set tomography since 2012, with funding from the DOE Office of Science through the Advanced Scientific Computing Research program.

    Sandia scientists collaborated with Australian researchers at The University of New South Wales (AU), led by Professor Andrea Morello, to publish one of today’s papers in Nature. Together, they used GST to show that a sophisticated, three-qubit system comprising two atomic nuclei and one electron in a silicon chip could be manipulated reliably with 99%-plus accuracy.

    In another Nature article appearing today, a group led by Professor Lieven Vandersypen at The Delft University of Technology [Technische Universiteit Delft](NL) used gate set tomography, implemented using Sandia software, to demonstrate the important milestone of 99%-plus accuracy but with a different approach, controlling electrons trapped within quantum dots instead of isolated atomic nuclei.

    “We want researchers everywhere to know they have access to a powerful, cutting-edge tool that will help them make their breakthroughs,” said Sandia scientist Robin Blume-Kohout.

    Future quantum processors with many more qubits, or quantum bits, could enable users working in national security, science and industry to perform some tasks faster than they ever could with a conventional computer. But flaws in current system controls cause computational errors. A quantum computer can correct some errors, but the more errors it must correct, the larger and more expensive that computer becomes to build.

    So, scientists need diagnostic tools to calculate how precisely they can control single atoms and electrons that store qubits and learn how to prevent errors instead of correcting them. This increases the reliability of their system while keeping costs down.

    Gate set tomography is Sandia’s flagship technique for measuring the performance of qubits and quantum logic operations-also known as “gates.” It combines results from many kinds of measurements to generate a detailed report describing every error occurring in the qubits. Experimental scientists like Morello can use the diagnostic results to deduce what they need to fix.

    “The Quantum Performance Laboratory at Sandia National Labs, led by Robin Blume-Kohout, has developed the most accurate method to identify the nature of the errors occurring in a quantum computer,” Morello said.

    Gate set tomography even detects unexpected error

    The Sandia team maintains a free, open-source GST software called pyGSTi (pronounced “pigsty,” which stands for Python Gate Set Tomography Implementation). Publicly available at http://www.pygsti.info, it was used by both research groups publishing in Nature today.

    While the Delft team used the pyGSTi software without assistance from the Sandia team, the UNSW-Sandia collaboration used a new, customized form of gate set tomography developed by the Sandia researchers. The new techniques enabled the team to rule out more potential error modes and focus on a few dominant error mechanisms.

    But when the Sandia team studied the GST analysis of the UNSW experimental data, they discovered a surprising kind of error that Morello’s group did not expect. The nuclear-spin qubits were interacting when they should have been isolated. Concerned that this error might indicate a flaw in the qubits, the team turned to Sandia’s Andrew Baczewski, an expert in silicon qubit physics and a researcher at the Quantum Systems Accelerator, a National Quantum Information Science Research Center, to help find its source.

    “It came to occupy a lot of my free time,” Baczewski said. “I would be out for a walk on a Saturday morning and, out of the blue, something would occur to me and I would run home and do math for an hour.”

    Eventually, Baczewski and the rest of the team tracked the error to a signal generator that was leaking microwaves into the system. This can be easily fixed in future experiments, now that the cause is known.

    Blume-Kohout said, “It was really fulfilling to see confirmation that GST even detected the errors that nobody expected.”

    “The collaboration with Sandia National Laboratories has been crucial to achieve the milestone of high-fidelity quantum operations in silicon,” Morello said. “The theoretical and computational methods developed at Sandia have enabled the rigorous demonstration of quantum computing with better than 99% fidelity and have provided precious insights into the microscopic causes of the residual errors. We plan to expand this strategic collaboration in years to come.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus.

    DOE’s Sandia National Laboratories (US) managed and operated by the National Technology and Engineering Solutions of Sandia (a wholly owned subsidiary of Honeywell International), is one of three National Nuclear Security Administration(US) research and development laboratories in the United States. Their primary mission is to develop, engineer, and test the non-nuclear components of nuclear weapons and high technology. Headquartered in Central New Mexico near the Sandia Mountains, on Kirtland Air Force Base in Albuquerque, Sandia also has a campus in Livermore, California, next to DOE’sLawrence Livermore National Laboratory(US), and a test facility in Waimea, Kauai, Hawaii.

    It is Sandia’s mission to maintain the reliability and surety of nuclear weapon systems, conduct research and development in arms control and nonproliferation technologies, and investigate methods for the disposal of the United States’ nuclear weapons program’s hazardous waste.

    Other missions include research and development in energy and environmental programs, as well as the surety of critical national infrastructures. In addition, Sandia is home to a wide variety of research including computational biology; mathematics (through its Computer Science Research Institute); materials science; alternative energy; psychology; MEMS; and cognitive science initiatives.

    Sandia formerly hosted ASCI Red, one of the world’s fastest supercomputers until its recent decommission, and now hosts ASCI Red Storm supercomputer, originally known as Thor’s Hammer.

    Sandia is also home to the Z Machine.

    The Z Machine is the largest X-ray generator in the world and is designed to test materials in conditions of extreme temperature and pressure. It is operated by Sandia National Laboratories to gather data to aid in computer modeling of nuclear guns. In December 2016, it was announced that National Technology and Engineering Solutions of Sandia, under the direction of Honeywell International, would take over the management of Sandia National Laboratories starting on May 1, 2017.

  • richardmitnick 12:07 pm on January 15, 2022 Permalink | Reply
    Tags: , , , , Quantum Computing, , , , , "From bits to qubits"   

    From Symmetry: “From bits to qubits” 

    Symmetry Mag

    From Symmetry

    Sarah Charley

    Illustration by Sandbox Studio, Chicago with Ana Kova.

    Quantum computers go beyond the binary.

    The first desktop computer was invented in the 1960s. But computing technology has been around for centuries, says Irfan Siddiqi, director of the Quantum Nanoelectronics Laboratory at The University of California- Berkeley (US).

    “An abacus is an ancient computer,” he says. “The materials science revolution made bits smaller, but the fundamental architecture hasn’t changed.”

    Both modern computers and abaci use basic units of information that have two possible states. In a classical computer, a binary digit (called a bit) is a 1 or a 0, represented by on-off switches in the hardware. On an abacus, a sliding bead can also be thought of as being “on” or “off,” based on its position (left or right on an abacus with horizontal rods, or up or down on an abacus with vertical ones). Bits and beads can form patterns that represent other numbers and, in the case of computers, letters and symbols.

    But what if there were even more possibilities? What if the beads of an abacus could sit in between two positions? What if the switches in a computer could consult each other before outputting a calculation?

    This is the fundamental idea behind quantum computers, which embrace the oddities of quantum mechanics to encode and process information.

    “Information in quantum mechanics is stored in very different ways than in classical mechanics, and that’s where the power comes from,” says Heather Gray, an assistant professor and particle physicist at UC Berkeley.

    Classical computer; classical mechanics

    Computing devices break down numbers into discrete components. A simple abacus could be made up of three rows: one with beads representing 100s, one with beads representing 10s, and one with beads representing 1s. In this case, the number 514 could be indicated by sliding to the right 5 beads in the 100s row, 1 bead in the 10s row, and 4 beads in the 1s row.

    The computer you may be using to read this article does something similar, counting by powers of two instead of 10s. In binary, the number 514 becomes 1000000010.

    The more complex the task, the more bits or time a computer needs to perform the calculation. To speed things up, scientists have over the years found ways to fit more and more bits into a computer. “You can now have one trillion transistors on a small silicon chip, which is a far cry from the ancient Chinese abacus,” Siddiqi says.

    But as engineers make transistors smaller and smaller, they’ve started to notice some funny effects.

    The quantum twist on computing

    Bits that behave classically are determinate: A 1 is a 1. But at very small scales, an entirely new set of physical rules comes into play.

    “We are hitting the quantum limits,” says Alberto Di Meglio, the head of CERN’s Quantum Technology Initiative. “As the scale of classic computing technology becomes smaller and smaller, quantum mechanics’ effects are not negligible anymore, and we do not want this in classic computers.”

    But quantum computers use quantum mechanics to their benefit. Rather than offering decisive answers, quantum bits, called qubits, behave like a distribution of probable values.

    Di Meglio likens qubits to undecided voters in an election. “You might know how a particular person is likely to vote, but until you actually ask them to vote, you won’t have a definite answer,” Di Meglio says.

    Qubits can be made from subatomic particles, such as electrons. Like other, similar particles, electrons have a property called spin that can exist in one of two possible states (spin-up or spin-down).

    If we think of these electrons as undecided voters, the question they are voting on is their direction of spin. Quantum computers process information while the qubits are still undecided—somewhere in between spin-up and spin-down.

    The situation becomes even more complicated when the “voters” can influence one another. This happens when two qubits are entangled. “For example, if one person votes yes, then an entangled ‘undecided’ voter will automatically vote no,” Di Meglio says. “The relationships become important, and the more voters you put together, the more chaotic it becomes.”

    When the qubits start talking to each other, each qubit can find itself in many different configurations, Siddiqi says. “An entangled array of qubits—with ‘n’ number of qubits—can exist in 2^n configurations. A quantum computer with 300 good qubits would have 2^300 possible configurations, which is more than the number of particles in the known universe.”

    With great power comes great… noise

    Entanglement allows a quantum computer to perform a complex task in a fraction of the time it would take a classical computer. But entanglement is also the quantum computer’s greatest weakness.

    “A qubit can get entangled with something else that you don’t have access to,” Siddiqi says. “Information can leave the system.”

    An electron from the computer’s power supply or a stray photon can entangle with a qubit and make it go rogue.

    “Quantum computing is not just about the number of qubits,” Di Meglio says. “You might have a quantum computer with thousands of qubits, but only a fraction are reliable.”

    Because of the problem of rogue qubits, today’s quantum computers are classified as noisy intermediate-scale quantum, or NISQ, devices. “Most quantum computers look like a physics experiment,” Gray says. “We’re very far from having one you could use at home.”

    But scientists are trying. In the future, scientists hope that they can use quantum computers to quickly search through large databases and calculate complex mathematical matrices.

    Today, physicists are already experimenting with quantum computers to simulate quantum processes, such as how particles interact with each other inside the detectors at the Large Hadron Collider. “You can do all sorts of cool things with entangled qubits,” Gray says.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 10:38 am on January 12, 2022 Permalink | Reply
    Tags: , At the dawn of the 20th century a new theory of matter and energy was emerging., , Could a quantum worldview prove useful outside the lab?, Information Theory: a blend of math and computer science, , One of the main questions quantum mechanics addressed was the nature of light-particle or wave, , Peter Shor: a fast-factoring algorithm for a quantum computer-a computer whose bits exist in superposition and can be entangled., Physicists developed a new system of mechanics to describe what seemed to be a quantized and uncertain probabilistic world-Heisenberg's Uncertainty Principle, , Quantum Computing, , , , , Shor’s algorithm is of particular interest in encryption because of the difficulty of identifying the prime factors of large numbers., Shor’s algorithm was designed to quickly divide large numbers into their prime factors., The second quantum revolution also relies on and encompasses new ways of using technology to manipulate matter at the quantum level., Today’s quantum computers are not yet advanced enough to implement Shor’s algorithm., , Vacuum tubes, What changed was Shor’s introduction of error-correcting codes.   

    From Symmetry: “The second quantum revolution” 

    Symmetry Mag

    From Symmetry

    Daniel Garisto

    Illustration by Ana Kova / Sandbox Studio, Chicago.

    Inventions like the transistor and laser changed the world. What changes will the second quantum revolution bring?

    For physicists trying to harness the power of electricity, no tool was more important than the vacuum tube. This lightbulb-like device controlled the flow of electricity and could amplify signals. In the early 20th century, vacuum tubes were used in radios, televisions and long-distance telephone networks.

    But vacuum tubes had significant drawbacks: They generated heat; they were bulky; and they had a propensity to burn out. Physicists at Bell Labs, a spin-off of AT&T, were interested in finding a replacement.

    Applying their knowledge of quantum mechanics—specifically how electrons flowed between materials with electrical conductivity—they found a way to mimic the function of vacuum tubes without those shortcomings.

    They had invented the transistor. At the time, the invention did not grace the front page of any major news publications. Even the scientists themselves couldn’t have appreciated just how important their device would be.

    First came the transistor radio, popularized in large part by the new Japanese company Sony. Spreading portable access to radio broadcasts changed music and connected disparate corners of the world.

    Transistors then paved the way for NASA’s Apollo Project, which first took humans to the moon. And perhaps most importantly, transistors were made smaller and smaller, shrinking room-sized computers and magnifying their power to eventually create laptops and smartphones.

    These quantum-inspired devices are central to every single modern electronic application that uses some computing power, such as cars, cellphones and digital cameras. You would not be reading this sentence without transistors, which are an important part of what is now called the First Quantum Revolution.

    Quantum physicists Jonathan Dowling and Gerard Milburn coined the term “quantum revolution” in a 2002 paper [The Royal Society]. In it, they argue that we have now entered a new era, a Second Quantum Revolution. “It just dawned on me that actually there was a whole new technological frontier opening up,” says Milburn, professor emeritus at The University of Queensland (AU).

    This second quantum revolution is defined by developments in technologies like quantum computing and quantum sensing, brought on by a deeper understanding of the quantum world and precision control down to the level of individual particles.

    A quantum understanding

    At the dawn of the 20th century a new theory of matter and energy was emerging. Unsatisfied with classical explanations about the strange behavior of particles, physicists developed a new system of mechanics to describe what seemed to be a quantized, uncertain, probabilistic world.

    One of the main questions quantum mechanics addressed was the nature of light. Eighteenth-century physicists believed light was a particle. Nineteenth-century physicists proved it had to be a wave. Twentieth-century physicists resolved the problem by redefining particles using the principles of quantum mechanics. They proposed that particles of light, now called photons, had some probability of existing in a given location—a probability that could be represented as a wave and even experience interference like one.

    This newfound picture of the world helped make sense of results such as those of the double-slit experiment, which showed that particles like electrons and photons could behave as if they were waves.

    But could a quantum worldview prove useful outside the lab?

    At first, “quantum was usually seen as just a source of mystery and confusion and all sorts of strange paradoxes,” Milburn says.

    But after World War II, people began figuring out how to use those paradoxes to get things done. Building on new quantum ideas about the behavior of electrons in metals and other materials, Bell Labs researchers William Shockley, John Bardeen and Walter Brattain created the first transistors. They realized that sandwiching semiconductors together could create a device that would allow electrical current to flow in one direction, but not another. Other technologies, such as atomic clocks and the nuclear magnetic resonance used for MRI scans, were also products of the first quantum revolution.

    Another important and, well, visible quantum invention was the laser.

    In the 1950s, optical physicists knew that hitting certain kinds of atoms with a few photons at the right energy could lead them to emit more photons with the same energy and direction as the initial photons. This effect would cause a cascade of photons, creating a stable, straight beam of light unlike anything seen in nature. Today, lasers are ubiquitous, used in applications from laser pointers to barcode scanners to life-saving medical techniques.

    All of these devices were made possible by studies of the quantum world. Both the laser and transistor rely on an understanding of quantized atomic energy levels. Milburn and Dowling suggest that the technologies of the first quantum revolution are unified by “the idea that matter particles sometimes behaved like waves, and that light waves sometimes acted like particles.”

    For the first time, scientists were using their understanding of quantum mechanics to create new tools that could be used in the classical world.

    The second quantum revolution

    Many of these developments were described to the public without resorting to the word “quantum,” as this Bell Labs video about the laser attests.

    One reason for the disconnect was that the first quantum revolution didn’t make full use of quantum mechanics. “The systems were too noisy. In a sense, the full richness of quantum mechanics wasn’t really accessible,” says Ivan Deutsch, a quantum physicist at The University of New Mexico (US). “You can get by with a fairly classical picture.”

    The stage for the second quantum revolution was set in the 1960s, when the North Irish physicist John Stewart Bell [B.Sc.The Queen’s University of Belfast (NIR); Ph.DThe University of Birmingham (UK);The European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) [CERN]; Stanford University (US) ] shook the foundations of quantum mechanics. Bell proposed that entangled particles were correlated in strange quantum ways and could not be explained with so-called “hidden variables.” Tests performed in the ’70s and ’80s confirmed that measuring one entangled particle really did seem to determine the state of the other, faster than any signal could travel between the two.

    The other critical ingredient for the second quantum revolution was information theory, a blend of math and computer science developed by pioneers like Claude Shannon and Alan Turing. In 1994, combining new insight into the foundations of quantum mechanics with information theory led the mathematician Peter Shor to introduce a fast-factoring algorithm for a quantum computer, a computer whose bits exist in superposition and can be entangled.

    Shor’s algorithm was designed to quickly divide large numbers into their prime factors. Using the algorithm, a quantum computer could solve the problem much more efficiently than a classical one. It was the clearest early demonstration of the worth of quantum computing.

    “It really made the whole idea of quantum information, a new concept that those of us who had been working in related areas, instantly appreciated,” Deutsch says. “Shor’s algorithm suggested the possibilities new quantum tech could have over existing classical tech, galvanizing research across the board.”

    Shor’s algorithm is of particular interest in encryption because the difficulty of identifying the prime factors of large numbers is precisely what keeps data private online. To unlock encrypted information, a computer must know the prime factors of a large number associated with it. Use a large enough number, and the puzzle of guessing its prime factors can take a classical computer thousands of years. With Shor’s algorithm, the guessing game can take just moments.

    Today’s quantum computers are not yet advanced enough to implement Shor’s algorithm. But as Deutsch points out, skeptics once doubted a quantum computer was even possible.

    “Because there was a kind of trade-off,” he says. “The kind of exponential increase in computational power that might come from quantum superpositions would be counteracted exactly, by exponential sensitivity to noise.”

    While inventions like the transistor required knowledge of quantum mechanics, the device itself wasn’t in a delicate quantum state, so it could be described semi-classically. Quantum computers, on the other hand, require delicate quantum connections.

    What changed was Shor’s introduction of error-correcting codes. By combining concepts from classical information theory with quantum mechanics, Shor showed that, in theory, even the delicate state of a quantum computer could be preserved.

    Beyond quantum computing, the second quantum revolution also relies on and encompasses new ways of using technology to manipulate matter at the quantum level.

    Using lasers, researchers have learned to sap the energy of atoms and cool them. Like a soccer player dribbling a ball up field with a series of taps, lasers can cool atoms to billionths of a degree above absolute zero—far colder than conventional cooling techniques. In 1995, scientists used laser cooling to observe a long-predicted state of matter: the Bose-Einstein condensate.

    Other quantum optical techniques have been developed to make ultra-precise measurements.

    Classical interferometers, like the type used in the famous Michelson-Morley experiment that measured the speed of light in different directions to search for signs of a hypothetical aether, looked at the interference pattern of light. New matter-wave interferometers exploit the principle that everything—not just light—has a wavefunction. Measuring changes in the phase of atoms, which have far shorter wavelengths than light, could give unprecedented control to experiments that attempt to measure the smallest effects, like those of gravity.

    With laboratories and companies around the world focused on advancements in quantum science and applications, the second quantum revolution has only begun. As Bardeen put it in his Nobel lecture, we may be at another “particularly opportune time … to add another small step in the control of nature for the benefit of [hu]mankind.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 10:51 am on January 11, 2022 Permalink | Reply
    Tags: "Looking at a new quantum revolution", , , , , , Quantum Computing,   

    From Symmetry: “Looking at a new quantum revolution” 

    Symmetry Mag

    From Symmetry

    Kathryn Jepsen

    Illustration by Sandbox Studio, Chicago with Ana Kova.

    This month, Symmetry presents a series of articles on the past, present and future of quantum research—and its many connections to particle physics, astrophysics and computing.

    On July 25, 2018, a group of scientists from Microsoft, Google and IBM sat on a stage at the Computer History Museum in Mountain View, California. Matthias Troyer, John Martinis and Pat Gumann were all working on research into quantum computing, which takes advantage of our knowledge of quantum mechanics, the physics of how the world operates at the smallest level.

    The evening was billed as a night to ask the experts

    Quantum Questions.
    CHM Live | Quantum Questions

    About an hour into the event, moderator and historian David Brock asked the scientists one last thing: “What do you think—for us as, you know, citizens of the world—what are the most important things for us to know about and keep in mind about quantum computing, as it is today?”

    Troyer called attention to the museum displays around them. “When you look back at the history of computing… the abacus works on the same principle of the most modern, fastest classical CPU. It’s discrete, digital logic. There’s been no change in the way we compute for the last 5,000 years.

    “And now is the time when this is changing,” he said, “because with quantum computing we are radically changing the way we use nature to compute.”

    Scientists have called this moment a second quantum revolution. The first quantum revolution brought us developments like the transistor, which enabled the creation of powerful, portable modern electronic devices.

    It’s not yet clear what this new revolution will bring. But plenty of computer scientists, physicists and engineers are hard at work to find out. Around the world, research institutions, universities and businesses have been ramping up their investments in quantum science.

    At the end of 2018, the United States passed the National Quantum Initiative Act, which led to the establishment of five new Department of Energy Quantum Information Science Research Centers; five new National Science Foundation Quantum Leap Challenge Institutes; and the National Institute of Standards and Technology’s Quantum Economic Development Consortium.

    Efforts to develop quantum computers, quantum sensors and quantum networks have the potential to change our lives. And some of the first applications of these developments could be in particle physics and astrophysics research.

    Throughout the month of January, Symmetry will publish a series of articles meant to give readers a better understanding of this quantum ecosystem—the physics ideas it’s based on, the ways this knowledge can be applied, and what will determine the shape of our quantum future.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 3:34 pm on January 8, 2022 Permalink | Reply
    Tags: "Materials theorist Yuan Ping wins NSF CAREER Award", , , , Critical properties of spin qubits include quantum coherence which determines how long the spin state will last., , Ping’s first-principles approach will eliminate the need for prior input parameters., Ping’s group has developed computational tools for predicting spin dynamics in solid-state materials which they will use to study the properties of spin qubits., Quantum Computing, , The funding for this project also includes support for a range of education and outreach activities., , Understanding kinetics of excited states and spin qubit relaxation and decoherence is the core issue of spin-based quantum information science., Yuan Ping   

    From The University of California-Santa Cruz (US) : “Materials theorist Yuan Ping wins NSF CAREER Award” 

    From The University of California-Santa Cruz (US)

    January 05, 2022
    Tim Stephens

    Yuan Ping

    Yuan Ping, assistant professor of chemistry and biochemistry at UC Santa Cruz, has received a Faculty Early Career Development (CAREER) Award from The National Science Foundation (US) to support her work developing computational platforms to investigate the physics of new materials for quantum computers and other applications of quantum information science.

    In quantum computers, information is encoded in quantum bits, or qubits, which can be made from any quantum system that has two states. One promising approach is based on the spin states of electrons. Ping’s group has developed a theoretical framework and computational tools for predicting spin dynamics in solid-state materials which they will use to study the properties of spin qubits.

    Critical properties of spin qubits include quantum coherence which determines how long the spin state will last (or how long the encoded information will be intact); readout efficiency, which determines the fidelity with which information can be extracted from a qubit; and quantum transduction, which determines if quantum information can be transferred and communicated among qubits over a long range.

    “Understanding kinetics of excited states and spin qubit relaxation and decoherence is the core issue of spin-based quantum information science,” Ping said. “In this project, we will develop a computational platform to tackle these issues for spin qubits.”

    All of these properties are materials-specific, and previous efforts have relied mostly on simplified models which require inputs from prior experiments. Ping’s first-principles approach will eliminate the need for prior input parameters and will open the path for designing novel quantum materials with the potential to enable unprecedented performance for applications in quantum information science.

    “Stable, scalable, and reliable quantum information science has the potential to transform and advance knowledge across a large number of critical fields through next-generation technologies for sensing, computing, modeling, and communicating,” Ping said.

    The funding for this project also includes support for a range of education and outreach activities. These include strengthening undergraduate education in physical chemistry through a summer bootcamp; developing computational materials research through new courses and undergraduate research programs; and supporting women and underrepresented groups through UCSC’s Women in Science and Engineering program.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC Santa Cruz (US) Lick Observatory Since 1888 Mt Hamilton, in San Jose, California, Altitude 1,283 m (4,209 ft)

    UC Observatories Lick Automated Planet Finder fully robotic 2.4-meter optical telescope at Lick Observatory, situated on the summit of Mount Hamilton, east of San Jose, California, USA.

    The UCO Lick C. Donald Shane telescope is a 120-inch (3.0-meter) reflecting telescope located at the Lick Observatory, Mt Hamilton, in San Jose, California, Altitude 1,283 m (4,209 ft).
    UC Santa Cruz (US) campus.

    The University of California-Santa Cruz (US) , opened in 1965 and grew, one college at a time, to its current (2008-09) enrollment of more than 16,000 students. Undergraduates pursue more than 60 majors supervised by divisional deans of humanities, physical & biological sciences, social sciences, and arts. Graduate students work toward graduate certificates, master’s degrees, or doctoral degrees in more than 30 academic fields under the supervision of the divisional and graduate deans. The dean of the Jack Baskin School of Engineering oversees the campus’s undergraduate and graduate engineering programs.

    UCSC is the home base for the Lick Observatory.

    UCO Lick Observatory’s 36-inch Great Refractor telescope housed in the South (large) Dome of main building.

    Search for extraterrestrial intelligence expands at Lick Observatory
    New instrument scans the sky for pulses of infrared light
    March 23, 2015
    By Hilary Lebow

    Astronomers are expanding the search for extraterrestrial intelligence into a new realm with detectors tuned to infrared light at UC’s Lick Observatory. A new instrument, called NIROSETI, will soon scour the sky for messages from other worlds.

    “Infrared light would be an excellent means of interstellar communication,” said Shelley Wright, an assistant professor of physics at UC San Diego (US) who led the development of the new instrument while at the U Toronto Dunlap Institute for Astronomy and Astrophysics (CA).

    Shelley Wright of UC San Diego with (US) NIROSETI, developed at U Toronto Dunlap Institute for Astronomy and Astrophysics (CA) at the 1-meter Nickel Telescope at Lick Observatory at UC Santa Cruz

    Wright worked on an earlier SETI project at Lick Observatory as a UC Santa Cruz undergraduate, when she built an optical instrument designed by University of California-Berkeley (US) researchers. The infrared project takes advantage of new technology not available for that first optical search.

    Infrared light would be a good way for extraterrestrials to get our attention here on Earth, since pulses from a powerful infrared laser could outshine a star, if only for a billionth of a second. Interstellar gas and dust is almost transparent to near infrared, so these signals can be seen from great distances. It also takes less energy to send information using infrared signals than with visible light.

    Frank Drake, professor emeritus of astronomy and astrophysics at UC Santa Cruz and director emeritus of the SETI Institute, said there are several additional advantages to a search in the infrared realm.

    Frank Drake with his Drake Equation. Credit Frank Drake.

    “The signals are so strong that we only need a small telescope to receive them. Smaller telescopes can offer more observational time, and that is good because we need to search many stars for a chance of success,” said Drake.

    The only downside is that extraterrestrials would need to be transmitting their signals in our direction, Drake said, though he sees this as a positive side to that limitation. “If we get a signal from someone who’s aiming for us, it could mean there’s altruism in the universe. I like that idea. If they want to be friendly, that’s who we will find.”

    Scientists have searched the skies for radio signals for more than 50 years and expanded their search into the optical realm more than a decade ago. The idea of searching in the infrared is not a new one, but instruments capable of capturing pulses of infrared light only recently became available.

    “We had to wait,” Wright said. “I spent eight years waiting and watching as new technology emerged.”

    Now that technology has caught up, the search will extend to stars thousands of light years away, rather than just hundreds. NIROSETI, or Near-Infrared Optical Search for Extraterrestrial Intelligence, could also uncover new information about the physical universe.

    “This is the first time Earthlings have looked at the universe at infrared wavelengths with nanosecond time scales,” said Dan Werthimer, UC Berkeley SETI Project Director. “The instrument could discover new astrophysical phenomena, or perhaps answer the question of whether we are alone.”

    NIROSETI will also gather more information than previous optical detectors by recording levels of light over time so that patterns can be analyzed for potential signs of other civilizations.

    “Searching for intelligent life in the universe is both thrilling and somewhat unorthodox,” said Claire Max, director of UC Observatories and professor of astronomy and astrophysics at UC Santa Cruz. “Lick Observatory has already been the site of several previous SETI searches, so this is a very exciting addition to the current research taking place.”

    NIROSETI will scan the skies several times a week on the Nickel 1-meter telescope at Lick Observatory, located on Mt. Hamilton east of San Jose.

  • richardmitnick 5:40 pm on January 7, 2022 Permalink | Reply
    Tags: "Magic-angle" graphene becomes a powerful ferromagnet., "Magic-angle" graphene has caused quite a stir in physics in recent years., "Magnetic surprise revealed in ‘magic-angle’ graphene", , “Magnetism and superconductivity are usually at opposite ends of the spectrum., , Changing the angle of the sheets with respect to each other changes the interactions., , Computer memory, , Electrons begin to interact not only with other electrons within a graphene sheet but also with those in the adjacent sheet., Exciting new possibilities for quantum science research, Magnets are generally destructive to superconductivity., , Quantum Computing, , Spin-orbit coupling is a state of electron in which each electron’s spin-its tiny magnetic moment that points either up or down-becomes linked to its orbit around the atomic nucleus., Things get interesting when graphene sheets are stacked., When “magic-angle" graphene” is cooled to near absolute zero it suddenly becomes a superconductor meaning it conducts electricity with zero resistance.   

    From Brown University (US) : “Magnetic surprise revealed in ‘magic-angle’ graphene” 

    From Brown University (US)

    January 6, 2022
    Kevin Stacey

    Magnets and superconductors don’t normally get along, but a new study shows that “magic-angle” graphene is capable of producing both superconductivity and ferromagnetism, which could be useful in quantum computing.

    When layers of “magic-angle” graphene (bottom) come in contact with layers of certain transitions metals, it induces a phenomenon called spin-orbit coupling in the graphene layers. That phenomenon gives rise to surprising physics, including ferromagnetism. Credit: Li lab/Brown University.

    When two sheets of the carbon nanomaterial graphene are stacked together at a particular angle with respect to each other, it gives rise to some fascinating physics. For instance, when this so-called “magic-angle” graphene is cooled to near absolute zero it suddenly becomes a superconductor meaning it conducts electricity with zero resistance.

    Now, a research team from Brown University has found a surprising new phenomenon that can arise in “magic-angle” graphene. In research published in the journal Science, the team showed that by inducing a phenomenon known as spin-orbit coupling, “magic-angle” graphene becomes a powerful ferromagnet.

    “Magnetism and superconductivity are usually at opposite ends of the spectrum in condensed matter physics, and it’s rare for them to appear in the same material platform,” said Jia Li, an assistant professor of physics at Brown and senior author of the research. “Yet we’ve shown that we can create magnetism in a system that originally hosts superconductivity. This gives us a new way to study the interplay between superconductivity and magnetism, and provides exciting new possibilities for quantum science research.”

    “Magic-angle” graphene has caused quite a stir in physics in recent years. Graphene is a two-dimensional material made of carbon atoms arranged in a honeycomb-like pattern. Single sheets of graphene are interesting on their own — displaying remarkable material strength and extremely efficient electrical conductance. But things get even more interesting when graphene sheets are stacked. Electrons begin to interact not only with other electrons within a graphene sheet but also with those in the adjacent sheet. Changing the angle of the sheets with respect to each other changes those interactions, giving rise to interesting quantum phenomena like superconductivity.

    This new research adds a new wrinkle — spin-orbit coupling — to this already interesting system. Spin-orbit coupling is a state of electron behavior in certain materials in which each electron’s spin — its tiny magnetic moment that points either up or down — becomes linked to its orbit around the atomic nucleus.

    “We know that spin-orbit coupling gives rise to a wide range of interesting quantum phenomena, but it’s not normally present in ‘magic-angle’ graphene,” said Jiang-Xiazi Lin, a postdoctoral researcher at Brown and the study’s lead author. “We wanted to introduce spin-orbit coupling, and then see what effect it had on the system.”

    To do that, Li and his team interfaced “magic-angle” graphene with a block of tungsten diselenide, a material that has strong spin-orbit coupling. Aligning the stack precisely induces spin-orbit coupling in the graphene. From there, the team probed the system with external electrical currents and magnetic fields.

    The experiments showed that an electric current flowing in one direction across the material in the presence of an external magnetic field produces a voltage in the direction perpendicular to the current. That voltage, known as the Hall effect, is the tell-tale signature of an intrinsic magnetic field in the material.

    Much to the research team’s surprise, they showed that the magnetic state could be controlled using an external magnetic field, which is oriented either in the plane of the graphene or out-of-plane. This is in contrast with magnetic materials without spin-orbit coupling, where the intrinsic magnetism can be controlled only when the external magnetic field is aligned along the direction of the magnetism.

    “This observation is an indication that spin-orbit coupling is indeed present and provided the clue for building a theoretical model to understand the influence of the atomic interface,” said Yahui Zhang, a theoretical physicist from Harvard University (US) who worked with the team at Brown to understand the physics associated with the observed magnetism.

    “The unique influence of spin-orbit coupling gives scientists a new experimental knob to turn in the effort to understand the behavior of ‘magic-angle’ graphene,” said Erin Morrissette, a Brown graduate student who performed some of the experimental work. “The findings also have the potential for new device applications.”

    One possible application is in computer memory. The team found that the magnetic properties of “magic-angle” graphene can be controlled with both external magnetic fields and electric fields. That would make this two-dimensional system an ideal candidate for a magnetic memory device with flexible read/write options.

    Another potential application is in quantum computing, the researchers say. An interface between a ferromagnet and a superconductor has been proposed as a potential building block for quantum computers. The problem, however, is that such an interface is difficult to create because magnets are generally destructive to superconductivity. But a material that’s capable of both ferromagnetism and superconductivity could provide a way to create that interface.

    “We are working on using the atomic interface to stabilize superconductivity and ferromagnetism at the same time,” Li said. “The coexistence of these two phenomena is rare in physics, and it will certainly unlock more excitement”

    The research was primarily supported by Brown University. Additional co-authors are Ya-Hui Zhang, Zhi Wang, Song Liu, Daniel Rhodes, Kenji Watanabe, Takashi Taniguchi and James Hone.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Welcome to Brown

    Brown U Robinson Hall

    Brown University (US) is a private Ivy League research university in Providence, Rhode Island. Founded in 1764 as the College in the English Colony of Rhode Island and Providence Plantations, Brown is the seventh-oldest institution of higher education in the United States and one of the nine colonial colleges chartered before the American Revolution.

    At its foundation, Brown University was the first college in North America to accept students regardless of their religious affiliation. The university is home to the oldest applied mathematics program in the United States, the oldest engineering program in the Ivy League, and the third-oldest medical program in New England. The university was one of the early doctoral-granting U.S. institutions in the late 19th century, adding masters and doctoral studies in 1887. In 1969, Brown adopted its “Open Curriculum” after a period of student lobbying. The new curriculum eliminated mandatory “general education” distribution requirements, made students “the architects of their own syllabus” and allowed them to take any course for a grade of satisfactory (Pass) or no-credit (Fail) which is unrecorded on external transcripts. In 1971, Brown’s coordinate women’s institution, Pembroke College (US), was fully merged into the university.

    Admission is among the most selective in the United States; in 2021, the university reported an acceptance rate of 5.4%.

    The university comprises the College; the Graduate School; Alpert Medical School; the School of Engineering; the School of Public Health and the School of Professional Studies. Brown’s international programs are organized through The Watson Institute for International and Public Affairs at Brown University (US), and the university is academically affiliated with the UChicago Marine Biological Laboratory in Woods Hole, Massachusetts (US) and The Rhode Island School of Design (US). In conjunction with the Rhode Island School of Design, Brown offers undergraduate and graduate dual degree programs.

    Brown’s main campus is located in the College Hill neighborhood of Providence, Rhode Island. The university is surrounded by a federally listed architectural district with a dense concentration of Colonial-era buildings. Benefit Street, which runs along the western edge of the campus, contains one of the richest concentrations of 17th and 18th century architecture in the United States.

    As of November 2019, nine Nobel Prize winners have been affiliated with Brown as alumni, faculty, or researchers, as well as seven National Humanities Medalists and ten National Medal of Science laureates. Other notable alumni include 26 Pulitzer Prize winners, 18 billionaires, one U.S. Supreme Court Chief Justice, four U.S. Secretaries of State, 99 members of the United States Congress, 57 Rhodes Scholars, 21 MacArthur Genius Fellows, and 37 Olympic medalists.

    The foundation and the charter

    In 1761, three residents of Newport, Rhode Island, drafted a petition to the colony’s General Assembly:

    “That your Petitioners propose to open a literary institution or School for instructing young Gentlemen in the Languages, Mathematics, Geography & History, & such other branches of Knowledge as shall be desired. That for this End… it will be necessary… to erect a public Building or Buildings for the boarding of the youth & the Residence of the Professors.”

    The three petitioners were Ezra Stiles, pastor of Newport’s Second Congregational Church and future president of Yale University (US); William Ellery, Jr., future signer of the United States Declaration of Independence; and Josias Lyndon, future governor of the colony. Stiles and Ellery later served as co-authors of the college’s charter two years later. The editor of Stiles’s papers observes, “This draft of a petition connects itself with other evidence of Dr. Stiles’s project for a Collegiate Institution in Rhode Island, before the charter of what became Brown University.”

    The Philadelphia Association of Baptist Churches were also interested in establishing a college in Rhode Island—home of the mother church of their denomination. At the time, the Baptists were unrepresented among the colonial colleges; the Congregationalists had Harvard University (US) and Yale, the Presbyterians had the College of New Jersey (later Princeton University (US)), and the Episcopalians had The William & Mary College (US) and King’s College (later Columbia University(US)). Isaac Backus, a historian of the New England Baptists and an inaugural trustee of Brown, wrote of the October 1762 resolution taken at Philadelphia:

    “The Philadelphia Association obtained such an acquaintance with our affairs, as to bring them to an apprehension that it was practicable and expedient to erect a college in the Colony of Rhode-Island, under the chief direction of the Baptists; … Mr. James Manning, who took his first degree in New-Jersey college in September, 1762, was esteemed a suitable leader in this important work.”

    James Manning arrived at Newport in July 1763 and was introduced to Stiles, who agreed to write the charter for the college. Stiles’ first draft was read to the General Assembly in August 1763 and rejected by Baptist members who worried that their denomination would be underrepresented in the College Board of Fellows. A revised charter written by Stiles and Ellery was adopted by the Rhode Island General Assembly on March 3, 1764, in East Greenwich.

    In September 1764, the inaugural meeting of the corporation—the college’s governing body—was held in Newport’s Old Colony House. Governor Stephen Hopkins was chosen chancellor, former and future governor Samuel Ward vice chancellor, John Tillinghast treasurer, and Thomas Eyres secretary. The charter stipulated that the board of trustees should be composed of 22 Baptists, five Quakers, five Episcopalians, and four Congregationalists. Of the 12 Fellows, eight should be Baptists—including the college president—”and the rest indifferently of any or all Denominations.”

    At the time of its creation, Brown’s charter was a uniquely progressive document. Other colleges had curricular strictures against opposing doctrines, while Brown’s charter asserted, “Sectarian differences of opinions, shall not make any Part of the Public and Classical Instruction.” The document additionally “recognized more broadly and fundamentally than any other [university charter] the principle of denominational cooperation.” The oft-repeated statement that Brown’s charter alone prohibited a religious test for College membership is inaccurate; other college charters were similarly liberal in that particular.

    The college was founded as Rhode Island College, at the site of the First Baptist Church in Warren, Rhode Island. James Manning was sworn in as the college’s first president in 1765 and remained in the role until 1791. In 1766, the college authorized Rev. Morgan Edwards to travel to Europe to “solicit Benefactions for this Institution.” During his year-and-a-half stay in the British Isles, the reverend secured funding from benefactors including Thomas Penn and Benjamin Franklin.

    In 1770, the college moved from Warren to Providence. To establish a campus, John and Moses Brown purchased a four-acre lot on the crest of College Hill on behalf of the school. The majority of the property fell within the bounds of the original home lot of Chad Brown, an ancestor of the Browns and one of the original proprietors of Providence Plantations. After the college was relocated to the city, work began on constructing its first building.

    A building committee, organized by the corporation, developed plans for the college’s first purpose-built edifice, finalizing a design on February 9, 1770. The subsequent structure, referred to as “The College Edifice” and later as University Hall, may have been modeled on Nassau Hall, built 14 years prior at the College of New Jersey. President Manning, an active member of the building process, was educated at Princeton and might have suggested that Brown’s first building resemble that of his alma mater.

    The College

    Founded in 1764, the college is Brown’s oldest school. About 7,200 undergraduate students are enrolled in the college, and 81 concentrations are offered. For the graduating class of 2020 the most popular concentrations were Computer Science; Economics; Biology; History; Applied Mathematics; International Relations and Political Science. A quarter of Brown undergraduates complete more than one concentration before graduating. If the existing programs do not align with their intended curricular interests, undergraduates may design and pursue independent concentrations.

    35 percent of undergraduates pursue graduate or professional study immediately, 60 percent within 5 years, and 80 percent within 10 years. For the Class of 2009, 56 percent of all undergraduate alumni have since earned graduate degrees. Among undergraduate alumni who go on to receive graduate degrees, the most common degrees earned are J.D. (16%), M.D. (14%), M.A. (14%), M.Sc. (14%), and Ph.D. (11%). The most common institutions from which undergraduate alumni earn graduate degrees are Brown University, Columbia University, and Harvard University.

    The highest fields of employment for undergraduate alumni ten years after graduation are education and higher education (15%), medicine (9%), business and finance (9%), law (8%), and computing and technology (7%).

    Brown and RISD

    Since its 1893 relocation to College Hill, Rhode Island School of Design (RISD) has bordered Brown to its west. Since 1900, Brown and RISD students have been able to cross-register at the two institutions, with Brown students permitted to take as many as four courses at RISD to count towards their Brown degree. The two institutions partner to provide various student-life services and the two student bodies compose a synergy in the College Hill cultural scene.


    Brown University is accredited by the New England Commission of Higher Education. For their 2021 rankings, The Wall Street Journal/Times Higher Education ranked Brown 5th in the Best Colleges 2021 edition.

    The Forbes Magazine annual ranking of America’s Top Colleges 2021—which ranked 600 research universities, liberal arts colleges and service academies—ranked Brown 26th overall and 23rd among universities.

    U.S. News & World Report ranked Brown 14th among national universities in its 2021 edition.[162] The 2021 edition also ranked Brown 1st for undergraduate teaching, 20th in Most Innovative Schools, and 18th in Best Value Schools.

    Washington Monthly ranked Brown 37th in 2020 among 389 national universities in the U.S. based on its contribution to the public good, as measured by social mobility, research, and promoting public service.

    For 2020, U.S. News & World Report ranks Brown 102nd globally.

    In 2014, Forbes Magazine ranked Brown 7th on its list of “America’s Most Entrepreneurial Universities.” The Forbes analysis looked at the ratio of “alumni and students who have identified themselves as founders and business owners on LinkedIn” and the total number of alumni and students.

    LinkedIn particularized the Forbes rankings, placing Brown third (between The Massachusetts Institute of Technology (US) and Princeton) among “Best Undergraduate Universities for Software Developers at Startups.” LinkedIn’s methodology involved a career-path examination of “millions of alumni profiles” in its membership database.

    In 2020, U.S. News ranked Brown’s Warren Alpert Medical School the 9th most selective in the country, with an acceptance rate of 2.8 percent.

    According to 2020 data from The Department of Education (US), the median starting salary of Brown computer science graduates was the highest in the United States.

    In 2020, Brown produced the second-highest number of Fulbright winners. For the three years prior, the university produced the most Fulbright winners of any university in the nation.


    Brown is member of The Association of American Universities (US) since 1933 and is classified among “R1: Doctoral Universities – Very High Research Activity”. In FY 2017, Brown spent $212.3 million on research and was ranked 103rd in the United States by total R&D expenditure by The National Science Foundation (US).

  • richardmitnick 2:36 pm on December 27, 2021 Permalink | Reply
    Tags: "Quantum marbles in a bowl of light", Atoms can be described quantum mechanically as matter waves., Even for quantum computers fundamental limits apply to the amount of data they can process in a given time., In the quantum world every measurement of the atom's position inevitably changes the matter wave in an unpredictable way., Physicists at the University of Bonn and The Technion have now investigated this time limit for the first time with an experiment on a complex quantum system., , Quantum Computing, Quantum gates resemble their traditional relatives in another respect: gates do not work infinitely fast., Quantum Interference: allows differences in waves to be detected very precisely., , The information in quantum computers is stored in quantum bits (qubits) which resemble a wave rather than a series of discrete values., ,   

    From The University of Bonn [Rheinische Friedrich-Wilhelms-Universität Bonn] (DE) and The Technion-Israel Institute of Technology [הטכניון – מכון טכנולוגי לישראל] (IL) : “Quantum marbles in a bowl of light” 

    From The University of Bonn [Rheinische Friedrich-Wilhelms-Universität Bonn] (DE)


    Technion bloc

    The Technion-Israel Institute of Technology [הטכניון – מכון טכנולוגי לישראל] (IL)

    22. December 2021

    Dr. Andrea Alberti
    Institut für Angewandte Physik (IAP)
    The Rhenish Friedrich Wilhelm University of Bonn [Rheinische Friedrich-Wilhelms-Universität Bonn](DE)
    Tel. +49-228/73-3471
    E-Mail: alberti@iap.uni-bonn.de

    Quantum marbles in action – – an artistic illustration of a matter wave rolling down a steep potential hill. © Image: Enrique Sahagún – Scixel

    An international study shows which factors determine the speed limit for quantum computations.

    Which factors determine how fast a quantum computer can perform its calculations? Physicists at the University of Bonn and the Technion – Israel Institute of Technology have devised an elegant experiment to answer this question. The results of the study are published in the journal Science Advances.

    Quantum computers are highly sophisticated machines that rely on the principles of quantum mechanics to process information. This should enable them to handle certain problems in the future that are completely unsolvable for conventional computers. But even for quantum computers fundamental limits apply to the amount of data they can process in a given time.

    Quantum gates require a minimum time

    The information stored in conventional computers can be thought of as a long sequence of zeros and ones, the bits. In quantum mechanics it is different: The information is stored in quantum bits (qubits), which resemble a wave rather than a series of discrete values. Physicists also speak of wave functions when they want to precisely represent the information contained in qubits.

    In a traditional computer, information is linked together by so-called gates. Combining several gates allows elementary calculations, such as the addition of two bits. Information is processed in a very similar way in quantum computers, where quantum gates change the wave function according to certain rules.

    Quantum gates resemble their traditional relatives in another respect: “Even in the quantum world, gates do not work infinitely fast,” explains Dr. Andrea Alberti of the Institute of Applied Physics at the University of Bonn. “They require a minimum amount of time to transform the wave function and the information this contains.”

    More than 70 years ago, Soviet physicists Leonid Mandelstam and Igor Tamm deduced theoretically this minimum time for transforming the wave function. Physicists at the University of Bonn and the Technion have now investigated this Mandelstam-Tamm limit for the first time with an experiment on a complex quantum system. To do this, they used cesium atoms that moved in a highly controlled manner. “In the experiment, we let individual atoms roll down like marbles in a light bowl and observe their motion,” explains Alberti, who led the experimental study.

    Atoms can be described quantum mechanically as matter waves. During the journey to the bottom of the light bowl, their quantum information changes. The researchers now wanted to know when this “deformation” could be identified at the earliest. This time would then be the experimental proof of the Mandelstam-Tamm limit. The problem with this, however, is: that in the quantum world every measurement of the atom’s position inevitably changes the matter wave in an unpredictable way. So it always looks like the marble has deformed, no matter how quickly the measurement is made. “We therefore devised a different method to detect the deviation from the initial state,” Alberti says.

    For this purpose, the researchers began by producing a clone of the matter wave, in other words an almost exact twin. “We used fast light pulses to create a so-called quantum superposition of two states of the atom,” explains Gal Ness, a doctoral student at the Technion and first author of the study. “Figuratively speaking, the atom behaves as if it had two different colors at the same time.” Depending on the color, each atom twin takes a different position in the light bowl: One is high up on the edge and “rolls” down from there. The other, conversely, is already at the bottom of the bowl. This twin does not move – after all, it cannot roll up the walls and so does not change its wave function.

    The physicists compared the two clones at regular intervals. They did this using a technique called quantum interference, which allows differences in waves to be detected very precisely. This enabled them to determine after what time a significant deformation of the matter wave first occurred.

    Two factors determine the speed limit

    By varying the height above the bottom of the bowl at the start of the experiment, the physicists were also able to control the average energy of the atom. Average because, in principle, the amount cannot be determined exactly. The “position energy” of the atom is therefore always uncertain. “We were able to demonstrate that the minimum time for the matter wave to change depends on this energy uncertainty,” says Professor Yoav Sagi, who led the partner team at Technion: “The greater the uncertainty, the shorter the Mandelstam-Tamm time.”

    This is exactly what the two Soviet physicists had predicted. But there was also a second effect: If the energy uncertainty was increased more and more until it exceeded the average energy of the atom, then the minimum time did not decrease further – contrary to what the Mandelstam-Tamm limit would actually suggest. The physicists thus proved a second speed limit, which was theoretically discovered about 20 years ago. The ultimate speed limit in the quantum world is therefore determined not only by the energy uncertainty, but also by the mean energy.

    “It is the first time that both quantum speed boundaries could be measured for a complex quantum system, and even in a single experiment,” Alberti enthuses. Future quantum computers may be able to solve problems rapidly, but they too will be constrained by these fundamental limits.


    The study was funded by the Reinhard Frank Foundation (in collaboration with the German Technion Society), the German Research Foundation (DFG), the Helen Diller Quantum Center at the Technion, and the German Academic Exchange Service (DAAD).

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Technion Campus

    A science and technology research university, among the world’s top ten, The Technion-Israel Institute of Technology [הטכניון – מכון טכנולוגי לישראל](IL), is dedicated to the creation of knowledge and the development of human capital and leadership, for the advancement of the State of Israel and all humanity.

    The Technion-Israel Institute of Technology [הטכניון – מכון טכנולוגי לישראל](IL) is a public research university in Haifa, Israel. Established in 1912 under the dominion of the Ottoman Empire (and more than 35 years before the establishment of State of Israel), the Technion is the oldest university in the country.

    The Technion is ranked as the top university in both Israel and the Middle East, and in the top 100 universities in the world in the Academic Ranking of World Universities of 2019. The university offers degrees in science and engineering, and related fields such as architecture, medicine, industrial management, and education. It has 19 academic departments, 60 research centers, and 12 affiliated teaching hospitals. Since its founding, it has awarded more than 100,000 degrees and its graduates are cited for providing the skills and education behind the creation and protection of the State of Israel.

    Technion’s 565 faculty members currently include three Nobel Laureates in chemistry. Four Nobel Laureates have been associated with the university.
    The selection of Hebrew as the language of instruction, defeating German in the War of the Languages, was an important milestone in Hebrew’s consolidation as Israel’s official language. The Technion is also a major factor behind the growth of Israel’s high-tech industry and innovation, including the country’s technical cluster in Silicon Wadi.


    The Technikum was conceived in the early 1900s by the German-Jewish fund Ezrah as a school of engineering and sciences. It was to be the only institution of higher learning in the then Ottoman Palestine, other than THE Bezalel Academy of Arts and Design(Il) בצלאל, אקדמיה לא(founded in 1907). In October 1913, the board of trustees selected German as the language of instruction, provoking a major controversy known as the War of the Languages. After opposition from American and Russian Jews to the use of German, the board of trustees reversed itself in February 1914 and selected Hebrew as the language of instruction. The German name Technikum was also replaced by the Hebrew name Technion.

    Technion’s cornerstone was laid in 1912, and studies began 12 years later in 1924.[18] In 1923 Albert Einstein visited and planted the now-famous first palm tree, as an initiative of Nobel tradition. The first palm tree still stands today in front of the old Technion building, which is now the MadaTech museum, in the Hadar neighborhood. Einstein founded the first Technion Society, and served as its president upon his return to Germany.

    In 1924, Arthur Blok became the Technion’s first president.

    In the early 1950s, under the administration of Yaakov Dori, who had served as the Israel Defense Forces’ first chief of staff, the Technion launched a campaign to recruit Jewish and pro-Israel scientists from abroad to establish research laboratories and teaching departments in the natural and exact sciences.

    Cornell Tech

    On 19 December 2011, a bid by a consortium of Cornell University (us) and Technion won a competition to establish a new high-tier applied science and engineering institution in New York City. The competition was established by New York City Mayor Michael Bloomberg in order to increase entrepreneurship and job growth in the city’s technology sector. The winning bid consisted of a 2,100,000 square feet (200,000 m2) state-of-the-art tech campus being built on Roosevelt Island, which would have its first phase completed by 2017, with a temporary off-site campus opening in 2013 at the Google New York City headquarters building at 111 Eighth Avenue. The new ‘School of Genius’ in New York City has been named The Jacobs Technion-Cornell Institute (US). Its Founding Director was Craig Gotsman, Technion’s Hewlett-Packard Professor of Computer Engineering.

    In 2015, AOL announced an investment of $5 million in a video research project at the institute.[27] Positive media coverage abounds, as well as some small scale protests from the margins of political and environmental activism.

    The University of Bonn [Rheinische Friedrich-Wilhelms-Universität Bonn] (DE) is a public research university located in Bonn, North Rhine-Westphalia, Germany. It was founded in its present form as the Rhein-Universität (English: Rhine University) on 18 October 1818 by Frederick William III, as the linear successor of the Kurkölnische Akademie Bonn (English: Academy of the Prince-elector of Cologne) which was founded in 1777. The University of Bonn offers many undergraduate and graduate programs in a range of subjects and has 544 professors. Its library holds more than five million volumes.

    As of October 2020, among its notable alumni, faculty and researchers are 11 Nobel Laureates, 4 Fields Medalists, 12 Gottfried Wilhelm Leibniz Prize winners as well as some of the most gifted minds in Natural science, e.g. August Kekulé, Heinrich Hertz and Justus von Liebig; Major philosophers, such as Friedrich Nietzsche, Karl Marx and Jürgen Habermas; Famous German poets and writers, for example Heinrich Heine, Paul Heyse and Thomas Mann; Painters, like Max Ernst; Political theorists, for instance Carl Schmitt and Otto Kirchheimer; Statesmen, viz. Konrad Adenauer and Robert Schuman; famous economists, like Walter Eucken, Ferdinand Tönnies and Joseph Schumpeter; and furthermore Prince Albert, Pope Benedict XVI and Wilhelm II.

    The University of Bonn has been conferred the title of “University of Excellence” under the German Universities Excellence Initiative.

    Research institutes

    The Franz Joseph Dölger-Institute studies the late antiquity and in particular the confrontation and interaction of Christians, Jews and Pagans in the late antiquity. The institute edits the Reallexikon für Antike und Christentum, a German language encyclopedia treating the history of early Christians in the late antiquity. The institute is named after the church historian Franz Joseph Dölger who was a professor of theology at the university from 1929 to 1940.

    The Research Institute for Discrete Mathematics focuses on discrete mathematics and its applications, in particular combinatorial optimization and the design of computer chips. The institute cooperates with IBM and Deutsche Post. Researchers of the institute optimized the chess computer IBM Deep Blue.

    The Bethe Center for Theoretical Physics “is a joint enterprise of theoretical physicists and mathematicians at various institutes of or connected with the University of Bonn. In the spirit of Hans Bethe it fosters research activities over a wide range of theoretical and mathematical physics.” Activities of the Bethe Center include short and long term visitors program, workshops on dedicated research topics, regular Bethe Seminar Series, lectures and seminars for graduate students.

    The German Reference Center for Ethics in the Life Sciences (German: Deutsches Referenzzentrum für Ethik in den Biowissenschaften) was founded in 1999 and is modeled after the National Reference Center for Bioethics Literature at Georgetown University. The center provides access to scientific information to academics and professionals in the fields of life science and is the only of its kind in Germany.

    After the German Government’s decision in 1991 to move the capital of Germany from Bonn to Berlin, the city of Bonn received generous compensation from the Federal Government. This led to the foundation of three research institutes in 1995, of which two are affiliated with the university:

    The Center for European Integration Studies (German: Zentrum für Europäische Integrationsforschung) studies the legal, economic and social implications of the European integration process. The institute offers several graduate programs and organizes summer schools for students.

    The Center for Development Research (German: Zentrum für Entwicklungsforschung) studies global development from an interdisciplinary perspective and offers a doctoral program in international development.

    The Center of Advanced European Studies and Research (CAESAR) is an interdisciplinary applied research institute. Research is conducted in the fields of nanotechnology, biotechnology and medical technology. The institute is a private foundation, but collaborates closely with the university.

    The Institute for the Study of Labor (German: Forschungsinstitut zur Zukunft der Arbeit) is a private research institute that is funded by Deutsche Post. The institute concentrates on research on labor economics, but is also offering policy advise on labor market issues. The institute also awards the annual IZA Prize in Labor Economics. The department of economics of the University of Bonn and the institute closely cooperate.

    The MPG Institute for Mathematics [MPG Institut für Mathematik](DE) is part of the MPG Society for the Advancement of Science [MPG Gesellschaft zur Förderung der Wissenschaften e. V.] (DE), a network of scientific research institutes in Germany. The institute was founded in 1980 by Friedrich Hirzebruch.

    The MPG Institute for Radio Astronomy [MPG Institut für Radioastronomie](DE) was founded in 1966 as an institute of the MPG Society for the Advancement of Science [MPG Gesellschaft zur Förderung der Wissenschaften e. V.] (DE). It operates the radio telescope in Effelsberg.

    Effelsberg Radio Telescope- a radio telescope in the Ahr Hills (part of the Eifel) in Bad Münstereifel(DE)

    The MPG Institute for Research on Collective Goods[MPG Institut zur Erforschung von Gemeinschaftsgütern)(DE) started as a research group in 1997 and was founded as an institute of the Max-Planck-Gesellschaft in 2003. The institute studies collective goods from a legal and economic perspective.

    The Center for Economics and Neuroscience founded in 2009 by Christian Elger, Gottfried Wilhelm Leibniz Prize winner Armin Falk, Martin Reuter and Bernd Weber, provides an international platform for interdisciplinary work in neuroeconomics. It includes the Laboratory for Experimental Economics that can carry out computer-based behavioral experiments with up to 24 participants simultaneously, two magnetic resonance imaging (MRI) scanners for interactive behavioral experiments and functional imaging, as well as a biomolecular laboratory for genotyping different polymorphisms.


    University of Bonn researchers made fundamental contributions in the sciences and the humanities. In physics researchers developed the quadrupole ion trap and the Geissler tube, discovered radio waves, were instrumental in describing cathode rays and developed the variable star designation. In chemistry researchers made significant contributions to the understanding of alicyclic compounds and Benzene. In material science researchers have been instrumental in describing the lotus effect. In mathematics University of Bonn faculty made fundamental contributions to modern topology and algebraic geometry. The Hirzebruch–Riemann–Roch theorem, Lipschitz continuity, the Petri net, the Schönhage–Strassen algorithm, Faltings’s theorem and the Toeplitz matrix are all named after University of Bonn mathematicians. University of Bonn economists made fundamental contributions to game theory and experimental economics. Famous thinkers that were faculty at the University of Bonn include the poet August Wilhelm Schlegel, the historian Barthold Georg Niebuhr, the theologians Karl Barth and Joseph Ratzinger and the poet Ernst Moritz Arndt.

    The university has nine collaborative research centres and five research units funded by the German Science Foundation and attracts more than 75 million Euros in external research funding annually.

    The Excellence Initiative of the German government in 2006 resulted in the foundation of the Hausdorff Center for Mathematics as one of the seventeen national Clusters of Excellence that were part of the initiative and the expansion of the already existing Bonn Graduate School of Economics (BGSE). The Excellence Initiative also resulted in the founding of the Bonn-Cologne Graduate School of Physics and Astronomy (an honors Masters and PhD program, jointly with the University of Cologne). Bethe Center for Theoretical Physics was founded in the November 2008, to foster closer interaction between mathematicians and theoretical physicists at Bonn. The center also arranges for regular visitors and seminars (on topics including String theory, Nuclear physics, Condensed matter etc.).

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: