Tagged: Qubits Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:22 pm on March 30, 2022 Permalink | Reply
    Tags: "Using two different elements creates new possibilities in hybrid atomic quantum computers", , In a hybrid array made of atoms of two different elements any atom’s nearest neighbors can be atoms of the other element., , One way to make a qubit is to trap a single neutral atom in place using a focused laser-a technique that won the Nobel Prize in 2018., , Qubits, The hybrid array created by the group contains 512 lasers: 256 loaded with cesium atoms and 256 with rubidium atoms., , To make a quantum computer out of neutral atom qubits many individual atoms must be trapped in place by many laser beams.   

    From The University of Chicago: “Using two different elements creates new possibilities in hybrid atomic quantum computers” 

    U Chicago bloc

    From The University of Chicago

    Mar 29, 2022
    Meredith Fore

    Left: A hybrid array of cesium atoms (yellow) and rubidium atoms (blue). Right: The customizability of the researchers’ technique enables them to place the atoms anywhere, allowing them to create this image of Chicago landmarks Willis Tower and the Cloud Gate. The scale bar in both images is 10 micrometers. Image by Bernien Lab.

    New technique allows researchers to measure a single atom without disturbing its neighbors.

    Qubits, the building blocks of quantum computers, can be made from many different technologies. One way to make a qubit is to trap a single neutral atom in place using a focused laser-a technique that won the Nobel Prize in 2018.

    But to make a quantum computer out of neutral atom qubits many individual atoms must be trapped in place by many laser beams. So far, these arrays have only been constructed from atoms of a single element, out of concern that making an array out of two elements would be prohibitively complex.

    But for the first time, University of Chicago researchers have created a hybrid array of neutral atoms from two different elements, significantly broadening the system’s potential applications in quantum technology. The results were funded in part by the NSF Quantum Leap Challenge Institute Hybrid Quantum Architectures and Networks (HQAN), and published in Physical Review X.

    “There have been many examples of quantum technology that have taken a hybrid approach,” said Hannes Bernien, lead researcher of the project and assistant professor in University of Chicago’s Pritzker School of Molecular Engineering. “But they have not been developed yet for these neutral atom platforms. We are very excited to see that our results have triggered a very positive response from the community, and that new protocols using our hybrid techniques are being developed.”

    Double the potential

    While man made qubits such as superconducting circuits require quality control to stay perfectly consistent, neutral atoms made from a single element all have exactly the same properties, making them ideal, consistent candidates for qubits.

    But since every atom in the array has the same properties, it’s extremely difficult to measure a single atom without disturbing its neighbors—they’re all on the same frequency, so to speak.

    “There have been quite a few milestone experiments over the last few years showing that atomic array platforms are extremely well suited for quantum simulation and also quantum computation,” Bernien said. “But measurements on these systems tend to be destructive, since all the atoms have the same resonances. This new hybrid approach can be really useful in this case.”

    In a hybrid array made of atoms of two different elements any atom’s nearest neighbors can be atoms of the other element, with completely different frequencies. This makes it much easier for researchers to measure and manipulate a single atom without any interference from the atoms around it.

    It also allows researchers to sidestep a standard complication of atomic arrays; it is very difficult to hold an atom in one place for very long.

    “When you do these experiments with the single atoms, at some point, you lose the atoms,” Bernien said. “And then you always have to re-initialize your system by first making a new, cold cloud of atoms and waiting for individual ones to get trapped by the lasers again. But because of this hybrid design, we can do experiments with these species separately. We can be doing an experiment with atoms of one element, while we refresh the other atoms, and then switch so we always have qubits available.”

    Making a bigger quantum computer

    The hybrid array created by Bernien’s group contains 512 lasers: 256 loaded with cesium atoms and 256 with rubidium atoms. As quantum computers go, this is a lot of qubits: Google and IBM, whose quantum computers are made of superconducting circuits rather than trapped atoms, have only gotten up to about 130 qubits. Though Bernien’s device is not yet a quantum computer, quantum computers made from atomic arrays are much easier to scale up, which could lead to some important new insights.

    “We actually don’t know what happens when you scale up a very coherent system that you can isolate very well from the environment,” Bernien said. “This trapped atom approach can be a wonderful tool to explore large-system quantum effects in unknown regimes.”

    The hybrid nature of this array also opens the door to many applications that wouldn’t be possible with a single species of atom. Since the two species are independently controllable, the atoms of one element can be used as quantum memory while the other can be used to make quantum computations, taking on the respective roles of RAM and a CPU on a typical computer.

    “Our work has already inspired theoreticians to think about new protocols for it, which is exactly what I hoped,” Bernien said. “I hope it will inspire people to think about how these tools can be used for measurements and state control. We have already seen really cool protocols that that we are very interested in implementing on these arrays.”

    The first author on the paper was postdoctoral researcher Kevin Singh. Other authors were UChicago graduate students Shraddha Anand, Andrew Pocklington and Jordan Kemp.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Chicago Campus

    The University of Chicago is an urban research university that has driven new ways of thinking since 1890. Our commitment to free and open inquiry draws inspired scholars to our global campuses, where ideas are born that challenge and change the world.

    We empower individuals to challenge conventional thinking in pursuit of original ideas. Students in the College develop critical, analytic, and writing skills in our rigorous, interdisciplinary core curriculum. Through graduate programs, students test their ideas with University of Chicago scholars, and become the next generation of leaders in academia, industry, nonprofits, and government.

    University of Chicago research has led to such breakthroughs as discovering the link between cancer and genetics, establishing revolutionary theories of economics, and developing tools to produce reliably excellent urban schooling. We generate new insights for the benefit of present and future generations with our national and affiliated laboratories: DOE’s Argonne National Laboratory, DOE’s Fermi National Accelerator Laboratory , and the Marine Biological Laboratory in Woods Hole, Massachusetts.
    The University of Chicago is enriched by the city we call home. In partnership with our neighbors, we invest in Chicago’s mid-South Side across such areas as health, education, economic growth, and the arts. Together with our medical center, we are the largest private employer on the South Side.

    In all we do, we are driven to dig deeper, push further, and ask bigger questions—and to leverage our knowledge to enrich all human life. Our diverse and creative students and alumni drive innovation, lead international conversations, and make masterpieces. Alumni and faculty, lecturers and postdocs go on to become Nobel laureates, CEOs, university presidents, attorneys general, literary giants, and astronauts. The University of Chicago is a private research university in Chicago, Illinois. Founded in 1890, its main campus is located in Chicago’s Hyde Park neighborhood. It enrolled 16,445 students in Fall 2019, including 6,286 undergraduates and 10,159 graduate students. The University of Chicago is ranked among the top universities in the world by major education publications, and it is among the most selective in the United States.

    The university is composed of one undergraduate college and five graduate research divisions, which contain all of the university’s graduate programs and interdisciplinary committees. Chicago has eight professional schools: the Law School, the Booth School of Business, the Pritzker School of Medicine, the School of Social Service Administration, the Harris School of Public Policy, the Divinity School, the Graham School of Continuing Liberal and Professional Studies, and the Pritzker School of Molecular Engineering. The university has additional campuses and centers in London, Paris, Beijing, Delhi, and Hong Kong, as well as in downtown Chicago.

    University of Chicago scholars have played a major role in the development of many academic disciplines, including economics, law, literary criticism, mathematics, religion, sociology, and the behavioralism school of political science, establishing the Chicago schools in various fields. Chicago’s Metallurgical Laboratory produced the world’s first man-made, self-sustaining nuclear reaction in Chicago Pile-1 beneath the viewing stands of the university’s Stagg Field. Advances in chemistry led to the “radiocarbon revolution” in the carbon-14 dating of ancient life and objects. The university research efforts include administration of DOE’s Fermi National Accelerator Laboratory and DOE’s Argonne National Laboratory, as well as the U Chicago Marine Biological Laboratory in Woods Hole, Massachusetts (MBL). The university is also home to the University of Chicago Press, the largest university press in the United States. The Barack Obama Presidential Center is expected to be housed at the university and will include both the Obama presidential library and offices of the Obama Foundation.

    The University of Chicago’s students, faculty, and staff have included 100 Nobel laureates as of 2020, giving it the fourth-most affiliated Nobel laureates of any university in the world. The university’s faculty members and alumni also include 10 Fields Medalists, 4 Turing Award winners, 52 MacArthur Fellows, 26 Marshall Scholars, 27 Pulitzer Prize winners, 20 National Humanities Medalists, 29 living billionaire graduates, and have won eight Olympic medals.

    The University of Chicago is enriched by the city we call home. In partnership with our neighbors, we invest in Chicago’s mid-South Side across such areas as health, education, economic growth, and the arts. Together with our medical center, we are the largest private employer on the South Side.


    According to the National Science Foundation, University of Chicago spent $423.9 million on research and development in 2018, ranking it 60th in the nation. It is classified among “R1: Doctoral Universities – Very high research activity” and is a founding member of the Association of American Universities and was a member of the Committee on Institutional Cooperation from 1946 through June 29, 2016, when the group’s name was changed to the Big Ten Academic Alliance. The University of Chicago is not a member of the rebranded consortium, but will continue to be a collaborator.

    The university operates more than 140 research centers and institutes on campus. Among these are the Oriental Institute—a museum and research center for Near Eastern studies owned and operated by the university—and a number of National Resource Centers, including the Center for Middle Eastern Studies. Chicago also operates or is affiliated with several research institutions apart from the university proper. The university manages DOE’s Argonne National Laboratory, part of the United States Department of Energy’s national laboratory system, and co-manages DOE’s Fermi National Accelerator Laboratory, a nearby particle physics laboratory, as well as a stake in the Apache Point Observatory in Sunspot, New Mexico.

    SDSS Telescope at Apache Point Observatory, near Sunspot NM, USA, Altitude 2,788 meters (9,147 ft).

    Apache Point Observatory, near Sunspot, New Mexico Altitude 2,788 meters (9,147 ft).

    Faculty and students at the adjacent Toyota Technological Institute at Chicago collaborate with the university. In 2013, the university formed an affiliation with the formerly independent Marine Biological Laboratory in Woods Hole, Mass. Although formally unrelated, the National Opinion Research Center is located on Chicago’s campus.

  • richardmitnick 5:59 pm on January 24, 2022 Permalink | Reply
    Tags: "Complex" numbers, "Complex" numbers are widely exploited in classical and relativistic physics., "Physics(US)", "Quantum Mechanics Must Be Complex", A basic starting point for quantum theory is to represent a particle state by a vector in a "complex"-valued space called a Hilbert space., , Early on the pioneers of quantum mechanics abandoned the attempt to develop a quantum theory based on real numbers because they thought it impractical., Polarization-entangled photons generated by parametric down-conversion and detected in superconducting nanowire single-photon detectors., , , Qubits, Recent theoretical results suggested that a real-valued quantum theory could describe an unexpectedly broad range of quantum systems., Superconducting quantum processors in which the qubits have individual control and readout., The lack of a general proof left open some paths for refuting the equivalence between “complex” and “real” quantum theories., The possibility of using real numbers was never formally ruled out., This real-number approach has now been squashed by two independent experiments., Two teams show that within a standard formulation of quantum mechanics "complex" numbers are indispensable for describing experiments carried out on simple quantum networks.   

    From Physics(US): “Quantum Mechanics Must Be Complex” 

    About Physics

    From Physics(US)

    January 24, 2022

    Alessio Avella, The National Institute of Metrological Research [Istituto Nazionale di Ricerca Metrologica](IT)

    Two independent studies demonstrate that a formulation of quantum mechanics involving “complex” rather than real numbers is necessary to reproduce experimental results.

    Credit: Carin Cain/American Physical Society(US)
    Figure 1: Conceptual sketch of the three-party game used by [Chen and colleagues] and [Li and colleagues] to demonstrate that a real quantum theory cannot describe certain measurements on small quantum networks. The game involves two sources distributing entangled qubits to three observers, who calculate a “score” from measurements performed on the qubits. In both experiments, the obtained score isn’t compatible with a real-valued, traditional formulation of quantum mechanics.

    “Complex” numbers are widely exploited in classical and relativistic physics. In electromagnetism, for instance, they tremendously simplify the description of wave-like phenomena. However, in these physical theories, “complex” numbers aren’t strictly needed, as all meaningful observables can be expressed in terms of real numbers. Thus, “complex” analysis is just a powerful computational tool. But are “complex” numbers essential in quantum physics—where the mathematics (the Schrödinger equation, the Hilbert space, etc.) is intrinsically “complex”-valued? This simple question has accompanied the development of quantum mechanics since its origins, when Schrödinger, Lorentz, and Planck debated it in their correspondence [1]. But early on, the pioneers of quantum mechanics abandoned the attempt to develop a quantum theory based on real numbers because they thought it impractical. However, the possibility of using real numbers was never formally ruled out, and recent theoretical results suggested that a real-valued quantum theory could describe an unexpectedly broad range of quantum systems [2]. But this real-number approach has now been squashed by two independent experiments, performed by Ming-Cheng Chen of The University of Science and Technology [中国科学技术大学](CN) at Chinese Academy of Sciences [中国科学院](CN) [3] and by Zheng-Da Li of The Southern University of Science and Technology[南方科技大學](CN) [4]. The two teams show that within a standard formulation of quantum mechanics “complex” numbers are indispensable for describing experiments carried out on simple quantum networks.

    A basic starting point for quantum theory is to represent a particle state by a vector in a “complex”-valued space called a Hilbert space. However, for a single, isolated quantum system, finding a description based purely on real numbers is straightforward: It can simply be obtained by doubling the dimension of the Hilbert space, as the space of complex numbers is equivalent, or “isomorphic,” to a two-dimensional, real plane, with the two dimensions representing the real and imaginary part of “complex” numbers, respectively. The problem becomes less trivial when we consider the unique quantum correlations, such as entanglement, that arise in quantum mechanics. These correlations can violate the principle of local realism, as proven by so-called Bell inequality tests [5]. Violations of Bell tests may appear to require “complex” values for their description [6]. But in 2009, a theoretical work demonstrated that, using real numbers, it is possible to reproduce the statistics of any standard Bell experiment, even those involving multiple quantum systems [2]. The result reinforced the conjecture that “complex” numbers aren’t necessary, but the lack of a general proof left open some paths for refuting the equivalence between “complex” and “real” quantum theories.

    One such path was identified in 2021 through the brilliant theoretical work of Marc-Olivier Renou of the The Institute of Photonic Sciences [Instituto de Ciencias Fotónicas](ES)and co-workers [7]. The researchers considered two theories that are both based on the postulates of quantum mechanics, but one uses a “complex” Hilbert space, as in the traditional formulation, while the other uses a real space. They then devised Bell-like experiments that could prove the inadequacy of the real theory. In their theorized experiments, two independent sources distribute entangled qubits in a quantum network configuration, while causally independent measurements on the nodes can reveal quantum correlations that do not admit any real quantum representation.

    Chen and colleagues and Li and colleagues now provide the experimental demonstration of Renou and co-workers’ proposal in two different physical platforms. The experiments are conceptually based on a “game” in which three parties (Alice, Bob, and Charlie) perform a Bell-like experiment (Fig. 1). In this game, two sources distribute entangled qubits between Alice and Bob and between Bob and Charlie, respectively. Each party independently chooses, from a set of possibilities, the measurements to perform on their qubit(s). Since the sources are independent, the qubits sent to Alice and Charlie are originally uncorrelated. Bob receives a qubit from both sources and, by performing a Bell-state measurement, he generates entanglement between Alice’s and Charlie’s qubits even though these qubits never interacted (a procedure called “entanglement swapping” [8]). Finally, a “score” is calculated from the statistical distribution of measurement outcomes. As demonstrated by Renou and co-workers, a “complex” quantum theory can produce a larger score than the one produced by a real quantum theory.

    The two groups follow different approaches to implement the quantum game. Chen and colleagues use a superconducting quantum processor in which the qubits have individual control and readout. The main challenge of this approach is making the qubits, which sit on the same circuit, truly independent and decoupled—a stringent requirement for the Bell-like tests. Li and colleagues instead choose a photonic implementation that more easily achieves this independence. Specifically, they use polarization-entangled photons generated by parametric down-conversion and detected in superconducting nanowire single-photon detectors. The optical implementation comes, however, with a different challenge: The protocol proposed by Renou and co-workers requires a complete Bell-state measurement, which can be directly implemented using superconducting qubits but is not achievable exploiting linear optical phenomena. Therefore, Li and colleagues had to rely on a so-called “partial” Bell-state measurement.

    Despite the difficulties inherent in each implementation, both experiments deliver compelling results. Impressively, they beat the score of real theory by many standard deviations (by 43 σ and 4.5 σ for Chen’s and Li’s experiments, respectively), providing convincing proof that complex numbers are needed to describe the experiments.

    Interestingly, both experiments are based on a minimal quantum network scheme (two sources and three nodes), which is a promising building block for a future quantum internet. The results thus offer one more demonstration that the availability of new quantum technologies is closely linked to the possibility of testing foundational aspects of quantum mechanics. Conversely, these new fundamental insights on quantum mechanics could have unexpected implications on the development of new quantum information technologies.

    We must be careful, however, in assessing the implications of these results. One might be tempted to conclude that “complex” numbers are indispensable to describe the physical reality of the Universe. However, this conclusion is true only if we accept the standard framework of quantum mechanics, which is based on several postulates. As Renou and his co-workers point out, these results would not be applicable to alternative formulations of quantum mechanics, such as Bohmian mechanics, which are based on different postulates. Therefore, these results could stimulate attempts to go beyond the standard formalism of quantum mechanics, which, despite great successes in predicting experimental results, is often considered inadequate from an interpretative point of view [9].


    C. N. Yang, “Square root of minus one, complex phases and Erwin Schrödinger,” Selected Papers II with Commentary (World Scientific, Hackensack, 2013)[Amazon][WorldCat].
    M. McKague et al., “Simulating quantum systems using real Hilbert spaces,” Phys. Rev. Lett. 102, 020505 (2009).
    M.-C. Chen et al., “Ruling out real-valued standard formalism of quantum theory,” Phys. Rev. Lett. 128, 040403 (2022).
    Z.-D. Li et al., “Testing real quantum theory in an optical quantum network,” Phys. Rev. Lett. 128, 040402 (2022).
    A. Aspect, “Closing the door on Einstein and Bohr’s quantum debate,” Physics 8, 123 (2015).
    N. Gisin, “Bell Inequalities: Many Questions, a Few Answers,” in Quantum Reality, Relativistic Causality, and Closing the Epistemic Circle, edited by W. C. Myrvold et al. The Western Ontario Series in Philosophy of Science, Vol. 73 (Springer, Dordrecht, 2009)[Amazon][WorldCat].
    M.-O. Renou et al., “Quantum theory based on real numbers can be experimentally falsified,” Nature 600, 625 (2021).
    J.-W. Pan et al., “Experimental entanglement swapping: Entangling photons that never interacted,” Phys. Rev. Lett. 80, 3891 (1998).
    T. Norsen, Foundations of Quantum Mechanics – An Exploration of the Physical Meaning of Quantum Theory, Undergraduate Lecture Notes in Physics (Springer, Cham, 2017)[Amazon][WorldCat].

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Physicists are drowning in a flood of research papers in their own fields and coping with an even larger deluge in other areas of physics. How can an active researcher stay informed about the most important developments in physics? Physics (US) highlights a selection of papers from the Physical Review journals. In consultation with expert scientists, the editors choose these papers for their importance and/or intrinsic interest. To highlight these papers, Physics features three kinds of articles: Viewpoints are commentaries written by active researchers, who are asked to explain the results to physicists in other subfields. Focus stories are written by professional science writers in a journalistic style and are intended to be accessible to students and non-experts. Synopses are brief editor-written summaries. Physics provides a much-needed guide to the best in physics, and we welcome your comments.

  • richardmitnick 12:07 pm on January 15, 2022 Permalink | Reply
    Tags: "From bits to qubits", , , , , , , , Qubits,   

    From Symmetry: “From bits to qubits” 

    Symmetry Mag

    From Symmetry

    Sarah Charley

    Illustration by Sandbox Studio, Chicago with Ana Kova.

    Quantum computers go beyond the binary.

    The first desktop computer was invented in the 1960s. But computing technology has been around for centuries, says Irfan Siddiqi, director of the Quantum Nanoelectronics Laboratory at The University of California- Berkeley (US).

    “An abacus is an ancient computer,” he says. “The materials science revolution made bits smaller, but the fundamental architecture hasn’t changed.”

    Both modern computers and abaci use basic units of information that have two possible states. In a classical computer, a binary digit (called a bit) is a 1 or a 0, represented by on-off switches in the hardware. On an abacus, a sliding bead can also be thought of as being “on” or “off,” based on its position (left or right on an abacus with horizontal rods, or up or down on an abacus with vertical ones). Bits and beads can form patterns that represent other numbers and, in the case of computers, letters and symbols.

    But what if there were even more possibilities? What if the beads of an abacus could sit in between two positions? What if the switches in a computer could consult each other before outputting a calculation?

    This is the fundamental idea behind quantum computers, which embrace the oddities of quantum mechanics to encode and process information.

    “Information in quantum mechanics is stored in very different ways than in classical mechanics, and that’s where the power comes from,” says Heather Gray, an assistant professor and particle physicist at UC Berkeley.

    Classical computer; classical mechanics

    Computing devices break down numbers into discrete components. A simple abacus could be made up of three rows: one with beads representing 100s, one with beads representing 10s, and one with beads representing 1s. In this case, the number 514 could be indicated by sliding to the right 5 beads in the 100s row, 1 bead in the 10s row, and 4 beads in the 1s row.

    The computer you may be using to read this article does something similar, counting by powers of two instead of 10s. In binary, the number 514 becomes 1000000010.

    The more complex the task, the more bits or time a computer needs to perform the calculation. To speed things up, scientists have over the years found ways to fit more and more bits into a computer. “You can now have one trillion transistors on a small silicon chip, which is a far cry from the ancient Chinese abacus,” Siddiqi says.

    But as engineers make transistors smaller and smaller, they’ve started to notice some funny effects.

    The quantum twist on computing

    Bits that behave classically are determinate: A 1 is a 1. But at very small scales, an entirely new set of physical rules comes into play.

    “We are hitting the quantum limits,” says Alberto Di Meglio, the head of CERN’s Quantum Technology Initiative. “As the scale of classic computing technology becomes smaller and smaller, quantum mechanics’ effects are not negligible anymore, and we do not want this in classic computers.”

    But quantum computers use quantum mechanics to their benefit. Rather than offering decisive answers, quantum bits, called qubits, behave like a distribution of probable values.

    Di Meglio likens qubits to undecided voters in an election. “You might know how a particular person is likely to vote, but until you actually ask them to vote, you won’t have a definite answer,” Di Meglio says.

    Qubits can be made from subatomic particles, such as electrons. Like other, similar particles, electrons have a property called spin that can exist in one of two possible states (spin-up or spin-down).

    If we think of these electrons as undecided voters, the question they are voting on is their direction of spin. Quantum computers process information while the qubits are still undecided—somewhere in between spin-up and spin-down.

    The situation becomes even more complicated when the “voters” can influence one another. This happens when two qubits are entangled. “For example, if one person votes yes, then an entangled ‘undecided’ voter will automatically vote no,” Di Meglio says. “The relationships become important, and the more voters you put together, the more chaotic it becomes.”

    When the qubits start talking to each other, each qubit can find itself in many different configurations, Siddiqi says. “An entangled array of qubits—with ‘n’ number of qubits—can exist in 2^n configurations. A quantum computer with 300 good qubits would have 2^300 possible configurations, which is more than the number of particles in the known universe.”

    With great power comes great… noise

    Entanglement allows a quantum computer to perform a complex task in a fraction of the time it would take a classical computer. But entanglement is also the quantum computer’s greatest weakness.

    “A qubit can get entangled with something else that you don’t have access to,” Siddiqi says. “Information can leave the system.”

    An electron from the computer’s power supply or a stray photon can entangle with a qubit and make it go rogue.

    “Quantum computing is not just about the number of qubits,” Di Meglio says. “You might have a quantum computer with thousands of qubits, but only a fraction are reliable.”

    Because of the problem of rogue qubits, today’s quantum computers are classified as noisy intermediate-scale quantum, or NISQ, devices. “Most quantum computers look like a physics experiment,” Gray says. “We’re very far from having one you could use at home.”

    But scientists are trying. In the future, scientists hope that they can use quantum computers to quickly search through large databases and calculate complex mathematical matrices.

    Today, physicists are already experimenting with quantum computers to simulate quantum processes, such as how particles interact with each other inside the detectors at the Large Hadron Collider. “You can do all sorts of cool things with entangled qubits,” Gray says.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 7:57 pm on August 3, 2021 Permalink | Reply
    Tags: A tedious hours-long process has been cut down to seconds and LFET is the first scalable transport and on-demand assembly technology of its kind., , , , LFET: low frequency electrothermoplasmonic tweezer, , , , Quantum photonics applications, Qubits, The scientists set out to make trapping and manipulating nanodiamonds simpler by using an interdisciplinary approach., The tweezer-a low frequency electrothermoplasmonic tweezer (LFET)-combines a fraction of a laser beam with a low-frequency alternating current electric field., This is an entirely new mechanism to trap and move nanodiamonds.   

    From Vanderbilt University (US) : “Research Snapshot: Vanderbilt engineer the first to introduce low-power dynamic manipulation of single nanoscale quantum objects” 

    Vanderbilt U Bloc

    From Vanderbilt University (US)

    Jul. 30, 2021
    Marissa Shapiro

    Low frequency electrothermoplasmonic tweezer device rendering. (Ndukaife.)


    Led by Justus Ndukaife, assistant professor of electrical engineering, Vanderbilt researchers are the first to introduce an approach for trapping and moving a nanomaterial known as a single colloidal nanodiamond with nitrogen-vacancy center using low power laser beam. The width of a single human hair is approximately 90,000 nanometers; nanodiamonds are less than 100 nanometers. These carbon-based materials are one of the few that can release the basic unit of all light—a single photon—a building block for future quantum photonics applications, Ndukaife explains.

    Currently it is possible to trap nanodiamonds using light fields focused near nano-sized metallic surfaces, but it is not possible to move them that way because laser beam spots are simply too big. Using an atomic force microscope, it takes scientists hours to push nanodiamonds into place one at a time near an emission enhancing environment to form a useful structure. Further, to create entangled sources and qubits—key elements that improve the processing speeds of quantum computers—several nanodiamond emitters are needed close together so that they can interact to make qubits, Ndukaife said.

    “We set out to make trapping and manipulating nanodiamonds simpler by using an interdisciplinary approach,” Ndukaife said. “Our tweezer-a low frequency electrothermoplasmonic tweezer (LFET)-combines a fraction of a laser beam with a low-frequency alternating current electric field. This is an entirely new mechanism to trap and move nanodiamonds.” A tedious hours-long process has been cut down to seconds and LFET is the first scalable transport and on-demand assembly technology of its kind.


    Ndukaife’s work is a key ingredient for quantum computing, a technology that will soon enable a huge number of applications from high resolution imaging to the creation of unhackable systems and ever smaller devices and computer chips. In 2019, the Department of Energy invested $60.7 million in funding to advance the development of quantum computing and networking.

    “Controlling nanodiamonds to make efficient single photon sources that can be used for these kinds of technologies will shape the future,” Ndukaife said. “To enhance quantum properties, it is essential to couple quantum emitters such as nanodiamonds with nitrogen-vacancy centers to nanophotonic structures.”


    Ndukaife intends to further explore nanodiamonds, arranging them onto nanophotonic structures designed to enhance their emission performance. With them in place, his lab will explore the possibilities for ultrabright single photon sources and entanglement in an on-chip platform for information processing and imaging.

    “There are so many things we can use this research to build upon,” Ndukaife said. “This is the first technique that allows us to dynamically manipulate single nanoscale objects in two dimensions using a low power laser beam.”

    Science paper:
    Nano Letters

    Coauthored by graduate students in Ndukaife’s lab, Chuchuan Hong and Sen Yang, as well as their collaborator, Ivan Kravchenko at DOE’s Oak Ridge National Laboratory (US).

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Commodore Cornelius Vanderbilt was in his 79th year when he decided to make the gift that founded Vanderbilt University (US) in the spring of 1873.
    The $1 million that he gave to endow and build the university was the commodore’s only major philanthropy. Methodist Bishop Holland N. McTyeire of Nashville, husband of Amelia Townsend who was a cousin of the commodore’s young second wife Frank Crawford, went to New York for medical treatment early in 1873 and spent time recovering in the Vanderbilt mansion. He won the commodore’s admiration and support for the project of building a university in the South that would “contribute to strengthening the ties which should exist between all sections of our common country.”

    McTyeire chose the site for the campus, supervised the construction of buildings and personally planted many of the trees that today make Vanderbilt a national arboretum. At the outset, the university consisted of one Main Building (now Kirkland Hall), an astronomical observatory and houses for professors. Landon C. Garland was Vanderbilt’s first chancellor, serving from 1875 to 1893. He advised McTyeire in selecting the faculty, arranged the curriculum and set the policies of the university.

    For the first 40 years of its existence, Vanderbilt was under the auspices of the Methodist Episcopal Church, South. The Vanderbilt Board of Trust severed its ties with the church in June 1914 as a result of a dispute with the bishops over who would appoint university trustees.

    From the outset, Vanderbilt met two definitions of a university: It offered work in the liberal arts and sciences beyond the baccalaureate degree and it embraced several professional schools in addition to its college. James H. Kirkland, the longest serving chancellor in university history (1893-1937), followed Chancellor Garland. He guided Vanderbilt to rebuild after a fire in 1905 that consumed the main building, which was renamed in Kirkland’s honor, and all its contents. He also navigated the university through the separation from the Methodist Church. Notable advances in graduate studies were made under the third chancellor, Oliver Cromwell Carmichael (1937-46). He also created the Joint University Library, brought about by a coalition of Vanderbilt, Peabody College and Scarritt College.

    Remarkable continuity has characterized the government of Vanderbilt. The original charter, issued in 1872, was amended in 1873 to make the legal name of the corporation “The Vanderbilt University.” The charter has not been altered since.

    The university is self-governing under a Board of Trust that, since the beginning, has elected its own members and officers. The university’s general government is vested in the Board of Trust. The immediate government of the university is committed to the chancellor, who is elected by the Board of Trust.

    The original Vanderbilt campus consisted of 75 acres. By 1960, the campus had spread to about 260 acres of land. When George Peabody College for Teachers merged with Vanderbilt in 1979, about 53 acres were added.

    Vanderbilt’s student enrollment tended to double itself each 25 years during the first century of the university’s history: 307 in the fall of 1875; 754 in 1900; 1,377 in 1925; 3,529 in 1950; 7,034 in 1975. In the fall of 1999 the enrollment was 10,127.

    In the planning of Vanderbilt, the assumption seemed to be that it would be an all-male institution. Yet the board never enacted rules prohibiting women. At least one woman attended Vanderbilt classes every year from 1875 on. Most came to classes by courtesy of professors or as special or irregular (non-degree) students. From 1892 to 1901 women at Vanderbilt gained full legal equality except in one respect — access to dorms. In 1894 the faculty and board allowed women to compete for academic prizes. By 1897, four or five women entered with each freshman class. By 1913 the student body contained 78 women, or just more than 20 percent of the academic enrollment.

    National recognition of the university’s status came in 1949 with election of Vanderbilt to membership in the select Association of American Universities (US). In the 1950s Vanderbilt began to outgrow its provincial roots and to measure its achievements by national standards under the leadership of Chancellor Harvie Branscomb. By its 90th anniversary in 1963, Vanderbilt for the first time ranked in the top 20 private universities in the United States.

    Vanderbilt continued to excel in research, and the number of university buildings more than doubled under the leadership of Chancellors Alexander Heard (1963-1982) and Joe B. Wyatt (1982-2000), only the fifth and sixth chancellors in Vanderbilt’s long and distinguished history. Heard added three schools (Blair, the Owen Graduate School of Management and Peabody College) to the seven already existing and constructed three dozen buildings. During Wyatt’s tenure, Vanderbilt acquired or built one-third of the campus buildings and made great strides in diversity, volunteerism and technology.

    The university grew and changed significantly under its seventh chancellor, Gordon Gee, who served from 2000 to 2007. Vanderbilt led the country in the rate of growth for academic research funding, which increased to more than $450 million and became one of the most selective undergraduate institutions in the country.

    On March 1, 2008, Nicholas S. Zeppos was named Vanderbilt’s eighth chancellor after serving as interim chancellor beginning Aug. 1, 2007. Prior to that, he spent 2002-2008 as Vanderbilt’s provost, overseeing undergraduate, graduate and professional education programs as well as development, alumni relations and research efforts in liberal arts and sciences, engineering, music, education, business, law and divinity. He first came to Vanderbilt in 1987 as an assistant professor in the law school. In his first five years, Zeppos led the university through the most challenging economic times since the Great Depression, while continuing to attract the best students and faculty from across the country and around the world. Vanderbilt got through the economic crisis notably less scathed than many of its peers and began and remained committed to its much-praised enhanced financial aid policy for all undergraduates during the same timespan. The Martha Rivers Ingram Commons for first-year students opened in 2008 and College Halls, the next phase in the residential education system at Vanderbilt, is on track to open in the fall of 2014. During Zeppos’ first five years, Vanderbilt has drawn robust support from federal funding agencies, and the Medical Center entered into agreements with regional hospitals and health care systems in middle and east Tennessee that will bring Vanderbilt care to patients across the state.

    Today, Vanderbilt University is a private research university of about 6,500 undergraduates and 5,300 graduate and professional students. The university comprises 10 schools, a public policy center and The Freedom Forum First Amendment Center. Vanderbilt offers undergraduate programs in the liberal arts and sciences, engineering, music, education and human development as well as a full range of graduate and professional degrees. The university is consistently ranked as one of the nation’s top 20 universities by publications such as U.S. News & World Report, with several programs and disciplines ranking in the top 10.

    Cutting-edge research and liberal arts, combined with strong ties to a distinguished medical center, creates an invigorating atmosphere where students tailor their education to meet their goals and researchers collaborate to solve complex questions affecting our health, culture and society.

    Vanderbilt, an independent, privately supported university, and the separate, non-profit Vanderbilt University Medical Center share a respected name and enjoy close collaboration through education and research. Together, the number of people employed by these two organizations exceeds that of the largest private employer in the Middle Tennessee region.

  • richardmitnick 11:24 am on June 2, 2021 Permalink | Reply
    Tags: "UArizona Engineers Demonstrate a Quantum Advantage", , How (and When) Quantum Works, Quantum computing and quantum sensing have the potential to be vastly more powerful than their classical counterparts., , Qubits, The technology isn't quite there yet, UArizona College of Engineering, UArizona College of Optical Sciences,   

    From University of Arizona (US) : “UArizona Engineers Demonstrate a Quantum Advantage” 

    From University of Arizona (US)


    Emily Dieckman
    College of Engineering

    In a new paper, researchers in the College of Engineering and James C. Wyant College of Optical Sciences experimentally demonstrate how quantum resources aren’t just dreams for the distant future – they can improve the technology of today.


    Quantum computing and quantum sensing have the potential to be vastly more powerful than their classical counterparts. Not only could a fully realized quantum computer take just seconds to solve equations that would take a classical computer thousands of years, but it could have incalculable impacts on areas ranging from biomedical imaging to autonomous driving.

    However, the technology isn’t quite there yet.

    In fact, despite widespread theories about the far-reaching impact of quantum technologies, very few researchers have been able to demonstrate, using the technology available now, that quantum methods have an advantage over their classical counterparts.

    In a paper published on June 1 in the journal Physical Review X, University of Arizona researchers experimentally show that quantum has an advantage over classical computing systems.

    Quntao Zhuang (left), PI of the Quantum Information Theory Group, and Zheshen Zhang, PI of the Quantum Information and Materials Group, are both assistant professors in the College of Engineering.

    “Demonstrating a quantum advantage is a long-sought-after goal in the community, and very few experiments have been able to show it,” said paper co-author Zheshen Zhang, assistant professor of materials science and engineering, principal investigator of the UArizona Quantum Information and Materials Group and one of the paper’s authors. “We are seeking to demonstrate how we can leverage the quantum technology that already exists to benefit real-world applications.”

    How (and When) Quantum Works

    Quantum computing and other quantum processes rely on tiny, powerful units of information called qubits. The classical computers we use today work with units of information called bits, which exist as either 0s or 1s, but qubits are capable of existing in both states at the same time. This duality makes them both powerful and fragile. The delicate qubits are prone to collapse without warning, making a process called error correction – which addresses such problems as they happen – very important.

    The quantum field is now in an era that John Preskill, a renowned physicist from the California Institute of Technology (US), termed “noisy intermediate scale quantum,” or NISQ. In the NISQ era, quantum computers can perform tasks that only require about 50 to a few hundred qubits, though with a significant amount of noise, or interference. Any more than that and the noisiness overpowers the usefulness, causing everything to collapse. It is widely believed that 10,000 to several million qubits would be needed to carry out practically useful quantum applications.

    Imagine inventing a system that guarantees every meal you cook will turn out perfectly, and then giving that system to a group of children who don’t have the right ingredients. It will be great in a few years, once the kids become adults and can buy what they need. But until then, the usefulness of the system is limited. Similarly, until researchers advance the field of error correction, which can reduce noise levels, quantum computations are limited to a small scale.

    Entanglement Advantages

    The experiment described in the paper used a mix of both classical and quantum techniques. Specifically, it used three sensors to classify the average amplitude and angle of radio frequency signals.

    The sensors were equipped with another quantum resource called entanglement, which allows them to share information with one another and provides two major benefits: First, it improves the sensitivity of the sensors and reduces errors. Second, because they are entangled, the sensors evaluate global properties rather than gathering data about specific parts of a system. This is useful for applications that only need a binary answer; for example, in medical imaging, researchers don’t need to know about every single cell in a tissue sample that isn’t cancerous – just whether there’s one cell that is cancerous. The same concept applies to detecting hazardous chemicals in drinking water.

    The experiment demonstrated that equipping the sensors with quantum entanglement gave them an advantage over classical sensors, reducing the likelihood of errors by a small but critical margin.

    “This idea of using entanglement to improve sensors is not limited to a specific type of sensor, so it could be used for a range of different applications, as long as you have the equipment to entangle the sensors,” said study co-author Quntao Zhuang, assistant professor of electrical and computer engineering and principal investigator of the Quantum Information Theory Group”In theory, you could consider applications like lidar (Light Detection and Ranging) for self-driving cars, for example.”

    Zhuang and Zhang developed the theory behind the experiment and described it in a 2019 Physical Review X paper. They co-authored the new paper with lead author Yi Xia, a doctoral student in the James C. Wyant College of Optical Sciences, and Wei Li, a postdoctoral researcher in materials science and engineering.

    Qubit Classifiers

    There are existing applications that use a mix of quantum and classical processing in the NISQ era, but they rely on preexisting classical datasets that must be converted and classified in the quantum realm. Imagine taking a series of photos of cats and dogs, then uploading the photos into a system that uses quantum methods to label the photos as either “cat” or “dog.”

    The team is tackling the labeling process from a different angle, by using quantum sensors to gather their own data in the first place. It’s more like using a specialized quantum camera that labels the photos as either “dog” or “cat” as the photos are taken.

    “A lot of algorithms consider data stored on a computer disk, and then convert that into a quantum system, which takes time and effort,” Zhuang said. “Our system works on a different problem by evaluating physical processes that are happening in real time.”

    The team is excited for future applications of their work at the intersection of quantum sensing and quantum computing. They even envision one day integrating their entire experimental setup onto a chip that could be dipped into a biomaterial or water sample to identify disease or harmful chemicals.

    “We think it’s a new paradigm for both quantum computing, quantum machine learning and quantum sensors, because it really creates a bridge to interconnect all these different domains,” Zhang said.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    As of 2019, the University of Arizona (US) enrolled 45,918 students in 19 separate colleges/schools, including the UArizona College of Medicine in Tucson and Phoenix and the James E. Rogers College of Law, and is affiliated with two academic medical centers (Banner – University Medical Center Tucson and Banner – University Medical Center Phoenix). UArizona is one of three universities governed by the Arizona Board of Regents. The university is part of the Association of American Universities and is the only member from Arizona, and also part of the Universities Research Association(US). The university is classified among “R1: Doctoral Universities – Very High Research Activity”.

    Known as the Arizona Wildcats (often shortened to “Cats”), the UArizona’s intercollegiate athletic teams are members of the Pac-12 Conference of the NCAA. UArizona athletes have won national titles in several sports, most notably men’s basketball, baseball, and softball. The official colors of the university and its athletic teams are cardinal red and navy blue.

    After the passage of the Morrill Land-Grant Act of 1862, the push for a university in Arizona grew. The Arizona Territory’s “Thieving Thirteenth” Legislature approved the UArizona in 1885 and selected the city of Tucson to receive the appropriation to build the university. Tucson hoped to receive the appropriation for the territory’s mental hospital, which carried a $100,000 allocation instead of the $25,000 allotted to the territory’s only university (Arizona State University(US) was also chartered in 1885, but it was created as Arizona’s normal school, and not a university). Flooding on the Salt River delayed Tucson’s legislators, and by they time they reached Prescott, back-room deals allocating the most desirable territorial institutions had been made. Tucson was largely disappointed with receiving what was viewed as an inferior prize.

    With no parties willing to provide land for the new institution, the citizens of Tucson prepared to return the money to the Territorial Legislature until two gamblers and a saloon keeper decided to donate the land to build the school. Construction of Old Main, the first building on campus, began on October 27, 1887, and classes met for the first time in 1891 with 32 students in Old Main, which is still in use today. Because there were no high schools in Arizona Territory, the university maintained separate preparatory classes for the first 23 years of operation.


    UArizona is classified among “R1: Doctoral Universities – Very high research activity”. UArizona is the fourth most awarded public university by National Aeronautics and Space Administration(US) for research. UArizona was awarded over $325 million for its Lunar and Planetary Laboratory (LPL) to lead NASA’s 2007–08 mission to Mars to explore the Martian Arctic, and $800 million for its OSIRIS-REx mission, the first in U.S. history to sample an asteroid.

    The LPL’s work in the Cassini spacecraft orbit around Saturn is larger than any other university globally. The UArizona laboratory designed and operated the atmospheric radiation investigations and imaging on the probe. UArizona operates the HiRISE camera, a part of the Mars Reconnaissance Orbiter. While using the HiRISE camera in 2011, UArizona alumnus Lujendra Ojha and his team discovered proof of liquid water on the surface of Mars—a discovery confirmed by NASA in 2015. UArizona receives more NASA grants annually than the next nine top NASA/JPL-Caltech(US)-funded universities combined. As of March 2016, the UArizona’s Lunar and Planetary Laboratory is actively involved in ten spacecraft missions: Cassini VIMS; Grail; the HiRISE camera orbiting Mars; the Juno mission orbiting Jupiter; Lunar Reconnaissance Orbiter (LRO); Maven, which will explore Mars’ upper atmosphere and interactions with the sun; Solar Probe Plus, a historic mission into the Sun’s atmosphere for the first time; Rosetta’s VIRTIS; WISE; and OSIRIS-REx, the first U.S. sample-return mission to a near-earth asteroid, which launched on September 8, 2016.

    UArizona students have been selected as Truman, Rhodes, Goldwater, and Fulbright Scholars. According to The Chronicle of Higher Education, UArizona is among the top 25 producers of Fulbright awards in the U.S.

    UArizona is a member of the Association of Universities for Research in Astronomy(US), a consortium of institutions pursuing research in astronomy. The association operates observatories and telescopes, notably Kitt Peak National Observatory(US) just outside Tucson. Led by Roger Angel, researchers in the Steward Observatory Mirror Lab at UArizona are working in concert to build the world’s most advanced telescope. Known as the Giant Magellan Telescope(CL), it will produce images 10 times sharper than those from the Earth-orbiting Hubble Telescope.

    Giant Magellan Telescope, 21 meters, to be at the NOIRLab(US) National Optical Astronomy Observatory(US) Carnegie Institution for Science’s(US) Las Campanas Observatory(CL), some 115 km (71 mi) north-northeast of La Serena, Chile, over 2,500 m (8,200 ft) high.

    The telescope is set to be completed in 2021. GMT will ultimately cost $1 billion. Researchers from at least nine institutions are working to secure the funding for the project. The telescope will include seven 18-ton mirrors capable of providing clear images of volcanoes and riverbeds on Mars and mountains on the moon at a rate 40 times faster than the world’s current large telescopes. The mirrors of the Giant Magellan Telescope will be built at UArizona and transported to a permanent mountaintop site in the Chilean Andes where the telescope will be constructed.

    Reaching Mars in March 2006, the Mars Reconnaissance Orbiter contained the HiRISE camera, with Principal Investigator Alfred McEwen as the lead on the project. This National Aeronautics and Space Administration(US) mission to Mars carrying the UArizona-designed camera is capturing the highest-resolution images of the planet ever seen. The journey of the orbiter was 300 million miles. In August 2007, the UArizona, under the charge of Scientist Peter Smith, led the Phoenix Mars Mission, the first mission completely controlled by a university. Reaching the planet’s surface in May 2008, the mission’s purpose was to improve knowledge of the Martian Arctic. The Arizona Radio Observatory(US), a part of UArizona Department of Astronomy Steward Observatory(US), operates the Submillimeter Telescope on Mount Graham.

    The National Science Foundation(US) funded the iPlant Collaborative in 2008 with a $50 million grant. In 2013, iPlant Collaborative received a $50 million renewal grant. Rebranded in late 2015 as “CyVerse”, the collaborative cloud-based data management platform is moving beyond life sciences to provide cloud-computing access across all scientific disciplines.
    In June 2011, the university announced it would assume full ownership of the Biosphere 2 scientific research facility in Oracle, Arizona, north of Tucson, effective July 1. Biosphere 2 was constructed by private developers (funded mainly by Texas businessman and philanthropist Ed Bass) with its first closed system experiment commencing in 1991. The university had been the official management partner of the facility for research purposes since 2007.

    U Arizona mirror lab-Where else in the world can you find an astronomical observatory mirror lab under a football stadium?

    University of Arizona’s Biosphere 2, located in the Sonoran desert. An entire ecosystem under a glass dome? Visit our campus, just once, and you’ll quickly understand why the UA is a university unlike any other.

  • richardmitnick 9:42 am on March 19, 2021 Permalink | Reply
    Tags: "Magnetism Meets Topology on a Superconductor's Surface", Dirac point, , Electrons in a solid occupy distinct energy bands separated by gaps., Energy band gaps are an electronic “no man’s land”-an energy range where no electrons are allowed., One of the ways to break time-reversal symmetry is by developing magnetism—specifically ferromagnetism., , Qubits, Theory predicts that Majorana fermions (sought-after quasiparticles) existing in superconducting topological surface states are immune to environmental disturbances., This unusual electronic energy structure could be harnessed for technologies of interest in quantum information science and electronics., Time-reversal symmetry means that the laws of physics are the same whether you look at a system going forward or backward ., When a gap opens up at the Dirac point it’s evidence that time-reversal symmetry has been broken.   

    From DOE’s Brookhaven National Laboratory (US): “Magnetism Meets Topology on a Superconductor’s Surface” 

    From DOE’s Brookhaven National Laboratory (US)

    March 17, 2021

    Ariana Manglaviti
    (631) 344-2347

    Peter Genzer
    (631) 344-3174

    This unusual electronic energy structure could be harnessed for technologies of interest in quantum information science and electronics.

    An illustration depicting a topological surface state with an energy band gap (an energy range where electrons are forbidden) between the apices of the top and corresponding bottom cones (allowed energy bands, or the range of energies electrons are allowed to have). A topological surface state is a unique electronic state, only existing at the surface of a material, that reflects strong interactions between an electron’s spin (red arrow) and its orbital motion around an atom’s nucleus. When the electron spins align parallel to each another, as they do here, the material has a type of magnetism called ferromagnetism. Credit: Dan Nevola, DOE’s Brookhaven National Laboratory(US).

    Electrons in a solid occupy distinct energy bands separated by gaps. Energy band gaps are an electronic “no man’s land”-an energy range where no electrons are allowed. Now, scientists studying a compound containing iron, tellurium, and selenium have found that an energy band gap opens at a point where two allowed energy bands intersect on the material’s surface. They observed this unexpected electronic behavior when they cooled the material and probed its electronic structure with laser light. Their findings, reported in the PNAS, could have implications for future quantum information science and electronics.

    The particular compound belongs to the family of iron-based high-temperature superconductors, which were initially discovered in 2008. These materials not only conduct electricity without resistance at relatively higher temperatures (but still very cold ones) than other classes of superconductors but also show magnetic properties.

    “For a while, people thought that superconductivity and magnetism would work against each other,” said first author Nader Zaki, a scientific associate in the Electron Spectroscopy Group of the Condensed Matter Physics and Materials Science (CMPMS) Division at the DOE’s Brookhaven National Laboratory(US). “We have explored a material where both develop at the same time.”

    Aside from superconductivity and magnetism, some iron-based superconductors have the right conditions to host “topological” surface states. The existence of these unique electronic states, localized at the surface (they do not exist in the bulk of the material), reflects strong interactions between an electron’s spin and its orbital motion around the nucleus of an atom.

    “When you have a superconductor with topological surface properties, you’re excited by the possibility of topological superconductivity,” said corresponding author Peter Johnson, leader of the Electron Spectroscopy Group. “Topological superconductivity is potentially capable of supporting Majorana fermions, which could serve as qubits, the information-storing building blocks of quantum computers.”

    Quantum computers promise tremendous speedups for calculations that would take an impractical amount of time or be impossible on traditional computers. One of the challenges to realizing practical quantum computing is that qubits are highly sensitive to their environment. Small interactions cause them to lose their quantum state and thus stored information becomes lost. Theory predicts that Majorana fermions (sought-after quasiparticles) existing in superconducting topological surface states are immune to environmental disturbances making them an ideal platform for robust qubits.

    Seeing the iron-based superconductors as a platform for a range of exotic and potentially important phenomena, Zaki, Johnson, and their colleagues set out to understand the roles of topology, superconductivity and magnetism.

    CMPMS Division senior physicist Genda Gu first grew high-quality single crystals of the iron-based compound. Then, Zaki mapped the electronic band structure of the material via laser-based photoemission spectroscopy. When light from a laser is focused onto a small spot on the material, electrons from the surface are “kicked out” (i.e., photoemitted). The energy and momentum of these electrons can then be measured.

    When they lowered the temperature, something surprising happened.

    “The material went superconducting, as we expected, and we saw a superconducting gap associated with that,” said Zaki. “But what we didn’t expect was the topological surface state opening up a second gap at the Dirac point. You can picture the energy band structure of this surface state as an hourglass or two cones attached at their apex. Where these cones intersect is called the Dirac point.”

    As Johnson and Zaki explained, when a gap opens up at the Dirac point it’s evidence that time-reversal symmetry has been broken. Time-reversal symmetry means that the laws of physics are the same whether you look at a system going forward or backward in time—akin to rewinding a video and seeing the same sequence of events playing in reverse. But under time reversal, electron spins change their direction and break this symmetry. Thus, one of the ways to break time-reversal symmetry is by developing magnetism—specifically, ferromagnetism, a type of magnetism where all electron spins align in a parallel fashion.

    “The system is going into the superconducting state and seemingly magnetism is developing,” said Johnson. “We have to assume the magnetism is in the surface region because in this form it cannot coexist in the bulk. This discovery is exciting because the material has a lot of different physics in it: superconductivity, topology, and now magnetism. I like to say it’s one-stop shopping. Understanding how these phenomena arise in the material could provide a basis for many new and exciting technological directions.”

    As previously noted, the material’s superconductivity and strong spin-orbit effects could be harnessed for quantum information technologies. Alternatively, the material’s magnetism and strong spin-orbit interactions could enable dissipationless (no energy loss) transport of electrical current in electronics. This capability could be leveraged to develop electronic devices that consume low amounts of power.

    Coauthors Alexei Tsvelik, senior scientist and group leader of the CMPMS Division Condensed Matter Theory Group, and Congjun Wu, a professor of physics at the University of California San Diego(US), provided theoretical insights on how time reversal symmetry is broken and magnetism originates in the surface region.

    “This discovery not only reveals deep connections between topological superconducting states and spontaneous magnetization but also provides important insights into the nature of superconducting gap functions in iron-based superconductors—an outstanding problem in the investigation of strongly correlated unconventional superconductors,” said Wu.

    In a separate study with other collaborators in the CMPMS Division, the experimental team is examining how different concentrations of the three elements in the sample contribute to the observed phenomena. Seemingly, tellurium is needed for the topological effects, too much iron kills superconductivity, and selenium enhances superconductivity.

    In follow-on experiments, the team hopes to verify the time-reversal symmetry breaking with other methods and explore how substituting elements in the compound modifies its electronic behavior.

    “As materials scientists, we like to alter the ingredients in the mixture to see what happens,” said Johnson. “The goal is to figure out how superconductivity, topology, and magnetism interact in these complex materials.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    One of ten national laboratories overseen and primarily funded by the DOE(US) Office of Science, DOE’s Brookhaven National Laboratory(US) conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University (US) the largest academic user of Laboratory facilities, and Battelle(US), a nonprofit, applied science and technology organization.

    Research at BNL specializes in nuclear and high energy physics, energy science and technology, environmental and bioscience, nanoscience and national security. The 5,300 acre campus contains several large research facilities, including the Relativistic Heavy Ion Collider [below] and National Synchrotron Light Source II [below]. Seven Nobel prizes have been awarded for work conducted at Brookhaven lab.

    BNL is staffed by approximately 2,750 scientists, engineers, technicians, and support personnel, and hosts 4,000 guest investigators every year. The laboratory has its own police station, fire department, and ZIP code (11973). In total, the lab spans a 5,265-acre (21 km^2) area that is mostly coterminous with the hamlet of Upton, New York. BNL is served by a rail spur operated as-needed by the New York and Atlantic Railway. Co-located with the laboratory is the Upton, New York, forecast office of the National Weather Service.

    Major programs

    Although originally conceived as a nuclear research facility, Brookhaven Lab’s mission has greatly expanded. Its foci are now:

    Nuclear and high-energy physics
    Physics and chemistry of materials
    Environmental and climate research
    Energy research
    Structural biology
    Accelerator physics


    Brookhaven National Lab was originally owned by the Atomic Energy Commission(US) and is now owned by that agency’s successor, the United States Department of Energy (DOE). DOE subcontracts the research and operation to universities and research organizations. It is currently operated by Brookhaven Science Associates LLC, which is an equal partnership of Stony Brook University(US) and Battelle Memorial Institute(US). From 1947 to 1998, it was operated by Associated Universities, Inc. (AUI), but AUI lost its contract in the wake of two incidents: a 1994 fire at the facility’s high-beam flux reactor that exposed several workers to radiation and reports in 1997 of a tritium leak into the groundwater of the Long Island Central Pine Barrens on which the facility sits.


    Following World War II, the US Atomic Energy Commission was created to support government-sponsored peacetime research on atomic energy. The effort to build a nuclear reactor in the American northeast was fostered largely by physicists Isidor Isaac Rabi and Norman Foster Ramsey Jr., who during the war witnessed many of their colleagues at Columbia University leave for new remote research sites following the departure of the Manhattan Project from its campus. Their effort to house this reactor near New York City was rivalled by a similar effort at the Massachusetts Institute of Technology(US) to have a facility near Boston, Massachusettes(US). Involvement was quickly solicited from representatives of northeastern universities to the south and west of New York City such that this city would be at their geographic center. In March 1946 a nonprofit corporation was established that consisted of representatives from nine major research universities — Columbia(US), Cornell(US), Harvard(US), Johns Hopkins(US), MIT, Princeton University(US), University of Pennsylvania(US), University of Rochester(US), and Yale University(US).

    Out of 17 considered sites in the Boston-Washington corridor, Camp Upton on Long Island was eventually chosen as the most suitable in consideration of space, transportation, and availability. The camp had been a training center from the US Army during both World War I and World War II. After the latter war, Camp Upton was deemed no longer necessary and became available for reuse. A plan was conceived to convert the military camp into a research facility.

    On March 21, 1947, the Camp Upton site was officially transferred from the U.S. War Department to the new U.S. Atomic Energy Commission (AEC), predecessor to the U.S. Department of Energy (DOE).

    Research and facilities

    Reactor history

    In 1947 construction began on the first nuclear reactor at Brookhaven, the Brookhaven Graphite Research Reactor. This reactor, which opened in 1950, was the first reactor to be constructed in the United States after World War II. The High Flux Beam Reactor operated from 1965 to 1999. In 1959 Brookhaven built the first US reactor specifically tailored to medical research, the Brookhaven Medical Research Reactor, which operated until 2000.

    Accelerator history

    In 1952 Brookhaven began using its first particle accelerator, the Cosmotron. At the time the Cosmotron was the world’s highest energy accelerator, being the first to impart more than 1 GeV of energy to a particle.

    BNL Cosmotron 1952-1966

    The Cosmotron was retired in 1966, after it was superseded in 1960 by the new Alternating Gradient Synchrotron (AGS).

    BNL Alternating Gradient Synchrotron (AGS)

    The AGS was used in research that resulted in 3 Nobel prizes, including the discovery of the muon neutrino, the charm quark, and CP violation.

    In 1970 in BNL started the ISABELLE project to develop and build two proton intersecting storage rings.

    The groundbreaking for the project was in October 1978. In 1981, with the tunnel for the accelerator already excavated, problems with the superconducting magnets needed for the ISABELLE accelerator brought the project to a halt, and the project was eventually cancelled in 1983.

    The National Synchrotron Light Source operated from 1982 to 2014 and was involved with two Nobel Prize-winning discoveries. It has since been replaced by the National Synchrotron Light Source II [below].


    After ISABELLE’S cancellation, physicist at BNL proposed that the excavated tunnel and parts of the magnet assembly be used in another accelerator. In 1984 the first proposal for the accelerator now known as the Relativistic Heavy Ion Collider (RHIC)[below] was put forward. The construction got funded in 1991 and RHIC has been operational since 2000. One of the world’s only two operating heavy-ion colliders, RHIC is as of 2010 the second-highest-energy collider after the Large Hadron Collider(CH). RHIC is housed in a tunnel 2.4 miles (3.9 km) long and is visible from space.

    On January 9, 2020, It was announced by Paul Dabbar, undersecretary of the US Department of Energy Office of Science, that the BNL eRHIC design has been selected over the conceptual design put forward by DOE’s Thomas Jefferson National Accelerator Facility [Jlab] as the future Electron–ion collider (EIC) in the United States.

    Electron-Ion Collider (EIC) at BNL, to be built inside the tunnel that currently houses the RHIC.

    In addition to the site selection, it was announced that the BNL EIC had acquired CD-0 (mission need) from the Department of Energy. BNL’s eRHIC design proposes upgrading the existing Relativistic Heavy Ion Collider, which collides beams light to heavy ions including polarized protons, with a polarized electron facility, to be housed in the same tunnel.

    Other discoveries

    In 1958, Brookhaven scientists created one of the world’s first video games, Tennis for Two. In 1968 Brookhaven scientists patented Maglev, a transportation technology that utilizes magnetic levitation.

    Major facilities

    Relativistic Heavy Ion Collider (RHIC), which was designed to research quark–gluon plasma[16] and the sources of proton spin. Until 2009 it was the world’s most powerful heavy ion collider. It is the only collider of spin-polarized protons.
    Center for Functional Nanomaterials (CFN), used for the study of nanoscale materials.
    National Synchrotron Light Source II (NSLS-II), Brookhaven’s newest user facility, opened in 2015 to replace the National Synchrotron Light Source (NSLS), which had operated for 30 years.[19] NSLS was involved in the work that won the 2003 and 2009 Nobel Prize in Chemistry.
    Alternating Gradient Synchrotron, a particle accelerator that was used in three of the lab’s Nobel prizes.
    Accelerator Test Facility, generates, accelerates and monitors particle beams.
    Tandem Van de Graaff, once the world’s largest electrostatic accelerator.
    Computational Science resources, including access to a massively parallel Blue Gene series supercomputer that is among the fastest in the world for scientific research, run jointly by Brookhaven National Laboratory and Stony Brook University.
    Interdisciplinary Science Building, with unique laboratories for studying high-temperature superconductors and other materials important for addressing energy challenges.
    NASA Space Radiation Laboratory, where scientists use beams of ions to simulate cosmic rays and assess the risks of space radiation to human space travelers and equipment.

    Off-site contributions

    It is a contributing partner to ATLAS experiment, one of the four detectors located at the Large Hadron Collider (LHC).

    CERN map

    Iconic view of the CERN (CH) ATLAS detector.

    It is currently operating at CERN near Geneva, Switzerland.

    Brookhaven was also responsible for the design of the SNS accumulator ring in partnership with Spallation Neutron Source at DOE’s Oak Ridge National Laboratory, Tennessee.

    ORNL Spallation Neutron Source annotated.

    Brookhaven plays a role in a range of neutrino research projects around the world, including the Daya Bay Reactor Neutrino Experiment in China and the Deep Underground Neutrino Experiment at DOE’s Fermi National Accelerator Laboratory(US).

    Daya Bay, nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA.

    Brookhaven Campus.

    BNL Center for Functional Nanomaterials.



    BNL RHIC Campus.

    BNL/RHIC Star Detector.

    BNL/RHIC Phenix.

  • richardmitnick 11:52 am on February 21, 2021 Permalink | Reply
    Tags: "Technologies for More Powerful Quantum Computers", , Development Will Be Made Available to Innovative First Users, , Fraunhofer Society [Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e. V.](DE), Friedrich–Alexander University Erlangen–Nürnberg [Friedrich-Alexander-Universität Erlangen-Nürnberg](DE), Important Step towards the Development of Superconducting Quantum Circuits in Germany, Infineon, Novel Materials for Higher Quality of Qubits, Qubits, The collaboration project “German Quantum Computer Based on Superconducting Qubits” GeQCoS for short., , , Walther Meißner Institute of the Bavarian Academy of Sciences(DE)   

    From The Karlsruhe Institute of Technology-KIT [Karlsruher Institut für Technologie] (DE): “Technologies for More Powerful Quantum Computers” 


    From The Karlsruhe Institute of Technology-KIT [Karlsruher Institut für Technologie] (DE)

    29.01.2021 [Just now in social media.]

    Monika Landgraf
    Head of Corporate Communications, Chief Press Officer
    Phone: +49 721 608-41150
    Fax: +49 721 608-43658
    presse∂kit edu

    Contact for this press release:
    Johannes Wagner
    Phone: +49 721 608-41175
    johannes wagner∂kit edu

    Visualization of a quantum processor: Its core contains a chip on which superconducting qubits are arranged in a checkered pattern. Credit: Christoph Hohmann.

    Quantum computers will efficiently solve problems that could not be solved in the past. Examples are calculations of properties of complex molecules for pharmaceutical industry or solutions of optimization problems for manufacturing processes in automotive industry or for calculations in the financial sector. Within the framework of the “GeQCoS“ collaboration project, Germany’s leading researchers in the area of superconducting quantum circuits are working on innovative concepts for designing better quantum processors. Researchers from Karlsruhe Institute of Technology (KIT) play an important role in the project.

    The collaboration project “German Quantum Computer Based on Superconducting Qubits,” GeQCoS for short, is aimed at developing a prototype quantum processor consisting of a few superconducting qubits with fundamentally improved components. The main components of a quantum computer, the quantum bits or qubits, will be implemented by zero-resistance currents in superconducting circuits. These currents are relatively robust against external disturbances and can preserve quantum states during operation.

    Novel Materials for Higher Quality of Qubits

    The planned improvements will consist in an increase in connectivity, that is the number of connections among the qubits, as well as in the quality of qubits, that is the possibility to rapidly and efficiently produce the desired quantum states. “ Currently, this is a big challenge,” says Dr. Ioan Pop from KIT’s Institute for Quantum Materials and Technologies. “Use of novel materials for the production of qubits is expected to result in better reproducibility and higher quality of the qubits.”

    Important Step towards the Development of Superconducting Quantum Circuits in Germany

    To achieve improvement, researchers collaborate closely in the areas of alternative components, change of architecture, coupling mechanisms, and higher precision of calculations. “This is a very important step towards the development of superconducting quantum circuits in Germany. This technology is preferred and pursued by IT managers in the area of quantum computers,” Professor Alexey Ustinov, Head of the research group at KIT’s Physikalisches Institut, emphasizes. “Localization and diagnosis of errors is rather challenging work. We have to improve fabrication methods to prevent faults that sustainably influence the quality of the qubits.”

    Today, quantum computers already are able to manage small specific problems and to exhibit basic functions, the experts say. In the long term, work is aimed at developing a so-called universal quantum computer that calculates important problems exponentially faster than a classical computer. An architecture suited for the calculation of practically relevant problems requires substantial improvement of both hardware and software.

    Development Will Be Made Available to Innovative First Users

    To reach this goal, scalable fabrication processes and optimized chip housings will be developed within the project. Eventually, the prototype quantum processor will be installed at the Walther Meißner Institute of the Bavarian Academy of Sciences. The technologies developed are not only expected to lead to new scientific findings. Close interconnection with companies will strengthen the quantum ecosystem in Germany and Europe. On both the hardware and software level, the quantum processor will be made available to innovative first users as early as possible.

    Apart from KIT, the Friedrich–Alexander University Erlangen–Nürnberg [Friedrich-Alexander-Universität Erlangen-Nürnberg](DE) , Forschungszentrum Jülich Research Centre [Forschungszentrum Jülichs] (FZJ)(DE), Walther Meißner Institute of the Bavarian Academy of Sciences(DE), The Technical University of Munich [Technische Universität München](DE), Infineon, and the Fraunhofer Society [Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e. V.](DE) are involved in the project. The “GeQCoS“ project is funded by the Federal Ministry of Education and Research with EUR 14.5 million. Of these, more than 3 million euros go to KIT.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition


    Mission Statement of KIT


    The Karlsruhe Institute of Technology-KIT [Karlsruher Institut für Technologie] (DE), briefly referred to as KIT, was established by the merger of the Forschungszentrum Karlsruhe GmbH and the Universität Karlsruhe ([TH] on October 01, 2009. KIT combines the tasks of a university of the state of Baden-Württemberg with those of a research center of the Helmholtz Association of German Research Centres [Helmholtz-Gemeinschaft Deutscher Forschungszentren] (DE) in the areas of research, teaching, and innovation.

    The KIT merger represents the consistent continuation of a long-standing close cooperation of two research and education institutions rich in tradition. The University of Karlsruhe was founded in 1825 as a Polytechnical School and has developed to a modern location of research and education in natural sciences, engineering, economics, social sciences, and the humanities, which is organized in eleven departments. The Karlsruhe Research Center was founded in 1956 as the Nuclear Reactor Construction and Operation Company and has turned into a multidisciplinary large-scale research center of the Helmholtz Association, which conducts research under eleven scientific and engineering programs.

    Being “The Research University in the Helmholtz Association”, KIT creates and imparts knowledge for the society and the environment. It is the objective to make significant contributions to the global challenges in the fields of energy, mobility, and information. For this, about 9,300 employees cooperate in a broad range of disciplines in natural sciences, engineering sciences, economics, and the humanities and social sciences. KIT prepares its 24,400 students for responsible tasks in society, industry, and science by offering research-based study programs. Innovation efforts at KIT build a bridge between important scientific findings and their application for the benefit of society, economic prosperity, and the preservation of our natural basis of life. KIT is one of the German universities of excellence.

    In 2014/15, the KIT concentrated on an overarching strategy process to further develop its corporate strategy. This mission statement as the result of a participative process was the first element to be incorporated in the strategy process.

    Mission Statement of KIT

    KIT combines the traditions of a renowned technical university and a major large-scale research institution in a very unique way. In research and education, KIT assumes responsibility for contributing to the sustainable solution of the grand challenges that face the society, industry, and the environment. For this purpose, KIT uses its financial and human resources with maximum efficiency. The scientists of KIT communicate the contents and results of their work to society.

    Engineering sciences, natural sciences, the humanities, and social sciences make up the scope of subjects covered by KIT. In high interdisciplinary interaction, scientists of these disciplines study topics extending from the fundamentals to application and from the development of new technologies to the reflection of the relationship between man and technology. For this to be accomplished in the best possible way, KIT’s research covers the complete range from fundamental research to close-to-industry, applied research and from small research partnerships to long-term large-scale research projects. Scientific sincerity and the striving for excellence are the basic principles of our activities.

    Worldwide exchange of knowledge, large-scale international research projects, numerous global cooperative ventures, and cultural diversity characterize and enrich the life and work at KIT. Academic education at KIT is guided by the principle of research-oriented teaching. Early integration into interdisciplinary research projects and international teams and the possibility of using unique research facilities open up exceptional development perspectives for our students.

    The development of viable technologies and their use in industry and the society are the cornerstones of KIT’s activities. KIT supports innovativeness and entrepreneurial culture in various ways. Moreover, KIT supports a culture of creativity, in which employees and students have time and space to develop new ideas.

    Cooperation of KIT employees, students, and members is characterized by mutual respect and trust. Achievements of every individual are highly appreciated. Employees and students of KIT are offered equal opportunities irrespective of the person. Family-friendliness is a major objective of KIT as an employer. KIT supports the compatibility of job and family. As a consequence, the leadership culture of KIT is also characterized by respect and cooperation. Personal responsibility and self-motivation of KIT employees and members are fostered by transparent and participative decisions, open communication, and various options for life-long learning.

    The structure of KIT is tailored to its objectives in research, education, and innovation. It supports flexible, synergy-based cooperation beyond disciplines, organizations, and hierarchies. Efficient services are rendered to support KIT employees and members in their work.

    Young people are our future. Reliable offers and career options excellently support KIT’s young scientists and professionals in their professional and personal development.

  • richardmitnick 10:46 am on February 20, 2021 Permalink | Reply
    Tags: "Physicists Propose a 'Force Field' to Protect Sensitive Quantum Computers From Noise", "Synthetic magnetic field", A promising method for ensuring a qubit stays fuzzy long enough to be useful is to entangle it with other qubits located elsewhere., , Back in 2001 a trio of researchers - Daniel Gottesman; Alexeir Kitaev; and John Preskill - formulated a way to encode this kind of protection into a space as an intrinsic feature of the circuitry., , One way to reduce the risk of “noise” is to build in checks and balances that help to shield the blurred state of reality at the core of quantum computers., , Qubits, RWTH Aachen University [ Rheinisch-Westfälische Technische Hochschule Aache](DE), , The basis for the design is a concept that's nearly 20 years old., This "noise" only gets worse as we grow devices to include more qubits., Too much 'noise' and the delicate state of the system collapses leaving you with a very expensive paperweight.   

    From RWTH Aachen University [ Rheinisch-Westfälische Technische Hochschule Aache](DE) via Science Alert(AU): “Physicists Propose a ‘Force Field’ to Protect Sensitive Quantum Computers From Noise” 

    From RWTH Aachen University [ Rheinisch-Westfälische Technische Hochschule Aache](DE)



    Science Alert(AU)

    19 FEBRUARY 2021

    Credit: oxygen/Moment/Getty Images.

    Creating a quantum computer requires an ability to stroke the edges of reality with the quietest of touches. Too much ‘noise’ and the delicate state of the system collapses, leaving you with a very expensive paperweight.

    One way to reduce the risk of this occurring is to build in checks and balances that help to shield the blurred state of reality at the core of quantum computers – and now scientists have proposed a new way to do just that.

    Theoretical physicists from RWTH Aachen University [ Rheinisch-Westfälische Technische Hochschule Aache](DE) have proposed what’s known as a “synthetic magnetic field”, which they think could help protect the fragile qubits needed in a quantum computer.

    “We have designed a circuit composed of state-of-the-art superconducting circuit elements and a nonreciprocal device, that can be used to passively implement the GKP quantum error-correcting code,” the team writes in Physical Review X.

    The basis for the design is a concept that’s nearly 20 years old (we’ll get to that in a moment), one that simply isn’t feasible based on its requirement of impossibly strong magnetic fields. The new approach attempts to get around this issue.

    Instead of the solid, bit-based language of 1s and 0s that informs the operations of your smartphone or desktop, quantum computing relies on a less binary, and far less definitive approach to crunching numbers.

    Quantum bits, or qubits, are individual units of its language based on the probability of quantum mechanics. String enough together and their seemingly random tumbling sets the foundations for a different unique approach to problem solving.

    A qubit is an odd creature though, something that has no real equivalent in our day-to-day experience. Unobserved, it could be simultaneously in the position of 1, 0, or both. But as soon as you look at it, the qubit settles into a single, more mundane state.

    In physics, this act of looking doesn’t even need to be an intentional stare. The buzz of electromagnetic radiation, a stray bump of a neighbouring particle… and that qubit can quickly find itself part of the scenery, losing its essential powers of probability.

    This ‘noise’ only gets worse as we grow devices to include more qubits, something that is necessary to make quantum computers powerful enough to be capable of the high-level processing we expect of them.

    A promising method for ensuring a qubit stays fuzzy long enough to be useful is to entangle it with other qubits located elsewhere, meaning its probabilities are now dependent on other, equally fuzzy particles sitting in zones unlikely to be slammed by the same noise.

    If that’s done right, engineers can ensure a level of quantum error correction – an insurance scheme that allows the qubit to cope with the occasional shake, rattle, and roll of surrounding noise.

    And this is where we return to the new paper. Back in 2001, a trio of researchers – Daniel Gottesman, Alexeir Kitaev, and John Preskill – formulated a way to encode this kind of protection into a space as an intrinsic feature of the circuitry holding the qubits, potentially allowing for slimmer hardware.

    It became known as the Gottesman-Kitaev-Preskill (GKP) code. There was just one problem – the GKP code relied on confining an electron to just two dimensions using intense, large magnetic fields in a way that just isn’t practical. What’s more, processes for detecting and recovering from errors are also fairly complicated, demanding even more chunks of hardware.

    To really get the most out of the GKP code’s benefits, quantum engineers would need a more passive, hands-off approach for shielding and recovering a qubit’s information from noise.

    So in this innovative new proposal, physicists suggest replacing the impossibly large magnetic field with a superconducting circuit comprising of components that serve much the same purpose, ironing out the noise.

    The technicalities of the setup aren’t for general reading, but Anja Metelmann at APS Physics does a top job of going through them step-by-step for those eager for details.

    For it to work, there would need to be a way for photons – effectively ripples in the electromagnetic field that carry the electron’s forces – to be manipulated by that very field. Given the photon’s neutrality, this just isn’t a possibility.

    There is a workaround, though. In recent years physicists have found a way to control photons so they can be channelled like electrons, by manipulating the optics of a space so it takes on certain magnetic-like characteristics.

    So-called synthetic magnetic fields permit photons to be directed, giving engineers a way to craft devices in which light waves can be forced to behave more like a current.

    The new paper lays out a way to use this synthetic magnetic field to protect a theoretical single electron in a crystal, confined to a 2D plane. When they ran calculations to see how it would react when subjected to a strong, real magnetic field, which usually would interfere with the system, they showed that their new set-up could protect it.

    “We find that the circuit is naturally protected against the common noise channels in superconducting circuits, such as charge and flux noise, implying that it can be used for passive quantum error correction,” the team explains in their paper.

    Before we get a working prototype of this quantum error-correcting machinery, there are plenty of kinks to work out experimentally. It’s all good on paper, but left to be seen if the technology does cooperate as expected.

    In time, we might have a relatively simple device that turns an impractical – but otherwise efficient – concept for scaling up quantum computers into a real possibility, opening the way for error tolerant technology that has until now been mostly theoretical.

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    RWTH Aachen University [ Rheinisch-Westfälische Technische Hochschule Aache](DE) is a public research university located in Aachen, North Rhine-Westphalia, Germany. With more than 45,000 students enrolled in 144 study programs, it is the largest technical university in Germany.

    In 2007, RWTH Aachen was chosen by the DFG as one of nine German Universities of Excellence for its future concept RWTH 2020: Meeting Global Challenges and additionally won funding for one graduate school and three clusters of excellence.

    RWTH Aachen is a founding member of IDEA League, a strategic alliance of five leading universities of technology in Europe. The university is also a member of TU9, DFG (Deutsche Forschungsgemeinschaft) and the Top Industrial Managers for Europe network.

    On 25 January 1858, prince Frederick William of Prussia (later German emperor), was given a donation of 5,000 talers from the Aachener und Münchener Feuer-Versicherungs-Gesellschaft, the precursor of the AachenMünchener insurance company, for charity. In March, the prince chose to use the donation to found the first Prussian institute of technology somewhere in the Rhine province. The seat of the institution remained undecided over years; while the prince initially favored Koblenz, the cities of Aachen, Bonn, Cologne and Düsseldorf also applied, with Aachen and Cologne being the main competitors. Aachen finally won with a financing concept backed by the insurance company and by local banks.

    Groundbreaking for the new Polytechnikum took place on 15 May 1865 and lectures started during the Franco-Prussian War on 10 October 1870 with 223 students and 32 teachers. The new institution had as its primary purpose the education of engineers, especially for the mining industry in the Ruhr area; there were schools of chemistry, electrical and mechanical engineering as well as an introductory general school that taught mathematics and natural sciences and some social sciences.
    Main Building of the RWTH Aachen. It was built in 1870.

    The unclear position of the new Prussian polytechnika (which officially were not universities) affected the first years. Polytechnics lacked prestige in society and the number of students decreased. This began to change in 1880 when the early RWTH, amongst others, was reorganized as a Royal Technical University, gained a seat in the Prussian House of Lords and finally won the right to bestow Dr. (1899) degrees and Diplomat titles (introduced in 1902). In the same year, over 800 male students enrolled. In 1909 the first women were admitted and the artist August von Brandis succeeded Alexander Frenz at the Faculty of Architecture as a “professor of figure and landscape painting”, Brandis became dean in 1929.

    World War I, however, proved a serious setback for the university. Many students voluntarily joined up and died in the war, and parts of the university were shortly occupied or confiscated.

    While the (then no more royal) TH Aachen (Technische Hochschule Aachen) flourished in the 1920s with the introduction of more independent faculties, of several new institutes and of the general students’ committee, the first signs of nationalist radicalization also became visible within the university. The Third Reich’s Gleichschaltung of the TH in 1933 met with relatively low resistance from both students and faculty. Beginning in September 1933, Jewish and (alleged) Communist professors (and from 1937 on also students) were systematically persecuted and excluded from the university. Vacant Chairs were increasingly given to NSDAP party-members or sympathizers. The freedom of research and teaching became severely limited, and institutes important for the regime’s plans were systematically established, and existing chairs promoted. Briefly closed in 1939, the TH continued courses in 1940, although with a low number of students. On 21 October 1944, when Aachen capitulated, more than 70% of all buildings of the university were destroyed or heavily damaged.

    After World War II ended in 1945 the university recovered and expanded quickly. In the 1950s, many professors who had been removed because of their alleged affiliation with the Nazi party were allowed to return and a multitude of new institutes were founded. By the late 1960s, the TH had 10,000 students, making it the foremost of all German technical universities. With the foundation of philosophical and medical faculties in 1965 and 1966, respectively, the university became more “universal”. The newly founded faculties in particular began attracting new students, and the number of students almost doubled twice from 1970 (10,000) to 1980 (more than 25,000) and from 1980 to 1990 (more than 37,000). Now, the average number of students is around 42,000, with about one third of all students being women. By relative terms, the most popular study-programs are engineering (57%), natural science (23%), economics and humanities (13%) and medicine (7%).

    Recent developments

    “Red lecture hall” at the central campus

    In December 2006, RWTH Aachen and the Sultanate of Oman signed an agreement to establish a private German University of Technology in Muscat. Professors from Aachen aided in developing the curricula for the currently five study-programs and scientific staff took over some of the first courses.

    In 2007, RWTH Aachen was chosen as one of nine German Universities of Excellence for its future concept RWTH 2020: Meeting Global Challenges, earning it the connotation of being a “University of Excellence”. However, although the list of universities honored for their future concepts mostly consists of large and already respected institutions, the Federal Ministry of Education and Research claimed that the initiative aimed at promoting universities with a dedicated future concept so they could continue researching on an international level. Having won funds in all three lines of funding, the process brought RWTH Aachen University an additional total funding of € 180 million from 2007–2011. The other two lines of funding were graduate schools, where the Aachen Institute for Advanced Study in Computational Engineering Science received funding and so-called “clusters of excellence”, where RWTH Aachen managed to win funding for the three clusters: Ultra High-Speed Mobile Information and Communication (UMIC), Integrative Production Technology for High-wage Countries and Tailor-Made Fuels from Biomass (TMFB).

    RWTH was selected to receive funding from the German federal and state governments for the third Universities of Excellence funding line starting 2019. RWTH’s proposal was called “The Integrated Interdisciplinary University of Science and Technology – Knowledge. Impact. Networks.” and has secured funding for a seven-year period.

    2019 Clusters of Excellence

    The Fuel Science Center (FSC) Adaptive Conversion Systems for Renewable Energy and Carbon Sources
    Internet of Production
    ML4Q – Matter and Light for Quantum Computing

    RWTH was already awarded funding in the first and second Universities of Excellence funding lines, in 2007 and 2012 respectively.

  • richardmitnick 11:56 am on February 10, 2021 Permalink | Reply
    Tags: "Quantum Photons", Another approach developed more recently is to use a photon as an optical qubit to encode quantum information., , Most efforts to build quantum computers have relied on qubits created in superconducting wires chilled to near absolute zero or on trapped ions held in place by lasers., , Qubits, To place an entire quantum photonics system onto a chip measuring about one square centimeter would be a tremendous achievement.,   

    From UC Santa Barbara: “Quantum Photons” 

    UC Santa Barbara Name bloc
    From UC Santa Barbara

    February 9, 2021

    Contact Info:
    Shelly Leachman
    (805) 893-8726

    Written by James Badham

    Galan Moody receives a new grant to develop a testbed for photonic-based quantum computing.

    Concept illustration depicting an integrated photonic quantum processor: Laser light coupled into the channels interacts with the rings (foreground) to create pairs of entangled photons (red). The entangled photons split and travel throughout the photonic circuit (background), which controls effective interactions between them, enabling optical quantum computations. Credit: Lillian McKinney.

    Classical computing is built on the power of the bit, which is, in essence, a micro transistor on a chip that can be either on or off, representing a 1 or a 0 in binary code. The quantum computing equivalent is the qubit. Unlike bits, qubits can exist in more than one “state” at a time, enabling quantum computers to perform computational functions exponentially faster than can classical computers.

    To date, most efforts to build quantum computers have relied on qubits created in superconducting wires chilled to near absolute zero or on trapped ions held in place by lasers. But those approaches face certain challenges, most notably that the qubits are highly sensitive to environmental factors. As the number of qubits increases, those factors are more likely to compound and interrupt the entanglement of qubits required for a quantum computer to work.

    Another approach, developed more recently, is to use a photon as an optical qubit to encode quantum information and to integrate the components necessary for that process into a photonic integrated circuit (PIC). Galan Moody, an assistant professor in the UC Santa Barbara College of Engineering’s Department of Electrical and Computer Engineering (ECE), has received a Defense University Research Instrumentation Program (DURIP) Award from the U.S. Department of Defense and the Air Force Office of Scientific Research to build a quantum photonic computing testbed. He will conduct his research in a lab set aside for such activity in recently completed Henley Hall, the new home of the College of Engineering’s Institute for Energy Efficiency (IEE).

    The grant supports the development or acquisition of new instrumentation to be used in fundamental and applied research across all areas of science and engineering. “My field is quantum photonics, so we’re working to develop new types of quantum light sources and ways to manipulate and detect quantum states of light for use in such applications as quantum photonic computing and quantum communications,” Moody said.

    “At a high level,” he explained, the concept of quantum photonic computing is “exactly the same as what Google is doing with superconducting qubits or what other companies are doing with trapped ions. There are a lot of different platforms for computing, and one of them is to use photonic integrated circuits to generate entangled photons, entanglement being the foundation for many different quantum applications.”

    To place an entire quantum photonics system onto a chip measuring about one square centimeter would be a tremendous achievement. Fortunately, the well-developed photonics infrastructure — including AIM Photonics, which has a center at UCSB led by ECE professor and photonics pioneer John Bowers, also director of the IEE — lends itself to that pursuit and to scaling up whatever quantum photonics platform is most promising. Photonics for classical applications is a mature technology industry that, Moody said, “has basically mastered large-scale and wafer-scale fabrication of devices.”

    It is reliable, so whatever Moody and his team design, they can fabricate themselves or even order from foundries, knowing they will get exactly what they want.

    The Photonic Edge

    The process of creating photonic qubits begins with generating high-quality single photons or pairs of entangled photons. A qubit can then be defined in several different ways, most often in the photon’s polarization (the orientation of the optical wave) or in the path that the photons travel. Moody and his team can create PICs that control these aspects of the photons, which become the carriers of quantum information and can be manipulated to perform logic operations.

    The approach has several advantages over other methods of creating qubits. For instance, the aforementioned environmental effects that can cause qubits to lose their coherence do not affect coherence in photons, which, Moody says, “can maintain that entanglement for a very long time. The challenge is not coherence but, rather, getting the photons to become entangled in the first place.”

    “That,” Moody notes, “is because photons don’t naturally interact; rather, they pass right through each other and go their separate ways. But they have to interact in some way to create an entangled state. We’re working on how to create PIC-based quantum light sources that produce high-quality photons as efficiently as possible and then how to get all the photons to interact in a way that allows us to build a scalable quantum processor or new devices for long-distance quantum communications.”

    Quantum computers are super efficient, and the photonics approach to quantum technologies is even more so. When Google “demonstrated quantum supremacy” in fall 2019 using the quantum computer built in its Goleta laboratory under the leadership of UCSB physics professor John Martinis, the company claimed that its machine, named Sycamore, could do a series of test calculations in 200 seconds that a super-computer would need closer to 10,000 years to complete. Recently, a Chinese team using a laboratory-scale table-top experiment claimed that, with a photon-based quantum processor, “You could do in two hundred seconds what would take a super-computer 2.5 billion years to accomplish,” Moody said.

    Another advantage is that photonics is naturally scalable to thousands and, eventually, millions of components, which can be done by leveraging the wafer-scale fabrication technologies developed for classical photonics. Today, the most advanced PICs comprise nearly five thousand components and could be expanded by a factor of two or four with existing fabrication technologies, a stage of development comparable to that of digital electronics in the 1960s and 1970s. “Even a few hundred components are enough to perform important quantum computing operations with light, at least on a small scale between a few qubits,” said Moody. With further development, quantum photonic chips can be scaled to tens or hundreds of qubits using the existing photonics infrastructure.

    Moody’s team is developing a new materials platform, based on gallium arsenide and silicon dioxide, to generate single and entangled photons, and it promises to be much more efficient than comparable systems. In fact, they have a forthcoming paper showing that their new quantum light source is nearly a thousand times more efficient than any other on-chip light source.

    In terms of the process, Moody says, “At the macro level, we work on making better light sources and integrating many of them onto a chip. Then, we combine these with on-chip programmable processors, analogous to electronic transistors used for classical logic operations, and with arrays of single-photon detectors to try to implement quantum logic operations with photons as efficiently as possible.”

    For more accessible applications, like communications, no computing need occur. “It involves taking a great light source and manipulating a property of the photon states (such as polarization), then sending those off to some other chip that’s up in a satellite or in some other part of the world, which can measure the photons and send a signal back that you can collect,” Moody said.

    One catch, for now, is that the single-photon detectors, which are used to signal whether the logic operations were performed, work with very high efficiency when they are on the chip; however, some of them work only if the chip is cooled to cryogenic temperatures.

    “If we want to integrate everything on chip and put detectors on chip as well, then we’re going to need to cool the whole thing down,” Moody said. “We’re going to build a setup to be able to do that and test the various quantum photonic components designed and fabricated for this. The DURIP award enables exactly this: developing the instrumentation to be able to test large-scale quantum photonic chips from cryogenic temperatures all the way up to room temperature.”

    There are also challenges associated with cooling the chip to cryogenic temperatures. Said Moody, “It’s getting this whole platform up and running, interfacing the instrumentation, and making all the custom parts we need to be able to look at large-scale photonic chips for quantum applications at cryogenic temperatures.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education CoalitionUC Santa Barbara Seal
    The University of California, Santa Barbara (commonly referred to as UC Santa Barbara or UCSB) is a public research university and one of the 10 general campuses of the University of California system. Founded in 1891 as an independent teachers’ college, UCSB joined the University of California system in 1944 and is the third-oldest general-education campus in the system. The university is a comprehensive doctoral university and is organized into five colleges offering 87 undergraduate degrees and 55 graduate degrees. In 2012, UCSB was ranked 41st among “National Universities” and 10th among public universities by U.S. News & World Report. UCSB houses twelve national research centers, including the renowned Kavli Institute for Theoretical Physics.

  • richardmitnick 9:44 pm on February 9, 2021 Permalink | Reply
    Tags: A Monte Carlo simulation technique, , D-Wave processors are now being used to simulate magnetic systems of practical interest., , , Fractional magnetization plateaus, Magnetic structure, , Quantum annealing-a form of quantum computing, , Quantum Science Center-a DOE Quantum Information Science Research Center established at ORNL in 2020., Qubits, Shastry-Sutherland Ising model, , Their novel simulations will serve as a foundation to streamline future efforts on next-generation quantum computers.   

    From DOE’s Oak Ridge National Laboratory: “Quantum computing enables simulations to unravel mysteries of magnetic materials” 

    From DOE’s Oak Ridge National Laboratory

    February 9, 2021
    Scott S Jones

    The researchers embedded a programmable model into a D-Wave quantum computer chip. Credit: D-Wave.

    A multi-institutional team became the first to generate accurate results from materials science simulations on a quantum computer that can be verified with neutron scattering experiments and other practical techniques.

    Researchers from the Department of Energy’s Oak Ridge National Laboratory; the University of Tennessee, Knoxville; Purdue University and D-Wave Systems harnessed the power of quantum annealing, a form of quantum computing, by embedding an existing model into a quantum computer.

    Characterizing materials has long been a hallmark of classical supercomputers, which encode information using a binary system of bits that are each assigned a value of either 0 or 1. But quantum computers — in this case, D-Wave’s 2000Q – rely on qubits, which can be valued at 0, 1 or both simultaneously because of a quantum mechanical capability known as superposition.

    “The underlying method behind solving materials science problems on quantum computers had already been developed, but it was all theoretical,” said Paul Kairys, a student at UT Knoxville’s Bredesen Center for Interdisciplinary Research and Graduate Education who led ORNL’s contributions to the project. “We developed new solutions to enable materials simulations on real-world quantum devices.”

    This unique approach proved that quantum resources are capable of studying the magnetic structure and properties of these materials, which could lead to a better understanding of spin liquids, spin ices and other novel phases of matter useful for data storage and spintronics applications. The researchers published the results of their simulations — which matched theoretical predictions and strongly resembled experimental data — in PRX Quantum.

    Eventually, the power and robustness of quantum computers could enable these systems to outperform their classical counterparts in terms of both accuracy and complexity, providing precise answers to materials science questions instead of approximations. However, quantum hardware limitations previously made such studies difficult or impossible to complete.

    To overcome these limitations, the researchers programmed various parameters into the Shastry-Sutherland Ising model. Because it shares striking similarities with the rare earth tetraborides, a class of magnetic materials, subsequent simulations using this model could provide substantial insights into the behavior of these tangible substances.

    “We are encouraged that the novel quantum annealing platform can directly help us understand materials with complicated magnetic phases, even those that have multiple defects,” said co-corresponding author Arnab Banerjee, an assistant professor at Purdue. “This capability will help us make sense of real material data from a variety of neutron scattering, magnetic susceptibility and heat capacity experiments, which can be very difficult otherwise.”

    Using the D-Wave chip (foreground), the team simulated the experimental signature of a sample material(background), producing results that are directly comparable to the output from real-world experiments. Credit: Paul Kairys/UT Knoxville.

    Magnetic materials can be described in terms of magnetic particles called spins. Each spin has a preferred orientation based on the behavior of its neighboring spins, but rare earth tetraborides are frustrated, meaning these orientations are incompatible with each other. As a result, the spins are forced to compromise on a collective configuration, leading to exotic behavior such as fractional magnetization plateaus. This peculiar behavior occurs when an applied magnetic field, which normally causes all spins to point in one direction, affects only some spins in the usual way while others point in the opposite direction instead.

    Using a Monte Carlo simulation technique powered by the quantum evolution of the Ising model, the team evaluated this phenomenon in microscopic detail.

    “We came up with new ways to represent the boundaries, or edges, of the material to trick the quantum computer into thinking that the material was effectively infinite, and that turned out to be crucial for correctly answering materials science questions,” said co-corresponding author Travis Humble. Humble is an ORNL researcher and deputy director of the Quantum Science Center, or QSC, a DOE Quantum Information Science Research Center established at ORNL in 2020. The individuals and institutions involved in this research are QSC members.

    Quantum resources have previously simulated small molecules to examine chemical or material systems. Yet, studying magnetic materials that contain thousands of atoms is possible because of the size and versatility of D-Wave’s quantum device.

    “D-Wave processors are now being used to simulate magnetic systems of practical interest, resembling real compounds. This is a big deal and takes us from the notepad to the lab,” said Andrew King, director of performance research at D-Wave. “The ultimate goal is to study phenomena that are intractable for classical computing and outside the reach of known experimental methods.”

    The researchers anticipate that their novel simulations will serve as a foundation to streamline future efforts on next-generation quantum computers. In the meantime, they plan to conduct related research through the QSC, from testing different models and materials to performing experimental measurements to validate the results.

    “We completed the largest simulation possible for this model on the largest quantum computer available at the time, and the results demonstrated the significant promise of using these techniques for materials science studies going forward,” Kairys said.

    This work was funded by the DOE Office of Science Early Career Research Program. Access to the D-Wave 2000Q system was provided through the Quantum Computing User Program managed by the Oak Ridge Leadership Computing Facility, a DOE Office of Science user facility located at ORNL.

    Research performed at ORNL’s Spallation Neutron Source, also a DOE Office of Science user facility located at ORNL, was supported by the DOE Office of Science.

    ORNL Spallation Neutron Source.

    ORNL Spallation Neutron Source annotated.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.


Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: