Tagged: Qubits Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:57 pm on August 3, 2021 Permalink | Reply
    Tags: A tedious hours-long process has been cut down to seconds and LFET is the first scalable transport and on-demand assembly technology of its kind., , , , LFET: low frequency electrothermoplasmonic tweezer, , , , Quantum photonics applications, Qubits, The scientists set out to make trapping and manipulating nanodiamonds simpler by using an interdisciplinary approach., The tweezer-a low frequency electrothermoplasmonic tweezer (LFET)-combines a fraction of a laser beam with a low-frequency alternating current electric field., This is an entirely new mechanism to trap and move nanodiamonds.   

    From Vanderbilt University (US) : “Research Snapshot: Vanderbilt engineer the first to introduce low-power dynamic manipulation of single nanoscale quantum objects” 

    Vanderbilt U Bloc

    From Vanderbilt University (US)

    Jul. 30, 2021
    Marissa Shapiro

    1
    Low frequency electrothermoplasmonic tweezer device rendering. (Ndukaife.)

    THE IDEA

    Led by Justus Ndukaife, assistant professor of electrical engineering, Vanderbilt researchers are the first to introduce an approach for trapping and moving a nanomaterial known as a single colloidal nanodiamond with nitrogen-vacancy center using low power laser beam. The width of a single human hair is approximately 90,000 nanometers; nanodiamonds are less than 100 nanometers. These carbon-based materials are one of the few that can release the basic unit of all light—a single photon—a building block for future quantum photonics applications, Ndukaife explains.

    Currently it is possible to trap nanodiamonds using light fields focused near nano-sized metallic surfaces, but it is not possible to move them that way because laser beam spots are simply too big. Using an atomic force microscope, it takes scientists hours to push nanodiamonds into place one at a time near an emission enhancing environment to form a useful structure. Further, to create entangled sources and qubits—key elements that improve the processing speeds of quantum computers—several nanodiamond emitters are needed close together so that they can interact to make qubits, Ndukaife said.

    “We set out to make trapping and manipulating nanodiamonds simpler by using an interdisciplinary approach,” Ndukaife said. “Our tweezer-a low frequency electrothermoplasmonic tweezer (LFET)-combines a fraction of a laser beam with a low-frequency alternating current electric field. This is an entirely new mechanism to trap and move nanodiamonds.” A tedious hours-long process has been cut down to seconds and LFET is the first scalable transport and on-demand assembly technology of its kind.

    WHY IT MATTERS

    Ndukaife’s work is a key ingredient for quantum computing, a technology that will soon enable a huge number of applications from high resolution imaging to the creation of unhackable systems and ever smaller devices and computer chips. In 2019, the Department of Energy invested $60.7 million in funding to advance the development of quantum computing and networking.

    “Controlling nanodiamonds to make efficient single photon sources that can be used for these kinds of technologies will shape the future,” Ndukaife said. “To enhance quantum properties, it is essential to couple quantum emitters such as nanodiamonds with nitrogen-vacancy centers to nanophotonic structures.”

    WHAT’S NEXT

    Ndukaife intends to further explore nanodiamonds, arranging them onto nanophotonic structures designed to enhance their emission performance. With them in place, his lab will explore the possibilities for ultrabright single photon sources and entanglement in an on-chip platform for information processing and imaging.

    “There are so many things we can use this research to build upon,” Ndukaife said. “This is the first technique that allows us to dynamically manipulate single nanoscale objects in two dimensions using a low power laser beam.”

    Science paper:
    Nano Letters

    Coauthored by graduate students in Ndukaife’s lab, Chuchuan Hong and Sen Yang, as well as their collaborator, Ivan Kravchenko at DOE’s Oak Ridge National Laboratory (US).

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Commodore Cornelius Vanderbilt was in his 79th year when he decided to make the gift that founded Vanderbilt University (US) in the spring of 1873.
    The $1 million that he gave to endow and build the university was the commodore’s only major philanthropy. Methodist Bishop Holland N. McTyeire of Nashville, husband of Amelia Townsend who was a cousin of the commodore’s young second wife Frank Crawford, went to New York for medical treatment early in 1873 and spent time recovering in the Vanderbilt mansion. He won the commodore’s admiration and support for the project of building a university in the South that would “contribute to strengthening the ties which should exist between all sections of our common country.”

    McTyeire chose the site for the campus, supervised the construction of buildings and personally planted many of the trees that today make Vanderbilt a national arboretum. At the outset, the university consisted of one Main Building (now Kirkland Hall), an astronomical observatory and houses for professors. Landon C. Garland was Vanderbilt’s first chancellor, serving from 1875 to 1893. He advised McTyeire in selecting the faculty, arranged the curriculum and set the policies of the university.

    For the first 40 years of its existence, Vanderbilt was under the auspices of the Methodist Episcopal Church, South. The Vanderbilt Board of Trust severed its ties with the church in June 1914 as a result of a dispute with the bishops over who would appoint university trustees.

    From the outset, Vanderbilt met two definitions of a university: It offered work in the liberal arts and sciences beyond the baccalaureate degree and it embraced several professional schools in addition to its college. James H. Kirkland, the longest serving chancellor in university history (1893-1937), followed Chancellor Garland. He guided Vanderbilt to rebuild after a fire in 1905 that consumed the main building, which was renamed in Kirkland’s honor, and all its contents. He also navigated the university through the separation from the Methodist Church. Notable advances in graduate studies were made under the third chancellor, Oliver Cromwell Carmichael (1937-46). He also created the Joint University Library, brought about by a coalition of Vanderbilt, Peabody College and Scarritt College.

    Remarkable continuity has characterized the government of Vanderbilt. The original charter, issued in 1872, was amended in 1873 to make the legal name of the corporation “The Vanderbilt University.” The charter has not been altered since.

    The university is self-governing under a Board of Trust that, since the beginning, has elected its own members and officers. The university’s general government is vested in the Board of Trust. The immediate government of the university is committed to the chancellor, who is elected by the Board of Trust.

    The original Vanderbilt campus consisted of 75 acres. By 1960, the campus had spread to about 260 acres of land. When George Peabody College for Teachers merged with Vanderbilt in 1979, about 53 acres were added.

    Vanderbilt’s student enrollment tended to double itself each 25 years during the first century of the university’s history: 307 in the fall of 1875; 754 in 1900; 1,377 in 1925; 3,529 in 1950; 7,034 in 1975. In the fall of 1999 the enrollment was 10,127.

    In the planning of Vanderbilt, the assumption seemed to be that it would be an all-male institution. Yet the board never enacted rules prohibiting women. At least one woman attended Vanderbilt classes every year from 1875 on. Most came to classes by courtesy of professors or as special or irregular (non-degree) students. From 1892 to 1901 women at Vanderbilt gained full legal equality except in one respect — access to dorms. In 1894 the faculty and board allowed women to compete for academic prizes. By 1897, four or five women entered with each freshman class. By 1913 the student body contained 78 women, or just more than 20 percent of the academic enrollment.

    National recognition of the university’s status came in 1949 with election of Vanderbilt to membership in the select Association of American Universities (US). In the 1950s Vanderbilt began to outgrow its provincial roots and to measure its achievements by national standards under the leadership of Chancellor Harvie Branscomb. By its 90th anniversary in 1963, Vanderbilt for the first time ranked in the top 20 private universities in the United States.

    Vanderbilt continued to excel in research, and the number of university buildings more than doubled under the leadership of Chancellors Alexander Heard (1963-1982) and Joe B. Wyatt (1982-2000), only the fifth and sixth chancellors in Vanderbilt’s long and distinguished history. Heard added three schools (Blair, the Owen Graduate School of Management and Peabody College) to the seven already existing and constructed three dozen buildings. During Wyatt’s tenure, Vanderbilt acquired or built one-third of the campus buildings and made great strides in diversity, volunteerism and technology.

    The university grew and changed significantly under its seventh chancellor, Gordon Gee, who served from 2000 to 2007. Vanderbilt led the country in the rate of growth for academic research funding, which increased to more than $450 million and became one of the most selective undergraduate institutions in the country.

    On March 1, 2008, Nicholas S. Zeppos was named Vanderbilt’s eighth chancellor after serving as interim chancellor beginning Aug. 1, 2007. Prior to that, he spent 2002-2008 as Vanderbilt’s provost, overseeing undergraduate, graduate and professional education programs as well as development, alumni relations and research efforts in liberal arts and sciences, engineering, music, education, business, law and divinity. He first came to Vanderbilt in 1987 as an assistant professor in the law school. In his first five years, Zeppos led the university through the most challenging economic times since the Great Depression, while continuing to attract the best students and faculty from across the country and around the world. Vanderbilt got through the economic crisis notably less scathed than many of its peers and began and remained committed to its much-praised enhanced financial aid policy for all undergraduates during the same timespan. The Martha Rivers Ingram Commons for first-year students opened in 2008 and College Halls, the next phase in the residential education system at Vanderbilt, is on track to open in the fall of 2014. During Zeppos’ first five years, Vanderbilt has drawn robust support from federal funding agencies, and the Medical Center entered into agreements with regional hospitals and health care systems in middle and east Tennessee that will bring Vanderbilt care to patients across the state.

    Today, Vanderbilt University is a private research university of about 6,500 undergraduates and 5,300 graduate and professional students. The university comprises 10 schools, a public policy center and The Freedom Forum First Amendment Center. Vanderbilt offers undergraduate programs in the liberal arts and sciences, engineering, music, education and human development as well as a full range of graduate and professional degrees. The university is consistently ranked as one of the nation’s top 20 universities by publications such as U.S. News & World Report, with several programs and disciplines ranking in the top 10.

    Cutting-edge research and liberal arts, combined with strong ties to a distinguished medical center, creates an invigorating atmosphere where students tailor their education to meet their goals and researchers collaborate to solve complex questions affecting our health, culture and society.

    Vanderbilt, an independent, privately supported university, and the separate, non-profit Vanderbilt University Medical Center share a respected name and enjoy close collaboration through education and research. Together, the number of people employed by these two organizations exceeds that of the largest private employer in the Middle Tennessee region.

     
  • richardmitnick 11:24 am on June 2, 2021 Permalink | Reply
    Tags: "UArizona Engineers Demonstrate a Quantum Advantage", , How (and When) Quantum Works, Quantum computing and quantum sensing have the potential to be vastly more powerful than their classical counterparts., , Qubits, The technology isn't quite there yet, UArizona College of Engineering, UArizona College of Optical Sciences,   

    From University of Arizona (US) : “UArizona Engineers Demonstrate a Quantum Advantage” 

    From University of Arizona (US)

    6.1.21

    Emily Dieckman
    College of Engineering
    edieckman@email.arizona.edu
    520-621-1992
    760-981-8808

    In a new paper, researchers in the College of Engineering and James C. Wyant College of Optical Sciences experimentally demonstrate how quantum resources aren’t just dreams for the distant future – they can improve the technology of today.

    1

    Quantum computing and quantum sensing have the potential to be vastly more powerful than their classical counterparts. Not only could a fully realized quantum computer take just seconds to solve equations that would take a classical computer thousands of years, but it could have incalculable impacts on areas ranging from biomedical imaging to autonomous driving.

    However, the technology isn’t quite there yet.

    In fact, despite widespread theories about the far-reaching impact of quantum technologies, very few researchers have been able to demonstrate, using the technology available now, that quantum methods have an advantage over their classical counterparts.

    In a paper published on June 1 in the journal Physical Review X, University of Arizona researchers experimentally show that quantum has an advantage over classical computing systems.

    2
    Quntao Zhuang (left), PI of the Quantum Information Theory Group, and Zheshen Zhang, PI of the Quantum Information and Materials Group, are both assistant professors in the College of Engineering.

    “Demonstrating a quantum advantage is a long-sought-after goal in the community, and very few experiments have been able to show it,” said paper co-author Zheshen Zhang, assistant professor of materials science and engineering, principal investigator of the UArizona Quantum Information and Materials Group and one of the paper’s authors. “We are seeking to demonstrate how we can leverage the quantum technology that already exists to benefit real-world applications.”

    How (and When) Quantum Works

    Quantum computing and other quantum processes rely on tiny, powerful units of information called qubits. The classical computers we use today work with units of information called bits, which exist as either 0s or 1s, but qubits are capable of existing in both states at the same time. This duality makes them both powerful and fragile. The delicate qubits are prone to collapse without warning, making a process called error correction – which addresses such problems as they happen – very important.

    The quantum field is now in an era that John Preskill, a renowned physicist from the California Institute of Technology (US), termed “noisy intermediate scale quantum,” or NISQ. In the NISQ era, quantum computers can perform tasks that only require about 50 to a few hundred qubits, though with a significant amount of noise, or interference. Any more than that and the noisiness overpowers the usefulness, causing everything to collapse. It is widely believed that 10,000 to several million qubits would be needed to carry out practically useful quantum applications.

    Imagine inventing a system that guarantees every meal you cook will turn out perfectly, and then giving that system to a group of children who don’t have the right ingredients. It will be great in a few years, once the kids become adults and can buy what they need. But until then, the usefulness of the system is limited. Similarly, until researchers advance the field of error correction, which can reduce noise levels, quantum computations are limited to a small scale.

    Entanglement Advantages

    The experiment described in the paper used a mix of both classical and quantum techniques. Specifically, it used three sensors to classify the average amplitude and angle of radio frequency signals.

    The sensors were equipped with another quantum resource called entanglement, which allows them to share information with one another and provides two major benefits: First, it improves the sensitivity of the sensors and reduces errors. Second, because they are entangled, the sensors evaluate global properties rather than gathering data about specific parts of a system. This is useful for applications that only need a binary answer; for example, in medical imaging, researchers don’t need to know about every single cell in a tissue sample that isn’t cancerous – just whether there’s one cell that is cancerous. The same concept applies to detecting hazardous chemicals in drinking water.

    The experiment demonstrated that equipping the sensors with quantum entanglement gave them an advantage over classical sensors, reducing the likelihood of errors by a small but critical margin.

    “This idea of using entanglement to improve sensors is not limited to a specific type of sensor, so it could be used for a range of different applications, as long as you have the equipment to entangle the sensors,” said study co-author Quntao Zhuang, assistant professor of electrical and computer engineering and principal investigator of the Quantum Information Theory Group”In theory, you could consider applications like lidar (Light Detection and Ranging) for self-driving cars, for example.”

    Zhuang and Zhang developed the theory behind the experiment and described it in a 2019 Physical Review X paper. They co-authored the new paper with lead author Yi Xia, a doctoral student in the James C. Wyant College of Optical Sciences, and Wei Li, a postdoctoral researcher in materials science and engineering.

    Qubit Classifiers

    There are existing applications that use a mix of quantum and classical processing in the NISQ era, but they rely on preexisting classical datasets that must be converted and classified in the quantum realm. Imagine taking a series of photos of cats and dogs, then uploading the photos into a system that uses quantum methods to label the photos as either “cat” or “dog.”

    The team is tackling the labeling process from a different angle, by using quantum sensors to gather their own data in the first place. It’s more like using a specialized quantum camera that labels the photos as either “dog” or “cat” as the photos are taken.

    “A lot of algorithms consider data stored on a computer disk, and then convert that into a quantum system, which takes time and effort,” Zhuang said. “Our system works on a different problem by evaluating physical processes that are happening in real time.”

    The team is excited for future applications of their work at the intersection of quantum sensing and quantum computing. They even envision one day integrating their entire experimental setup onto a chip that could be dipped into a biomaterial or water sample to identify disease or harmful chemicals.

    “We think it’s a new paradigm for both quantum computing, quantum machine learning and quantum sensors, because it really creates a bridge to interconnect all these different domains,” Zhang said.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    As of 2019, the University of Arizona (US) enrolled 45,918 students in 19 separate colleges/schools, including the UArizona College of Medicine in Tucson and Phoenix and the James E. Rogers College of Law, and is affiliated with two academic medical centers (Banner – University Medical Center Tucson and Banner – University Medical Center Phoenix). UArizona is one of three universities governed by the Arizona Board of Regents. The university is part of the Association of American Universities and is the only member from Arizona, and also part of the Universities Research Association(US). The university is classified among “R1: Doctoral Universities – Very High Research Activity”.

    Known as the Arizona Wildcats (often shortened to “Cats”), the UArizona’s intercollegiate athletic teams are members of the Pac-12 Conference of the NCAA. UArizona athletes have won national titles in several sports, most notably men’s basketball, baseball, and softball. The official colors of the university and its athletic teams are cardinal red and navy blue.

    After the passage of the Morrill Land-Grant Act of 1862, the push for a university in Arizona grew. The Arizona Territory’s “Thieving Thirteenth” Legislature approved the UArizona in 1885 and selected the city of Tucson to receive the appropriation to build the university. Tucson hoped to receive the appropriation for the territory’s mental hospital, which carried a $100,000 allocation instead of the $25,000 allotted to the territory’s only university (Arizona State University(US) was also chartered in 1885, but it was created as Arizona’s normal school, and not a university). Flooding on the Salt River delayed Tucson’s legislators, and by they time they reached Prescott, back-room deals allocating the most desirable territorial institutions had been made. Tucson was largely disappointed with receiving what was viewed as an inferior prize.

    With no parties willing to provide land for the new institution, the citizens of Tucson prepared to return the money to the Territorial Legislature until two gamblers and a saloon keeper decided to donate the land to build the school. Construction of Old Main, the first building on campus, began on October 27, 1887, and classes met for the first time in 1891 with 32 students in Old Main, which is still in use today. Because there were no high schools in Arizona Territory, the university maintained separate preparatory classes for the first 23 years of operation.

    Research

    UArizona is classified among “R1: Doctoral Universities – Very high research activity”. UArizona is the fourth most awarded public university by National Aeronautics and Space Administration(US) for research. UArizona was awarded over $325 million for its Lunar and Planetary Laboratory (LPL) to lead NASA’s 2007–08 mission to Mars to explore the Martian Arctic, and $800 million for its OSIRIS-REx mission, the first in U.S. history to sample an asteroid.

    The LPL’s work in the Cassini spacecraft orbit around Saturn is larger than any other university globally. The UArizona laboratory designed and operated the atmospheric radiation investigations and imaging on the probe. UArizona operates the HiRISE camera, a part of the Mars Reconnaissance Orbiter. While using the HiRISE camera in 2011, UArizona alumnus Lujendra Ojha and his team discovered proof of liquid water on the surface of Mars—a discovery confirmed by NASA in 2015. UArizona receives more NASA grants annually than the next nine top NASA/JPL-Caltech(US)-funded universities combined. As of March 2016, the UArizona’s Lunar and Planetary Laboratory is actively involved in ten spacecraft missions: Cassini VIMS; Grail; the HiRISE camera orbiting Mars; the Juno mission orbiting Jupiter; Lunar Reconnaissance Orbiter (LRO); Maven, which will explore Mars’ upper atmosphere and interactions with the sun; Solar Probe Plus, a historic mission into the Sun’s atmosphere for the first time; Rosetta’s VIRTIS; WISE; and OSIRIS-REx, the first U.S. sample-return mission to a near-earth asteroid, which launched on September 8, 2016.

    UArizona students have been selected as Truman, Rhodes, Goldwater, and Fulbright Scholars. According to The Chronicle of Higher Education, UArizona is among the top 25 producers of Fulbright awards in the U.S.

    UArizona is a member of the Association of Universities for Research in Astronomy(US), a consortium of institutions pursuing research in astronomy. The association operates observatories and telescopes, notably Kitt Peak National Observatory(US) just outside Tucson. Led by Roger Angel, researchers in the Steward Observatory Mirror Lab at UArizona are working in concert to build the world’s most advanced telescope. Known as the Giant Magellan Telescope(CL), it will produce images 10 times sharper than those from the Earth-orbiting Hubble Telescope.

    Giant Magellan Telescope, 21 meters, to be at the NOIRLab(US) National Optical Astronomy Observatory(US) Carnegie Institution for Science’s(US) Las Campanas Observatory(CL), some 115 km (71 mi) north-northeast of La Serena, Chile, over 2,500 m (8,200 ft) high.


    The telescope is set to be completed in 2021. GMT will ultimately cost $1 billion. Researchers from at least nine institutions are working to secure the funding for the project. The telescope will include seven 18-ton mirrors capable of providing clear images of volcanoes and riverbeds on Mars and mountains on the moon at a rate 40 times faster than the world’s current large telescopes. The mirrors of the Giant Magellan Telescope will be built at UArizona and transported to a permanent mountaintop site in the Chilean Andes where the telescope will be constructed.

    Reaching Mars in March 2006, the Mars Reconnaissance Orbiter contained the HiRISE camera, with Principal Investigator Alfred McEwen as the lead on the project. This National Aeronautics and Space Administration(US) mission to Mars carrying the UArizona-designed camera is capturing the highest-resolution images of the planet ever seen. The journey of the orbiter was 300 million miles. In August 2007, the UArizona, under the charge of Scientist Peter Smith, led the Phoenix Mars Mission, the first mission completely controlled by a university. Reaching the planet’s surface in May 2008, the mission’s purpose was to improve knowledge of the Martian Arctic. The Arizona Radio Observatory(US), a part of UArizona Department of Astronomy Steward Observatory(US), operates the Submillimeter Telescope on Mount Graham.

    The National Science Foundation(US) funded the iPlant Collaborative in 2008 with a $50 million grant. In 2013, iPlant Collaborative received a $50 million renewal grant. Rebranded in late 2015 as “CyVerse”, the collaborative cloud-based data management platform is moving beyond life sciences to provide cloud-computing access across all scientific disciplines.
    In June 2011, the university announced it would assume full ownership of the Biosphere 2 scientific research facility in Oracle, Arizona, north of Tucson, effective July 1. Biosphere 2 was constructed by private developers (funded mainly by Texas businessman and philanthropist Ed Bass) with its first closed system experiment commencing in 1991. The university had been the official management partner of the facility for research purposes since 2007.

    U Arizona mirror lab-Where else in the world can you find an astronomical observatory mirror lab under a football stadium?

    University of Arizona’s Biosphere 2, located in the Sonoran desert. An entire ecosystem under a glass dome? Visit our campus, just once, and you’ll quickly understand why the UA is a university unlike any other.

     
  • richardmitnick 9:42 am on March 19, 2021 Permalink | Reply
    Tags: "Magnetism Meets Topology on a Superconductor's Surface", Dirac point, , Electrons in a solid occupy distinct energy bands separated by gaps., Energy band gaps are an electronic “no man’s land”-an energy range where no electrons are allowed., One of the ways to break time-reversal symmetry is by developing magnetism—specifically ferromagnetism., , Qubits, Theory predicts that Majorana fermions (sought-after quasiparticles) existing in superconducting topological surface states are immune to environmental disturbances., This unusual electronic energy structure could be harnessed for technologies of interest in quantum information science and electronics., Time-reversal symmetry means that the laws of physics are the same whether you look at a system going forward or backward ., When a gap opens up at the Dirac point it’s evidence that time-reversal symmetry has been broken.   

    From DOE’s Brookhaven National Laboratory (US): “Magnetism Meets Topology on a Superconductor’s Surface” 

    From DOE’s Brookhaven National Laboratory (US)

    March 17, 2021

    Ariana Manglaviti
    amanglaviti@bnl.gov
    (631) 344-2347

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    This unusual electronic energy structure could be harnessed for technologies of interest in quantum information science and electronics.

    1
    An illustration depicting a topological surface state with an energy band gap (an energy range where electrons are forbidden) between the apices of the top and corresponding bottom cones (allowed energy bands, or the range of energies electrons are allowed to have). A topological surface state is a unique electronic state, only existing at the surface of a material, that reflects strong interactions between an electron’s spin (red arrow) and its orbital motion around an atom’s nucleus. When the electron spins align parallel to each another, as they do here, the material has a type of magnetism called ferromagnetism. Credit: Dan Nevola, DOE’s Brookhaven National Laboratory(US).

    Electrons in a solid occupy distinct energy bands separated by gaps. Energy band gaps are an electronic “no man’s land”-an energy range where no electrons are allowed. Now, scientists studying a compound containing iron, tellurium, and selenium have found that an energy band gap opens at a point where two allowed energy bands intersect on the material’s surface. They observed this unexpected electronic behavior when they cooled the material and probed its electronic structure with laser light. Their findings, reported in the PNAS, could have implications for future quantum information science and electronics.

    The particular compound belongs to the family of iron-based high-temperature superconductors, which were initially discovered in 2008. These materials not only conduct electricity without resistance at relatively higher temperatures (but still very cold ones) than other classes of superconductors but also show magnetic properties.

    “For a while, people thought that superconductivity and magnetism would work against each other,” said first author Nader Zaki, a scientific associate in the Electron Spectroscopy Group of the Condensed Matter Physics and Materials Science (CMPMS) Division at the DOE’s Brookhaven National Laboratory(US). “We have explored a material where both develop at the same time.”

    Aside from superconductivity and magnetism, some iron-based superconductors have the right conditions to host “topological” surface states. The existence of these unique electronic states, localized at the surface (they do not exist in the bulk of the material), reflects strong interactions between an electron’s spin and its orbital motion around the nucleus of an atom.

    “When you have a superconductor with topological surface properties, you’re excited by the possibility of topological superconductivity,” said corresponding author Peter Johnson, leader of the Electron Spectroscopy Group. “Topological superconductivity is potentially capable of supporting Majorana fermions, which could serve as qubits, the information-storing building blocks of quantum computers.”

    Quantum computers promise tremendous speedups for calculations that would take an impractical amount of time or be impossible on traditional computers. One of the challenges to realizing practical quantum computing is that qubits are highly sensitive to their environment. Small interactions cause them to lose their quantum state and thus stored information becomes lost. Theory predicts that Majorana fermions (sought-after quasiparticles) existing in superconducting topological surface states are immune to environmental disturbances making them an ideal platform for robust qubits.

    Seeing the iron-based superconductors as a platform for a range of exotic and potentially important phenomena, Zaki, Johnson, and their colleagues set out to understand the roles of topology, superconductivity and magnetism.

    CMPMS Division senior physicist Genda Gu first grew high-quality single crystals of the iron-based compound. Then, Zaki mapped the electronic band structure of the material via laser-based photoemission spectroscopy. When light from a laser is focused onto a small spot on the material, electrons from the surface are “kicked out” (i.e., photoemitted). The energy and momentum of these electrons can then be measured.

    When they lowered the temperature, something surprising happened.

    “The material went superconducting, as we expected, and we saw a superconducting gap associated with that,” said Zaki. “But what we didn’t expect was the topological surface state opening up a second gap at the Dirac point. You can picture the energy band structure of this surface state as an hourglass or two cones attached at their apex. Where these cones intersect is called the Dirac point.”

    As Johnson and Zaki explained, when a gap opens up at the Dirac point it’s evidence that time-reversal symmetry has been broken. Time-reversal symmetry means that the laws of physics are the same whether you look at a system going forward or backward in time—akin to rewinding a video and seeing the same sequence of events playing in reverse. But under time reversal, electron spins change their direction and break this symmetry. Thus, one of the ways to break time-reversal symmetry is by developing magnetism—specifically, ferromagnetism, a type of magnetism where all electron spins align in a parallel fashion.

    “The system is going into the superconducting state and seemingly magnetism is developing,” said Johnson. “We have to assume the magnetism is in the surface region because in this form it cannot coexist in the bulk. This discovery is exciting because the material has a lot of different physics in it: superconductivity, topology, and now magnetism. I like to say it’s one-stop shopping. Understanding how these phenomena arise in the material could provide a basis for many new and exciting technological directions.”

    As previously noted, the material’s superconductivity and strong spin-orbit effects could be harnessed for quantum information technologies. Alternatively, the material’s magnetism and strong spin-orbit interactions could enable dissipationless (no energy loss) transport of electrical current in electronics. This capability could be leveraged to develop electronic devices that consume low amounts of power.

    Coauthors Alexei Tsvelik, senior scientist and group leader of the CMPMS Division Condensed Matter Theory Group, and Congjun Wu, a professor of physics at the University of California San Diego(US), provided theoretical insights on how time reversal symmetry is broken and magnetism originates in the surface region.

    “This discovery not only reveals deep connections between topological superconducting states and spontaneous magnetization but also provides important insights into the nature of superconducting gap functions in iron-based superconductors—an outstanding problem in the investigation of strongly correlated unconventional superconductors,” said Wu.

    In a separate study with other collaborators in the CMPMS Division, the experimental team is examining how different concentrations of the three elements in the sample contribute to the observed phenomena. Seemingly, tellurium is needed for the topological effects, too much iron kills superconductivity, and selenium enhances superconductivity.

    In follow-on experiments, the team hopes to verify the time-reversal symmetry breaking with other methods and explore how substituting elements in the compound modifies its electronic behavior.

    “As materials scientists, we like to alter the ingredients in the mixture to see what happens,” said Johnson. “The goal is to figure out how superconductivity, topology, and magnetism interact in these complex materials.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    One of ten national laboratories overseen and primarily funded by the DOE(US) Office of Science, DOE’s Brookhaven National Laboratory(US) conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University (US) the largest academic user of Laboratory facilities, and Battelle(US), a nonprofit, applied science and technology organization.

    Research at BNL specializes in nuclear and high energy physics, energy science and technology, environmental and bioscience, nanoscience and national security. The 5,300 acre campus contains several large research facilities, including the Relativistic Heavy Ion Collider [below] and National Synchrotron Light Source II [below]. Seven Nobel prizes have been awarded for work conducted at Brookhaven lab.

    BNL is staffed by approximately 2,750 scientists, engineers, technicians, and support personnel, and hosts 4,000 guest investigators every year. The laboratory has its own police station, fire department, and ZIP code (11973). In total, the lab spans a 5,265-acre (21 km^2) area that is mostly coterminous with the hamlet of Upton, New York. BNL is served by a rail spur operated as-needed by the New York and Atlantic Railway. Co-located with the laboratory is the Upton, New York, forecast office of the National Weather Service.

    Major programs

    Although originally conceived as a nuclear research facility, Brookhaven Lab’s mission has greatly expanded. Its foci are now:

    Nuclear and high-energy physics
    Physics and chemistry of materials
    Environmental and climate research
    Nanomaterials
    Energy research
    Nonproliferation
    Structural biology
    Accelerator physics

    Operation

    Brookhaven National Lab was originally owned by the Atomic Energy Commission(US) and is now owned by that agency’s successor, the United States Department of Energy (DOE). DOE subcontracts the research and operation to universities and research organizations. It is currently operated by Brookhaven Science Associates LLC, which is an equal partnership of Stony Brook University(US) and Battelle Memorial Institute(US). From 1947 to 1998, it was operated by Associated Universities, Inc. (AUI), but AUI lost its contract in the wake of two incidents: a 1994 fire at the facility’s high-beam flux reactor that exposed several workers to radiation and reports in 1997 of a tritium leak into the groundwater of the Long Island Central Pine Barrens on which the facility sits.

    Foundations

    Following World War II, the US Atomic Energy Commission was created to support government-sponsored peacetime research on atomic energy. The effort to build a nuclear reactor in the American northeast was fostered largely by physicists Isidor Isaac Rabi and Norman Foster Ramsey Jr., who during the war witnessed many of their colleagues at Columbia University leave for new remote research sites following the departure of the Manhattan Project from its campus. Their effort to house this reactor near New York City was rivalled by a similar effort at the Massachusetts Institute of Technology(US) to have a facility near Boston, Massachusettes(US). Involvement was quickly solicited from representatives of northeastern universities to the south and west of New York City such that this city would be at their geographic center. In March 1946 a nonprofit corporation was established that consisted of representatives from nine major research universities — Columbia(US), Cornell(US), Harvard(US), Johns Hopkins(US), MIT, Princeton University(US), University of Pennsylvania(US), University of Rochester(US), and Yale University(US).

    Out of 17 considered sites in the Boston-Washington corridor, Camp Upton on Long Island was eventually chosen as the most suitable in consideration of space, transportation, and availability. The camp had been a training center from the US Army during both World War I and World War II. After the latter war, Camp Upton was deemed no longer necessary and became available for reuse. A plan was conceived to convert the military camp into a research facility.

    On March 21, 1947, the Camp Upton site was officially transferred from the U.S. War Department to the new U.S. Atomic Energy Commission (AEC), predecessor to the U.S. Department of Energy (DOE).

    Research and facilities

    Reactor history

    In 1947 construction began on the first nuclear reactor at Brookhaven, the Brookhaven Graphite Research Reactor. This reactor, which opened in 1950, was the first reactor to be constructed in the United States after World War II. The High Flux Beam Reactor operated from 1965 to 1999. In 1959 Brookhaven built the first US reactor specifically tailored to medical research, the Brookhaven Medical Research Reactor, which operated until 2000.

    Accelerator history

    In 1952 Brookhaven began using its first particle accelerator, the Cosmotron. At the time the Cosmotron was the world’s highest energy accelerator, being the first to impart more than 1 GeV of energy to a particle.

    BNL Cosmotron 1952-1966

    The Cosmotron was retired in 1966, after it was superseded in 1960 by the new Alternating Gradient Synchrotron (AGS).

    BNL Alternating Gradient Synchrotron (AGS)

    The AGS was used in research that resulted in 3 Nobel prizes, including the discovery of the muon neutrino, the charm quark, and CP violation.

    In 1970 in BNL started the ISABELLE project to develop and build two proton intersecting storage rings.

    The groundbreaking for the project was in October 1978. In 1981, with the tunnel for the accelerator already excavated, problems with the superconducting magnets needed for the ISABELLE accelerator brought the project to a halt, and the project was eventually cancelled in 1983.

    The National Synchrotron Light Source operated from 1982 to 2014 and was involved with two Nobel Prize-winning discoveries. It has since been replaced by the National Synchrotron Light Source II [below].

    BNL NSLS.

    After ISABELLE’S cancellation, physicist at BNL proposed that the excavated tunnel and parts of the magnet assembly be used in another accelerator. In 1984 the first proposal for the accelerator now known as the Relativistic Heavy Ion Collider (RHIC)[below] was put forward. The construction got funded in 1991 and RHIC has been operational since 2000. One of the world’s only two operating heavy-ion colliders, RHIC is as of 2010 the second-highest-energy collider after the Large Hadron Collider(CH). RHIC is housed in a tunnel 2.4 miles (3.9 km) long and is visible from space.

    On January 9, 2020, It was announced by Paul Dabbar, undersecretary of the US Department of Energy Office of Science, that the BNL eRHIC design has been selected over the conceptual design put forward by DOE’s Thomas Jefferson National Accelerator Facility [Jlab] as the future Electron–ion collider (EIC) in the United States.

    Electron-Ion Collider (EIC) at BNL, to be built inside the tunnel that currently houses the RHIC.

    In addition to the site selection, it was announced that the BNL EIC had acquired CD-0 (mission need) from the Department of Energy. BNL’s eRHIC design proposes upgrading the existing Relativistic Heavy Ion Collider, which collides beams light to heavy ions including polarized protons, with a polarized electron facility, to be housed in the same tunnel.

    Other discoveries

    In 1958, Brookhaven scientists created one of the world’s first video games, Tennis for Two. In 1968 Brookhaven scientists patented Maglev, a transportation technology that utilizes magnetic levitation.

    Major facilities

    Relativistic Heavy Ion Collider (RHIC), which was designed to research quark–gluon plasma[16] and the sources of proton spin. Until 2009 it was the world’s most powerful heavy ion collider. It is the only collider of spin-polarized protons.
    Center for Functional Nanomaterials (CFN), used for the study of nanoscale materials.
    National Synchrotron Light Source II (NSLS-II), Brookhaven’s newest user facility, opened in 2015 to replace the National Synchrotron Light Source (NSLS), which had operated for 30 years.[19] NSLS was involved in the work that won the 2003 and 2009 Nobel Prize in Chemistry.
    Alternating Gradient Synchrotron, a particle accelerator that was used in three of the lab’s Nobel prizes.
    Accelerator Test Facility, generates, accelerates and monitors particle beams.
    Tandem Van de Graaff, once the world’s largest electrostatic accelerator.
    Computational Science resources, including access to a massively parallel Blue Gene series supercomputer that is among the fastest in the world for scientific research, run jointly by Brookhaven National Laboratory and Stony Brook University.
    Interdisciplinary Science Building, with unique laboratories for studying high-temperature superconductors and other materials important for addressing energy challenges.
    NASA Space Radiation Laboratory, where scientists use beams of ions to simulate cosmic rays and assess the risks of space radiation to human space travelers and equipment.

    Off-site contributions

    It is a contributing partner to ATLAS experiment, one of the four detectors located at the Large Hadron Collider (LHC).

    CERN map

    Iconic view of the CERN (CH) ATLAS detector.

    It is currently operating at CERN near Geneva, Switzerland.

    Brookhaven was also responsible for the design of the SNS accumulator ring in partnership with Spallation Neutron Source at DOE’s Oak Ridge National Laboratory, Tennessee.

    ORNL Spallation Neutron Source annotated.

    Brookhaven plays a role in a range of neutrino research projects around the world, including the Daya Bay Reactor Neutrino Experiment in China and the Deep Underground Neutrino Experiment at DOE’s Fermi National Accelerator Laboratory(US).

    Daya Bay, nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA.

    Brookhaven Campus.

    BNL Center for Functional Nanomaterials.

    BNL NSLS-II.

    BNL NSLS II.

    BNL RHIC Campus.

    BNL/RHIC Star Detector.

    BNL/RHIC Phenix.

     
  • richardmitnick 11:52 am on February 21, 2021 Permalink | Reply
    Tags: "Technologies for More Powerful Quantum Computers", , Development Will Be Made Available to Innovative First Users, , Fraunhofer Society [Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e. V.](DE), Friedrich–Alexander University Erlangen–Nürnberg [Friedrich-Alexander-Universität Erlangen-Nürnberg](DE), Important Step towards the Development of Superconducting Quantum Circuits in Germany, Infineon, Novel Materials for Higher Quality of Qubits, Qubits, The collaboration project “German Quantum Computer Based on Superconducting Qubits” GeQCoS for short., , The Technical University of Munich [Technische Universität München](DE), Walther Meißner Institute of the Bavarian Academy of Sciences(DE)   

    From The Karlsruhe Institute of Technology-KIT [Karlsruher Institut für Technologie] (DE): “Technologies for More Powerful Quantum Computers” 

    1

    From The Karlsruhe Institute of Technology-KIT [Karlsruher Institut für Technologie] (DE)

    29.01.2021 [Just now in social media.]

    Contact:
    Monika Landgraf
    Head of Corporate Communications, Chief Press Officer
    Phone: +49 721 608-41150
    Fax: +49 721 608-43658
    presse∂kit edu

    Contact for this press release:
    Johannes Wagner
    Pressereferent
    Phone: +49 721 608-41175
    johannes wagner∂kit edu

    1
    Visualization of a quantum processor: Its core contains a chip on which superconducting qubits are arranged in a checkered pattern. Credit: Christoph Hohmann.

    Quantum computers will efficiently solve problems that could not be solved in the past. Examples are calculations of properties of complex molecules for pharmaceutical industry or solutions of optimization problems for manufacturing processes in automotive industry or for calculations in the financial sector. Within the framework of the “GeQCoS“ collaboration project, Germany’s leading researchers in the area of superconducting quantum circuits are working on innovative concepts for designing better quantum processors. Researchers from Karlsruhe Institute of Technology (KIT) play an important role in the project.

    The collaboration project “German Quantum Computer Based on Superconducting Qubits,” GeQCoS for short, is aimed at developing a prototype quantum processor consisting of a few superconducting qubits with fundamentally improved components. The main components of a quantum computer, the quantum bits or qubits, will be implemented by zero-resistance currents in superconducting circuits. These currents are relatively robust against external disturbances and can preserve quantum states during operation.

    Novel Materials for Higher Quality of Qubits

    The planned improvements will consist in an increase in connectivity, that is the number of connections among the qubits, as well as in the quality of qubits, that is the possibility to rapidly and efficiently produce the desired quantum states. “ Currently, this is a big challenge,” says Dr. Ioan Pop from KIT’s Institute for Quantum Materials and Technologies. “Use of novel materials for the production of qubits is expected to result in better reproducibility and higher quality of the qubits.”

    Important Step towards the Development of Superconducting Quantum Circuits in Germany

    To achieve improvement, researchers collaborate closely in the areas of alternative components, change of architecture, coupling mechanisms, and higher precision of calculations. “This is a very important step towards the development of superconducting quantum circuits in Germany. This technology is preferred and pursued by IT managers in the area of quantum computers,” Professor Alexey Ustinov, Head of the research group at KIT’s Physikalisches Institut, emphasizes. “Localization and diagnosis of errors is rather challenging work. We have to improve fabrication methods to prevent faults that sustainably influence the quality of the qubits.”

    Today, quantum computers already are able to manage small specific problems and to exhibit basic functions, the experts say. In the long term, work is aimed at developing a so-called universal quantum computer that calculates important problems exponentially faster than a classical computer. An architecture suited for the calculation of practically relevant problems requires substantial improvement of both hardware and software.

    Development Will Be Made Available to Innovative First Users

    To reach this goal, scalable fabrication processes and optimized chip housings will be developed within the project. Eventually, the prototype quantum processor will be installed at the Walther Meißner Institute of the Bavarian Academy of Sciences. The technologies developed are not only expected to lead to new scientific findings. Close interconnection with companies will strengthen the quantum ecosystem in Germany and Europe. On both the hardware and software level, the quantum processor will be made available to innovative first users as early as possible.

    Apart from KIT, the Friedrich–Alexander University Erlangen–Nürnberg [Friedrich-Alexander-Universität Erlangen-Nürnberg](DE) , Forschungszentrum Jülich Research Centre [Forschungszentrum Jülichs] (FZJ)(DE), Walther Meißner Institute of the Bavarian Academy of Sciences(DE), The Technical University of Munich [Technische Universität München](DE), Infineon, and the Fraunhofer Society [Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e. V.](DE) are involved in the project. The “GeQCoS“ project is funded by the Federal Ministry of Education and Research with EUR 14.5 million. Of these, more than 3 million euros go to KIT.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    2

    Mission Statement of KIT

    Preamble

    The Karlsruhe Institute of Technology-KIT [Karlsruher Institut für Technologie] (DE), briefly referred to as KIT, was established by the merger of the Forschungszentrum Karlsruhe GmbH and the Universität Karlsruhe ([TH] on October 01, 2009. KIT combines the tasks of a university of the state of Baden-Württemberg with those of a research center of the Helmholtz Association of German Research Centres [Helmholtz-Gemeinschaft Deutscher Forschungszentren] (DE) in the areas of research, teaching, and innovation.

    The KIT merger represents the consistent continuation of a long-standing close cooperation of two research and education institutions rich in tradition. The University of Karlsruhe was founded in 1825 as a Polytechnical School and has developed to a modern location of research and education in natural sciences, engineering, economics, social sciences, and the humanities, which is organized in eleven departments. The Karlsruhe Research Center was founded in 1956 as the Nuclear Reactor Construction and Operation Company and has turned into a multidisciplinary large-scale research center of the Helmholtz Association, which conducts research under eleven scientific and engineering programs.

    Being “The Research University in the Helmholtz Association”, KIT creates and imparts knowledge for the society and the environment. It is the objective to make significant contributions to the global challenges in the fields of energy, mobility, and information. For this, about 9,300 employees cooperate in a broad range of disciplines in natural sciences, engineering sciences, economics, and the humanities and social sciences. KIT prepares its 24,400 students for responsible tasks in society, industry, and science by offering research-based study programs. Innovation efforts at KIT build a bridge between important scientific findings and their application for the benefit of society, economic prosperity, and the preservation of our natural basis of life. KIT is one of the German universities of excellence.

    In 2014/15, the KIT concentrated on an overarching strategy process to further develop its corporate strategy. This mission statement as the result of a participative process was the first element to be incorporated in the strategy process.

    Mission Statement of KIT

    KIT combines the traditions of a renowned technical university and a major large-scale research institution in a very unique way. In research and education, KIT assumes responsibility for contributing to the sustainable solution of the grand challenges that face the society, industry, and the environment. For this purpose, KIT uses its financial and human resources with maximum efficiency. The scientists of KIT communicate the contents and results of their work to society.

    Engineering sciences, natural sciences, the humanities, and social sciences make up the scope of subjects covered by KIT. In high interdisciplinary interaction, scientists of these disciplines study topics extending from the fundamentals to application and from the development of new technologies to the reflection of the relationship between man and technology. For this to be accomplished in the best possible way, KIT’s research covers the complete range from fundamental research to close-to-industry, applied research and from small research partnerships to long-term large-scale research projects. Scientific sincerity and the striving for excellence are the basic principles of our activities.

    Worldwide exchange of knowledge, large-scale international research projects, numerous global cooperative ventures, and cultural diversity characterize and enrich the life and work at KIT. Academic education at KIT is guided by the principle of research-oriented teaching. Early integration into interdisciplinary research projects and international teams and the possibility of using unique research facilities open up exceptional development perspectives for our students.

    The development of viable technologies and their use in industry and the society are the cornerstones of KIT’s activities. KIT supports innovativeness and entrepreneurial culture in various ways. Moreover, KIT supports a culture of creativity, in which employees and students have time and space to develop new ideas.

    Cooperation of KIT employees, students, and members is characterized by mutual respect and trust. Achievements of every individual are highly appreciated. Employees and students of KIT are offered equal opportunities irrespective of the person. Family-friendliness is a major objective of KIT as an employer. KIT supports the compatibility of job and family. As a consequence, the leadership culture of KIT is also characterized by respect and cooperation. Personal responsibility and self-motivation of KIT employees and members are fostered by transparent and participative decisions, open communication, and various options for life-long learning.

    The structure of KIT is tailored to its objectives in research, education, and innovation. It supports flexible, synergy-based cooperation beyond disciplines, organizations, and hierarchies. Efficient services are rendered to support KIT employees and members in their work.

    Young people are our future. Reliable offers and career options excellently support KIT’s young scientists and professionals in their professional and personal development.

     
  • richardmitnick 10:46 am on February 20, 2021 Permalink | Reply
    Tags: "Physicists Propose a 'Force Field' to Protect Sensitive Quantum Computers From Noise", "Synthetic magnetic field", A promising method for ensuring a qubit stays fuzzy long enough to be useful is to entangle it with other qubits located elsewhere., , Back in 2001 a trio of researchers - Daniel Gottesman; Alexeir Kitaev; and John Preskill - formulated a way to encode this kind of protection into a space as an intrinsic feature of the circuitry., , One way to reduce the risk of “noise” is to build in checks and balances that help to shield the blurred state of reality at the core of quantum computers., , Qubits, RWTH Aachen University [ Rheinisch-Westfälische Technische Hochschule Aache](DE), , The basis for the design is a concept that's nearly 20 years old., This "noise" only gets worse as we grow devices to include more qubits., Too much 'noise' and the delicate state of the system collapses leaving you with a very expensive paperweight.   

    From RWTH Aachen University [ Rheinisch-Westfälische Technische Hochschule Aache](DE) via Science Alert(AU): “Physicists Propose a ‘Force Field’ to Protect Sensitive Quantum Computers From Noise” 

    From RWTH Aachen University [ Rheinisch-Westfälische Technische Hochschule Aache](DE)

    via

    ScienceAlert

    Science Alert(AU)

    19 FEBRUARY 2021
    MIKE MCRAE

    1
    Credit: oxygen/Moment/Getty Images.

    Creating a quantum computer requires an ability to stroke the edges of reality with the quietest of touches. Too much ‘noise’ and the delicate state of the system collapses, leaving you with a very expensive paperweight.

    One way to reduce the risk of this occurring is to build in checks and balances that help to shield the blurred state of reality at the core of quantum computers – and now scientists have proposed a new way to do just that.

    Theoretical physicists from RWTH Aachen University [ Rheinisch-Westfälische Technische Hochschule Aache](DE) have proposed what’s known as a “synthetic magnetic field”, which they think could help protect the fragile qubits needed in a quantum computer.

    “We have designed a circuit composed of state-of-the-art superconducting circuit elements and a nonreciprocal device, that can be used to passively implement the GKP quantum error-correcting code,” the team writes in Physical Review X.

    The basis for the design is a concept that’s nearly 20 years old (we’ll get to that in a moment), one that simply isn’t feasible based on its requirement of impossibly strong magnetic fields. The new approach attempts to get around this issue.

    Instead of the solid, bit-based language of 1s and 0s that informs the operations of your smartphone or desktop, quantum computing relies on a less binary, and far less definitive approach to crunching numbers.

    Quantum bits, or qubits, are individual units of its language based on the probability of quantum mechanics. String enough together and their seemingly random tumbling sets the foundations for a different unique approach to problem solving.

    A qubit is an odd creature though, something that has no real equivalent in our day-to-day experience. Unobserved, it could be simultaneously in the position of 1, 0, or both. But as soon as you look at it, the qubit settles into a single, more mundane state.

    In physics, this act of looking doesn’t even need to be an intentional stare. The buzz of electromagnetic radiation, a stray bump of a neighbouring particle… and that qubit can quickly find itself part of the scenery, losing its essential powers of probability.

    This ‘noise’ only gets worse as we grow devices to include more qubits, something that is necessary to make quantum computers powerful enough to be capable of the high-level processing we expect of them.

    A promising method for ensuring a qubit stays fuzzy long enough to be useful is to entangle it with other qubits located elsewhere, meaning its probabilities are now dependent on other, equally fuzzy particles sitting in zones unlikely to be slammed by the same noise.

    If that’s done right, engineers can ensure a level of quantum error correction – an insurance scheme that allows the qubit to cope with the occasional shake, rattle, and roll of surrounding noise.

    And this is where we return to the new paper. Back in 2001, a trio of researchers – Daniel Gottesman, Alexeir Kitaev, and John Preskill – formulated a way to encode this kind of protection into a space as an intrinsic feature of the circuitry holding the qubits, potentially allowing for slimmer hardware.

    It became known as the Gottesman-Kitaev-Preskill (GKP) code. There was just one problem – the GKP code relied on confining an electron to just two dimensions using intense, large magnetic fields in a way that just isn’t practical. What’s more, processes for detecting and recovering from errors are also fairly complicated, demanding even more chunks of hardware.

    To really get the most out of the GKP code’s benefits, quantum engineers would need a more passive, hands-off approach for shielding and recovering a qubit’s information from noise.

    So in this innovative new proposal, physicists suggest replacing the impossibly large magnetic field with a superconducting circuit comprising of components that serve much the same purpose, ironing out the noise.

    The technicalities of the setup aren’t for general reading, but Anja Metelmann at APS Physics does a top job of going through them step-by-step for those eager for details.

    For it to work, there would need to be a way for photons – effectively ripples in the electromagnetic field that carry the electron’s forces – to be manipulated by that very field. Given the photon’s neutrality, this just isn’t a possibility.

    There is a workaround, though. In recent years physicists have found a way to control photons so they can be channelled like electrons, by manipulating the optics of a space so it takes on certain magnetic-like characteristics.

    So-called synthetic magnetic fields permit photons to be directed, giving engineers a way to craft devices in which light waves can be forced to behave more like a current.

    The new paper lays out a way to use this synthetic magnetic field to protect a theoretical single electron in a crystal, confined to a 2D plane. When they ran calculations to see how it would react when subjected to a strong, real magnetic field, which usually would interfere with the system, they showed that their new set-up could protect it.

    “We find that the circuit is naturally protected against the common noise channels in superconducting circuits, such as charge and flux noise, implying that it can be used for passive quantum error correction,” the team explains in their paper.

    Before we get a working prototype of this quantum error-correcting machinery, there are plenty of kinks to work out experimentally. It’s all good on paper, but left to be seen if the technology does cooperate as expected.

    In time, we might have a relatively simple device that turns an impractical – but otherwise efficient – concept for scaling up quantum computers into a real possibility, opening the way for error tolerant technology that has until now been mostly theoretical.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    RWTH Aachen University [ Rheinisch-Westfälische Technische Hochschule Aache](DE) is a public research university located in Aachen, North Rhine-Westphalia, Germany. With more than 45,000 students enrolled in 144 study programs, it is the largest technical university in Germany.

    In 2007, RWTH Aachen was chosen by the DFG as one of nine German Universities of Excellence for its future concept RWTH 2020: Meeting Global Challenges and additionally won funding for one graduate school and three clusters of excellence.

    RWTH Aachen is a founding member of IDEA League, a strategic alliance of five leading universities of technology in Europe. The university is also a member of TU9, DFG (Deutsche Forschungsgemeinschaft) and the Top Industrial Managers for Europe network.

    On 25 January 1858, prince Frederick William of Prussia (later German emperor), was given a donation of 5,000 talers from the Aachener und Münchener Feuer-Versicherungs-Gesellschaft, the precursor of the AachenMünchener insurance company, for charity. In March, the prince chose to use the donation to found the first Prussian institute of technology somewhere in the Rhine province. The seat of the institution remained undecided over years; while the prince initially favored Koblenz, the cities of Aachen, Bonn, Cologne and Düsseldorf also applied, with Aachen and Cologne being the main competitors. Aachen finally won with a financing concept backed by the insurance company and by local banks.

    Groundbreaking for the new Polytechnikum took place on 15 May 1865 and lectures started during the Franco-Prussian War on 10 October 1870 with 223 students and 32 teachers. The new institution had as its primary purpose the education of engineers, especially for the mining industry in the Ruhr area; there were schools of chemistry, electrical and mechanical engineering as well as an introductory general school that taught mathematics and natural sciences and some social sciences.
    Main Building of the RWTH Aachen. It was built in 1870.

    The unclear position of the new Prussian polytechnika (which officially were not universities) affected the first years. Polytechnics lacked prestige in society and the number of students decreased. This began to change in 1880 when the early RWTH, amongst others, was reorganized as a Royal Technical University, gained a seat in the Prussian House of Lords and finally won the right to bestow Dr. (1899) degrees and Diplomat titles (introduced in 1902). In the same year, over 800 male students enrolled. In 1909 the first women were admitted and the artist August von Brandis succeeded Alexander Frenz at the Faculty of Architecture as a “professor of figure and landscape painting”, Brandis became dean in 1929.

    World War I, however, proved a serious setback for the university. Many students voluntarily joined up and died in the war, and parts of the university were shortly occupied or confiscated.

    While the (then no more royal) TH Aachen (Technische Hochschule Aachen) flourished in the 1920s with the introduction of more independent faculties, of several new institutes and of the general students’ committee, the first signs of nationalist radicalization also became visible within the university. The Third Reich’s Gleichschaltung of the TH in 1933 met with relatively low resistance from both students and faculty. Beginning in September 1933, Jewish and (alleged) Communist professors (and from 1937 on also students) were systematically persecuted and excluded from the university. Vacant Chairs were increasingly given to NSDAP party-members or sympathizers. The freedom of research and teaching became severely limited, and institutes important for the regime’s plans were systematically established, and existing chairs promoted. Briefly closed in 1939, the TH continued courses in 1940, although with a low number of students. On 21 October 1944, when Aachen capitulated, more than 70% of all buildings of the university were destroyed or heavily damaged.

    After World War II ended in 1945 the university recovered and expanded quickly. In the 1950s, many professors who had been removed because of their alleged affiliation with the Nazi party were allowed to return and a multitude of new institutes were founded. By the late 1960s, the TH had 10,000 students, making it the foremost of all German technical universities. With the foundation of philosophical and medical faculties in 1965 and 1966, respectively, the university became more “universal”. The newly founded faculties in particular began attracting new students, and the number of students almost doubled twice from 1970 (10,000) to 1980 (more than 25,000) and from 1980 to 1990 (more than 37,000). Now, the average number of students is around 42,000, with about one third of all students being women. By relative terms, the most popular study-programs are engineering (57%), natural science (23%), economics and humanities (13%) and medicine (7%).

    Recent developments

    “Red lecture hall” at the central campus

    In December 2006, RWTH Aachen and the Sultanate of Oman signed an agreement to establish a private German University of Technology in Muscat. Professors from Aachen aided in developing the curricula for the currently five study-programs and scientific staff took over some of the first courses.

    In 2007, RWTH Aachen was chosen as one of nine German Universities of Excellence for its future concept RWTH 2020: Meeting Global Challenges, earning it the connotation of being a “University of Excellence”. However, although the list of universities honored for their future concepts mostly consists of large and already respected institutions, the Federal Ministry of Education and Research claimed that the initiative aimed at promoting universities with a dedicated future concept so they could continue researching on an international level. Having won funds in all three lines of funding, the process brought RWTH Aachen University an additional total funding of € 180 million from 2007–2011. The other two lines of funding were graduate schools, where the Aachen Institute for Advanced Study in Computational Engineering Science received funding and so-called “clusters of excellence”, where RWTH Aachen managed to win funding for the three clusters: Ultra High-Speed Mobile Information and Communication (UMIC), Integrative Production Technology for High-wage Countries and Tailor-Made Fuels from Biomass (TMFB).

    RWTH was selected to receive funding from the German federal and state governments for the third Universities of Excellence funding line starting 2019. RWTH’s proposal was called “The Integrated Interdisciplinary University of Science and Technology – Knowledge. Impact. Networks.” and has secured funding for a seven-year period.

    2019 Clusters of Excellence

    The Fuel Science Center (FSC) Adaptive Conversion Systems for Renewable Energy and Carbon Sources
    Internet of Production
    ML4Q – Matter and Light for Quantum Computing

    RWTH was already awarded funding in the first and second Universities of Excellence funding lines, in 2007 and 2012 respectively.

     
  • richardmitnick 11:56 am on February 10, 2021 Permalink | Reply
    Tags: "Quantum Photons", Another approach developed more recently is to use a photon as an optical qubit to encode quantum information., , Most efforts to build quantum computers have relied on qubits created in superconducting wires chilled to near absolute zero or on trapped ions held in place by lasers., , Qubits, To place an entire quantum photonics system onto a chip measuring about one square centimeter would be a tremendous achievement.,   

    From UC Santa Barbara: “Quantum Photons” 

    UC Santa Barbara Name bloc
    From UC Santa Barbara

    February 9, 2021

    Contact Info:
    Shelly Leachman
    (805) 893-8726
    shelly.leachman@ucsb.edu

    Written by James Badham

    Galan Moody receives a new grant to develop a testbed for photonic-based quantum computing.

    1
    Concept illustration depicting an integrated photonic quantum processor: Laser light coupled into the channels interacts with the rings (foreground) to create pairs of entangled photons (red). The entangled photons split and travel throughout the photonic circuit (background), which controls effective interactions between them, enabling optical quantum computations. Credit: Lillian McKinney.

    Classical computing is built on the power of the bit, which is, in essence, a micro transistor on a chip that can be either on or off, representing a 1 or a 0 in binary code. The quantum computing equivalent is the qubit. Unlike bits, qubits can exist in more than one “state” at a time, enabling quantum computers to perform computational functions exponentially faster than can classical computers.

    To date, most efforts to build quantum computers have relied on qubits created in superconducting wires chilled to near absolute zero or on trapped ions held in place by lasers. But those approaches face certain challenges, most notably that the qubits are highly sensitive to environmental factors. As the number of qubits increases, those factors are more likely to compound and interrupt the entanglement of qubits required for a quantum computer to work.

    Another approach, developed more recently, is to use a photon as an optical qubit to encode quantum information and to integrate the components necessary for that process into a photonic integrated circuit (PIC). Galan Moody, an assistant professor in the UC Santa Barbara College of Engineering’s Department of Electrical and Computer Engineering (ECE), has received a Defense University Research Instrumentation Program (DURIP) Award from the U.S. Department of Defense and the Air Force Office of Scientific Research to build a quantum photonic computing testbed. He will conduct his research in a lab set aside for such activity in recently completed Henley Hall, the new home of the College of Engineering’s Institute for Energy Efficiency (IEE).

    The grant supports the development or acquisition of new instrumentation to be used in fundamental and applied research across all areas of science and engineering. “My field is quantum photonics, so we’re working to develop new types of quantum light sources and ways to manipulate and detect quantum states of light for use in such applications as quantum photonic computing and quantum communications,” Moody said.

    “At a high level,” he explained, the concept of quantum photonic computing is “exactly the same as what Google is doing with superconducting qubits or what other companies are doing with trapped ions. There are a lot of different platforms for computing, and one of them is to use photonic integrated circuits to generate entangled photons, entanglement being the foundation for many different quantum applications.”

    To place an entire quantum photonics system onto a chip measuring about one square centimeter would be a tremendous achievement. Fortunately, the well-developed photonics infrastructure — including AIM Photonics, which has a center at UCSB led by ECE professor and photonics pioneer John Bowers, also director of the IEE — lends itself to that pursuit and to scaling up whatever quantum photonics platform is most promising. Photonics for classical applications is a mature technology industry that, Moody said, “has basically mastered large-scale and wafer-scale fabrication of devices.”

    It is reliable, so whatever Moody and his team design, they can fabricate themselves or even order from foundries, knowing they will get exactly what they want.

    The Photonic Edge

    The process of creating photonic qubits begins with generating high-quality single photons or pairs of entangled photons. A qubit can then be defined in several different ways, most often in the photon’s polarization (the orientation of the optical wave) or in the path that the photons travel. Moody and his team can create PICs that control these aspects of the photons, which become the carriers of quantum information and can be manipulated to perform logic operations.

    The approach has several advantages over other methods of creating qubits. For instance, the aforementioned environmental effects that can cause qubits to lose their coherence do not affect coherence in photons, which, Moody says, “can maintain that entanglement for a very long time. The challenge is not coherence but, rather, getting the photons to become entangled in the first place.”

    “That,” Moody notes, “is because photons don’t naturally interact; rather, they pass right through each other and go their separate ways. But they have to interact in some way to create an entangled state. We’re working on how to create PIC-based quantum light sources that produce high-quality photons as efficiently as possible and then how to get all the photons to interact in a way that allows us to build a scalable quantum processor or new devices for long-distance quantum communications.”

    Quantum computers are super efficient, and the photonics approach to quantum technologies is even more so. When Google “demonstrated quantum supremacy” in fall 2019 using the quantum computer built in its Goleta laboratory under the leadership of UCSB physics professor John Martinis, the company claimed that its machine, named Sycamore, could do a series of test calculations in 200 seconds that a super-computer would need closer to 10,000 years to complete. Recently, a Chinese team using a laboratory-scale table-top experiment claimed that, with a photon-based quantum processor, “You could do in two hundred seconds what would take a super-computer 2.5 billion years to accomplish,” Moody said.

    Another advantage is that photonics is naturally scalable to thousands and, eventually, millions of components, which can be done by leveraging the wafer-scale fabrication technologies developed for classical photonics. Today, the most advanced PICs comprise nearly five thousand components and could be expanded by a factor of two or four with existing fabrication technologies, a stage of development comparable to that of digital electronics in the 1960s and 1970s. “Even a few hundred components are enough to perform important quantum computing operations with light, at least on a small scale between a few qubits,” said Moody. With further development, quantum photonic chips can be scaled to tens or hundreds of qubits using the existing photonics infrastructure.

    Moody’s team is developing a new materials platform, based on gallium arsenide and silicon dioxide, to generate single and entangled photons, and it promises to be much more efficient than comparable systems. In fact, they have a forthcoming paper showing that their new quantum light source is nearly a thousand times more efficient than any other on-chip light source.

    In terms of the process, Moody says, “At the macro level, we work on making better light sources and integrating many of them onto a chip. Then, we combine these with on-chip programmable processors, analogous to electronic transistors used for classical logic operations, and with arrays of single-photon detectors to try to implement quantum logic operations with photons as efficiently as possible.”

    For more accessible applications, like communications, no computing need occur. “It involves taking a great light source and manipulating a property of the photon states (such as polarization), then sending those off to some other chip that’s up in a satellite or in some other part of the world, which can measure the photons and send a signal back that you can collect,” Moody said.

    One catch, for now, is that the single-photon detectors, which are used to signal whether the logic operations were performed, work with very high efficiency when they are on the chip; however, some of them work only if the chip is cooled to cryogenic temperatures.

    “If we want to integrate everything on chip and put detectors on chip as well, then we’re going to need to cool the whole thing down,” Moody said. “We’re going to build a setup to be able to do that and test the various quantum photonic components designed and fabricated for this. The DURIP award enables exactly this: developing the instrumentation to be able to test large-scale quantum photonic chips from cryogenic temperatures all the way up to room temperature.”

    There are also challenges associated with cooling the chip to cryogenic temperatures. Said Moody, “It’s getting this whole platform up and running, interfacing the instrumentation, and making all the custom parts we need to be able to look at large-scale photonic chips for quantum applications at cryogenic temperatures.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education CoalitionUC Santa Barbara Seal
    The University of California, Santa Barbara (commonly referred to as UC Santa Barbara or UCSB) is a public research university and one of the 10 general campuses of the University of California system. Founded in 1891 as an independent teachers’ college, UCSB joined the University of California system in 1944 and is the third-oldest general-education campus in the system. The university is a comprehensive doctoral university and is organized into five colleges offering 87 undergraduate degrees and 55 graduate degrees. In 2012, UCSB was ranked 41st among “National Universities” and 10th among public universities by U.S. News & World Report. UCSB houses twelve national research centers, including the renowned Kavli Institute for Theoretical Physics.

     
  • richardmitnick 9:44 pm on February 9, 2021 Permalink | Reply
    Tags: A Monte Carlo simulation technique, , D-Wave processors are now being used to simulate magnetic systems of practical interest., , , Fractional magnetization plateaus, Magnetic structure, , Quantum annealing-a form of quantum computing, , Quantum Science Center-a DOE Quantum Information Science Research Center established at ORNL in 2020., Qubits, Shastry-Sutherland Ising model, , Their novel simulations will serve as a foundation to streamline future efforts on next-generation quantum computers.   

    From DOE’s Oak Ridge National Laboratory: “Quantum computing enables simulations to unravel mysteries of magnetic materials” 

    From DOE’s Oak Ridge National Laboratory

    February 9, 2021
    Scott S Jones
    jonesg@ornl.gov
    865.241.6491

    1
    The researchers embedded a programmable model into a D-Wave quantum computer chip. Credit: D-Wave.

    A multi-institutional team became the first to generate accurate results from materials science simulations on a quantum computer that can be verified with neutron scattering experiments and other practical techniques.

    Researchers from the Department of Energy’s Oak Ridge National Laboratory; the University of Tennessee, Knoxville; Purdue University and D-Wave Systems harnessed the power of quantum annealing, a form of quantum computing, by embedding an existing model into a quantum computer.

    Characterizing materials has long been a hallmark of classical supercomputers, which encode information using a binary system of bits that are each assigned a value of either 0 or 1. But quantum computers — in this case, D-Wave’s 2000Q – rely on qubits, which can be valued at 0, 1 or both simultaneously because of a quantum mechanical capability known as superposition.

    “The underlying method behind solving materials science problems on quantum computers had already been developed, but it was all theoretical,” said Paul Kairys, a student at UT Knoxville’s Bredesen Center for Interdisciplinary Research and Graduate Education who led ORNL’s contributions to the project. “We developed new solutions to enable materials simulations on real-world quantum devices.”

    This unique approach proved that quantum resources are capable of studying the magnetic structure and properties of these materials, which could lead to a better understanding of spin liquids, spin ices and other novel phases of matter useful for data storage and spintronics applications. The researchers published the results of their simulations — which matched theoretical predictions and strongly resembled experimental data — in PRX Quantum.

    Eventually, the power and robustness of quantum computers could enable these systems to outperform their classical counterparts in terms of both accuracy and complexity, providing precise answers to materials science questions instead of approximations. However, quantum hardware limitations previously made such studies difficult or impossible to complete.

    To overcome these limitations, the researchers programmed various parameters into the Shastry-Sutherland Ising model. Because it shares striking similarities with the rare earth tetraborides, a class of magnetic materials, subsequent simulations using this model could provide substantial insights into the behavior of these tangible substances.

    “We are encouraged that the novel quantum annealing platform can directly help us understand materials with complicated magnetic phases, even those that have multiple defects,” said co-corresponding author Arnab Banerjee, an assistant professor at Purdue. “This capability will help us make sense of real material data from a variety of neutron scattering, magnetic susceptibility and heat capacity experiments, which can be very difficult otherwise.”

    2
    Using the D-Wave chip (foreground), the team simulated the experimental signature of a sample material(background), producing results that are directly comparable to the output from real-world experiments. Credit: Paul Kairys/UT Knoxville.

    Magnetic materials can be described in terms of magnetic particles called spins. Each spin has a preferred orientation based on the behavior of its neighboring spins, but rare earth tetraborides are frustrated, meaning these orientations are incompatible with each other. As a result, the spins are forced to compromise on a collective configuration, leading to exotic behavior such as fractional magnetization plateaus. This peculiar behavior occurs when an applied magnetic field, which normally causes all spins to point in one direction, affects only some spins in the usual way while others point in the opposite direction instead.

    Using a Monte Carlo simulation technique powered by the quantum evolution of the Ising model, the team evaluated this phenomenon in microscopic detail.

    “We came up with new ways to represent the boundaries, or edges, of the material to trick the quantum computer into thinking that the material was effectively infinite, and that turned out to be crucial for correctly answering materials science questions,” said co-corresponding author Travis Humble. Humble is an ORNL researcher and deputy director of the Quantum Science Center, or QSC, a DOE Quantum Information Science Research Center established at ORNL in 2020. The individuals and institutions involved in this research are QSC members.

    Quantum resources have previously simulated small molecules to examine chemical or material systems. Yet, studying magnetic materials that contain thousands of atoms is possible because of the size and versatility of D-Wave’s quantum device.

    “D-Wave processors are now being used to simulate magnetic systems of practical interest, resembling real compounds. This is a big deal and takes us from the notepad to the lab,” said Andrew King, director of performance research at D-Wave. “The ultimate goal is to study phenomena that are intractable for classical computing and outside the reach of known experimental methods.”

    The researchers anticipate that their novel simulations will serve as a foundation to streamline future efforts on next-generation quantum computers. In the meantime, they plan to conduct related research through the QSC, from testing different models and materials to performing experimental measurements to validate the results.

    “We completed the largest simulation possible for this model on the largest quantum computer available at the time, and the results demonstrated the significant promise of using these techniques for materials science studies going forward,” Kairys said.

    This work was funded by the DOE Office of Science Early Career Research Program. Access to the D-Wave 2000Q system was provided through the Quantum Computing User Program managed by the Oak Ridge Leadership Computing Facility, a DOE Office of Science user facility located at ORNL.

    Research performed at ORNL’s Spallation Neutron Source, also a DOE Office of Science user facility located at ORNL, was supported by the DOE Office of Science.

    ORNL Spallation Neutron Source.


    ORNL Spallation Neutron Source annotated.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

     
  • richardmitnick 1:35 pm on February 5, 2021 Permalink | Reply
    Tags: "Quantum systems learn joint computing", , MPG Institute for Quantum Optics [Max Planck Institut für Quantenoptik] (DE), MPQ researchers realize the first quantum-logic computer operation between two separate quantum modules in different laboratories., , , , Qubits, They realized a quantum-logic gate .   

    From MPG Institute for Quantum Optics [Max Planck Institut für Quantenoptik] (DE): “Quantum systems learn joint computing” 

    Max Planck Institut für Quantenoptik (DE)

    From MPG Institute for Quantum Optics [Max Planck Institut für Quantenoptik] (DE)

    February 05, 2021

    Contacts

    Severin Daiss
    PhD Candidate
    +49 89 32905-670
    +49 89 32905-372
    severin.daiss@mpq.mpg.de

    Gerhard Rempe
    Director
    +49 89 32905-701
    Gerhard.Rempe@mpq.mpg.de

    Katharina Jarrah
    PR and Communications
    +49 89 32905-213
    katharina.jarrah@mpq.mpg.de

    MPQ researchers realize the first quantum-logic computer operation between two separate quantum modules in different laboratories.

    Today’s quantum computers contain up to several dozen memory and processing units, the so-called qubits. Severin Daiss, Stefan Langenfeld, and colleagues from the Max Planck Institute of Quantum Optics in Garching have successfully interconnected two such qubits located in different labs to a distributed quantum computer by linking the qubits with a 60-meter-long optical fiber. Over such a distance they realized a quantum-logic gate – the basic building block of a quantum computer. It makes the system the worldwide first prototype of a distributed quantum computer.

    1
    The two qubit modules (red atom between two blue mirrors) that have been interconnected to implement a basic quantum computation (depicted as light blue symbol) over a distance of 60 meters. The modules reside in different laboratories of the same building and are connected by an optical fiber. The computation operation is mediated by a single photon (flying red sphere) that interacts successively with the two modules. Credit: Stephan Welte/Severin Daiss, MPQ.

    The limitations of previous qubit architectures

    Quantum computers are considerably different from traditional “binary” computers: Future realizations of them are expected to easily perform specific calculations for which traditional computers would take months or even years – for example in the field of data encryption and decryption. While the performance of binary computers results from large memories and fast computing cycles, the success of the quantum computer rests on the fact that one single memory unit – a quantum bit, also called “qubit” – can contain superpositions of different possible values at the same time. Therefore, a quantum computer does not only calculate one result at a time, but instead many possible results in parallel. The more qubits there are interconnected in a quantum computer; the more complex calculations it can perform.

    The basic computing operations of a quantum computer are quantum-logic gates between two qubits. Such an operation changes – depending on the initial state of the qubits – their quantum mechanical states. For a quantum computer to be superior to a normal computer for various calculations, it would have to reliably interconnect many dozens, or even thousands of qubits for equally thousands of quantum operations. Despite great successes, all current laboratories are still struggling to build such a large and reliable quantum computer, since every additionally required qubit makes it much harder to build a quantum computer in just one single set-up. The qubits are implemented, for instance, with single atoms, superconductive elements, or light particles, all of which need to be isolated perfectly from each other and the environment. The more qubits are arranged next to one another, the harder it is to both isolate and control them from outside at the same time.

    Data line and processing unit combined

    One way to overcome the technical difficulties in the construction of quantum computers is presented in a new study in the journal Science by Severin Daiss, Stefan Langenfeld and colleagues from the research group of Gerhard Rempe at the Max Planck Institute of Quantum Optics in Garching. In this work supported by the Institute of Photonic Sciences (Castelldefels, Spain), the team succeeded in connecting two qubit modules across a 60-meter distance in such a way that they effectively form a basic quantum computer with two qubits. “Across this distance, we perform a quantum computing operation between two independent qubit setups in different laboratories,” Daiss emphasizes. This enables the possibility to merge smaller quantum computers to a joint processing unit.

    2
    First author, Severin Daiss, in front of part one of their distributed quantum computer.

    Simply coupling distant qubits to generate entanglement between them has been achieved in the past, but now, the connection can additionally be used for quantum computations. For this purpose, the researchers employed modules consisting of a single atom as a qubit that is positioned amidst two mirrors. Between these modules, they send one single light quanta, a photon, that is transported in the optical fiber. This photon is then entangled with the quantum states of the qubits in the different modules. Subsequently, the state of one of the qubits is changed according to the measured state of the “ancilla photon”, realizing a quantum mechanical CNOT-operation with a fidelity of 80 percent. A next step would be to connect more than two modules and to host more qubits in the individual modules.

    Higher performance quantum computers through distributed computing

    Team leader and institute director Gerhard Rempe believes the result will allow to further advance the technology: “Our scheme opens up a new development path for distributed quantum computing”. It could enable, for instance, to build a distributed quantum computer consisting of many modules with few qubits that are interconnected with the newly introduced method. This approach could circumvent the limitation of existing quantum computers to integrate more qubits into a single setup and could therefore allow more powerful systems.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition


    Research at the MPG Institute for Quantum Optics [Max Planck Institut für Quantenoptik ] (DE)
    Light can behave as an electromagnetic wave or a shower of particles that have no mass, called photons, depending on the conditions under which it is studied or used. Matter, on the other hand, is composed of particles, but it can actually exhibit wave-like properties, giving rise to many astonishing phenomena in the microcosm.

    At our institute we explore the interaction of light and quantum systems, exploiting the two extreme regimes of the wave-particle duality of light and matter. On the one hand we handle light at the single photon level where wave-interference phenomena differ from those of intense light beams. On the other hand, when cooling ensembles of massive particles down to extremely low temperatures we suddenly observe phenomena that go back to their wave-like nature. Furthermore, when dealing with ultrashort and highly intense light pulses comprising trillions of photons we can completely neglect the particle properties of light. We take advantage of the large force that the rapidly oscillating electromagnetic field exerts on electrons to steer their motion within molecules or accelerate them to relativistic energies.

     
  • richardmitnick 1:46 pm on January 30, 2021 Permalink | Reply
    Tags: "With superconducting qubits on the way to the quantum computer", , GeQCoS ('German Quantum Computer based on Superconducting Qubits'), Jülich Research Centre [Forschungszentrum Jülichs] (FZJ)(DE), Quantum computers hold the promise to efficiently solve problems that are intractable with conventional computers., Qubits, The processor performance will eventually be demonstrated using a specifically developed quantum algorithm at theWalther-Meißner-Institute.   

    From Jülich Research Centre [Forschungszentrum Jülichs] (FZJ)(DE): “With superconducting qubits on the way to the quantum computer” 

    From Jülich Research Centre [Forschungszentrum Jülichs] (FZJ)(DE)

    29 January 2021

    Prof. Dr. Frank Wilhelm-Mauch
    Peter Grünberg Institute, Quantum Computing Analytics (PGI-12)
    Tel.: +49 2461 61-6106
    f.wilhelm-mauch@fz-juelich.de

    Building a quantum processor with novel properties based on superconducting qubits – this is the aim of the four year project GeQCoS (‘German Quantum Computer based on Superconducting Qubits’) funded by the BMBF.

    1
    Visualisation of a quantum processor. Copyright: Christoph Hohmann.

    In this joint project, Germany’s leading scientists in the field of superconducting quantum circuits have teamed up to develop innovative concepts for the construction of an improved quantum processor. They aim to realize a quantum processor with improved quality based on new materials and manufacturing methods by the Karlsruhe Institute of Technology (KIT), tailor-made theoretical concepts of the Friedrich-Alexander University Erlangen Nürnberg (FAU), optimized control methods of the Forschungszentrum Jülichs (FZJ) and concepts for new architectures with higher connectivity at the Walther-Meißner-Institute (WMI – Bavarian Academy of Sciences and Technical University of Munich). In order to achieve this goal, semiconductor manufacturer Infineon will develop scalable manufacturing processes, while the Fraunhofer Institute for Applied Solid State Physics (IAF) in Freiburg is promoting the development of optimized chip packages. The processor performance will eventually be demonstrated using a specifically developed quantum algorithm at the WMI.

    Improved technology for more powerful quantum computers

    Quantum computers hold the promise to efficiently solve problems that are intractable with conventional computers. This includes, for example, the calculation of the properties of complex molecules for the chemical and pharmaceutical industry as well as the solution of optimization tasks, e.g. for manufacturing processes in the automotive industry or for calculations in the financial world. Already today, quantum computers have demonstrated their basic functionality by mastering small, specific problems. The long-term goal of a quantum computer that calculates exponentially faster than a classic computer, however, is still in the future. A suitable architecture for calculating practical problems can only be realized through fundamental improvements in both the hardware and the software.

    Within the GeQCoS project, a quantum processor prototype is to be developed that consists of a few superconducting qubits with fundamentally improved components. In this technology the basic building blocks of a quantum computer, the quantum bits or qubits, are implemented by means of currents flowing without resistance in superconducting circuits. These currents are relatively robust to external interference and can retain their quantum properties over relatively long time scales. Together with reliable and scalable manufacturing methods, this has resulted in one of the leading quantum technologies that is already successfully used to build the first quantum processors.

    The planned improvements concern, on the one hand, the qubit connectivity, the number of connections between the individual qubits, and the quality of the qubits to enhance the capability to quickly and efficiently produce the desired quantum states. “By using new types of materials, we expect better reproducibility and a higher quality of the qubits.” says Prof. Ioan Pop (KIT). “We will also improve the manufacturing methods in order to avoid imperfections that effect on the quality of the qubits,” adds Prof. Alexey Ustinov (KIT).

    The researchers pay special attention to the interplay between hardware and software, in which they develop algorithms that are ideally matched to the hardware, i.e. the type of qubits and operations as well as the existing connections between the qubits. “This is the only way to make optimal use of the hardware resources currently available and in the near future.” says Prof. Hartmann (FAU). “In particular, we will also develop more efficient and precise methods for characterizing the qubits and modeling the overall system.” adds Prof. Wilhelm-Mauch, who recently moved to Forschungszentrum Jülich and is working there with Prof. DiVincenzo and Dr. Bushev on setting up a quantum computing center.

    Ultimately, however, it is also important to lay the foundations for rapid industrialization and commercialization of quantum technology. This includes a reproducible production of scalable quantum circuits according to industrial standards. “With its many years of experience in the manufacture of special semiconductor chips, Infineon can make a significant contribution to improving superconducting circuits. To achieve this goal, we can draw also on our quantum technology expertise in the field of ion traps, a second very promising quantum computer platform.” says Sebastian Luber from Infineon. In order to be able to control the highly sensitive quantum circuits accurately and at the same time shield them from the environment, optimized processor housings are being developed in the project. “Scaling to a large number of qubits and operating them at low temperatures also poses great challenges to the packaging technology. Here, however, we can very well adapt the existing tools from traditional fields and apply them to the field of quantum technologies”, mentions Sébastien Chartier (IAF).

    A nucleus for future quantum computer development

    The technologies developed within GeQCoS will not only lead to new scientific knowledge, but also strengthen the quantum ecosystem in Germany and Europe through close links with companies. A specific goal is to make the quantum processor available to first-time users both on the hardware and on the software level as early as possible. Thanks to numerous companies with strong research and development departments, Germany is in an ideal starting position to become a leading center for users and beneficiaries of quantum computing. With access to the processor developed in the project, companies in the quantum technology sector should be strengthened and new start-ups should be promoted.

    In addition, the project may serve as the nucleus of the current federal initiative to build a quantum computer “made in Germany”. The close association between science and industry is a clear commitment to the promotion of technology transfer and to the establishment of a Germany-wide network based on superconducting qubits. The orientation of the project at the interface between engineering, computer science and physics takes into account the interdisciplinary nature of the field of quantum information processing and serves as an important component of the German technology landscape for the training and further education of highly qualified scientists.

    GeQCoS “German Quantum Computer based on Superconducting Qubits”

    Funded by the German Federal Ministry of Education and Research
    Funding program quantum technologies – from basic research to market
    Contract number: 13N15685

    Project partners:

    The Walther Meißner Institute (WMI) of the Bavarian Academy of Sciences has been doing pioneering work in the field of quantum sciences and quantum technologies (QWT) with superconducting circuits in close collaboration with the Technical University of Munich for almost 20 years and is involved in a large number of quantum initiatives in the Munich area in a leading role.

    Forschungszentrum Jülich (FZJ) addresses quantum computing in quantum materials, quantum computing devices and with the quantum computing user facility JUNIQ. It covers both fundamental research and applications in quantum computing. It also hosts the central laboratory of the European flagship project OpenSuperQ.

    At the Karlsruhe Institute of Technology (KIT), experimental pioneering work on multiplexed qubit readout, two-level defects, quantum simulators and quantum metamaterials has been carried out and the development of quantum circuits has been advanced.

    The University of Erlangen Nuremberg (FAU) is one of the most innovative universities in the world. In the group of Prof. Hartmann, besides the development of coupling circuits and qubits, the development of algorithms for near-tearm quantum computers is advanced.

    Infineon Technologies AG is a leading global provider of semiconductor solutions with one of the broadest product portfolios in the industry. The company has a high level of expertise in the conception, design and manufacturing of special technologies and is involved in several consortia on quantum technologies, including PIEDMONS on ion trap-based and QUASAR on silicon-based quantum computers.

    The Fraunhofer Institute for Applied Solid State Physics IAF offers the entire value chain in the field of III/V semiconductors and has many years of experience in the realization of microwave and submillimeter wave modules both in waveguides and on printed circuit boards. In the field of quantum computing, Fraunhofer IAF participates, for example, in the EU project “SEQUENCE” (development of cryogenic electronics) and coordinates the Competence Center Quantum Computing Baden-Württemberg.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Jülich Research Centre[Forschungszentrum Jülich] is a member of the Helmholtz Association of German Research Centres [Helmholtz-Gemeinschaft Deutscher Forschungszentren ](DE) and is one of the largest interdisciplinary research centres in Europe. It was founded on 11 December 1956 by the state of North Rhine-Westphalia as a registered association, before it became “Kernforschungsanlage Jülich GmbH” or Nuclear Research Centre Jülich in 1967. In 1990, the name of the association was changed to “Forschungszentrum Jülich GmbH”. It has close collaborations with RWTH Aachen in the form of Jülich-Aachen Research Alliance (JARA).

    Forschungszentrum Jülich is situated in the middle of the Stetternich Forest in Jülich (Kreis Düren, Rheinland) and covers an area of 2.2 square kilometres.

    Forschungszentrum Jülich employs more than 5,700 members of staff (2015) and works within the framework of the disciplines physics, chemistry, biology, medicine and engineering on the basic principles and applications in the areas of health, information, environment and energy. Amongst the members of staff, there are approx. 1,500 scientists including 400 PhD students and 130 diploma students. Around 600 people work in the administration and service areas, 500 work for project management agencies, and there are 1,600 technical staff members, while around 330 trainees are completing their training in more than 20 professions.

    More than 800 visiting scientists come to Forschungszentrum Jülich every year from about 50 countries.

     
  • richardmitnick 9:07 am on December 31, 2020 Permalink | Reply
    Tags: "Opinion-The unhackable computers that could revolutionize the future", , , , , , , Qubits   

    From CNN: “Opinion-The unhackable computers that could revolutionize the future” 

    From CNN

    December 29, 2020

    Don Lincoln-Fermi National Accelerator Laboratory.

    Modern science often makes some unbelievable claims, but the discipline that seems to be the most outrageous may well be quantum physics. Commonly mentioned quantum ideas include cats simultaneously being alive and dead until someone looks at them and multiple worlds where not only are all things possible, but all things actually happen — at least in a parallel reality.

    With such mind-bending predictions, it is entirely reasonable to imagine that no useful technology could arise from quantum physics. However, this is not true. Physicists, engineers and computer scientists are trying to harness the counterintuitive behavior of quantum mechanics to build quantum computers, leading eventually to a quantum internet.
    And this effort isn’t just an abstract goal of academics; it has been identified by the US government as an important national initiative.

    Federal agencies have begun to lay out the framework for an American quantum infrastructure. The Department of Energy, for example, plans to link together its laboratories with a quantum internet.

    A quantum internet is both similar and different to the ordinary internet. It is similar in that it connects computers, although only quantum ones.

    Google 54-qubit Sycamore superconducting processor quantum computer.

    IBM iconic image of Quantum computer.

    It is different because the way these computers interact is essentially unhackable.
    Any attempt to intercept a message tell the intended recipient that someone had read it before it was delivered. To comply with the new initiative, the country’s universities and national laboratories have begun to develop the capabilities necessary to make a successful quantum internet.

    One such advance was the successful transmission of quantum information announced this month by a consortium of universities, national labs and private industry.

    This is an important step toward building a quantum internet. (Disclosure: One of the affiliated laboratories is Fermi National Accelerator Laboratory, the facility at which I am a scientist, although I am not involved with this particular achievement.)

    Quantum computing differs from ordinary computing in many ways. First, ordinary computers are built around the concept of the bit, which is effectively a switch that can be flipped on or off — what computer professionals call a 1 or 0. In contrast, quantum computers use the qubit, short for “quantum bit.”

    Qubits are somewhat like ordinary bits, in that they are measured as 0s and 1s, but in between measurements, they are an indeterminate mix of 0s and 1s. This bit of quantum magic is exactly the same (and just as confusing) as Schrodinger’s cat.

    In 1935, Austrian physicist Erwin Schrodinger devised a thought experiment to illustrate the absurdity of a quantum theory called the Copenhagen interpretation of quantum mechanics [Stanford Encyclopedia of Philosophy]. The Copenhagen interpretation, named for the city in which it was invented, said that a quantum system could simultaneously be two opposite things until a measurement was performed.

    An example of a quantum system is a radioactive atom which, according to the Copenhagen interpretation, is both decayed and not decayed until someone measures it. Schrodinger imagined some radioactive material in a sealed box which included a radiation detector, a hammer, a vial of poisonous gas and a cat. If an atom of the radioactive substance decayed, the detector would record it and release a hammer to break the vial of poison, which would, in turn, kill the cat.

    According to the laws of quantum mechanics, until the box is opened, the atom is simultaneously both decayed and not decayed, meaning the cat was both alive and dead. Schrodinger felt this was absurd and claimed that his thought experiment invalidated the Copenhagen interpretation of quantum mechanics.

    Yet the idea that quantum mechanics allows for an object to simultaneously be in two opposite configurations is actually true. A qubit in a quantum computer is both a 0 and 1 in a single moment. That sounds impossible, but it’s one of the things that distinguishes the quantum world from our familiar one. Subatomic particles like electrons can be in two places or once or can simultaneously spin in opposite directions. It’s these opposite spins that make up qubits. A clockwise spinning electron is a 0 and a counterclockwise one is a 1 (or vice versa).

    The first working quantum computer was demonstrated in 1998. It was very primitive, but it was a baby step. Quantum computing has strengths and weaknesses. For most problems, a quantum computer isn’t really faster than high-end ordinary computers. However, for certain problems — like code breaking — quantum computing leaves regular computers in the dust.

    When advanced quantum computers are a reality, they will be able to break codes incomparably faster than currently possible. What would take trillions of years using ordinary computers would take a few seconds with a quantum computer. For example, Google has announced an algorithm that runs a hundred million times faster on quantum computers than ordinary ones.

    What’s more, quantum computers not only excel at decryption; they also excel at encryption. Quantum algorithms have been developed that are thought to be unbreakable. It is these cryptographic capabilities that interest both nations and corporations involved in e-commerce.

    Of course, perfect quantum computers are not yet available, and may never be. Unlike ordinary computers, in which it is easy to tell if a bit is on or off, in quantum computers, the qubits are very sensitive to their environment, especially heat. Vibrations of the atoms of the computer can destroy the information stored in qubits, requiring that quantum computers be kept at very low temperatures.

    While many institutions are developing quantum computers, making a quantum internet requires a way to transfer the information between computers. This is accomplished by a phenomenon called quantum teleportation [INQNET], in which two atoms separated by large distances are made to act as if they are identical.

    A recent advance by the IN-Q-NET consortium [PRX Quantum], led by Caltech and with many institutional collaborators, has successfully demonstrated long distance quantum teleportation at two test beds, one located at Caltech, and the other at Fermilab, near Chicago. This achievement used commercially available equipment and is an important step in developing a quantum internet.

    It is still very early in the history of quantum computing and it is unclear exactly where it is going. Its proponents are very enthusiastic about its future, while others (including myself) view it cautiously. However, there is no question that its codemaking and -breaking capabilities make it an interesting prospect in the landscape of online protection and hacking.

    Where will quantum computing be in a decade? It is hard to say. But we have a long history of impressive scientific feats to make us optimistic. In 1783, when Benjamin Franklin viewed the first balloon flight, he was asked what good it was. He replied with the famous quip, “What good is a newborn baby?” And today we fly around the world and are conquering space.
    Quantum computing is still in its infancy, but one day it may change the world. We must wait and see.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: