Tagged: Quantum entanglement Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:49 pm on August 5, 2021 Permalink | Reply
    Tags: "NIST’s Quantum Crystal Could Be a New Dark Matter Sensor", , Experiments searching for this type of dark matter have been ongoing for more than a decade with superconducting circuits., Ion crystals could detect certain types of dark matter — examples are axions and hidden photons — that interact with normal matter through a weak electric field., , , , Quantum entanglement, Quantum sensors such as this have the potential to detect signals from dark matter., The ions self-arrange into a flat 2D crystal just 200 millionths of a meter in diameter., The motion of trapped ions provides sensitivity over a different range of frequencies., The quantum sensor consists of 150 beryllium ions (electrically charged atoms) confined in a magnetic field.   

    From National Institute of Standards and Technology (US) : “NIST’s Quantum Crystal Could Be a New Dark Matter Sensor” 

    From National Institute of Standards and Technology (US)

    August 05, 2021
    Media Contact
    Laura Ost
    laura.ost@nist.gov
    (303) 497-4880

    1
    NIST physicists John Bollinger (left) and Matt Affolter adjust the laser and optics array used to trap and probe beryllium ions in the large magnetic chamber (white pillar at left). The ion crystal may help detect mysterious dark matter. Credit: R. Jacobson/NIST

    2
    Illustration of NIST quantum sensor made of trapped beryllium ions (red dots) self-arranged into a 2D crystal. Credit: S. Burrows/ JILA [Joint Institute for Laboratory Astrophysics]

    Physicists at the National Institute of Standards and Technology (NIST) have linked together, or “entangled,” the mechanical motion and electronic properties of a tiny blue crystal, giving it a quantum edge in measuring electric fields with record sensitivity that may enhance understanding of the universe.

    The quantum sensor consists of 150 beryllium ions (electrically charged atoms) confined in a magnetic field, so they self-arrange into a flat 2D crystal just 200 millionths of a meter in diameter. Quantum sensors such as this have the potential to detect signals from dark matter — a mysterious substance that might turn out to be, among other theories, subatomic particles that interact with normal matter through a weak electromagnetic field. The presence of dark matter could cause the crystal to wiggle in telltale ways, revealed by collective changes among the crystal’s ions in one of their electronic properties, known as spin.

    As described in the Aug. 6 issue of Science researchers can measure the vibrational excitation of the crystal — the flat plane moving up and down like the head of a drum — by monitoring changes in the collective spin. Measuring the spin indicates the extent of the vibrational excitation, referred to as displacement.

    This sensor can measure external electric fields that have the same vibration frequency as the crystal with more than 10 times the sensitivity of any previously demonstrated atomic sensor. (Technically, the sensor can measure 240 nanovolts per meter in one second.) In the experiments, researchers apply a weak electric field to excite and test the crystal sensor. A dark matter search would look for such a signal.

    “Ion crystals could detect certain types of dark matter — examples are axions and hidden photons — that interact with normal matter through a weak electric field,” NIST senior author John Bollinger said. “The dark matter forms a background signal with an oscillation frequency that depends on the mass of the dark matter particle. Experiments searching for this type of dark matter have been ongoing for more than a decade with superconducting circuits. The motion of trapped ions provides sensitivity over a different range of frequencies.”

    Bollinger’s group has been working with the ion crystal for more than a decade. What’s new is the use of a specific type of laser light to entangle the collective motion and spins of a large number of ions, plus what the researchers call a “time reversal” strategy to detect the results.

    The experiment benefited from a collaboration with NIST theorist Ana Maria Rey, who works at JILA [Joint Institute for Laboratory Astrophysics] University of Colorado (US)/National Institute of Standards and Technology (US). The theory work was critical for understanding the limits of the laboratory setup, offered a new model for understanding the experiment that is valid for large numbers of trapped ions, and demonstrated that the quantum advantage comes from entangling the spin and motion, Bollinger said.

    Rey noted that entanglement is beneficial in canceling the ions’ intrinsic quantum noise. However, measuring the entangled quantum state without destroying the information shared between spin and motion is difficult.

    “To avoid this issue, John is able to reverse the dynamics and disentangle the spin and the motion after the displacement is applied,” Rey said. “This time reversal decouples the spin and the motion, and now the collective spin itself has the displacement information stored on it, and when we measure the spins we can determine the displacement very precisely. This is neat!”

    The researchers used microwaves to produce desired values of the spins. Ions can be spin up (often envisioned as an arrow pointing up), spin down or other angles, including both at the same time, a special quantum state. In this experiment the ions all had the same spin — first spin up and then horizontal — so when excited they rotated together in a pattern characteristic of spinning tops.

    Crossed laser beams, with a difference in frequency that was nearly the same as the motion, were used to entangle the collective spin with the motion. The crystal was then vibrationally excited. The same lasers and microwaves were used to undo the entanglement. To determine how much the crystal moved, researchers measured the ions’ spin level of fluorescence (spin up scatters light, spin down is dark).

    In the future, increasing the number of ions to 100,000 by making 3D crystals is expected to improve the sensing capability thirtyfold. In addition, the stability of the crystal’s excited motion might be improved, which would enhance the time reversal process and the precision of the results.

    “If we are able to improve this aspect, this experiment can become a fundamental resource for detecting dark matter,” Rey said. “We know 85% of the matter in the universe is made of dark matter, but to date we do not know what dark matter is made of. This experiment could allow us in the future to unveil this mystery.”

    Co-authors included researchers from the University of Oklahoma. This work is supported in part by the Department of Energy (US), Air Force Office of Scientific Research (US), Defense Advanced Research Projects Agency (DARPA)(US), Army Research Office (US) and National Science Foundation (US).

    _____________________________________________________________________________________

    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com.


    Coma cluster via NASA/ESA Hubble.


    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.
    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.
    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL).


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970

    Dark Matter Research

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.
    ______________________________________________________________________________________________________________

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    [caption id="attachment_53872" align="alignnone" width="480"] NIST Campus, Gaitherberg, MD, USA

    National Institute of Standards and Technology (US)‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “National Institute of Standards and Technology (US)” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock. NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR). The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961. SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology (CNST) performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility. This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).

    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
  • richardmitnick 7:57 pm on August 3, 2021 Permalink | Reply
    Tags: A tedious hours-long process has been cut down to seconds and LFET is the first scalable transport and on-demand assembly technology of its kind., , , , LFET: low frequency electrothermoplasmonic tweezer, , , Quantum entanglement, Quantum photonics applications, , The scientists set out to make trapping and manipulating nanodiamonds simpler by using an interdisciplinary approach., The tweezer-a low frequency electrothermoplasmonic tweezer (LFET)-combines a fraction of a laser beam with a low-frequency alternating current electric field., This is an entirely new mechanism to trap and move nanodiamonds.   

    From Vanderbilt University (US) : “Research Snapshot: Vanderbilt engineer the first to introduce low-power dynamic manipulation of single nanoscale quantum objects” 

    Vanderbilt U Bloc

    From Vanderbilt University (US)

    Jul. 30, 2021
    Marissa Shapiro

    1
    Low frequency electrothermoplasmonic tweezer device rendering. (Ndukaife.)

    THE IDEA

    Led by Justus Ndukaife, assistant professor of electrical engineering, Vanderbilt researchers are the first to introduce an approach for trapping and moving a nanomaterial known as a single colloidal nanodiamond with nitrogen-vacancy center using low power laser beam. The width of a single human hair is approximately 90,000 nanometers; nanodiamonds are less than 100 nanometers. These carbon-based materials are one of the few that can release the basic unit of all light—a single photon—a building block for future quantum photonics applications, Ndukaife explains.

    Currently it is possible to trap nanodiamonds using light fields focused near nano-sized metallic surfaces, but it is not possible to move them that way because laser beam spots are simply too big. Using an atomic force microscope, it takes scientists hours to push nanodiamonds into place one at a time near an emission enhancing environment to form a useful structure. Further, to create entangled sources and qubits—key elements that improve the processing speeds of quantum computers—several nanodiamond emitters are needed close together so that they can interact to make qubits, Ndukaife said.

    “We set out to make trapping and manipulating nanodiamonds simpler by using an interdisciplinary approach,” Ndukaife said. “Our tweezer-a low frequency electrothermoplasmonic tweezer (LFET)-combines a fraction of a laser beam with a low-frequency alternating current electric field. This is an entirely new mechanism to trap and move nanodiamonds.” A tedious hours-long process has been cut down to seconds and LFET is the first scalable transport and on-demand assembly technology of its kind.

    WHY IT MATTERS

    Ndukaife’s work is a key ingredient for quantum computing, a technology that will soon enable a huge number of applications from high resolution imaging to the creation of unhackable systems and ever smaller devices and computer chips. In 2019, the Department of Energy invested $60.7 million in funding to advance the development of quantum computing and networking.

    “Controlling nanodiamonds to make efficient single photon sources that can be used for these kinds of technologies will shape the future,” Ndukaife said. “To enhance quantum properties, it is essential to couple quantum emitters such as nanodiamonds with nitrogen-vacancy centers to nanophotonic structures.”

    WHAT’S NEXT

    Ndukaife intends to further explore nanodiamonds, arranging them onto nanophotonic structures designed to enhance their emission performance. With them in place, his lab will explore the possibilities for ultrabright single photon sources and entanglement in an on-chip platform for information processing and imaging.

    “There are so many things we can use this research to build upon,” Ndukaife said. “This is the first technique that allows us to dynamically manipulate single nanoscale objects in two dimensions using a low power laser beam.”

    Science paper:
    Nano Letters

    Coauthored by graduate students in Ndukaife’s lab, Chuchuan Hong and Sen Yang, as well as their collaborator, Ivan Kravchenko at DOE’s Oak Ridge National Laboratory (US).

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Commodore Cornelius Vanderbilt was in his 79th year when he decided to make the gift that founded Vanderbilt University (US) in the spring of 1873.
    The $1 million that he gave to endow and build the university was the commodore’s only major philanthropy. Methodist Bishop Holland N. McTyeire of Nashville, husband of Amelia Townsend who was a cousin of the commodore’s young second wife Frank Crawford, went to New York for medical treatment early in 1873 and spent time recovering in the Vanderbilt mansion. He won the commodore’s admiration and support for the project of building a university in the South that would “contribute to strengthening the ties which should exist between all sections of our common country.”

    McTyeire chose the site for the campus, supervised the construction of buildings and personally planted many of the trees that today make Vanderbilt a national arboretum. At the outset, the university consisted of one Main Building (now Kirkland Hall), an astronomical observatory and houses for professors. Landon C. Garland was Vanderbilt’s first chancellor, serving from 1875 to 1893. He advised McTyeire in selecting the faculty, arranged the curriculum and set the policies of the university.

    For the first 40 years of its existence, Vanderbilt was under the auspices of the Methodist Episcopal Church, South. The Vanderbilt Board of Trust severed its ties with the church in June 1914 as a result of a dispute with the bishops over who would appoint university trustees.

    From the outset, Vanderbilt met two definitions of a university: It offered work in the liberal arts and sciences beyond the baccalaureate degree and it embraced several professional schools in addition to its college. James H. Kirkland, the longest serving chancellor in university history (1893-1937), followed Chancellor Garland. He guided Vanderbilt to rebuild after a fire in 1905 that consumed the main building, which was renamed in Kirkland’s honor, and all its contents. He also navigated the university through the separation from the Methodist Church. Notable advances in graduate studies were made under the third chancellor, Oliver Cromwell Carmichael (1937-46). He also created the Joint University Library, brought about by a coalition of Vanderbilt, Peabody College and Scarritt College.

    Remarkable continuity has characterized the government of Vanderbilt. The original charter, issued in 1872, was amended in 1873 to make the legal name of the corporation “The Vanderbilt University.” The charter has not been altered since.

    The university is self-governing under a Board of Trust that, since the beginning, has elected its own members and officers. The university’s general government is vested in the Board of Trust. The immediate government of the university is committed to the chancellor, who is elected by the Board of Trust.

    The original Vanderbilt campus consisted of 75 acres. By 1960, the campus had spread to about 260 acres of land. When George Peabody College for Teachers merged with Vanderbilt in 1979, about 53 acres were added.

    Vanderbilt’s student enrollment tended to double itself each 25 years during the first century of the university’s history: 307 in the fall of 1875; 754 in 1900; 1,377 in 1925; 3,529 in 1950; 7,034 in 1975. In the fall of 1999 the enrollment was 10,127.

    In the planning of Vanderbilt, the assumption seemed to be that it would be an all-male institution. Yet the board never enacted rules prohibiting women. At least one woman attended Vanderbilt classes every year from 1875 on. Most came to classes by courtesy of professors or as special or irregular (non-degree) students. From 1892 to 1901 women at Vanderbilt gained full legal equality except in one respect — access to dorms. In 1894 the faculty and board allowed women to compete for academic prizes. By 1897, four or five women entered with each freshman class. By 1913 the student body contained 78 women, or just more than 20 percent of the academic enrollment.

    National recognition of the university’s status came in 1949 with election of Vanderbilt to membership in the select Association of American Universities (US). In the 1950s Vanderbilt began to outgrow its provincial roots and to measure its achievements by national standards under the leadership of Chancellor Harvie Branscomb. By its 90th anniversary in 1963, Vanderbilt for the first time ranked in the top 20 private universities in the United States.

    Vanderbilt continued to excel in research, and the number of university buildings more than doubled under the leadership of Chancellors Alexander Heard (1963-1982) and Joe B. Wyatt (1982-2000), only the fifth and sixth chancellors in Vanderbilt’s long and distinguished history. Heard added three schools (Blair, the Owen Graduate School of Management and Peabody College) to the seven already existing and constructed three dozen buildings. During Wyatt’s tenure, Vanderbilt acquired or built one-third of the campus buildings and made great strides in diversity, volunteerism and technology.

    The university grew and changed significantly under its seventh chancellor, Gordon Gee, who served from 2000 to 2007. Vanderbilt led the country in the rate of growth for academic research funding, which increased to more than $450 million and became one of the most selective undergraduate institutions in the country.

    On March 1, 2008, Nicholas S. Zeppos was named Vanderbilt’s eighth chancellor after serving as interim chancellor beginning Aug. 1, 2007. Prior to that, he spent 2002-2008 as Vanderbilt’s provost, overseeing undergraduate, graduate and professional education programs as well as development, alumni relations and research efforts in liberal arts and sciences, engineering, music, education, business, law and divinity. He first came to Vanderbilt in 1987 as an assistant professor in the law school. In his first five years, Zeppos led the university through the most challenging economic times since the Great Depression, while continuing to attract the best students and faculty from across the country and around the world. Vanderbilt got through the economic crisis notably less scathed than many of its peers and began and remained committed to its much-praised enhanced financial aid policy for all undergraduates during the same timespan. The Martha Rivers Ingram Commons for first-year students opened in 2008 and College Halls, the next phase in the residential education system at Vanderbilt, is on track to open in the fall of 2014. During Zeppos’ first five years, Vanderbilt has drawn robust support from federal funding agencies, and the Medical Center entered into agreements with regional hospitals and health care systems in middle and east Tennessee that will bring Vanderbilt care to patients across the state.

    Today, Vanderbilt University is a private research university of about 6,500 undergraduates and 5,300 graduate and professional students. The university comprises 10 schools, a public policy center and The Freedom Forum First Amendment Center. Vanderbilt offers undergraduate programs in the liberal arts and sciences, engineering, music, education and human development as well as a full range of graduate and professional degrees. The university is consistently ranked as one of the nation’s top 20 universities by publications such as U.S. News & World Report, with several programs and disciplines ranking in the top 10.

    Cutting-edge research and liberal arts, combined with strong ties to a distinguished medical center, creates an invigorating atmosphere where students tailor their education to meet their goals and researchers collaborate to solve complex questions affecting our health, culture and society.

    Vanderbilt, an independent, privately supported university, and the separate, non-profit Vanderbilt University Medical Center share a respected name and enjoy close collaboration through education and research. Together, the number of people employed by these two organizations exceeds that of the largest private employer in the Middle Tennessee region.

     
  • richardmitnick 9:50 am on June 18, 2021 Permalink | Reply
    Tags: "Brookhaven Lab Intern Returns to Continue Theoretical Physics Pursuit", Co-design Center for Quantum Advantage (C2QA), DOE Science Undergraduate Laboratory Internships, National Quantum Information Science Research Centers, , , Quantum entanglement, , , Wenjie Gong recently received a Barry Goldwater Scholarship., Women in STEM-Wenjie Gong   

    From DOE’s Brookhaven National Laboratory (US) : Women in STEM-Wenjie Gong “Brookhaven Lab Intern Returns to Continue Theoretical Physics Pursuit” 

    From DOE’s Brookhaven National Laboratory (US)

    June 14, 2021
    Kelly Zegers
    kzegewrs@bnl.gov

    Wenjie Gong virtually visits Brookhaven for an internship to perform theory research on quantum information science in nuclear physics.

    1
    Wenjie Gong, who recently received a Barry Goldwater Scholarship. (Courtesy photo.)

    Internships often help students nail down the direction they’d like to take their scientific pursuits. For Wenjie Gong, who just completed her junior year at Harvard University (US), a first look into theoretical physics last summer as an intern with the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory made her want to dive further into the field.

    Gong returns to Brookhaven Lab this summer for her second experience as a virtual DOE Science Undergraduate Laboratory Internships (SULI) participant to continue collaborating with Raju Venugopalan, a senior physicist and Nuclear Theory Group leader. Together, they will explore the connections between nuclear physics theory—which explores the interactions of fundamental particles—and quantum computing.

    “I find theoretical physics fascinating as there are so many different avenues to explore and so many different angles from which to approach a problem,” Gong said. “Even though it can be difficult to parse through the technical underpinnings of different physical situations, any progress made is all the more exciting and rewarding.”

    Last year, Gong collaborated with Venugopalan on a project exploring possible ways to measure a quantum phenomenon known as “entanglement” in the matter produced at high-energy collisions.

    The physical properties of entangled particles are inextricably linked, even when the particles are separated by a great distance. Albert Einstein referred to entanglement as “spooky action at distance.”

    Studying this phenomenon is an important part of setting up long-distance quantum computing networks—the topic of many of the experiments at Co-design Center for Quantum Advantage (C2QA). The center led by Brookhaven Lab is one of five National Quantum Information Science Research Centers and applies quantum principles to materials, devices and software co-design efforts to lay the foundation for a new generation of quantum computers.

    “Usually, entanglement requires very precise measurements that are found in optics laboratories, but we wanted to look at how we could understand entanglement in high-energy particle collisions, which have much less of a controlled environment,” Gong said.

    Venugopalan said the motivation behind thinking of ways to detect entanglement in high-energy collisions is two-fold, first asking the question: “Can we think of experimental measures in collider experiments that have comparable ability to extract quantum action-at-a distance just as the carefully designed tabletop experiments?”

    “That would be interesting in itself because one might be inclined to think it unlikely,” he said.

    Venugopalan said scientists have identified sub-atomic particle correlations of so-called Lambda hyperons, which have particular properties that may allow such an experiment. Those experiments would open up the question of whether entanglement persists if scientists change the conditions of the collisions, he said.

    “If we made the collisions more violent, say, by increasing the number of particles produced, would the quantum action-at-a-distance correlation go away, just as you, and I, as macroscopic quantum states, don’t exhibit any spooky action-at-a-distance nonsense,” Venugopalan said. “When does such a quantum-to-classical transition take place?”

    In addition, can such measurements teach us about the nature of the interactions of the building blocks of matter–quarks and gluons?

    There may be more questions than answers at this stage, “but these questions force us to refine our experimental and computational tools,” Venugopalan said.

    Gong will continue collaborating with Venugopalan to develop the project on entanglement this summer. She may also start a new project exploring quirky features of soft particles in the quantum theory of electromagnetism that also apply to the strong force of nuclear physics, Venugopalan said. While her internship is virtual again this year, she said she learned last summer that collaborating remotely can be productive and rewarding.

    “Wenjie is the real deal,” Venugopalan said. “Even as a rising junior, she was functioning at the level of a postdoc. It’s a great joy to exchange ‘crazy’ ideas with her and work out the consequences. She shows great promise for an outstanding career in theoretical physics.”

    Others have noticed Gong’s scientific talent. She was recently honored with a Barry M. Goldwater Scholarship. The prestigious award supports impressive undergraduates who plan to pursue a PhD in the natural sciences, mathematics, and engineering.

    “I feel really honored and also very grateful to Raju, the Department of Energy (US) , and Brookhaven for providing me the opportunity to do this research—which I wrote about in my Goldwater essay,” Gong said.

    Gong said she’s looking forward to applying concepts from courses she took at Harvard over the past year, including quantum field theory, which she found challenging but also rewarding.

    Gong’s interest in physics started when she took Advanced Placement (AP) Physics in high school. The topic drew her in because it requires a way of thinking that’s different compared to other sciences because it explores the laws governing the motion of matter and existence, she said.

    In addition to further exploring high energy theoretical physics research, Gong said she hopes to one day teach as a university professor. She’s currently a peer tutor at Harvard.

    “I love teaching physics,” she said. “It’s really cool to see the ‘Ah-ha!’ moment when students go from not really understanding something to grasping a concept.”

    The SULI program at Brookhaven is managed by the Lab’s Office of Educational Programs and sponsored by DOE’s Office of Workforce Development for Teachers and Scientists (WDTS) within the Department’s Office of Science.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    One of ten national laboratories overseen and primarily funded by the DOE(US) Office of Science, DOE’s Brookhaven National Laboratory (US) conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University(US), the largest academic user of Laboratory facilities, and Battelle(US), a nonprofit, applied science and technology organization.

    Research at BNL specializes in nuclear and high energy physics, energy science and technology, environmental and bioscience, nanoscience and national security. The 5,300 acre campus contains several large research facilities, including the Relativistic Heavy Ion Collider [below] and National Synchrotron Light Source II [below]. Seven Nobel prizes have been awarded for work conducted at Brookhaven lab.

    BNL is staffed by approximately 2,750 scientists, engineers, technicians, and support personnel, and hosts 4,000 guest investigators every year. The laboratory has its own police station, fire department, and ZIP code (11973). In total, the lab spans a 5,265-acre (21 km^2) area that is mostly coterminous with the hamlet of Upton, New York. BNL is served by a rail spur operated as-needed by the New York and Atlantic Railway. Co-located with the laboratory is the Upton, New York, forecast office of the National Weather Service.

    Major programs

    Although originally conceived as a nuclear research facility, Brookhaven Lab’s mission has greatly expanded. Its foci are now:

    Nuclear and high-energy physics
    Physics and chemistry of materials
    Environmental and climate research
    Nanomaterials
    Energy research
    Nonproliferation
    Structural biology
    Accelerator physics

    Operation

    Brookhaven National Lab was originally owned by the Atomic Energy Commission(US) and is now owned by that agency’s successor, the United States Department of Energy (DOE). DOE subcontracts the research and operation to universities and research organizations. It is currently operated by Brookhaven Science Associates LLC, which is an equal partnership of Stony Brook University(US) and Battelle Memorial Institute(US). From 1947 to 1998, it was operated by Associated Universities, Inc. (AUI) (US), but AUI lost its contract in the wake of two incidents: a 1994 fire at the facility’s high-beam flux reactor that exposed several workers to radiation and reports in 1997 of a tritium leak into the groundwater of the Long Island Central Pine Barrens on which the facility sits.

    Foundations

    Following World War II, the US Atomic Energy Commission was created to support government-sponsored peacetime research on atomic energy. The effort to build a nuclear reactor in the American northeast was fostered largely by physicists Isidor Isaac Rabi and Norman Foster Ramsey Jr., who during the war witnessed many of their colleagues at Columbia University leave for new remote research sites following the departure of the Manhattan Project from its campus. Their effort to house this reactor near New York City was rivalled by a similar effort at the Massachusetts Institute of Technology (US) to have a facility near Boston, Massachusettes(US). Involvement was quickly solicited from representatives of northeastern universities to the south and west of New York City such that this city would be at their geographic center. In March 1946 a nonprofit corporation was established that consisted of representatives from nine major research universities — Columbia University(US), Cornell University(US), Harvard University(US), Johns Hopkins University(US), Massachusetts Institute of Technology(US), Princeton University(US), University of Pennsylvania(US), University of Rochester(US), and Yale University(US).

    Out of 17 considered sites in the Boston-Washington corridor, Camp Upton on Long Island was eventually chosen as the most suitable in consideration of space, transportation, and availability. The camp had been a training center from the US Army during both World War I and World War II. After the latter war, Camp Upton was deemed no longer necessary and became available for reuse. A plan was conceived to convert the military camp into a research facility.

    On March 21, 1947, the Camp Upton site was officially transferred from the U.S. War Department to the new U.S. Atomic Energy Commission (AEC), predecessor to the U.S. Department of Energy (DOE).

    Research and facilities

    Reactor history

    In 1947 construction began on the first nuclear reactor at Brookhaven, the Brookhaven Graphite Research Reactor. This reactor, which opened in 1950, was the first reactor to be constructed in the United States after World War II. The High Flux Beam Reactor operated from 1965 to 1999. In 1959 Brookhaven built the first US reactor specifically tailored to medical research, the Brookhaven Medical Research Reactor, which operated until 2000.

    Accelerator history

    In 1952 Brookhaven began using its first particle accelerator, the Cosmotron. At the time the Cosmotron was the world’s highest energy accelerator, being the first to impart more than 1 GeV of energy to a particle.


    The Cosmotron was retired in 1966, after it was superseded in 1960 by the new Alternating Gradient Synchrotron (AGS).

    The AGS was used in research that resulted in 3 Nobel prizes, including the discovery of the muon neutrino, the charm quark, and CP violation.

    In 1970 in BNL started the ISABELLE project to develop and build two proton intersecting storage rings.

    The groundbreaking for the project was in October 1978. In 1981, with the tunnel for the accelerator already excavated, problems with the superconducting magnets needed for the ISABELLE accelerator brought the project to a halt, and the project was eventually cancelled in 1983.

    The National Synchrotron Light Source (US) operated from 1982 to 2014 and was involved with two Nobel Prize-winning discoveries. It has since been replaced by the National Synchrotron Light Source II (US) [below].

    After ISABELLE’S cancellation, physicist at BNL proposed that the excavated tunnel and parts of the magnet assembly be used in another accelerator. In 1984 the first proposal for the accelerator now known as the Relativistic Heavy Ion Collider (RHIC)[below] was put forward. The construction got funded in 1991 and RHIC has been operational since 2000. One of the world’s only two operating heavy-ion colliders, RHIC is as of 2010 the second-highest-energy collider after the Large Hadron Collider(CH). RHIC is housed in a tunnel 2.4 miles (3.9 km) long and is visible from space.

    On January 9, 2020, It was announced by Paul Dabbar, undersecretary of the US Department of Energy Office of Science, that the BNL eRHIC design has been selected over the conceptual design put forward by DOE’s Thomas Jefferson National Accelerator Facility [Jlab] (US) as the future Electron–ion collider (EIC) in the United States.

    In addition to the site selection, it was announced that the BNL EIC had acquired CD-0 (mission need) from the Department of Energy. BNL’s eRHIC design proposes upgrading the existing Relativistic Heavy Ion Collider, which collides beams light to heavy ions including polarized protons, with a polarized electron facility, to be housed in the same tunnel.

    Other discoveries

    In 1958, Brookhaven scientists created one of the world’s first video games, Tennis for Two. In 1968 Brookhaven scientists patented Maglev, a transportation technology that utilizes magnetic levitation.

    Major facilities

    Relativistic Heavy Ion Collider (RHIC), which was designed to research quark–gluon plasma and the sources of proton spin. Until 2009 it was the world’s most powerful heavy ion collider. It is the only collider of spin-polarized protons.
    Center for Functional Nanomaterials (CFN), used for the study of nanoscale materials.
    BNL National Synchrotron Light Source II(US), Brookhaven’s newest user facility, opened in 2015 to replace the National Synchrotron Light Source (NSLS), which had operated for 30 years.[19] NSLS was involved in the work that won the 2003 and 2009 Nobel Prize in Chemistry.
    Alternating Gradient Synchrotron, a particle accelerator that was used in three of the lab’s Nobel prizes.
    Accelerator Test Facility, generates, accelerates and monitors particle beams.
    Tandem Van de Graaff, once the world’s largest electrostatic accelerator.
    Computational Science resources, including access to a massively parallel Blue Gene series supercomputer that is among the fastest in the world for scientific research, run jointly by Brookhaven National Laboratory and Stony Brook University.
    Interdisciplinary Science Building, with unique laboratories for studying high-temperature superconductors and other materials important for addressing energy challenges.
    NASA Space Radiation Laboratory, where scientists use beams of ions to simulate cosmic rays and assess the risks of space radiation to human space travelers and equipment.

    Off-site contributions

    It is a contributing partner to ATLAS experiment, one of the four detectors located at the Large Hadron Collider (LHC).


    It is currently operating at CERN near Geneva, Switzerland.

    Brookhaven was also responsible for the design of the SNS accumulator ring in partnership with Spallation Neutron Source at DOE’s Oak Ridge National Laboratory (US), Tennessee.

    Brookhaven plays a role in a range of neutrino research projects around the world, including the Daya Bay Neutrino Experiment (CN) nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China.


     
  • richardmitnick 9:33 pm on June 9, 2021 Permalink | Reply
    Tags: "Australian researchers create quantum microscope that can see the impossible", , Quantum entanglement,   

    From University of Queensland (AU) : Women in STEM-Catxere Casacio “Australian researchers create quantum microscope that can see the impossible” 

    u-queensland-bloc

    From University of Queensland (AU)

    10 June 2021

    Professor Warwick Bowen
    wbowen@physics.uq.edu.au
    +61 404 618 722

    Dominic Jarvis
    dominic.jarvis@uq.edu.au
    +61 413 334 924

    1
    Artist’s impression of UQ’s new quantum microscope in action.

    In a major scientific leap, University of Queensland researchers have created a quantum microscope that can reveal biological structures that would otherwise be impossible to see.

    This paves the way for applications in biotechnology, and could extend far beyond this into areas ranging from navigation to medical imaging.

    The microscope is powered by the science of quantum entanglement, an effect Einstein described as “spooky interactions at a distance”.

    Professor Warwick Bowen, from UQ’s Quantum Optics Lab and the ARC Centre of Excellence for Engineered Quantum Systems (EQUS), said it was the first entanglement-based sensor with performance beyond the best possible existing technology.

    “This breakthrough will spark all sorts of new technologies – from better navigation systems to better MRI machines, you name it,” Professor Bowen said.

    “Entanglement is thought to lie at the heart of a quantum revolution.

    “We’ve finally demonstrated that sensors that use it can supersede existing, non-quantum technology.

    “This is exciting – it’s the first proof of the paradigm-changing potential of entanglement for sensing.”

    Australia’s Quantum Technologies Roadmap sees quantum sensors spurring a new wave of technological innovation in healthcare, engineering, transport and resources.

    A major success of the team’s quantum microscope was its ability to catapult over a ‘hard barrier’ in traditional light-based microscopy.

    “The best light microscopes use bright lasers that are billions of times brighter than the sun,” Professor Bowen said.

    “Fragile biological systems like a human cell can only survive a short time in them and this is a major roadblock.

    2
    UQ team researchers (counter-clockwise from bottom-left) Catxere Casacio, Warwick Bowen, Lars Madsen and Waleed Muhammad aligning the quantum microscope.

    “The quantum entanglement in our microscope provides 35 per cent improved clarity without destroying the cell, allowing us to see minute biological structures that would otherwise be invisible.

    “The benefits are obvious – from a better understanding of living systems, to improved diagnostic technologies.”

    Professor Bowen said there were potentially boundless opportunities for quantum entanglement in technology.

    “Entanglement is set to revolutionise computing, communication and sensing,” he said.

    “Absolutely secure communication was demonstrated some decades ago as the first demonstration of absolute quantum advantage over conventional technologies.

    “Computing faster than any possible conventional computer was demonstrated by Google two years ago, as the first demonstration of absolute advantage in computing.

    “The last piece in the puzzle was sensing, and we’ve now closed that gap.

    “This opens the door for some wide-ranging technological revolutions.”

    The research was supported by the Air Force Office of Scientific Research (US) and the Australian Research Council-ARC Centre of Excellence (AU). It is published in Nature.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    u-queensland-campus

    The University of Queensland (AU) is one of Australia’s leading research and teaching institutions. We strive for excellence through the creation, preservation, transfer and application of knowledge. For more than a century, we have educated and worked with outstanding people to deliver knowledge leadership for a better world.

    UQ ranks in the top 50 as measured by the QS World University Rankings and the Performance Ranking of Scientific Papers for World Universities. The University also ranks 52 in the US News Best Global Universities Rankings, 60 in the Times Higher Education World University Rankings and 55 in the Academic Ranking of World Universities.

     
  • richardmitnick 1:01 pm on June 9, 2021 Permalink | Reply
    Tags: "Early endeavours on the path to reliable quantum machine learning", Computer scientists led by ETH Zürich conduct an early exploration for reliable quantum machine learning., Quantum entanglement, , , , The fact that quantum states can superpose and entangle creates a basis that allows quantum computers the access to a fundamentally richer set of processing logic., The future quantum computers should be capable of super-​fast and reliable computation. Today this is still a major challenge., Translating classical wisdom into the quantum realm.   

    From Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich] (CH): “Early endeavours on the path to reliable quantum machine learning” 

    From Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich] (CH)

    08.06.2021
    Florian Meyer

    The future quantum computers should be capable of super-​fast and reliable computation. Today, this is still a major challenge. Now, computer scientists led by ETH Zürich conduct an early exploration for reliable quantum machine learning.

    1
    Building on concepts such as quantum entanglement, quantum computers promise a wealth of machine learning applications. (Photo: Keystone/Science Photo Library)

    Anyone who collects mushrooms knows that it is better to keep the poisonous and the non-​poisonous ones apart. Not to mention what would happen if someone ate the poisonous ones. In such “classification problems”, which require us to distinguish certain objects from one another and to assign the objects we are looking for to certain classes by means of characteristics, computers can already provide useful support to humans.

    Intelligent machine learning methods can recognise patterns or objects and automatically pick them out of data sets. For example, they could pick out those pictures from a photo database that show non-​toxic mushrooms. Particularly with very large and complex data sets, machine learning can deliver valuable results that humans would not be able to find out, or only with much more time. However, for certain computational tasks, even the fastest computers available today reach their limits. This is where the great promise of quantum computers comes into play: that one day they will also perform super-​fast calculations that classical computers cannot solve in a useful period of time.

    The reason for this “quantum supremacy” lies in physics: quantum computers calculate and process information by exploiting certain states and interactions that occur within atoms or molecules or between elementary particles.

    The fact that quantum states can superpose and entangle creates a basis that allows quantum computers the access to a fundamentally richer set of processing logic. For instance, unlike classical computers, quantum computers do not calculate with binary codes or bits, which process information only as 0 or 1, but with quantum bits or qubits, which correspond to the quantum states of particles. The crucial difference is that qubits can realise not only one state – 0 or 1 – per computational step, but also a state in which both superpose. These more general manners of information processing in turn allow for a drastic computational speed-​up in certain problems.

    2
    A reliable quantum classification algorithm correctly classifies a toxic mushroom as “poisonous” while a noisy, perturbed one classifies it faultily as “edible”. (Image: npj Quantum Information / DS3Lab ETH Zürich.)

    Translating classical wisdom into the quantum realm.

    These speed advantages of quantum computing are also an opportunity for machine learning applications – after all, quantum computers could compute the huge amounts of data that machine learning methods need to improve the accuracy of their results much faster than classical computers.

    However, to really exploit the potential of quantum computing, one has to adapt the classical machine learning methods to the peculiarities of quantum computers. For example, the algorithms, i.e. the mathematical calculation rules that describe how a classical computer solves a certain problem, must be formulated differently for quantum computers. Developing well-​functioning “quantum algorithms” for machine learning is not entirely trivial, because there are still a few hurdles to overcome along the way.

    On the one hand, this is due to the quantum hardware. At ETH Zürich, researchers currently have quantum computers that work with up to 17 qubits (see “ETH Zürich and PSI found Quantum Computing Hub” of 3 May 2021). However, if quantum computers are to realise their full potential one day, they might need thousands to hundreds of thousands of qubits.

    Quantum noise and the inevitability of errors

    One challenge that quantum computers face concerns their vulnerability to error. Today’s quantum computers operate with a very high level of “noise”, as errors or disturbances are known in technical jargon. For the American Physical Society (US), this noise is “the major obstacle to scaling up quantum computers”. No comprehensive solution exists for both correcting and mitigating errors. No way has yet been found to produce error-​free quantum hardware, and quantum computers with 50 to 100 qubits are too small to implement correction software or algorithms.

    To a certain extent, one has to live with the fact that errors in quantum computing are in principle unavoidable, because the quantum states on which the concrete computational steps are based can only be distinguished and quantified with probabilities. What can be achieved, on the other hand, are procedures that limit the extent of noise and perturbations to such an extent that the calculations nevertheless deliver reliable results. Computer scientists refer to a reliably functioning calculation method as “robust” and in this context also speak of the necessary “error tolerance”.

    This is exactly what the research group led by Ce Zhang, ETH computer science professor and member of the ETH AI Center, has has recently explored, somehow “accidentally” during an endeavor to reason about the robustness of classical distributions for the purpose of building better machine learning systems and platforms. Together with Professor Nana Liu from Shanghai Jiao Tong University [海交通大学](CN) and with Professor Bo Li from the University of Illinois Urbana-Champaign(US), they have developed a new approach. This allows them to prove the robustness conditions of certain quantum-​based machine learning models, for which the quantum computation is guaranteed to be reliable and the result to be correct. The researchers have published their approach, which is one of the first of its kind, in the scientific journal npj Quantum Information.

    Protection against errors and hackers

    “When we realised that quantum algorithms, like classical algorithms, are prone to errors and perturbations, we asked ourselves how we can estimate these sources of errors and perturbations for certain machine learning tasks, and how we can guarantee the robustness and reliability of the chosen method,” says Zhikuan Zhao, a postdoc in Ce Zhang’s group. “If we know this, we can trust the computational results, even if they are noisy.”

    The researchers investigated this question using quantum classification algorithms as an example – after all, errors in classification tasks are tricky because they can affect the real world, for example if poisonous mushrooms were classified as non-​toxic. Perhaps most importantly, using the theory of quantum hypothesis testing – inspired by other researchers’ recent work in applying hypothesis testing in the classical setting – which allows quantum states to be distinguished, the ETH researchers determined a threshold above which the assignments of the quantum classification algorithm are guaranteed to be correct and its predictions robust.

    With their robustness method, the researchers can even verify whether the classification of an erroneous, noisy input yields the same result as a clean, noiseless input. From their findings, the researchers have also developed a protection scheme that can be used to specify the error tolerance of a computation, regardless of whether an error has a natural cause or is the result of manipulation from a hacking attack. Their robustness concept works for both hacking attacks and natural errors.

    “The method can also be applied to a broader class of quantum algorithms,” says Maurice Weber, a doctoral student with Ce Zhang and the first author of the publication. Since the impact of error in quantum computing increases as the system size rises, he and Zhao are now conducting research on this problem. “We are optimistic that our robustness conditions will prove useful, for example, in conjunction with quantum algorithms designed to better understand the electronic structure of molecules.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus
    Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich] (CH) is a public research university in the city of Zürich, Switzerland. Founded by the Swiss Federal Government in 1854 with the stated mission to educate engineers and scientists, the school focuses exclusively on science, technology, engineering and mathematics. Like its sister institution Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne](CH) , it is part of the Swiss Federal Institutes of Technology Domain (ETH Domain)) , part of the Swiss Federal Department of Economic Affairs, Education and Research.

    The university is an attractive destination for international students thanks to low tuition fees of 809 CHF per semester, PhD and graduate salaries that are amongst the world’s highest, and a world-class reputation in academia and industry. There are currently 22,200 students from over 120 countries, of which 4,180 are pursuing doctoral degrees. In the 2021 edition of the QS World University Rankings ETH Zürich is ranked 6th in the world and 8th by the Times Higher Education World Rankings 2020. In the 2020 QS World University Rankings by subject it is ranked 4th in the world for engineering and technology (2nd in Europe) and 1st for earth & marine science.

    As of November 2019, 21 Nobel laureates, 2 Fields Medalists, 2 Pritzker Prize winners, and 1 Turing Award winner have been affiliated with the Institute, including Albert Einstein. Other notable alumni include John von Neumann and Santiago Calatrava. It is a founding member of the IDEA League and the International Alliance of Research Universities (IARU) and a member of the CESAER network.

    ETH Zürich was founded on 7 February 1854 by the Swiss Confederation and began giving its first lectures on 16 October 1855 as a polytechnic institute (eidgenössische polytechnische Schule) at various sites throughout the city of Zurich. It was initially composed of six faculties: architecture, civil engineering, mechanical engineering, chemistry, forestry, and an integrated department for the fields of mathematics, natural sciences, literature, and social and political sciences.

    It is locally still known as Polytechnikum, or simply as Poly, derived from the original name eidgenössische polytechnische Schule, which translates to “federal polytechnic school”.

    ETH Zürich is a federal institute (i.e., under direct administration by the Swiss government), whereas the University of Zürich [Universität Zürich ] (CH) is a cantonal institution. The decision for a new federal university was heavily disputed at the time; the liberals pressed for a “federal university”, while the conservative forces wanted all universities to remain under cantonal control, worried that the liberals would gain more political power than they already had. In the beginning, both universities were co-located in the buildings of the University of Zürich.

    From 1905 to 1908, under the presidency of Jérôme Franel, the course program of ETH Zürich was restructured to that of a real university and ETH Zürich was granted the right to award doctorates. In 1909 the first doctorates were awarded. In 1911, it was given its current name, Eidgenössische Technische Hochschule. In 1924, another reorganization structured the university in 12 departments. However, it now has 16 departments.

    ETH Zürich, EPFL (Swiss Federal Institute of Technology in Lausanne) [École polytechnique fédérale de Lausanne](CH), and four associated research institutes form the Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) with the aim of collaborating on scientific projects.

    Reputation and ranking

    ETH Zürich is ranked among the top universities in the world. Typically, popular rankings place the institution as the best university in continental Europe and ETH Zürich is consistently ranked among the top 1-5 universities in Europe, and among the top 3-10 best universities of the world.

    Historically, ETH Zürich has achieved its reputation particularly in the fields of chemistry, mathematics and physics. There are 32 Nobel laureates who are associated with ETH Zürich, the most recent of whom is Richard F. Heck, awarded the Nobel Prize in chemistry in 2010. Albert Einstein is perhaps its most famous alumnus.

    In 2018, the QS World University Rankings placed ETH Zürich at 7th overall in the world. In 2015, ETH Zürich was ranked 5th in the world in Engineering, Science and Technology, just behind the Massachusetts Institute of Technology(US), Stanford University(US) and University of Cambridge(UK). In 2015, ETH Zürich also ranked 6th in the world in Natural Sciences, and in 2016 ranked 1st in the world for Earth & Marine Sciences for the second consecutive year.

    In 2016, Times Higher Education WorldUniversity Rankings ranked ETH Zürich 9th overall in the world and 8th in the world in the field of Engineering & Technology, just behind the Massachusetts Institute of Technology(US), Stanford University(US), California Institute of Technology(US), Princeton University(US), University of Cambridge(UK), Imperial College London(UK) and University of Oxford(UK).

    In a comparison of Swiss universities by swissUP Ranking and in rankings published by CHE comparing the universities of German-speaking countries, ETH Zürich traditionally is ranked first in natural sciences, computer science and engineering sciences.

    In the survey CHE ExcellenceRanking on the quality of Western European graduate school programs in the fields of biology, chemistry, physics and mathematics, ETH Zürich was assessed as one of the three institutions to have excellent programs in all the considered fields, the other two being Imperial College London(UK) and the University of Cambridge(UK), respectively.

     
  • richardmitnick 11:24 am on June 2, 2021 Permalink | Reply
    Tags: "UArizona Engineers Demonstrate a Quantum Advantage", , How (and When) Quantum Works, Quantum computing and quantum sensing have the potential to be vastly more powerful than their classical counterparts., Quantum entanglement, , The technology isn't quite there yet, UArizona College of Engineering, UArizona College of Optical Sciences,   

    From University of Arizona (US) : “UArizona Engineers Demonstrate a Quantum Advantage” 

    From University of Arizona (US)

    6.1.21

    Emily Dieckman
    College of Engineering
    edieckman@email.arizona.edu
    520-621-1992
    760-981-8808

    In a new paper, researchers in the College of Engineering and James C. Wyant College of Optical Sciences experimentally demonstrate how quantum resources aren’t just dreams for the distant future – they can improve the technology of today.

    1

    Quantum computing and quantum sensing have the potential to be vastly more powerful than their classical counterparts. Not only could a fully realized quantum computer take just seconds to solve equations that would take a classical computer thousands of years, but it could have incalculable impacts on areas ranging from biomedical imaging to autonomous driving.

    However, the technology isn’t quite there yet.

    In fact, despite widespread theories about the far-reaching impact of quantum technologies, very few researchers have been able to demonstrate, using the technology available now, that quantum methods have an advantage over their classical counterparts.

    In a paper published on June 1 in the journal Physical Review X, University of Arizona researchers experimentally show that quantum has an advantage over classical computing systems.

    2
    Quntao Zhuang (left), PI of the Quantum Information Theory Group, and Zheshen Zhang, PI of the Quantum Information and Materials Group, are both assistant professors in the College of Engineering.

    “Demonstrating a quantum advantage is a long-sought-after goal in the community, and very few experiments have been able to show it,” said paper co-author Zheshen Zhang, assistant professor of materials science and engineering, principal investigator of the UArizona Quantum Information and Materials Group and one of the paper’s authors. “We are seeking to demonstrate how we can leverage the quantum technology that already exists to benefit real-world applications.”

    How (and When) Quantum Works

    Quantum computing and other quantum processes rely on tiny, powerful units of information called qubits. The classical computers we use today work with units of information called bits, which exist as either 0s or 1s, but qubits are capable of existing in both states at the same time. This duality makes them both powerful and fragile. The delicate qubits are prone to collapse without warning, making a process called error correction – which addresses such problems as they happen – very important.

    The quantum field is now in an era that John Preskill, a renowned physicist from the California Institute of Technology (US), termed “noisy intermediate scale quantum,” or NISQ. In the NISQ era, quantum computers can perform tasks that only require about 50 to a few hundred qubits, though with a significant amount of noise, or interference. Any more than that and the noisiness overpowers the usefulness, causing everything to collapse. It is widely believed that 10,000 to several million qubits would be needed to carry out practically useful quantum applications.

    Imagine inventing a system that guarantees every meal you cook will turn out perfectly, and then giving that system to a group of children who don’t have the right ingredients. It will be great in a few years, once the kids become adults and can buy what they need. But until then, the usefulness of the system is limited. Similarly, until researchers advance the field of error correction, which can reduce noise levels, quantum computations are limited to a small scale.

    Entanglement Advantages

    The experiment described in the paper used a mix of both classical and quantum techniques. Specifically, it used three sensors to classify the average amplitude and angle of radio frequency signals.

    The sensors were equipped with another quantum resource called entanglement, which allows them to share information with one another and provides two major benefits: First, it improves the sensitivity of the sensors and reduces errors. Second, because they are entangled, the sensors evaluate global properties rather than gathering data about specific parts of a system. This is useful for applications that only need a binary answer; for example, in medical imaging, researchers don’t need to know about every single cell in a tissue sample that isn’t cancerous – just whether there’s one cell that is cancerous. The same concept applies to detecting hazardous chemicals in drinking water.

    The experiment demonstrated that equipping the sensors with quantum entanglement gave them an advantage over classical sensors, reducing the likelihood of errors by a small but critical margin.

    “This idea of using entanglement to improve sensors is not limited to a specific type of sensor, so it could be used for a range of different applications, as long as you have the equipment to entangle the sensors,” said study co-author Quntao Zhuang, assistant professor of electrical and computer engineering and principal investigator of the Quantum Information Theory Group”In theory, you could consider applications like lidar (Light Detection and Ranging) for self-driving cars, for example.”

    Zhuang and Zhang developed the theory behind the experiment and described it in a 2019 Physical Review X paper. They co-authored the new paper with lead author Yi Xia, a doctoral student in the James C. Wyant College of Optical Sciences, and Wei Li, a postdoctoral researcher in materials science and engineering.

    Qubit Classifiers

    There are existing applications that use a mix of quantum and classical processing in the NISQ era, but they rely on preexisting classical datasets that must be converted and classified in the quantum realm. Imagine taking a series of photos of cats and dogs, then uploading the photos into a system that uses quantum methods to label the photos as either “cat” or “dog.”

    The team is tackling the labeling process from a different angle, by using quantum sensors to gather their own data in the first place. It’s more like using a specialized quantum camera that labels the photos as either “dog” or “cat” as the photos are taken.

    “A lot of algorithms consider data stored on a computer disk, and then convert that into a quantum system, which takes time and effort,” Zhuang said. “Our system works on a different problem by evaluating physical processes that are happening in real time.”

    The team is excited for future applications of their work at the intersection of quantum sensing and quantum computing. They even envision one day integrating their entire experimental setup onto a chip that could be dipped into a biomaterial or water sample to identify disease or harmful chemicals.

    “We think it’s a new paradigm for both quantum computing, quantum machine learning and quantum sensors, because it really creates a bridge to interconnect all these different domains,” Zhang said.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    As of 2019, the University of Arizona (US) enrolled 45,918 students in 19 separate colleges/schools, including the UArizona College of Medicine in Tucson and Phoenix and the James E. Rogers College of Law, and is affiliated with two academic medical centers (Banner – University Medical Center Tucson and Banner – University Medical Center Phoenix). UArizona is one of three universities governed by the Arizona Board of Regents. The university is part of the Association of American Universities and is the only member from Arizona, and also part of the Universities Research Association(US). The university is classified among “R1: Doctoral Universities – Very High Research Activity”.

    Known as the Arizona Wildcats (often shortened to “Cats”), the UArizona’s intercollegiate athletic teams are members of the Pac-12 Conference of the NCAA. UArizona athletes have won national titles in several sports, most notably men’s basketball, baseball, and softball. The official colors of the university and its athletic teams are cardinal red and navy blue.

    After the passage of the Morrill Land-Grant Act of 1862, the push for a university in Arizona grew. The Arizona Territory’s “Thieving Thirteenth” Legislature approved the UArizona in 1885 and selected the city of Tucson to receive the appropriation to build the university. Tucson hoped to receive the appropriation for the territory’s mental hospital, which carried a $100,000 allocation instead of the $25,000 allotted to the territory’s only university (Arizona State University(US) was also chartered in 1885, but it was created as Arizona’s normal school, and not a university). Flooding on the Salt River delayed Tucson’s legislators, and by they time they reached Prescott, back-room deals allocating the most desirable territorial institutions had been made. Tucson was largely disappointed with receiving what was viewed as an inferior prize.

    With no parties willing to provide land for the new institution, the citizens of Tucson prepared to return the money to the Territorial Legislature until two gamblers and a saloon keeper decided to donate the land to build the school. Construction of Old Main, the first building on campus, began on October 27, 1887, and classes met for the first time in 1891 with 32 students in Old Main, which is still in use today. Because there were no high schools in Arizona Territory, the university maintained separate preparatory classes for the first 23 years of operation.

    Research

    UArizona is classified among “R1: Doctoral Universities – Very high research activity”. UArizona is the fourth most awarded public university by National Aeronautics and Space Administration(US) for research. UArizona was awarded over $325 million for its Lunar and Planetary Laboratory (LPL) to lead NASA’s 2007–08 mission to Mars to explore the Martian Arctic, and $800 million for its OSIRIS-REx mission, the first in U.S. history to sample an asteroid.

    The LPL’s work in the Cassini spacecraft orbit around Saturn is larger than any other university globally. The UArizona laboratory designed and operated the atmospheric radiation investigations and imaging on the probe. UArizona operates the HiRISE camera, a part of the Mars Reconnaissance Orbiter. While using the HiRISE camera in 2011, UArizona alumnus Lujendra Ojha and his team discovered proof of liquid water on the surface of Mars—a discovery confirmed by NASA in 2015. UArizona receives more NASA grants annually than the next nine top NASA/JPL-Caltech(US)-funded universities combined. As of March 2016, the UArizona’s Lunar and Planetary Laboratory is actively involved in ten spacecraft missions: Cassini VIMS; Grail; the HiRISE camera orbiting Mars; the Juno mission orbiting Jupiter; Lunar Reconnaissance Orbiter (LRO); Maven, which will explore Mars’ upper atmosphere and interactions with the sun; Solar Probe Plus, a historic mission into the Sun’s atmosphere for the first time; Rosetta’s VIRTIS; WISE; and OSIRIS-REx, the first U.S. sample-return mission to a near-earth asteroid, which launched on September 8, 2016.

    UArizona students have been selected as Truman, Rhodes, Goldwater, and Fulbright Scholars. According to The Chronicle of Higher Education, UArizona is among the top 25 producers of Fulbright awards in the U.S.

    UArizona is a member of the Association of Universities for Research in Astronomy(US), a consortium of institutions pursuing research in astronomy. The association operates observatories and telescopes, notably Kitt Peak National Observatory(US) just outside Tucson. Led by Roger Angel, researchers in the Steward Observatory Mirror Lab at UArizona are working in concert to build the world’s most advanced telescope. Known as the Giant Magellan Telescope(CL), it will produce images 10 times sharper than those from the Earth-orbiting Hubble Telescope.

    Giant Magellan Telescope, 21 meters, to be at the NOIRLab(US) National Optical Astronomy Observatory(US) Carnegie Institution for Science’s(US) Las Campanas Observatory(CL), some 115 km (71 mi) north-northeast of La Serena, Chile, over 2,500 m (8,200 ft) high.


    The telescope is set to be completed in 2021. GMT will ultimately cost $1 billion. Researchers from at least nine institutions are working to secure the funding for the project. The telescope will include seven 18-ton mirrors capable of providing clear images of volcanoes and riverbeds on Mars and mountains on the moon at a rate 40 times faster than the world’s current large telescopes. The mirrors of the Giant Magellan Telescope will be built at UArizona and transported to a permanent mountaintop site in the Chilean Andes where the telescope will be constructed.

    Reaching Mars in March 2006, the Mars Reconnaissance Orbiter contained the HiRISE camera, with Principal Investigator Alfred McEwen as the lead on the project. This National Aeronautics and Space Administration(US) mission to Mars carrying the UArizona-designed camera is capturing the highest-resolution images of the planet ever seen. The journey of the orbiter was 300 million miles. In August 2007, the UArizona, under the charge of Scientist Peter Smith, led the Phoenix Mars Mission, the first mission completely controlled by a university. Reaching the planet’s surface in May 2008, the mission’s purpose was to improve knowledge of the Martian Arctic. The Arizona Radio Observatory(US), a part of UArizona Department of Astronomy Steward Observatory(US), operates the Submillimeter Telescope on Mount Graham.

    The National Science Foundation(US) funded the iPlant Collaborative in 2008 with a $50 million grant. In 2013, iPlant Collaborative received a $50 million renewal grant. Rebranded in late 2015 as “CyVerse”, the collaborative cloud-based data management platform is moving beyond life sciences to provide cloud-computing access across all scientific disciplines.
    In June 2011, the university announced it would assume full ownership of the Biosphere 2 scientific research facility in Oracle, Arizona, north of Tucson, effective July 1. Biosphere 2 was constructed by private developers (funded mainly by Texas businessman and philanthropist Ed Bass) with its first closed system experiment commencing in 1991. The university had been the official management partner of the facility for research purposes since 2007.

    U Arizona mirror lab-Where else in the world can you find an astronomical observatory mirror lab under a football stadium?

    University of Arizona’s Biosphere 2, located in the Sonoran desert. An entire ecosystem under a glass dome? Visit our campus, just once, and you’ll quickly understand why the UA is a university unlike any other.

     
  • richardmitnick 9:58 am on May 25, 2021 Permalink | Reply
    Tags: "Was Einstein wrong? Why some astrophysicists are questioning the theory of space-time", As in history revolutions are the lifeblood of science., , , Erwin Schrödinger's cat, General Theory of Relativity., , Loop Quantum Gravity (LQG), Modular space-time theory, , Quantum entanglement, , , The Planck scale length-around a trillionth of a trillionth of a trillionth of a meter.   

    From Live Science : “Was Einstein wrong? Why some astrophysicists are questioning the theory of space-time” 

    From Live Science

    5.24.21
    Colin Stuart

    1
    Do we have to kill off the theory of space and time to make sense of the universe? (Image credit: Tobias Roetsch.)

    As in history revolutions are the lifeblood of science. Bubbling undercurrents of disquiet boil over until a new regime emerges to seize power. Then everyone’s attention turns to toppling their new ruler. The king is dead, long live the king.

    This has happened many times in the history of physics and astronomy. First, we thought Earth was at the center of the solar system — an idea that stood for over 1,000 years. Then Copernicus stuck his neck out to say that the whole system would be a lot simpler if we are just another planet orbiting the sun. Despite much initial opposition, the old geocentric picture eventually buckled under the weight of evidence from the newly invented telescope.

    Then Newton came along to explain that gravity is why the planets orbit the sun. He said all objects with mass have a gravitational attraction towards each other. According to his ideas we orbit the sun because it is pulling on us, the moon orbits Earth because we are pulling on it. Newton ruled for two-and-a-half centuries before Albert Einstein turned up in 1915 to usurp him with his General Theory of Relativity. This new picture neatly explained inconsistencies in Mercury’s orbit, and was famously confirmed by observations of a solar eclipse off the coast of Africa in 1919.

    Instead of a pull, Einstein saw gravity as the result of curved space. He said that all objects in the universe sit in a smooth, four-dimensional fabric called space-time. Massive objects such as the sun warp the space-time around them, and so Earth’s orbit is simply the result of our planet following this curvature. To us that looks like a Newtonian gravitational pull. This space-time picture has now been on the throne for over 100 years, and has so far vanquished all pretenders to its crown. The discovery of gravitational waves in 2015 was a decisive victory, but, like its predecessors, it too might be about to fall. That’s because it is fundamentally incompatible with the other big beast in the physics zoo: Quantum theory.

    The quantum world is notoriously weird. Single particles can be in two places at once, for example. Only by making an observation do we force it to ‘choose’. Before an observation we can only assign probabilities to the likely outcomes. In the 1930s, Erwin Schrödinger devised a famous way to expose how perverse this idea is. He imagined a cat in a sealed box accompanied by a vial of poison attached to a hammer. The hammer is hooked up to a device that measures the quantum state of a particle. Whether or not the hammer smashes the vial and kills the cat hinges on that measurement, but quantum physics says that until such a measurement is made, the particle is simultaneously in both states, which means the vial is both broken and unbroken and the cat is alive and dead.

    Such a picture cannot be reconciled with a smooth, continuous fabric of space-time. “A gravitational field cannot be in two places at once,” said Sabine Hossenfelder, a theoretical physicist at the Frankfurt Institute for Advanced Studies [Frankfurter Institut für fortgeschrittene Studien] (DE). According to Einstein, space-time is warped by matter and energy, but quantum physics says matter and energy exist in multiple states simultaneously — they can be both here and over there. “So where is the gravitational field?” asks Hossenfelder. “Nobody has an answer to that question. It’s kind of embarrassing,” she said.

    1
    Massive bodies warp the fabric of space and time around them, leading to nearby objects following a curved path. (Image credit: Take 27 Ltd.)

    Try and use general relativity and quantum theory together, and it doesn’t work. “Above a certain energy, you get probabilities that are larger than one,” said Hossenfelder. One is the highest probability possible — it means an outcome is certain. You can’t be more certain than certain. Equally, calculations sometimes give you the answer infinity, which has no real physical meaning. The two theories are therefore mathematically inconsistent. So, like many monarchs throughout history, physicists are seeking a marriage between rival factions to secure peace. They’re searching for a theory of quantum gravity— the ultimate diplomatic exercise in getting these two rivals to share the throne. This has seen theorists turn to some outlandish possibilities.

    Arguably the most famous is string theory. It’s the idea that sub-atomic particles such as electrons and quarks are made from tiny vibrating strings. Just as you can play strings on a musical instrument to create different notes, string theorists argue that different combinations of strings create different particles. The attraction of the theory is that it can reconcile general relativity and quantum physics, at least on paper. However, to pull that particular rabbit out of the hat, the strings have to vibrate across eleven dimensions — seven more than the four in Einstein’s space-time fabric. As yet there is no experimental evidence that these extra dimensions really exist. “It might be interesting mathematics, but whether it describes the space-time in which we live, we don’t really know until there is an experiment,” said Jorma Louko from the University of Nottingham (UK).

    2
    One way to reconcile general relativity and quantum theory says reality is made of vibrating strings. (Image credit: Science Photo Library.)

    Partly inspired by string theory’s perceived failings, other physicists have turned to an alternative called Loop Quantum Gravity (LQG). They can get the two theories to play nicely if they do away with one of the central tenets of general relativity: That space-time is a smooth, continuous fabric. Instead, they argue, space-time is made up of a series of interwoven loops — that it has structure at the smallest size scales. This is a bit like a length of cloth. At first glance it looks like one smooth fabric. Look closely, however, and you’ll see it is really made of a network of stitches. Alternatively, think of it like a photograph on a computer screen: Zoom in, and you’ll see it is really made of individual pixels.

    The trouble is that when LQG physicists say small, they mean really small. These defects in space-time would only be apparent on the level of the Planck scale —around a trillionth of a trillionth of a trillionth of a meter. That’s so tiny that there would be more loops in a cubic centimeter of space than cubic centimeters in the entire observable universe. “If space-time only differs on the Planck scale then this would be difficult to test in any particle accelerator,” says Louko. You’d need an atom smasher a 1,000-trillion-times more powerful than the Large Hadron Collider (LHC) at CERN. How, then, can you detect space-time defects that small? The answer is to look across a large area of space.

    Light arriving here from the furthest reaches of the universe has traveled through billions of light years of space-time along the way. While the effect of each space-time defect would be tiny, over those distances interactions with multiple defects might well add up to a potentially observable effect. For the last decade, astronomers have been using light from far-off Gamma Ray Bursts to look for evidence in support of LQG. These cosmic flashes are the result of massive stars collapsing at the ends of their lives, and there is something about these distant detonations we currently cannot explain. “Their spectrum has a systematic distortion to it,” said Hossenfelder, but no one knows if that is something that happens on the way here or if it’s something to do with the source of the bursts themselves. The jury is still out.

    3
    An alternate picture says space and time is not smooth, but instead made of a series of tiny loops. (Image credit: Science Photo Library.)

    To make progress, we might have to go a step further than saying space-time isn’t the smooth, continuous fabric Einstein suggested. According to Einstein, space-time is like a stage that remains in place whether actors are treading its boards or not —even if there were no stars or planets dancing around, space-time would still be there. However, physicists Laurent Freidel, Robert Leigh, and Djordje Minic think that this picture is holding us back. They believe space-time doesn’t exist independently of the objects in it. Space-time is defined by the way objects interact. That would make space-time an artifact of the quantum world itself, not something to be combined with it. “It may sound kooky,” said Minic, “but it is a very precise way of approaching the problem.”

    The attraction of this theory — called modular space-time — is that it might help solve another long-standing problem in theoretical physics regarding something called locality, and a notorious phenomenon in quantum physics called entanglement. Physicists can set up a situation whereby they bring two particles together and link their quantum properties. They then separate them by a large distance and find they are still linked. Change the properties of one and the other will change instantly, as if information has traveled from one to the other faster than the speed of light in direct violation of relativity. Einstein was so perturbed by this phenomenon that he called it ‘spooky action at a distance’.

    Modular space-time theory can accommodate such behavior by redefining what it means to be separated. If space-time emerges from the quantum world, then being closer in a quantum sense is more fundamental than being close in a physical sense. “Different observers would have different notions of locality,” said Minic, “it depends on the context.” It’s a bit like our relationships with other people. We can feel closer to a loved one far away than the stranger who lives down the street. “You can have these non-local connections as long as they are fairly small,” said Hossenfelder.

    Freidel, Leigh, and Minic have been working on their idea for the last five years, and they believe they are slowly making progress. “We want to be conservative and take things step-by-step,” said Minic, “but it is tantalizing and exciting”. It’s certainly a novel approach, one that looks to “gravitationalize” the quantum world rather than quantizing gravity as in LQG. Yet as with any scientific theory, it needs to be tested. At the moment the trio are working on how to fit time into their model.

    This may all sound incredibly esoteric, something only academics should care about, but it could have a more profound effect on our everyday lives. “We sit in space, we travel through time, and if something changes in our understanding of space-time this will impact not only on our understanding of gravity, but of quantum theory in general,” said Hossenfelder. “All our present devices only work because of quantum theory. If we understand the quantum structure of space-time better that will have an impact on future technologies — maybe not in 50 or 100 years, but maybe in 200,” she said.

    The current monarch is getting long in tooth, and a new pretender is long overdue, but we can’t decide which of the many options is the most likely to succeed. When we do, the resulting revolution could bear fruit not just for theoretical physics, but for all.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 11:51 am on May 14, 2021 Permalink | Reply
    Tags: "Quantum machine learning hits a limit", A new theorem shows that information run through an information scrambler such as a black hole will reach a point where any algorithm will be unable to learn the information that has been scrambled., Barren plateaus are regions in the mathematical space of optimization algorithms where the ability to solve the problem becomes exponentially harder as the size of the system being studied increases., , Quantum entanglement,   

    From DOE’s Los Alamos National Laboratory (US) : “Quantum machine learning hits a limit” 

    LANL bloc

    From DOE’s Los Alamos National Laboratory (US)

    May 12, 2021

    Charles Poling
    cpoling@lanl.gov
    (505) 257-8006

    1
    A new theorem shows that information run through an information scrambler such as a black hole will reach a point where any algorithm will be unable to learn the information that has been scrambled.

    A new theorem from the field of quantum machine learning has poked a major hole in the accepted understanding about information scrambling.

    “Our theorem implies that we are not going to be able to use quantum machine learning to learn typical random or chaotic processes, such as black holes. In this sense, it places a fundamental limit on the learnability of unknown processes,” said Zoe Holmes, a post-doc at Los Alamos National Laboratory and coauthor of the paper describing the work published today in Physical Review Letters.

    “Thankfully, because most physically interesting processes are sufficiently simple or structured so that they do not resemble a random process, the results don’t condemn quantum machine learning, but rather highlight the importance of understanding its limits,” Holmes said.

    In the classic Hayden-Preskill thought experiment, a fictitious Alice tosses information such as a book into a black hole that scrambles the text. Her companion, Bob, can still retrieve it using entanglement, a unique feature of quantum physics. However, the new work proves that fundamental constraints on Bob’s ability to learn the particulars of a given black hole’s physics means that reconstructing the information in the book is going to be very difficult or even impossible.

    “Any information run through an information scrambler such as a black hole will reach a point where the machine learning algorithm stalls out on a barren plateau and thus becomes untrainable. That means the algorithm can’t learn scrambling processes,” said Andrew Sornborger a computer scientist at Los Alamos and coauthor of the paper. Sornborger is Director of Quantum Science Center at Los Alamos and leader of the Center’s algorithms and simulation thrust. The Center is a multi-institutional collaboration led by DOE’s Oak Ridge National Laboratory (US).

    Barren plateaus are regions in the mathematical space of optimization algorithms where the ability to solve the problem becomes exponentially harder as the size of the system being studied increases. This phenomenon, which severely limits the trainability of large scale quantum neural networks, was described in a recent paper [Nature Communications] by a related Los Alamos team.

    “Recent work has identified the potential for quantum machine learning to be a formidable tool in our attempts to understand complex systems,” said Andreas Albrecht, a co-author of the research. Albrecht is Director of the Center for Quantum Mathematics and Physics (QMAP) and Distinguished Professor, Department of Physics and Astronomy, at University of California-Davis (US). “Our work points out fundamental considerations that limit the capabilities of this tool.”

    In the Hayden-Preskill thought experiment, Alice attempts to destroy a secret, encoded in a quantum state, by throwing it into nature’s fastest scrambler, a black hole. Bob and Alice are the fictitious quantum dynamic duo typically used by physicists to represent agents in a thought experiment.

    “You might think that this would make Alice’s secret pretty safe,” Holmes said, “but Hayden and Preskill argued that if Bob knows the unitary dynamics implemented by the black hole, and share a maximally entangled state with the black hole, it is possible to decode Alice’s secret by collecting a few additional photons emitted from the black hole. But this prompts the question, how could Bob learn the dynamics implemented by the black hole? Well, not by using quantum machine learning, according to our findings.”

    A key piece of the new theorem developed by Holmes and her coauthors assumes no prior knowledge of the quantum scrambler, a situation unlikely to occur in real-world science.

    “Our work draws attention to the tremendous leverage even small amounts of prior information may play in our ability to extract information from complex systems and potentially reduce the power of our theorem,” Albrecht said. “Our ability to do this can vary greatly among different situations (as we scan from theoretical consideration of black holes to concrete situations controlled by humans here on earth). Future research is likely to turn up interesting examples, both of situations where our theorem remains fully in force, and others where it can be evaded.

    Funding: U.S. Department of Energy, Office of Science.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Los Alamos National Laboratory mission is to solve national security challenges through scientific excellence.

    LANL campus
    DOE’s Los Alamos National Laboratory , a multidisciplinary research institution engaged in strategic science on behalf of national security, is operated by Los Alamos National Security, LLC, a team composed of Bechtel National, the University of California, The Babcock & Wilcox Company, and URS for the
    Department of Energy’s National Nuclear Security Administration.
    Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

    Operated by Los Alamos National Security, LLC for the DOE National Nuclear Security Administration(US)

     
  • richardmitnick 11:17 am on May 9, 2021 Permalink | Reply
    Tags: "NIST Team Directs and Measures Quantum Drum Duet", , , , Quantum entanglement,   

    From National Institute of Standards and Technology (US) : “NIST Team Directs and Measures Quantum Drum Duet” 

    From National Institute of Standards and Technology (US)

    May 06, 2021
    Laura Ost
    laura.ost@nist.gov
    (303) 497-4880

    Like conductors of a spooky symphony, researchers at the National Institute of Standards and Technology (NIST) have “entangled” two small mechanical drums and precisely measured their linked quantum properties. Entangled pairs like this might someday perform computations and transmit data in large-scale quantum networks.

    The NIST team used microwave pulses to entice the two tiny aluminum drums into a quantum version of the Lindy Hop, with one partner bopping in a cool and calm pattern while the other was jiggling a bit more. Researchers analyzed radar-like signals to verify that the two drums’ steps formed an entangled pattern — a duet that would be impossible in the everyday classical world.

    1
    Credit: Juha Juvonen.

    2
    NIST researchers entangled the beats of these two mechanical drums — tiny aluminum membranes each made of about 1 trillion atoms — and precisely measured their linked quantum properties. Entangled pairs like this (as shown in this colorized micrograph), which are massive by quantum standards, might someday perform computations and transmit data in large-scale quantum networks. Credit: J. Teufel/NIST.

    What’s new is not so much the dance itself but the researchers’ ability to measure the drumbeats, rising and falling by just one-quadrillionth of a meter, and verify their fragile entanglement by detecting subtle statistical relationships between their motions.

    The research is described in the May 7 issue of Science.

    “If you analyze the position and momentum data for the two drums independently, they each simply look hot,” NIST physicist John Teufel said. “But looking at them together, we can see that what looks like random motion of one drum is highly correlated with the other, in a way that is only possible through quantum entanglement.”

    Quantum mechanics was originally conceived as the rulebook for light and matter at atomic scales. However, in recent years researchers have shown that the same rules can apply to increasingly larger objects such as the drums. Their back-and-forth motion makes them a type of system known as a mechanical oscillator. Such systems were entangled for the first time at NIST about a decade ago, and in that case the mechanical elements were single atoms.

    Since then, Teufel’s research group has been demonstrating quantum control of drumlike aluminum membranes suspended above sapphire mats. By quantum standards, the NIST drums are massive, 20 micrometers wide by 14 micrometers long and 100 nanometers thick. They each weigh about 70 picograms, which corresponds to about 1 trillion atoms.

    Entangling massive objects is difficult because they interact strongly with the environment, which can destroy delicate quantum states. Teufel’s group developed new methods to control and measure the motion of two drums simultaneously. The researchers adapted a technique first demonstrated in 2011 for cooling a single drum by switching from steady to pulsed microwave signals to separately optimize the steps of cooling, entangling and measuring the states. To rigorously analyze the entanglement, experimentalists also worked more closely with theorists, an increasingly important alliance in the global effort to build quantum networks.

    The NIST drum set is connected to an electrical circuit and encased in a cryogenically chilled cavity. When a microwave pulse is applied, the electrical system interacts with and controls the activities of the drums, which can sustain quantum states like entanglement for approximately a millisecond, a long time in the quantum world.

    For the experiments, researchers applied two simultaneous microwave pulses to cool the drums, two more simultaneous pulses to entangle the drums, and two final pulses to amplify and record the signals representing the quantum states of the two drums. The states are encoded in a reflected microwave field, similar to radar. Researchers compared the reflections to the original microwave pulse to determine the position and momentum of each drum.

    To cool the drums, researchers applied pulses at a frequency below the cavity’s natural vibrations. As in the 2011 experiment, the drumbeats converted applied photons to the cavity’s higher frequency. These photons leaked out of the cavity as it filled up. Each departing photon took with it one mechanical unit of energy — one phonon, or one quantum — from drum motion. This got rid of most of the heat-related drum motion.

    To create entanglement, researchers applied microwave pulses in between the frequencies of the two drums, higher than drum 1 and lower than drum 2. These pulses entangled drum 1 phonons with the cavity’s photons, generating correlated photon-phonon pairs. The pulses also cooled drum 2 further, as photons leaving the cavity were replaced with phonons. What was left was mostly pairs of entangled phonons shared between the two drums.

    To entangle the phonon pairs, the duration of the pulses was crucial. Researchers discovered that these microwave pulses needed to last longer than 4 microseconds, ideally 16.8 microseconds, to strongly entangle the phonons. During this time period the entanglement became stronger and the motion of each drum increased because they were moving in unison, a kind of sympathetic reinforcement, Teufel said.

    Researchers looked for patterns in the returned signals, or radar data. In the classical world the results would be random. Plotting the results on a graph revealed unusual patterns suggesting the drums were entangled. To be certain, the researchers ran the experiment 10,000 times and applied a statistical test to calculate the correlations between various sets of results, such as the positions of the two drums.

    “Roughly speaking, we measured how correlated two variables are — for example, if you measured the position of one drum, how well could you predict the position of the other drum,” Teufel said. “If they have no correlations and they are both perfectly cold, you could only guess the average position of the other drum within an uncertainly of half a quantum of motion. When they are entangled, we can do better, with less uncertainty. Entanglement is the only way this is possible.”

    “To verify that entanglement is present, we do a statistical test called an ‘entanglement witness,’’’ NIST theorist Scott Glancy said. “We observe correlations between the drums’ positions and momentums, and if those correlations are stronger than can be produced by classical physics, we know the drums must have been entangled. The radar signals measure position and momentum simultaneously, but the Heisenberg uncertainty principle says that this can’t be done with perfect accuracy. Therefore, we pay a cost of extra randomness in our measurements. We manage that uncertainty by collecting a large data set and correcting for the uncertainty during our statistical analysis.”

    Highly entangled, massive quantum systems like this might serve as long-lived nodes of quantum networks. The high-efficiency radar measurements used in this work could be helpful in applications such as quantum teleportation — data transfer without a physical link — or swapping entanglement between nodes of a quantum network, because these applications require decisions to be made based on measurements of entanglement outcomes. Entangled systems could also be used in fundamental tests of quantum mechanics and force sensing beyond standard quantum limits.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    National Institute of Standards and Technology (US)‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “National Institute of Standards and Technology (US)” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock. NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR). The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961. SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology (CNST) performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility. This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).

    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
  • richardmitnick 11:41 pm on May 2, 2021 Permalink | Reply
    Tags: "Going Beyond Qubits: New Study Demonstrates Key Components for a Qutrit-Based Quantum Computer", , , , , Quantum entanglement, Quantum process tomography,   

    From DOE’s Lawrence Berkeley National Laboratory(US): “Going Beyond Qubits: New Study Demonstrates Key Components for a Qutrit-Based Quantum Computer” 

    April 26, 2021
    Media Relations
    media@lbl.gov
    (510) 486-5183

    By Glenn Roberts Jr.

    Team led by Berkeley Lab, UC Berkeley scientists builds a new type of quantum processor capable of information scrambling like that theorized within black holes.

    1
    The experimental quantum computing setup at the Advanced Quantum Testbed. (Credit: Berkeley Lab)

    A team led by physicists at Lawrence Berkeley National Laboratory (Berkeley Lab) and University of California at Berkeley (US) has successfully observed the scrambling of quantum information, which is thought to underlie the behavior of black holes, using qutrits: information-storing quantum units that can represent three separate states at the same time. Their efforts also pave the way for building a quantum information processor based upon qutrits.

    The black hole information paradox

    The new study, recently published in the journal Physical Review X, makes use of a quantum circuit that is inspired by the longstanding physics question: What happens to information when it enters a black hole?

    Beyond the connection to cosmology and fundamental physics, the team’s technical milestones that made the experiment possible represent important progress toward using more complex quantum processors for quantum computing, cryptography, and error detection, among other applications.

    While black holes are considered one of the most destructive forces in the universe – matter and light cannot escape their pull, and are quickly and thoroughly scrambled once they enter – there has been considerable debate about whether and how information is lost after passing into a black hole.

    While black holes are considered one of the most destructive forces in the universe – matter and light cannot escape their pull, and are quickly and thoroughly scrambled once they enter – there has been considerable debate about whether and how information is lost after passing into a black hole.

    The late physicist Stephen Hawking showed that black holes emit radiation – now known as Hawking radiation – as they slowly evaporate over time. In principle, this radiation could carry information about what’s inside the black hole – even allowing the reconstruction of information that passes into the black hole.

    And by using a quantum property known as entanglement, it is possible to perform this reconstruction significantly more rapidly, as was shown in earlier work.

    Quantum entanglement defies the rules of classical physics, allowing particles to remain correlated even when separated by large distances so that the state of one particle will inform you about the state of its entangled partner. If you had two entangled coins, for example, knowing that one coin came up heads when you looked at it would automatically tell you that the other entangled coin was tails, for example.

    Most efforts in quantum computing seek to tap into this phenomenon by encoding information as entangled quantum bits, known as qubits (pronounced CUE-bits). Like a traditional computer bit, which can hold the value of zero or one, a qubit can also be either a zero or one. But in addition, a qubit can exist in a superposition that is both one and zero at the same time. In the case of a coin, it’s like a coin flip that can represent either heads or tails, as well as the superposition of both heads and tails at the same time.

    The power of 3: Introducing qutrits

    Each qubit you add to a quantum computer doubles its computing power, and that exponential increase soars when you use quantum bits capable of storing more values, like qutrits (pronounced CUE-trits). Because of this, it takes far fewer qubits and even fewer qutrits or qudits – which describes quantum units with three or more states – to perform complex algorithms capable of demonstrating the ability to solve problems that cannot be solved using conventional computers.

    That said, there are a number of technical hurdles to building quantum computers with a large number of quantum bits that can operate reliably and efficiently in solving problems in a truly quantum way.

    In this latest study, researchers detail how they developed a quantum processor capable of encoding and transmitting information using a series of five qutrits, which can each simultaneously represent three states. And despite the typically noisy, imperfect, and error-prone environment of quantum circuity, they found that their platform proved surprisingly resilient and robust.

    Qutrits can have a value of zero, one, or two, holding all of these states in superposition. In the coin analogy, it’s like a coin that has the possibility of coming up as heads, tails, or in landing on its thin edge.

    “A black hole is an extremely good encoder of information,” said Norman Yao, a faculty scientist in Berkeley Lab’s Materials Sciences Division and an assistant professor of physics at UC Berkeley who helped to lead the planning and design of the experiment. “It smears it out very quickly, so that any local noise has an extremely hard time destroying this information.”

    But, he added, “The encoder is so darn good that it’s also very hard to decode this information.”

    Creating an experiment to mimic quantum scrambling.

    The team set out to replicate the type of rapid quantum information smearing, or scrambling, in an experiment that used tiny devices called nonlinear harmonic oscillators as qutrits. These nonlinear harmonic oscillators are essentially sub-micron-sized weights on springs that can be driven at several distinct frequencies when subjected to microwave pulses.

    A common problem in making these oscillators work as qutrits, though, is that their quantum nature tends to break down very quickly via a mechanism called decoherence, so it is difficult to distinguish whether the information scrambling is truly quantum or is due to this decoherence or other interference, noted Irfan Siddiqi, the study’s lead author.

    Siddiqi is director of Berkeley Lab’s Advanced Quantum Testbed, a faculty scientist in the Lab’s Computational Research and Materials Sciences divisions, and a professor of physics at UC Berkeley.

    The testbed, which began accepting proposals from the quantum science community in 2020, is a collaborative research laboratory that provides open, free access to users who want to explore how superconducting quantum processors can be used to advance scientific research. The demonstration of scrambling is one of the first results from the testbed’s user program.

    “In principle, an isolated black hole exhibits scrambling,” Siddiqi said, “but any experimental system also exhibits loss from decoherence. In a laboratory, how do you distinguish between the two?”

    A key to the study was in preserving the coherence, or orderly patterning, of the signal carried by the oscillators for long enough to confirm that quantum scrambling was occurring via the teleportation of a qutrit. While teleportation may conjure up sci-fi imagery of “beaming up” people or objects from a planet’s surface onto a spaceship, in this case there is only the transmission of information – not matter – from one location to another via quantum entanglement.

    Another essential piece was the creation of customized logic gates that enable the realization of “universal quantum circuits,” which can be used to run arbitrary algorithms. These logic gates allow pairs of qutrits to interact with each other and were designed to handle three different levels of signals produced by the microwave pulses.

    One of the five qutrits in the experiment served as the input, and the other four qutrits were in entangled pairs. Because of the nature of the qutrits’ entanglement, a joint measurement of one of the pairs of qutrits after the scrambling circuit ensured that the state of the input qutrit was teleported to another qutrit.

    Mirrored black holes and wormholes

    The researchers used a technique known as quantum process tomography to verify that the logic gates were working and that the information was properly scrambled, so that it was equally likely to appear in any given part of the quantum circuit.

    Siddiqi said that one way to think about how the entangled qutrits transmit information is to compare it to a black hole. It’s as if there is a black hole and a mirrored version of that black hole, so that information passing in one side of the mirrored black hole is transmitted to the other side via entanglement.

    Looking forward, Siddiqi and Yao are particularly interested in tapping into the power of qutrits for studies related to traversable wormholes, which are theoretical passages connecting separate locations in the universe, for example.

    A scientist from the Perimeter Institute for Theoretical Physics (CA) in Canada also participated in the study, which received supported from the U.S. Department of Energy’s Office of Advanced Scientific Computing Research and Office of High Energy Physics; and from the National Science Foundation’s Graduate Research Fellowship.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    Bringing Science Solutions to the World

    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab)(US) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California(UC) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a University of California, Berkeley(US) physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    History

    1931–1941

    The laboratory was founded on August 26, 1931, by Ernest Lawrence, as the Radiation Laboratory of the University of California, Berkeley, associated with the Physics Department. It centered physics research around his new instrument, the cyclotron, a type of particle accelerator for which he was awarded the Nobel Prize in Physics in 1939.

    LBNL 88 inch cyclotron.


    Throughout the 1930s, Lawrence pushed to create larger and larger machines for physics research, courting private philanthropists for funding. He was the first to develop a large team to build big projects to make discoveries in basic research. Eventually these machines grew too large to be held on the university grounds, and in 1940 the lab moved to its current site atop the hill above campus. Part of the team put together during this period includes two other young scientists who went on to establish large laboratories; J. Robert Oppenheimer founded DOE’s Los Alamos Laboratory(US), and Robert Wilson founded Fermi National Accelerator Laboratory(US).

    1942–1950

    Leslie Groves visited Lawrence’s Radiation Laboratory in late 1942 as he was organizing the Manhattan Project, meeting J. Robert Oppenheimer for the first time. Oppenheimer was tasked with organizing the nuclear bomb development effort and founded today’s Los Alamos National Laboratory to help keep the work secret. At the RadLab, Lawrence and his colleagues developed the technique of electromagnetic enrichment of uranium using their experience with cyclotrons. The “calutrons” (named after the University) became the basic unit of the massive Y-12 facility in Oak Ridge, Tennessee. Lawrence’s lab helped contribute to what have been judged to be the three most valuable technology developments of the war (the atomic bomb, proximity fuse, and radar). The cyclotron, whose construction was stalled during the war, was finished in November 1946. The Manhattan Project shut down two months later.

    1951–2018

    After the war, the Radiation Laboratory became one of the first laboratories to be incorporated into the Atomic Energy Commission (AEC) (now Department of Energy(US). The most highly classified work remained at Los Alamos, but the RadLab remained involved. Edward Teller suggested setting up a second lab similar to Los Alamos to compete with their designs. This led to the creation of an offshoot of the RadLab (now the Lawrence Livermore National Laboratory(US)) in 1952. Some of the RadLab’s work was transferred to the new lab, but some classified research continued at Berkeley Lab until the 1970s, when it became a laboratory dedicated only to unclassified scientific research.

    Shortly after the death of Lawrence in August 1958, the UC Radiation Laboratory (both branches) was renamed the Lawrence Radiation Laboratory. The Berkeley location became the Lawrence Berkeley Laboratory in 1971, although many continued to call it the RadLab. Gradually, another shortened form came into common usage, LBNL. Its formal name was amended to Ernest Orlando Lawrence Berkeley National Laboratory in 1995, when “National” was added to the names of all DOE labs. “Ernest Orlando” was later dropped to shorten the name. Today, the lab is commonly referred to as “Berkeley Lab”.

    The Alvarez Physics Memos are a set of informal working papers of the large group of physicists, engineers, computer programmers, and technicians led by Luis W. Alvarez from the early 1950s until his death in 1988. Over 1700 memos are available on-line, hosted by the Laboratory.

    The lab remains owned by the Department of Energy(US), with management from the University of California(US). Companies such as Intel were funding the lab’s research into computing chips.

    Science mission

    From the 1950s through the present, Berkeley Lab has maintained its status as a major international center for physics research, and has also diversified its research program into almost every realm of scientific investigation. Its mission is to solve the most pressing and profound scientific problems facing humanity, conduct basic research for a secure energy future, understand living systems to improve the environment, health, and energy supply, understand matter and energy in the universe, build and safely operate leading scientific facilities for the nation, and train the next generation of scientists and engineers.

    The Laboratory’s 20 scientific divisions are organized within six areas of research: Computing Sciences; Physical Sciences; Earth and Environmental Sciences; Biosciences; Energy Sciences; and Energy Technologies. Berkeley Lab has six main science thrusts: advancing integrated fundamental energy science; integrative biological and environmental system science; advanced computing for science impact; discovering the fundamental properties of matter and energy; accelerators for the future; and developing energy technology innovations for a sustainable future. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab tradition that continues today.

    Berkeley Lab operates five major National User Facilities for the DOE Office of Science(US):

    The Advanced Light Source (ALS) is a synchrotron light source with 41 beam lines providing ultraviolet, soft x-ray, and hard x-ray light to scientific experiments.

    LBNL/ALS


    The ALS is one of the world’s brightest sources of soft x-rays, which are used to characterize the electronic structure of matter and to reveal microscopic structures with elemental and chemical specificity. About 2,500 scientist-users carry out research at ALS every year. Berkeley Lab is proposing an upgrade of ALS which would increase the coherent flux of soft x-rays by two-three orders of magnitude.

    The Joint Genome Institute (JGI) supports genomic research in support of the DOE missions in alternative energy, global carbon cycling, and environmental management. The JGI’s partner laboratories are Berkeley Lab, Lawrence Livermore National Lab (LLNL), DOE’s Oak Ridge National Laboratory(US)(ORNL), DOE’s Pacific Northwest National Laboratory(US) (PNNL), and the HudsonAlpha Institute for Biotechnology(US). The JGI’s central role is the development of a diversity of large-scale experimental and computational capabilities to link sequence to biological insights relevant to energy and environmental research. Approximately 1,200 scientist-users take advantage of JGI’s capabilities for their research every year.

    The LBNL Molecular Foundry(US) [above] is a multidisciplinary nanoscience research facility. Its seven research facilities focus on Imaging and Manipulation of Nanostructures; Nanofabrication; Theory of Nanostructured Materials; Inorganic Nanostructures; Biological Nanostructures; Organic and Macromolecular Synthesis; and Electron Microscopy. Approximately 700 scientist-users make use of these facilities in their research every year.

    The DOE’s NERSC National Energy Research Scientific Computing Center(US) is the scientific computing facility that provides large-scale computing for the DOE’s unclassified research programs. Its current systems provide over 3 billion computational hours annually. NERSC supports 6,000 scientific users from universities, national laboratories, and industry.

    National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory

    NERSC Cray Cori II supercomputer, named after Gerty Cori, the first American woman to win a Nobel Prize in science.

    NERSC Hopper Cray XE6 supercomputer, named after Grace Hopper, One of the first programmers of the Harvard Mark I computer.

    NERSC Cray XC30 Edison supercomputer.

    NERSC GPFS for Life Sciences.


    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF computer cluster in 2003.

    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Future:

    Cray Shasta Perlmutter SC18 AMD Epyc Nvidia pre-exascale supeercomputer

    NERSC is a DOE Office of Science User Facility.

    The DOE’s Energy Science Network(US) is a high-speed network infrastructure optimized for very large scientific data flows. ESNet provides connectivity for all major DOE sites and facilities, and the network transports roughly 35 petabytes of traffic each month.

    Berkeley Lab is the lead partner in the DOE’s Joint Bioenergy Institute(US) (JBEI), located in Emeryville, California. Other partners are the DOE’s Sandia National Laboratory(US), the University of California (UC) campuses of Berkeley and Davis, the Carnegie Institution for Science(US), and DOE’s Lawrence Livermore National Laboratory(US) (LLNL). JBEI’s primary scientific mission is to advance the development of the next generation of biofuels – liquid fuels derived from the solar energy stored in plant biomass. JBEI is one of three new U.S. Department of Energy (DOE) Bioenergy Research Centers (BRCs).

    Berkeley Lab has a major role in two DOE Energy Innovation Hubs. The mission of the Joint Center for Artificial Photosynthesis (JCAP) is to find a cost-effective method to produce fuels using only sunlight, water, and carbon dioxide. The lead institution for JCAP is the California Institute of Technology(US) and Berkeley Lab is the second institutional center. The mission of the Joint Center for Energy Storage Research (JCESR) is to create next-generation battery technologies that will transform transportation and the electricity grid. DOE’s Argonne National Laboratory(US) leads JCESR and Berkeley Lab is a major partner.

    Operations and governance

    The University of California(US) operates Lawrence Berkeley National Laboratory under a contract with the US Department of Energy. The site consists of 76 buildings (owned by the U.S. Department of Energy) located on 200 acres (0.81 km^2) owned by the university in the Berkeley Hills. Altogether, the Lab has some 4,000 UC employees, of whom about 800 are students or postdocs, and each year it hosts more than 3,000 participating guest scientists. There are approximately two dozen DOE employees stationed at the laboratory to provide federal oversight of Berkeley Lab’s work for the DOE. Although Berkeley Lab is governed by UC independently of the Berkeley campus, the two entities are closely interconnected: more than 200 Berkeley Lab researchers hold joint appointments as UC Berkeley faculty.
    The Lab’s budget for the fiscal year 2019 was US$1.1 billion dollars.

    Scientific achievements, inventions, and discoveries

    Notable scientific accomplishments at the Lab since World War II include the observation of the antiproton, the discovery of several transuranic elements, and the discovery of the accelerating universe.

    Since its inception, 13 researchers associated with Berkeley Lab (Ernest Lawrence, Glenn T. Seaborg, Edwin M. McMillan, Owen Chamberlain, Emilio G. Segrè, Donald A. Glaser, Melvin Calvin, Luis W. Alvarez, Yuan T. Lee, Steven Chu, George F. Smoot, Saul Perlmutter, and Jennifer Doudna) have been awarded either the Nobel Prize in Physics or the Nobel Prize in Chemistry.

    In addition, twenty-three Berkeley Lab employees, as contributors to the Intergovernmental Panel on Climate Change, shared the 2007 Nobel Peace Prize with former Vice President Al Gore.

    Seventy Berkeley Lab scientists are members of the U.S. National Academy of Sciences(US) (NAS), one of the highest honors for a scientist in the United States. Thirteen Berkeley Lab scientists have won the National Medal of Science, the nation’s highest award for lifetime achievement in fields of scientific research. Eighteen Berkeley Lab engineers have been elected to the National Academy of Engineering, and three Berkeley Lab scientists have been elected into the National Academy of Medicine. Nature Index rates the Lab sixth in the world among government research organizations; it is the only one of the top six that is a single laboratory, rather than a system of laboratories.

    Elements discovered by Berkeley Lab physicists include astatine; neptunium; plutonium; curium; americium; berkelium*; californium*; einsteinium; fermium; mendelevium; nobelium; lawrencium*; dubnium; and seaborgium*. Those elements listed with asterisks (*) are named after the University Professors Lawrence and Seaborg. Seaborg was the principal scientist involved in their discovery. The element technetium was discovered after Ernest Lawrence gave Emilio Segrè a molybdenum strip from the Berkeley Lab cyclotron. The fabricated evidence used to claim the creation of oganesson and livermorium by Victor Ninov, a researcher employed at Berkeley Lab, led to the retraction of two articles.

    Inventions and discoveries to come out of Berkeley Lab include: “smart” windows with embedded electrodes that enable window glass to respond to changes in sunlight; synthetic genes for antimalaria and anti-AIDS superdrugs based on breakthroughs in synthetic biology; electronic ballasts for more efficient lighting; Home Energy Saver; the web’s first do-it-yourself home energy audit tool; a pocket-sized DNA sampler called the PhyloChip; and the Berkeley Darfur Stove which uses one-quarter as much firewood as traditional cook stoves. One of Berkeley Lab’s most notable breakthroughs is the discovery of Dark Energy. During the 1980s and 1990s Berkeley Lab physicists and astronomers formed the Supernova Cosmology Project (SCP), using Type Ia supernovae as “standard candles” to measure the expansion rate of the universe. Their successful methods inspired competition, with the result that early in 1998 both the SCP and the Harvard Cosmology with Supernovae: The High-Z Supernova Search High-Z SN(US) announced the surprising discovery that expansion is accelerating; the cause was soon named Dark Energy.

    Arthur Rosenfeld, a senior scientist at Berkeley Lab, was the nation’s leading advocate for energy efficiency from 1975 until his death in 2017. He led efforts at the Lab that produced several technologies that radically improved efficiency: compact fluorescent lamps; low-energy refrigerators; and windows that trap heat. He established the Center for Building Science at the Lab, which developed into the Building Technology and Urban Systems Division. He developed the first energy-efficiency standards for buildings and appliances in California, which helped the state to sustain constant electricity use per capita, a phenomenon called the Rosenfeld effect. The Energy Efficiency and Environmental Impacts Division continues to set the research foundation for the national energy efficiency standards and works with China, India, and other countries to help develop their standards.

    Carl Haber and Vitaliy Fadeyev of Berkeley Lab developed the IRENE system for optical scanning of audio discs and cylinders.
    In December 2018, researchers at Intel Corp. and the Lawrence Berkeley National Laboratory published a paper in Nature, which outlined a chip “made with quantum materials called magnetoelectric multiferroics instead of the conventional silicon,” to allow for increased processing and reduced energy consumption to support technology such as artificial intelligence.

    A U.S. Department of Energy National Laboratory Operated by the University of California.

    University of California Seal

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: