Tagged: Quantum Computing Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:36 am on July 22, 2021 Permalink | Reply
    Tags: "'Magic-angle' trilayer graphene may be a rare magnet-proof superconductor", , “spin-singlet", , In spin-singlet superconductors if you kill superconductivity it never comes back — it’s gone for good. Here it reappeared again. So this definitely says this material is not spin-singlet., , , Quantum Computing, Spin-singlet pairs happily speed through a superconductor-except under high magnetic fields-which can shift the energy of each electron in opposite directions pulling the pair apart., Spin-triplet superconductivity, Superconducting materials are defined by their super-efficient ability to conduct electricity without losing energy., When exposed to high magnetic fields the spin of both electrons in a Cooper pair shift in the same direction. They are not pulled apart- continuing superconducting regardless of the magnetic field.   

    From Massachusetts Institute of Technology (US) : “‘Magic-angle’ trilayer graphene may be a rare magnet-proof superconductor” 

    MIT News

    From Massachusetts Institute of Technology (US)

    July 21, 2021
    Jennifer Chu

    New findings might help inform the design of more powerful MRI machines or robust quantum computers.

    1
    MIT physicists have observed signs of a rare type of superconductivity in a material called “magic-angle” twisted trilayer graphene. Credit: the researchers.

    MIT physicists have observed signs of a rare type of superconductivity in a material called magic-angle twisted trilayer graphene. In a study appearing today in Nature, the researchers report that the material exhibits superconductivity at surprisingly high magnetic fields of up to 10 Tesla, which is three times higher than what the material is predicted to endure if it were a conventional superconductor.

    The results strongly imply that magic-angle trilayer graphene, which was initially discovered by the same group, is a very rare type of superconductor, known as a “spin-triplet,” that is impervious to high magnetic fields. Such exotic superconductors could vastly improve technologies such as magnetic resonance imaging, which uses superconducting wires under a magnetic field to resonate with and image biological tissue. MRI machines are currently limited to magnet fields of 1 to 3 Tesla. If they could be built with spin-triplet superconductors, MRI could operate under higher magnetic fields to produce sharper, deeper images of the human body.

    The new evidence of spin-triplet superconductivity in trilayer graphene could also help scientists design stronger superconductors for practical quantum computing.

    “The value of this experiment is what it teaches us about fundamental superconductivity, about how materials can behave, so that with those lessons learned, we can try to design principles for other materials which would be easier to manufacture, that could perhaps give you better superconductivity,” says Pablo Jarillo-Herrero, the Cecil and Ida Green Professor of Physics at MIT.

    His co-authors on the paper include postdoc Yuan Cao and graduate student Jeong Min Park at MIT, and Kenji Watanabe and Takashi Taniguchi of the NIMS-National Institute for Materials Science [物質・材料研究機構] (JP).

    Strange shift

    Superconducting materials are defined by their super-efficient ability to conduct electricity without losing energy. When exposed to an electric current, electrons in a superconductor couple up in “Cooper pairs” that then travel through the material without resistance, like passengers on an express train.

    In a vast majority of superconductors, these passenger pairs have opposite spins, with one electron spinning up, and the other down — a configuration known as a “spin-singlet.” These pairs happily speed through a superconductor-except under high magnetic fields-which can shift the energy of each electron in opposite directions pulling the pair apart. In this way-and through mechanisms-high magnetic fields can derail superconductivity in conventional spin-singlet superconductors.

    “That’s the ultimate reason why in a large-enough magnetic field, superconductivity disappears,” Park says.

    But there exists a handful of exotic superconductors that are impervious to magnetic fields, up to very large strengths. These materials superconduct through pairs of electrons with the same spin — a property known as “spin-triplet.” When exposed to high magnetic fields, the energy of both electrons in a Cooper pair shift in the same direction, in a way that they are not pulled apart but continue superconducting unperturbed, regardless of the magnetic field strength.

    Jarillo-Herrero’s group was curious whether magic-angle trilayer graphene might harbor signs of this more unusual spin-triplet superconductivity. The team has produced pioneering work in the study of graphene moiré structures — layers of atom-thin carbon lattices that, when stacked at specific angles, can give rise to surprising electronic behaviors.

    The researchers initially reported such curious properties in two angled sheets of graphene, which they dubbed magic-angle bilayer graphene ([Nature] and [Nature]) . They soon followed up with tests of trilayer graphene [Nature], a sandwich configuration of three graphene sheets that turned out to be even stronger than its bilayer counterpart, retaining superconductivity at higher temperatures. When the researchers applied a modest magnetic field, they noticed that trilayer graphene was able to superconduct at field strengths that would destroy superconductivity in bilayer graphene.

    “We thought, this is something very strange,” Jarillo-Herrero says.

    A super comeback

    In their new study, the physicists tested trilayer graphene’s superconductivity under increasingly higher magnetic fields. They fabricated the material by peeling away atom-thin layers of carbon from a block of graphite, stacking three layers together, and rotating the middle one by 1.56 degrees with respect to the outer layers. They attached an electrode to either end of the material to run a current through and measure any energy lost in the process. Then they turned on a large magnet in the lab, with a field which they oriented parallel to the material.

    As they increased the magnetic field around trilayer graphene, they observed that superconductivity held strong up to a point before disappearing, but then curiously reappeared at higher field strengths — a comeback that is highly unusual and not known to occur in conventional spin-singlet superconductors.

    “,” Cao says. “Here it reappeared again. So this definitely says this material is not spin-singlet.”

    They also observed that after “re-entry,” superconductivity persisted up to 10 Tesla, the maximum field strength that the lab’s magnet could produce. This is about three times higher than what the superconductor should withstand if it were a conventional spin-singlet, according to Pauli’s limit, a theory that predicts the maximum magnetic field at which a material can retain superconductivity.

    Trilayer graphene’s reappearance of superconductivity, paired with its persistence at higher magnetic fields than predicted, rules out the possibility that the material is a run-of-the-mill superconductor. Instead, it is likely a very rare type, possibly a spin-triplet, hosting Cooper pairs that speed through the material, impervious to high magnetic fields. The team plans to drill down on the material to confirm its exact spin state, which could help to inform the design of more powerful MRI machines, and also more robust quantum computers.

    “Regular quantum computing is super fragile,” Jarillo-Herrero says. “You look at it and, poof, it disappears. About 20 years ago, theorists proposed a type of topological superconductivity that, if realized in any material, could [enable] a quantum computer where states responsible for computation are very robust. That would give infinite more power to do computing. The key ingredient to realize that would be spin-triplet superconductors, of a certain type. We have no idea if our type is of that type. But even if it’s not, this could make it easier to put trilayer graphene with other materials to engineer that kind of superconductivity. That could be a major breakthrough. But it’s still super early.”

    This research was supported by the U.S. Department of Energy, the National Science Foundation, the Gordon and Betty Moore Foundation, the Fundacion Ramon Areces, and the CIFAR Quantum Materials Program.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    Massachusetts Institute of Technology (US) is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory (US), the MIT Bates Research and Engineering Center (US), and the Haystack Observatory (US), as well as affiliated laboratories such as the Broad Institute of MIT and Harvard(US) and Whitehead Institute (US).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology (US) adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with Massachusetts Institute of Technology (US) . The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology (US) is a member of the Association of American Universities (AAU).

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia (US), wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after Massachusetts Institute of Technology (US) was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst (US)). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    Massachusetts Institute of Technology (US) was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology (US) faculty and alumni rebuffed Harvard University (US) president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, the Massachusetts Institute of Technology (US) administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology (US) catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities (US)in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at Massachusetts Institute of Technology (US) that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    Massachusetts Institute of Technology (US)‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology (US)’s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, Massachusetts Institute of Technology (US) became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected Massachusetts Institute of Technology (US) profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of Massachusetts Institute of Technology (US) between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, Massachusetts Institute of Technology (US) no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and Massachusetts Institute of Technology (US)’s defense research. In this period Massachusetts Institute of Technology (US)’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. Massachusetts Institute of Technology (US) ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT (US) Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However six Massachusetts Institute of Technology (US) students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at Massachusetts Institute of Technology (US) over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, Massachusetts Institute of Technology (US)’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    Massachusetts Institute of Technology (US) has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology (US) classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    Massachusetts Institute of Technology (US) was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, Massachusetts Institute of Technology (US) launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, Massachusetts Institute of Technology (US) announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology (US) faculty adopted an open-access policy to make its scholarship publicly accessible online.

    Massachusetts Institute of Technology (US) has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology (US) community with thousands of police officers from the New England region and Canada. On November 25, 2013, Massachusetts Institute of Technology (US) announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of the Massachusetts Institute of Technology (US) community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO (US) was designed and constructed by a team of scientists from California Institute of Technology (US), Massachusetts Institute of Technology (US), and industrial contractors, and funded by the National Science Foundation (US) .

    MIT/Caltech Advanced aLigo .

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology (US) physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also an Massachusetts Institute of Technology (US) graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of Massachusetts Institute of Technology (US) is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the Massachusetts Institute of Technology (US) community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 11:27 am on July 4, 2021 Permalink | Reply
    Tags: "Black Holes; Quantum Entanglement; and the No-Go Theorem", Machine learning-AI, Quantum Computing, ,   

    From Scientific American (US) : “Black Holes; Quantum Entanglement; and the No-Go Theorem” 

    From Scientific American (US)

    July 4, 2021
    Zoë Holmes
    Andrew Sornborger

    1
    Credit: Getty Images.

    Suppose someone—let’s call her Alice—has a book of secrets she wants to destroy so she tosses it into a handy black hole. Given that black holes are nature’s fastest scramblers, acting like giant garbage shredders, Alice’s secrets must be pretty safe, right?

    Now suppose her nemesis, Bob, has a quantum computer that’s entangled with the black hole. (In entangled quantum systems, actions performed on one particle similarly affect their entangled partners, regardless of distance or even if some disappear into a black hole.)

    A famous thought experiment by Patrick Hayden and John Preskill says Bob can observe a few particles of light that leak from the edges of a black hole. Then Bob can run those photons as qubits (the basic processing unit of quantum computing) through the gates of his quantum computer to reveal the particular physics that jumbled Alice’s text. From that, he can reconstruct the book.

    But not so fast.

    Our recent work on quantum machine learning suggests Alice’s book might be gone forever, after all.

    QUANTUM COMPUTERS TO STUDY QUANTUM MECHANICS

    Alice might never have the chance to hide her secrets in a black hole. Still, our new no-go theorem about information scrambling has real-world application to understanding random and chaotic systems in the rapidly expanding fields of quantum machine learning, quantum thermodynamics, and quantum information science.

    Richard Feynman, one of the great physicists of the 20th century, launched the field of quantum computing in a 1981 speech, when he proposed developing quantum computers as the natural platform to simulate quantum systems. They are notoriously difficult to study otherwise.

    Our team at DOE’s Los Alamos National Laboratory (US), along with other collaborators, has focused on studying algorithms for quantum computers and, in particular, machine-learning algorithms—what some like to call artificial intelligence. The research sheds light on what sorts of algorithms will do real work on existing noisy, intermediate-scale quantum computers and on unresolved questions in quantum mechanics at large.

    In particular, we have been studying the training of variational quantum algorithms. They set up a problem-solving landscape where the peaks represent the high-energy (undesirable) points of the system, or problem, and the valleys are the low-energy (desirable) values. To find the solution, the algorithm works its way through a mathematical landscape, examining its features one at a time. The answer lies in the deepest valley.

    ENTANGLEMENT LEADS TO SCRAMBLING

    We wondered if we could apply quantum machine learning to understand scrambling. This quantum phenomenon happens when entanglement grows in a system made of many particles or atoms. Think of the initial conditions of this system as a kind of information—Alice’s book, for instance. As the entanglement among particles within the quantum system grows, the information spreads widely; this scrambling of information is key to understanding quantum chaos, quantum information science, random circuits and a range of other topics.

    A black hole is the ultimate scrambler. By exploring it with a variational quantum algorithm on a theoretical quantum computer entangled with the black hole, we could probe the scalability and applicability of quantum machine learning. We could also learn something new about quantum systems generally. Our idea was to use a variational quantum algorithm that would exploit the leaked photons to learn about the dynamics of the black hole. The approach would be an optimization procedure—again, searching through the mathematical landscape to find the lowest point.

    If we found it, we would reveal the dynamics inside the black hole. Bob could use that information to crack the scrambler’s code and reconstruct Alice’s book.

    Now here’s the rub. The Hayden-Preskill thought experiment assumes Bob can determine the black hole dynamics that are scrambling the information. Instead, we found that the very nature of scrambling prevents Bob from learning those dynamics.

    STALLED OUT ON A BARREN PLATEAU

    Here’s why: the algorithm stalled out on a barren plateau [Nature Communications], which, in machine learning, is as grim as it sounds. During machine-learning training, a barren plateau represents a problem-solving space that is entirely flat as far as the algorithm can see. In this featureless landscape, the algorithm can’t find the downward slope; there’s no clear path to the energy minimum. The algorithm just spins its wheels, unable to learn anything new. It fails to find the solution.

    Our resulting no-go theorem says that any quantum machine-learning strategy will encounter the dreaded barren plateau when applied to an unknown scrambling process.

    The good news is, most physical processes are not as complex as black holes, and we often will have prior knowledge of their dynamics, so the no-go theorem doesn’t condemn quantum machine learning. We just need to carefully pick the problems we apply it to. And we’re not likely to need quantum machine learning to peer inside a black hole to learn about Alice’s book—or anything else—anytime soon.

    So, Alice can rest assured that her secrets are safe, after all.

    See the full article here3 .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    Scientific American (US) , the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 9:50 am on June 18, 2021 Permalink | Reply
    Tags: "Brookhaven Lab Intern Returns to Continue Theoretical Physics Pursuit", Co-design Center for Quantum Advantage (C2QA), DOE Science Undergraduate Laboratory Internships, National Quantum Information Science Research Centers, , Quantum Computing, , , , Wenjie Gong recently received a Barry Goldwater Scholarship., Women in STEM-Wenjie Gong   

    From DOE’s Brookhaven National Laboratory (US) : Women in STEM-Wenjie Gong “Brookhaven Lab Intern Returns to Continue Theoretical Physics Pursuit” 

    From DOE’s Brookhaven National Laboratory (US)

    June 14, 2021
    Kelly Zegers
    kzegewrs@bnl.gov

    Wenjie Gong virtually visits Brookhaven for an internship to perform theory research on quantum information science in nuclear physics.

    1
    Wenjie Gong, who recently received a Barry Goldwater Scholarship. (Courtesy photo.)

    Internships often help students nail down the direction they’d like to take their scientific pursuits. For Wenjie Gong, who just completed her junior year at Harvard University (US), a first look into theoretical physics last summer as an intern with the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory made her want to dive further into the field.

    Gong returns to Brookhaven Lab this summer for her second experience as a virtual DOE Science Undergraduate Laboratory Internships (SULI) participant to continue collaborating with Raju Venugopalan, a senior physicist and Nuclear Theory Group leader. Together, they will explore the connections between nuclear physics theory—which explores the interactions of fundamental particles—and quantum computing.

    “I find theoretical physics fascinating as there are so many different avenues to explore and so many different angles from which to approach a problem,” Gong said. “Even though it can be difficult to parse through the technical underpinnings of different physical situations, any progress made is all the more exciting and rewarding.”

    Last year, Gong collaborated with Venugopalan on a project exploring possible ways to measure a quantum phenomenon known as “entanglement” in the matter produced at high-energy collisions.

    The physical properties of entangled particles are inextricably linked, even when the particles are separated by a great distance. Albert Einstein referred to entanglement as “spooky action at distance.”

    Studying this phenomenon is an important part of setting up long-distance quantum computing networks—the topic of many of the experiments at Co-design Center for Quantum Advantage (C2QA). The center led by Brookhaven Lab is one of five National Quantum Information Science Research Centers and applies quantum principles to materials, devices and software co-design efforts to lay the foundation for a new generation of quantum computers.

    “Usually, entanglement requires very precise measurements that are found in optics laboratories, but we wanted to look at how we could understand entanglement in high-energy particle collisions, which have much less of a controlled environment,” Gong said.

    Venugopalan said the motivation behind thinking of ways to detect entanglement in high-energy collisions is two-fold, first asking the question: “Can we think of experimental measures in collider experiments that have comparable ability to extract quantum action-at-a distance just as the carefully designed tabletop experiments?”

    “That would be interesting in itself because one might be inclined to think it unlikely,” he said.

    Venugopalan said scientists have identified sub-atomic particle correlations of so-called Lambda hyperons, which have particular properties that may allow such an experiment. Those experiments would open up the question of whether entanglement persists if scientists change the conditions of the collisions, he said.

    “If we made the collisions more violent, say, by increasing the number of particles produced, would the quantum action-at-a-distance correlation go away, just as you, and I, as macroscopic quantum states, don’t exhibit any spooky action-at-a-distance nonsense,” Venugopalan said. “When does such a quantum-to-classical transition take place?”

    In addition, can such measurements teach us about the nature of the interactions of the building blocks of matter–quarks and gluons?

    There may be more questions than answers at this stage, “but these questions force us to refine our experimental and computational tools,” Venugopalan said.

    Gong will continue collaborating with Venugopalan to develop the project on entanglement this summer. She may also start a new project exploring quirky features of soft particles in the quantum theory of electromagnetism that also apply to the strong force of nuclear physics, Venugopalan said. While her internship is virtual again this year, she said she learned last summer that collaborating remotely can be productive and rewarding.

    “Wenjie is the real deal,” Venugopalan said. “Even as a rising junior, she was functioning at the level of a postdoc. It’s a great joy to exchange ‘crazy’ ideas with her and work out the consequences. She shows great promise for an outstanding career in theoretical physics.”

    Others have noticed Gong’s scientific talent. She was recently honored with a Barry M. Goldwater Scholarship. The prestigious award supports impressive undergraduates who plan to pursue a PhD in the natural sciences, mathematics, and engineering.

    “I feel really honored and also very grateful to Raju, the Department of Energy (US) , and Brookhaven for providing me the opportunity to do this research—which I wrote about in my Goldwater essay,” Gong said.

    Gong said she’s looking forward to applying concepts from courses she took at Harvard over the past year, including quantum field theory, which she found challenging but also rewarding.

    Gong’s interest in physics started when she took Advanced Placement (AP) Physics in high school. The topic drew her in because it requires a way of thinking that’s different compared to other sciences because it explores the laws governing the motion of matter and existence, she said.

    In addition to further exploring high energy theoretical physics research, Gong said she hopes to one day teach as a university professor. She’s currently a peer tutor at Harvard.

    “I love teaching physics,” she said. “It’s really cool to see the ‘Ah-ha!’ moment when students go from not really understanding something to grasping a concept.”

    The SULI program at Brookhaven is managed by the Lab’s Office of Educational Programs and sponsored by DOE’s Office of Workforce Development for Teachers and Scientists (WDTS) within the Department’s Office of Science.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    One of ten national laboratories overseen and primarily funded by the DOE(US) Office of Science, DOE’s Brookhaven National Laboratory (US) conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University(US), the largest academic user of Laboratory facilities, and Battelle(US), a nonprofit, applied science and technology organization.

    Research at BNL specializes in nuclear and high energy physics, energy science and technology, environmental and bioscience, nanoscience and national security. The 5,300 acre campus contains several large research facilities, including the Relativistic Heavy Ion Collider [below] and National Synchrotron Light Source II [below]. Seven Nobel prizes have been awarded for work conducted at Brookhaven lab.

    BNL is staffed by approximately 2,750 scientists, engineers, technicians, and support personnel, and hosts 4,000 guest investigators every year. The laboratory has its own police station, fire department, and ZIP code (11973). In total, the lab spans a 5,265-acre (21 km^2) area that is mostly coterminous with the hamlet of Upton, New York. BNL is served by a rail spur operated as-needed by the New York and Atlantic Railway. Co-located with the laboratory is the Upton, New York, forecast office of the National Weather Service.

    Major programs

    Although originally conceived as a nuclear research facility, Brookhaven Lab’s mission has greatly expanded. Its foci are now:

    Nuclear and high-energy physics
    Physics and chemistry of materials
    Environmental and climate research
    Nanomaterials
    Energy research
    Nonproliferation
    Structural biology
    Accelerator physics

    Operation

    Brookhaven National Lab was originally owned by the Atomic Energy Commission(US) and is now owned by that agency’s successor, the United States Department of Energy (DOE). DOE subcontracts the research and operation to universities and research organizations. It is currently operated by Brookhaven Science Associates LLC, which is an equal partnership of Stony Brook University(US) and Battelle Memorial Institute(US). From 1947 to 1998, it was operated by Associated Universities, Inc. (AUI) (US), but AUI lost its contract in the wake of two incidents: a 1994 fire at the facility’s high-beam flux reactor that exposed several workers to radiation and reports in 1997 of a tritium leak into the groundwater of the Long Island Central Pine Barrens on which the facility sits.

    Foundations

    Following World War II, the US Atomic Energy Commission was created to support government-sponsored peacetime research on atomic energy. The effort to build a nuclear reactor in the American northeast was fostered largely by physicists Isidor Isaac Rabi and Norman Foster Ramsey Jr., who during the war witnessed many of their colleagues at Columbia University leave for new remote research sites following the departure of the Manhattan Project from its campus. Their effort to house this reactor near New York City was rivalled by a similar effort at the Massachusetts Institute of Technology (US) to have a facility near Boston, Massachusettes(US). Involvement was quickly solicited from representatives of northeastern universities to the south and west of New York City such that this city would be at their geographic center. In March 1946 a nonprofit corporation was established that consisted of representatives from nine major research universities — Columbia University(US), Cornell University(US), Harvard University(US), Johns Hopkins University(US), Massachusetts Institute of Technology(US), Princeton University(US), University of Pennsylvania(US), University of Rochester(US), and Yale University(US).

    Out of 17 considered sites in the Boston-Washington corridor, Camp Upton on Long Island was eventually chosen as the most suitable in consideration of space, transportation, and availability. The camp had been a training center from the US Army during both World War I and World War II. After the latter war, Camp Upton was deemed no longer necessary and became available for reuse. A plan was conceived to convert the military camp into a research facility.

    On March 21, 1947, the Camp Upton site was officially transferred from the U.S. War Department to the new U.S. Atomic Energy Commission (AEC), predecessor to the U.S. Department of Energy (DOE).

    Research and facilities

    Reactor history

    In 1947 construction began on the first nuclear reactor at Brookhaven, the Brookhaven Graphite Research Reactor. This reactor, which opened in 1950, was the first reactor to be constructed in the United States after World War II. The High Flux Beam Reactor operated from 1965 to 1999. In 1959 Brookhaven built the first US reactor specifically tailored to medical research, the Brookhaven Medical Research Reactor, which operated until 2000.

    Accelerator history

    In 1952 Brookhaven began using its first particle accelerator, the Cosmotron. At the time the Cosmotron was the world’s highest energy accelerator, being the first to impart more than 1 GeV of energy to a particle.


    The Cosmotron was retired in 1966, after it was superseded in 1960 by the new Alternating Gradient Synchrotron (AGS).

    The AGS was used in research that resulted in 3 Nobel prizes, including the discovery of the muon neutrino, the charm quark, and CP violation.

    In 1970 in BNL started the ISABELLE project to develop and build two proton intersecting storage rings.

    The groundbreaking for the project was in October 1978. In 1981, with the tunnel for the accelerator already excavated, problems with the superconducting magnets needed for the ISABELLE accelerator brought the project to a halt, and the project was eventually cancelled in 1983.

    The National Synchrotron Light Source (US) operated from 1982 to 2014 and was involved with two Nobel Prize-winning discoveries. It has since been replaced by the National Synchrotron Light Source II (US) [below].

    After ISABELLE’S cancellation, physicist at BNL proposed that the excavated tunnel and parts of the magnet assembly be used in another accelerator. In 1984 the first proposal for the accelerator now known as the Relativistic Heavy Ion Collider (RHIC)[below] was put forward. The construction got funded in 1991 and RHIC has been operational since 2000. One of the world’s only two operating heavy-ion colliders, RHIC is as of 2010 the second-highest-energy collider after the Large Hadron Collider(CH). RHIC is housed in a tunnel 2.4 miles (3.9 km) long and is visible from space.

    On January 9, 2020, It was announced by Paul Dabbar, undersecretary of the US Department of Energy Office of Science, that the BNL eRHIC design has been selected over the conceptual design put forward by DOE’s Thomas Jefferson National Accelerator Facility [Jlab] (US) as the future Electron–ion collider (EIC) in the United States.

    In addition to the site selection, it was announced that the BNL EIC had acquired CD-0 (mission need) from the Department of Energy. BNL’s eRHIC design proposes upgrading the existing Relativistic Heavy Ion Collider, which collides beams light to heavy ions including polarized protons, with a polarized electron facility, to be housed in the same tunnel.

    Other discoveries

    In 1958, Brookhaven scientists created one of the world’s first video games, Tennis for Two. In 1968 Brookhaven scientists patented Maglev, a transportation technology that utilizes magnetic levitation.

    Major facilities

    Relativistic Heavy Ion Collider (RHIC), which was designed to research quark–gluon plasma and the sources of proton spin. Until 2009 it was the world’s most powerful heavy ion collider. It is the only collider of spin-polarized protons.
    Center for Functional Nanomaterials (CFN), used for the study of nanoscale materials.
    BNL National Synchrotron Light Source II(US), Brookhaven’s newest user facility, opened in 2015 to replace the National Synchrotron Light Source (NSLS), which had operated for 30 years.[19] NSLS was involved in the work that won the 2003 and 2009 Nobel Prize in Chemistry.
    Alternating Gradient Synchrotron, a particle accelerator that was used in three of the lab’s Nobel prizes.
    Accelerator Test Facility, generates, accelerates and monitors particle beams.
    Tandem Van de Graaff, once the world’s largest electrostatic accelerator.
    Computational Science resources, including access to a massively parallel Blue Gene series supercomputer that is among the fastest in the world for scientific research, run jointly by Brookhaven National Laboratory and Stony Brook University.
    Interdisciplinary Science Building, with unique laboratories for studying high-temperature superconductors and other materials important for addressing energy challenges.
    NASA Space Radiation Laboratory, where scientists use beams of ions to simulate cosmic rays and assess the risks of space radiation to human space travelers and equipment.

    Off-site contributions

    It is a contributing partner to ATLAS experiment, one of the four detectors located at the Large Hadron Collider (LHC).


    It is currently operating at CERN near Geneva, Switzerland.

    Brookhaven was also responsible for the design of the SNS accumulator ring in partnership with Spallation Neutron Source at DOE’s Oak Ridge National Laboratory (US), Tennessee.

    Brookhaven plays a role in a range of neutrino research projects around the world, including the Daya Bay Neutrino Experiment (CN) nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China.


     
  • richardmitnick 5:42 pm on June 17, 2021 Permalink | Reply
    Tags: "‘Talking’ quantum dots could be used as qubits", Atomic-scale computer simulations show how quantum dots “talk” to each other., In their study the team successfully modelled the behaviour of the quantum dots at the femtosecond scale., More complex behaviours can occur when two or more quantum dots are close enough together to interact with each other., , Quantum Computing, Quantum dots are tiny pieces of semiconductor crystal contain thousands of atoms., With further improvements to the model the use of quantum dots could be expanded to include a diverse array of real-world applications.   

    From physicsworld.com (UK) : “‘Talking’ quantum dots could be used as qubits” 

    From physicsworld.com (UK)

    16 Jun 2021
    Sam Jarman

    1
    It’s good to talk: illustration showing two quantum dots “communicating” with each other by exchanging light. (Courtesy: Helmholtz-Zentrum Berlin (HZB) (DE))

    New atomic-scale computer simulations of how quantum dots “talk” to each other could lead to a wide range of practical applications ranging from quantum computing to green energy.

    The research was done by Pascal Krause and Annika Bande at the Helmholtz Centre for Materials and Energy in Germany and Jean Christophe Tremblay at National Centre for Scientific Research [Centre national de la recherche scientifique, [CNRS] (FR) and the University of Lorraine [Université de Lorraine](FR), who modelled the absorption, exchange, and storage of energy within pairs of quantum dots. With further improvements to the model the use of quantum dots could be expanded to include a diverse array of real-world applications.

    Quantum dots are tiny pieces of semiconductor crystal contain thousands of atoms. The dots are quantum systems that behave much like atoms, having electron energy levels that can absorb and emit light at discrete wavelengths. For example, when illuminated with ultraviolet light a quantum dot can be excited to a higher energy state. When it drops back down to its ground state, it can emit a visible photon – allowing quantum dots to produce glow with vivid colours.

    More complex behaviours can occur when two or more quantum dots are close enough together to interact with each other. For example, interactions can stabilize excitons, which are quasiparticles that comprise an electron and a hole and are created when electrons are excited. Long-lasting excitons can have applications ranging from photocatalysis to quantum computing.

    Sheer complexity

    So far, computer simulations of quantum dot interactions have been limited by their sheer complexity. Since the processes involve thousands of atoms, each hosting multiple electrons, the characteristics of exciton formation and recombination cannot be fully captured by even the most advanced supercomputers. Now, Krause, Bande and Tremblay have approximated the process through simulations of scaled-down quantum dots, each containing just hundreds of atoms.

    In their study the trio successfully modelled the behaviour of the quantum dots at the femtosecond scale. Their simulations revealed how the quantum dot pairs absorb, exchange, and store light energy. They also found how excitons can be stabilized by applying a sequence of ultraviolet and infrared pulses to quantum dots. While an initial ultraviolet pulse can generate an exciton in one quantum dot, a subsequent infrared pulse can shift the exciton to a nearby quantum dot – where the energy it contains can be stored.

    The team simulated interactions between three pairs of germanium/silicon quantum dots, which had different shapes and sizes. They now plan to create more realistic simulations that will allow them to model how environmental factors such as temperature can affect interactions. Through further improvements, their results could lead to a wide range of applications for quantum dots including quantum bits (qubits) that can reliably store and read out quantum information and photocatalysts that absorb sunlight, facilitating reactions that produce hydrogen gas as a carbon-free fuel source.

    The research is described in the Journal of Physical Chemistry A.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    a href=”http://www.stemedcoalition.org/”>Stem Education Coalition

    PhysicsWorld(UK) is a publication of the Institute of Physics(UK). The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

     
  • richardmitnick 9:29 am on June 12, 2021 Permalink | Reply
    Tags: "A Tectonic Shift in Analytics and Computing Is Coming", "Destination Earth", "Speech Understanding Research", "tensor processing units", , , , Computing clusters, , GANs: generative adversarial networks, , , , Quantum Computing, Seafloor bathymetry, SML: supervised machine learning, UML: Unsupervised Machine Learning   

    From Eos: “A Tectonic Shift in Analytics and Computing Is Coming” 

    From AGU
    Eos news bloc

    From Eos

    4 June 2021
    Gabriele Morra
    morra@louisiana.edu
    Ebru Bozdag
    Matt Knepley
    Ludovic Räss
    Velimir Vesselinov

    Artificial intelligence combined with high-performance computing could trigger a fundamental change in how geoscientists extract knowledge from large volumes of data.

    1
    A Cartesian representation of a global adjoint tomography model, which uses high-performance computing capabilities to simulate seismic wave propagation, is shown here. Blue and red colorations represent regions of high and low seismic velocities, respectively. Credit: David Pugmire, DOE’s Oak Ridge National Laboratory (US).

    More than 50 years ago, a fundamental scientific revolution occurred, sparked by the concurrent emergence of a huge amount of new data on seafloor bathymetry and profound intellectual insights from researchers rethinking conventional wisdom. Data and insight combined to produce the paradigm of plate tectonics. Similarly, in the coming decade, a new revolution in data analytics may rapidly overhaul how we derive knowledge from data in the geosciences. Two interrelated elements will be central in this process: artificial intelligence (AI, including machine learning methods as a subset) and high-performance computing (HPC).

    Already today, geoscientists must understand modern tools of data analytics and the hardware on which they work. Now AI and HPC, along with cloud computing and interactive programming languages, are becoming essential tools for geoscientists. Here we discuss the current state of AI and HPC in Earth science and anticipate future trends that will shape applications of these developing technologies in the field. We also propose that it is time to rethink graduate and professional education to account for and capitalize on these quickly emerging tools.

    Work in Progress

    Great strides in AI capabilities, including speech and facial recognition, have been made over the past decade, but the origins of these capabilities date back much further. In 1971, the Defense Advanced Research Projects Agency (US) substantially funded a project called Speech Understanding Research [Journal of the Acoustical Society of America], and it was generally believed at the time that artificial speech recognition was just around the corner. We know now that this was not the case, as today’s speech and writing recognition capabilities emerged only as a result of both vastly increased computing power and conceptual breakthroughs such as the use of multilayered neural networks, which mimic the biological structure of the brain.

    Recently, AI has gained the ability to create images of artificial faces that humans cannot distinguish from real ones by using generative adversarial networks (GANs). These networks combine two neural networks, one that produces a model and a second one that tries to discriminate the generated model from the real one. Scientists have now started to use GANs to generate artificial geoscientific data sets.

    These and other advances are striking, yet AI and many other artificial computing tools are still in their infancy. We cannot predict what AI will be able to do 20–30 years from now, but a survey of existing AI applications recently showed that computing power is the key when targeting practical applications today. The fact that AI is still in its early stages has important implications for HPC in the geosciences. Currently, geoscientific HPC studies have been dominated by large-scale time-dependent numerical simulations that use physical observations to generate models [Morra et al, 2021a*]. In the future, however, we may work in the other direction—Earth, ocean, and atmospheric simulations may feed large AI systems that in turn produce artificial data sets that allow geoscientific investigations, such as Destination Earth, for which collected data are insufficient.

    *all citations are included in References below.

    Data-Centric Geosciences

    Development of AI capabilities is well underway in certain geoscience disciplines. For a decade now [Ma et al., 2019], remote sensing operations have been using convolutional neural networks (CNNs), a kind of neural network that adaptively learns which features to look at in a data set. In seismology (Figure 1), pattern recognition is the most common application of machine learning (ML), and recently, CNNs have been trained to find patterns in seismic data [Kong et al., 2019], leading to discoveries such as previously unrecognized seismic events [Bergen et al., 2019].

    2
    Fig. 1. Example of a workflow used to produce an interactive “visulation” system, in which graphic visualization and computer simulation occur simultaneously, for analysis of seismic data. Credit: Ben Kadlec.

    New AI applications and technologies are also emerging; these involve, for example, the self-ordering of seismic waveforms to detect structural anomalies in the deep mantle [Kim et al., 2020]. Recently, deep generative models, which are based on neural networks, have shown impressive capabilities in modeling complex natural signals, with the most promising applications in autoencoders and GANs (e.g., for generating images from data).

    CNNs are a form of supervised machine learning (SML), meaning that before they are applied for their intended use, they are first trained to find prespecified patterns in labeled data sets and to check their accuracy against an answer key. Training a neural network using SML requires large, well-labeled data sets as well as massive computing power. Massive computing power, in turn, requires massive amounts of electricity, such that the energy demand of modern AI models is doubling every 3.4 months and causing a large and growing carbon footprint.

    In the future, the trend in geoscientific applications of AI might shift from using bigger CNNs to using more scalable algorithms that can improve performance with less training data and fewer computing resources. Alternative strategies will likely involve less energy-intensive neural networks, such as spiking neural networks, which reduce data inputs by analyzing discrete events rather than continuous data streams.

    Unsupervised ML (UML), in which an algorithm identifies patterns on its own rather than searching for a user-specified pattern, is another alternative to data-hungry SML. One type of UML identifies unique features in a data set to allow users to discover anomalies of interest (e.g., evidence of hidden geothermal resources in seismic data) and to distinguish trends of interest (e.g., rapidly versus slowly declining production from oil and gas wells based on production rate transients) [Vesselinov et al., 2019].

    AI is also starting to improve the efficiency of geophysical sensors. Data storage limitations require instruments such as seismic stations, acoustic sensors, infrared cameras, and remote sensors to record and save data sets that are much smaller than the total amount of data they measure. Some sensors use AI to detect when “interesting” data are recorded, and these data are selectively stored. Sensor-based AI algorithms also help minimize energy consumption by and prolong the life of sensors located in remote regions, which are difficult to service and often powered by a single solar panel. These techniques include quantized CNN (using 8-bit variables) running on minimal hardware, such as Raspberry Pi [Wilkes et al., 2017].

    Advances in Computing Architectures

    Powerful, efficient algorithms and software represent only one part of the data revolution; the hardware and networks that we use to process and store data have evolved significantly as well.

    Since about 2004, when the increase in frequencies at which processors operate stalled at about 3 gigahertz (the end of Moore’s law), computing power has been augmented by increasing the number of cores per CPU and by the parallel work of cores in multiple CPUs, as in computing clusters.

    Accelerators such as graphics processing units (GPUs), once used mostly for video games, are now routinely used for AI applications and are at the heart of all major ML facilities (as well the DOE’s Exascale Ccomputing Project (US), a part of the National Strategic Computing Initiative – NSF (US)). For example, Summit and Sierra, the two fastest supercomputers in the United States, are based on a hierarchical CPU-GPU architecture.

    Meanwhile, emerging tensor processing units, which were developed specifically for matrix-based operations, excel at the most demanding tasks of most neural network algorithms. In the future, computers will likely become increasingly heterogeneous, with a single system combining several types of processors, including specialized ML coprocessors (e.g., Cerebras) and quantum computing processors.

    Computational systems that are physically distributed across remote locations and used on demand, usually called cloud computing, are also becoming more common, although these systems impose limitations on the code that can be run on them. For example, cloud infrastructures, in contrast to centralized HPC clusters and supercomputers, are not designed for performing large-scale parallel simulations. Cloud infrastructures face limitations on high-throughput interconnectivity, and the synchronization needed to help multiple computing nodes coordinate tasks is substantially more difficult to achieve for physically remote clusters. Although several cloud-based computing providers are now investing in high-throughput interconnectivity, the problem of synchronization will likely remain for the foreseeable future.

    Boosting 3D Simulations

    Artificial intelligence has proven invaluable in discovering and analyzing patterns in large, real-world data sets. It could also become a source of realistic artificial data sets, generated through models and simulations. Artificial data sets enable geophysicists to examine problems that are unwieldy or intractable using real-world data—because these data may be too costly or technically demanding to obtain—and to explore what-if scenarios or interconnected physical phenomena in isolation. For example, simulations could generate artificial data to help study seismic wave propagation; large-scale geodynamics; or flows of water, oil, and carbon dioxide through rock formations to assist in energy extraction and storage.

    HPC and cloud computing will help produce and run 3D models, not only assisting in improved visualization of natural processes but also allowing for investigation of processes that can’t be adequately studied with 2D modeling. In geodynamics, for example, using 2D modeling makes it difficult to calculate 3D phenomena like toroidal flow and vorticity because flow patterns are radically different in 3D. Meanwhile, phenomena like crustal porosity waves [Geophysical Research Letters] (waves of high porosity in rocks; Figure 2) and corridors of fast-moving ice in glaciers require extremely high spatial and temporal resolutions in 3D to capture [Räss et al., 2020].

    3
    Fig. 2. A 3D modeling run with 16 billion degrees of freedom simulates flow focusing in porous media and identifies a pulsed behavior phenomenon called porosity waves. Credit: Räss et al. [2018], CC BY 4.0.

    Adding an additional dimension to a model can require a significant increase in the amount of data processed. For example, in exploration seismology, going from a 2D to a 3D simulation involves a transition from requiring three-dimensional data (i.e., source, receiver, time) to five-dimensional data (source x, source y, receiver x, receiver y, and time [e.g., Witte et al., 2020]). AI can help with this transition. At the global scale, for example, the assimilation of 3D simulations in iterative full-waveform inversions for seismic imaging was performed recently with limited real-world data sets, employing AI techniques to maximize the amount of information extracted from seismic traces while maintaining the high quality of the data [Lei et al., 2020].

    Emerging Methods and Enhancing Education

    As far as we’ve come in developing AI for uses in geoscientific research, there is plenty of room for growth in the algorithms and computing infrastructure already mentioned, as well as in other developing technologies. For example, interactive programming, in which the programmer develops new code while a program is active, and language-agnostic programming environments that can run code in a variety of languages are young techniques that will facilitate introducing computing to geoscientists.

    Programming languages, such as Python and Julia, which are now being taught to Earth science students, will accompany the transition to these new methods and will be used in interactive environments such as the Jupyter Notebook. Julia was shown recently to perform well as compiled code for machine learning algorithms in its most recent implementations, such as the ones using differentiable programming, which reduces computational resource and energy requirements.

    Quantum computing, which uses the quantum states of atoms rather than streams of electrons to transmit data, is another promising development that is still in its infancy but that may lead to the next major scientific revolution. It is forecast that by the end of this decade, quantum computers will be applied in solving many scientific problems, including those related to wave propagation, crustal stresses, atmospheric simulations, and other topics in the geosciences. With competition from China in developing quantum technologies and AI, quantum computing and quantum information applications may become darlings of major funding opportunities, offering the means for ambitious geophysicists to pursue fundamental research.

    Taking advantage of these new capabilities will, of course, require geoscientists who know how to use them. Today, many geoscientists face enormous pressure to requalify themselves for a rapidly changing job market and to keep pace with the growing complexity of computational technologies. Academia, meanwhile, faces the demanding task of designing innovative training to help students and others adapt to market conditions, although finding professionals who can teach these courses is challenging because they are in high demand in the private sector. However, such teaching opportunities could provide a point of entry for young scientists specializing in computer science or part-time positions for professionals retired from industry or national labs [Morra et al., 2021b].

    The coming decade will see a rapid revolution in data analytics that will significantly affect the processing and flow of information in the geosciences. Artificial intelligence and high-performance computing are the two central elements shaping this new landscape. Students and professionals in the geosciences will need new forms of education enabling them to rapidly learn the modern tools of data analytics and predictive modeling. If done well, the concurrence of these new tools and a workforce primed to capitalize on them could lead to new paradigm-shifting insights that, much as the plate tectonic revolution did, help us address major geoscientific questions in the future.

    Acknowledgments:

    The listed authors thank Peter Gerstoft, Scripps Institution of Oceanography (US), University of California, San Diego; Henry M. Tufo, University of Colorado-Boulder (US); and David A. Yuen, Columbia University (US) and Ocean University of China [中國海洋大學](CN), Qingdao, who contributed equally to the writing of this article.

    References:

    Bergen, K. J., et al. (2019), Machine learning for data-driven discovery in solid Earth geoscience, Science, 363(6433), eaau0323, https://doi.org/10.1126/science.aau0323.

    Kim, D., et al. (2020), Sequencing seismograms: A panoptic view of scattering in the core-mantle boundary region, Science, 368(6496), 1,223–1,228, https://doi.org/10.1126/science.aba8972.

    Kong, Q., et al. (2019), Machine learning in seismology: Turning data into insights, Seismol. Res. Lett., 90(1), 3–14, https://doi.org/10.1785/0220180259.

    Lei, W., et al. (2020), Global adjoint tomography—Model GLAD-M25, Geophys. J. Int., 223(1), 1–21, https://doi.org/10.1093/gji/ggaa253.

    Ma, L., et al. (2019), Deep learning in remote sensing applications: A meta-analysis and review, ISPRS J. Photogramm. Remote Sens., 152, 166–177, https://doi.org/10.1016/j.isprsjprs.2019.04.015.

    Morra, G., et al. (2021a), Fresh outlook on numerical methods for geodynamics. Part 1: Introduction and modeling, in Encyclopedia of Geology, 2nd ed., edited by D. Alderton and S. A. Elias, pp. 826–840, Academic, Cambridge, Mass., https://doi.org/10.1016/B978-0-08-102908-4.00110-7.

    Morra, G., et al. (2021b), Fresh outlook on numerical methods for geodynamics. Part 2: Big data, HPC, education, in Encyclopedia of Geology, 2nd ed., edited by D. Alderton and S. A. Elias, pp. 841–855, Academic, Cambridge, Mass., https://doi.org/10.1016/B978-0-08-102908-4.00111-9.

    Räss, L., N. S. C. Simon, and Y. Y. Podladchikov (2018), Spontaneous formation of fluid escape pipes from subsurface reservoirs, Sci. Rep., 8, 11116, https://doi.org/10.1038/s41598-018-29485-5.

    Räss, L., et al. (2020), Modelling thermomechanical ice deformation using an implicit pseudo-transient method (FastICE v1.0) based on graphical processing units (GPUs), Geosci. Model Dev., 13, 955–976, https://doi.org/10.5194/gmd-13-955-2020.

    Vesselinov, V. V., et al. (2019), Unsupervised machine learning based on non-negative tensor factorization for analyzing reactive-mixing, J. Comput. Phys., 395, 85–104, https://doi.org/10.1016/j.jcp.2019.05.039.

    Wilkes, T. C., et al. (2017), A low-cost smartphone sensor-based UV camera for volcanic SO2 emission measurements, Remote Sens., 9(1), 27, https://doi.org/10.3390/rs9010027.

    Witte, P. A., et al. (2020), An event-driven approach to serverless seismic imaging in the cloud, IEEE Trans. Parallel Distrib. Syst., 31, 2,032–2,049, https://doi.org/10.1109/TPDS.2020.2982626.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 10:37 pm on June 11, 2021 Permalink | Reply
    Tags: "People of PI- Women in STEM Christine Muschik enjoys the best of both worlds", Classical Computing, Hybrid computing systems offer new approaches to simulation and analysis that can move fields from cosmology to particle physics forward., Muschik has developed new computational approaches that squeeze more power out of existing quantum technologies and flow the results into a feedback loop with classical processors., Perimeter Institute (CA), Quantum Computing, , Quantum-classical hybrid computers, Reconciling quantum mechanics and classical physics, We want to have quantum simulations that work not only on paper but that also lead to proof-of-concept demonstrations.   

    From Perimeter Institute for Theoretical Physics (CA): “People of PI- Women in STEM Christine Muschik enjoys the best of both worlds” 

    Perimeter Institute

    From Perimeter Institute for Theoretical Physics (CA)

    Jun 02, 2021
    Patchen Barss

    1
    Christine Muschik. Credit: Perimeter Institute.

    When Christine Muschik was completing her PhD in the 2010s, the privilege of running an experiment on a quantum computer required the kind of money, fame, or insider connections few early-career researchers could muster.

    “If you had an idea, it was very difficult to get it implemented on hardware. Hardware development was just crazy expensive,” says Muschik, who is now an associate faculty member at Perimeter Institute, cross-appointed with the University of Waterloo (CA).

    As a theorist with great interest in moving from the whiteboard to real-world experiments, Muschik continually strained against technological and logistical restrictions. At every opportunity, she pushed to advance both the science and the technology.

    “We want to have quantum simulations that work not only on paper but that also lead to proof-of-concept demonstrations. This is why we work with experimental teams: To make it real. To bring it to life in the lab,” she says.

    A decade later, quantum hardware has advanced more quickly than anyone could have imagined. Quantum tech is still in its early, fragile, somewhat experimental stages, but it has become much more accessible, freeing Muschik’s curiosity and intellect.

    “We’re all surprised by the rapid acceleration of hardware development,” she says. “It’s happening because industry – the Googles and the IBMs – are getting on board and pumping a lot of money into it. Everybody is hunting after the ‘quantum advantage.’ For our last publication, we just ran the program we needed on an IBM cloud-based quantum computer.”

    A 2019 report in Nature estimated that private investors in North America and Europe had poured at least $450 million (USD) into quantum technology start-up companies in the preceding two years. (Similar information was not available for China, which has become a powerhouse of quantum technology.)

    Many of these start-ups are racing with each other, and with established tech giants, to achieve “quantum supremacy,” an industry term for the milestone of a quantum computer solving a useful or interesting problem that is impossible for classical computers. The term is ambiguous (Google claimed they had achieved quantum supremacy in 2019, but critics disputed it) and also deceptive: Quantum supremacy does not mean that quantum computers take over from conventional computers. Each is suited for different types of computational challenges.

    Muschik has been working to combine the best of both.

    “She understands both the intricacies of complex theories and the subtleties of experimental implementation,” says Raymond Laflamme, the Mike and Ophelia Lazaridis John von Neumann Chair in Quantum Information at Waterloo’s Institute for Quantum Computing. “She is very hands on in both areas, which makes her stand out.”

    Muschik has developed new computational approaches that squeeze more power out of existing quantum technologies and flow the results into a feedback loop with classical processors, creating increasingly capable hybrid systems. While this work will inevitably make waves in the commercial tech sector, she is more interested in using these tools to create new knowledge.

    “The whole guiding theme of my group involves one question: What if quantum computers could help us to make new scientific discoveries?” she says. She’s interested in questions about matter and antimatter, the inner workings of neutron stars, and other mysteries that conventional computers haven’t been able to solve.

    Hybrid computing systems offer new approaches to simulation and analysis that can move fields from cosmology to particle physics forward. But even before she begins exploring questions from other fields, Muschik’s core work in developing such systems already helps advance a central challenge that has confounded theoretical physicists for decades: reconciling quantum mechanics and classical physics.

    Each of these powerful theoretical frameworks does a great job of describing the universe from a specific perspective: Quantum mechanics covers the subatomic world of protons and quarks. Classical physics describes the macroscopic world of people and planets. Each provides an accurate and precise description of the same physical reality, as seen from a different point of view.

    But each is inconsistent and incompatible with the other.

    Quantum-classical hybrid computers send information back and forth between these contradictory frameworks, using both to solve problems and run simulations with implications for the aerospace industry, drug discovery, financial services, and many areas of scientific research.

    Muschik makes the technology sound easy.

    “It’s all about how you formulate the problem,” she says. “You take a question like ‘Why is there more matter than antimatter?’ You reformulate your question in the form of an optimization problem. I teach my quantum core processor to analyze this problem and spit out numbers. And classical computers know how to deal with numbers.”

    Muschik works on applications for existing “noisy intermediate-sized quantum computers,” but also plans projects that benefit – and benefit from – continuing technological developments.

    “We play a dual role, not only simulating the physics now, but also focusing on method development for future quantum computers,” she says. “This is how you pave the way to scale it up for future generations.”

    Muschik oversees the Quantum Simulations of Fundamental Interactions initiative, a joint venture of Perimeter Institute and the Institute for Quantum Computing at the University of Waterloo. Among other things, her lab is developing the technology to simulate forces and particles that extend beyond the Standard Model of particle physics. The rapid advance of quantum computers over the past decades has made it much more possible to simulate quantum fields and fundamental forces that the Standard Model can’t explain.

    “Where our understanding fails is the most interesting part. It is a hint about where we can find new physics. The models beyond the Standard Model are freaking difficult. Standard methods cannot tackle them. My personal computer cannot tackle them. The biggest supercomputer cannot. Even future supercomputer centres that are only planned – even those will not be able to tackle these questions,” she says.

    “And you can say, ‘Ok, we should give up.’ But this is a tremendous opportunity. Quantum computers right now are too small, but they have tremendous promise to answer these big, deep, open questions.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Perimeter Institute for Theoretical Physics (CA) is a leading centre for scientific research, training and educational outreach in foundational theoretical physics. Founded in 1999 in Waterloo, Ontario, Canada, its mission is to advance our understanding of the universe at the most fundamental level, stimulating the breakthroughs that could transform our future. Perimeter also trains the next generation of physicists through innovative programs, and shares the excitement and wonder of science with students, teachers and the general public.

    The Institute’s founding and major benefactor is Canadian entrepreneur and philanthropist Mike Lazaridis.

    The original building, designed by Saucier + Perrotte, opened in 2004 and was awarded a Governor General’s Medal for Architecture in 2006. The Stephen Hawking Centre, designed by Teeple Architects, was opened in 2011 and was LEED Silver certified in 2015.

    In addition to research, Perimeter also provides scientific training and educational outreach activities to the general public. This is done in part through Perimeter’s Educational Outreach team.

    Research

    Perimeter’s research encompasses nine fields:

    Cosmology
    Mathematical physics
    Particle Physics
    Quantum fields and strings
    Quantum foundations
    Quantum gravity
    Quantum information
    Quantum matter
    Strong gravity

     
  • richardmitnick 12:35 pm on May 29, 2021 Permalink | Reply
    Tags: "Energy on Demand-Learning from Nature’s Catalysts", "Janus intermediate", , “Big questions in biocatalysis”-specifically how to control matter and energy., , , , , Enzymes: nature’s catalysts, How natural catalysts churn out specific reactions-over and over-in the blink of an eye., Nitrogenase: an enzyme found in soil-dwelling microorganisms that has a unique ability to break apart nitrogen’s triple bond-one of the strongest bonds in nature., , , Quantum Computing   

    From DOE’s Pacific Northwest National Laboratory (US) : “Energy on Demand-Learning from Nature’s Catalysts” 

    From DOE’s Pacific Northwest National Laboratory (US)

    April 26, 2021 [Just now in social media.]
    Lynne Roeder, PNNL

    New Energy Sciences Center, quantum chemistry to accelerate enzyme research.

    1
    Nitrogenase. Credit: PNNL.

    About 15 years ago, Simone Raugei started simulating chemistry experiments at the molecular level.

    Today, as part of a top-notch research team aided by advanced computing, Raugei and his colleagues stand primed to crack an important hidden code: nature’s intricate method for releasing energy on demand.

    “We want to know how to funnel energy precisely at the right time, in the right spot, to perform the chemical reaction we want—just like enzymes do in nature,” said Raugei, a computational scientist who leads the physical biosciences research at Pacific Northwest National Laboratory (PNNL). “Advances in computing have helped us make tremendous progress in the past five or six years. We now have a critical mass of capabilities and knowledge.”

    The research is part of PNNL’s focus on reinventing chemical conversions, which supports the goals of the U.S. Department of Energy Office of Science, Basic Energy Sciences (BES) program. One of the programs’ many goals is to understand, at an atomic level, how natural catalysts churn out specific reactions-over and over-in the blink of an eye.

    The ability to mimic these natural reactions could profoundly improve the design of new synthetic catalysts for producing cleaner and more efficient energy, industrial processes, and materials.

    Raugei described the BES Physical Biosciences program as the visionary effort that brought together individual research groups and experimentalists to collaborate on “big questions in biocatalysis”—specifically, how to control matter and energy.

    The questions don’t get much bigger than that.

    Enzymes: nature’s catalysts

    At PNNL, Raugei teams closely with fellow computational scientists Bojana Ginovska and Marcel Baer to examine the inner workings of enzymes. Found within every living cell, these miniscule multi-taskers direct all sorts of reactions for different functions.

    Through feedback loops between theory, computer simulations, and experimentation among PNNL and university collaborators, the scientists have made steady progress in uncovering the molecular machinations of several types of enzymes. They are particularly interested in nitrogenase, an enzyme found in soil-dwelling microorganisms, that has a unique ability to break apart nitrogen’s triple bond—one of the strongest bonds in nature. That molecular fracture, which occurs in the buried active core of nitrogenase, produces ammonia.

    In the world of commercial chemistry, ammonia is used to make many valuable products, such as fertilizer. But producing ammonia at an industrial scale takes a lot of energy. Much of that energy is spent trying to break nitrogen’s sturdy triple bonds. Figuring out how nature does it so efficiently is key to designing new synthetic catalysts that improve the production process for ammonia and other commercial products.


    Quantum Chemistry. Credit: PNNL.

    Nitrogenase: cracking the code

    About two years ago, the team of PNNL and university scientists isolated the elusive molecular structure inside nitrogenase—called the Janus intermediate—that represents the ‘point of no return’ in the production of ammonia. The researchers found that two negatively charged hydrogens, called hydrides, form bridges with two iron ions. Those bridges allow four extra electrons to park inside the core cluster of atoms.

    The team’s latest research confirmed the shuffling of electrons within the protein environment, packing in enough energy to break apart the nitrogen bonds and form ammonia. Powerful spectroscopy techniques were used to probe the magnetic interactions between electrons in the enzyme’s metallic core. Those interactions were then correlated with quantum simulations of the enzyme’s transformation to yield the molecular structure of the Janus intermediate.

    “The energetics of the electron delivery are amazing,” said Raugei. “When you think of adding electrons into a tiny cluster of atoms, one electron is difficult, two is harder, three is really hard, and to add the fourth is generally considered impossible. But we found that’s how it happens.”

    Lance Seefeldt, a professor at Utah State University (US) who holds a joint appointment at PNNL, leads the experimental work for the team’s nitrogenase research. Another key collaborator, and the “mastermind behind the spectroscopy measurements” according to Raugei, is Brian Hoffman from Northwestern University (US). The team’s most recent findings about nitrogenase were published in the Journal of the American Chemical Society in December 2020.

    Quantum chemistry collaborations

    Ginovska helps direct the day-to-day activities of the group’s postdoctoral researchers working on the project. She credits Raugei with establishing and maintaining connections among the scientific community to spur progress on enzyme research.

    “As a theoretical hub, we collaborate with universities and other national laboratories for the experimental aspects of the research,” said Ginovska. “We started with nitrogenase and it grew from there. We are now working on several enzymatic systems. All of that work is feeding into the same knowledge base.”

    Karl Mueller, chief science and technology officer for PNNL’s Physical and Computational Sciences Directorate, said nitrogenase is a prime example of the challenging problems that can be tackled at a national laboratory through collaboration between experimental and computational scientists, including university researchers. As the scientists prepare to move into PNNL’s new Energy Sciences Center in the fall of 2021, Raugei is confident the enhanced capabilities and collaborative environment will help the team soon crack the remaining code of how nitrogenase forms ammonia.

    “We know that it has to do with adding hydrogen atoms, but how? There are a multitude of possible pathways and that’s what we’re looking into now,” said Raugei. “This is definitely an application where breakthroughs in quantum computing will accelerate our research and elevate our understanding of complex systems.”

    As the pace of scientific progress speeds forward, nitrogenase is just one example of how the promise of quantum chemistry, quantum computing, and PNNL’s Energy Sciences Center could help answer the next big question in catalysis.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Pacific Northwest National Laboratory (PNNL) (US) is one of the United States Department of Energy National Laboratories, managed by the Department of Energy’s Office of Science. The main campus of the laboratory is in Richland, Washington.

    PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.

     
  • richardmitnick 9:24 am on May 18, 2021 Permalink | Reply
    Tags: "Scientists Got Photons to Interact Taking Step Towards Long-Living Quantum Memory", , Photons as superconductive qubits, Quantum Computing, Superconducting qubits are a leading qubit modality today that is currently being pursued by industry and academia for quantum computing applications., Superconducting waveguide quantum electrodynamics   

    From National University of Science and Technology MISiS (NUST) [ Национальный исследовательский университет “МИСиС”](RU): “Scientists Got Photons to Interact Taking Step Towards Long-Living Quantum Memory” 

    From National University of Science and Technology MISiS (NUST) [ Национальный исследовательский университет “МИСиС”](RU)

    April 19, 2021

    1
    Quantum computer. Credit: Sergey Gnuskov/NUST MISIS.

    An international research team obtained experimental evidence for effective interaction between microwave photons via superconductive qubits for the first time. The study, published in npj Quantum Materials, may be a step towards the implementation of a long-living quantum memory and the development of commercial quantum devices.

    Scientists believe that individual light particles, or photons, are ideally suited for sending quantum information. Encoded with quantum data, they could literally transfer information at the speed of light. However, while photons would make for great carriers because of their speed, they don’t like to interact with each other, making it difficult to achieve quantum entanglement.

    A team of scientists from NUST MISIS, Russian Quantum Center, the Ioffe Physical-Technical Institute of the Russian Academy of Sciences [Физико-технический институт им. А. Ф. Иоффе] (RU) and KIT Karlsruhe Institute of Technology [Karlsruher Institut für Technologie] (DE), for the first time, made photons interact with each other effectively using an array of superconducting qubits and a waveguide. In their experiments, the researchers used photons with the frequency of a few GHz and the wavelength of a few centimeters.

    “We used superconducting cubits, which are basically artificial atoms, because they have been proven to strongly interact with light. Interaction between natural atoms and natural light is weak due to the small size of natural atoms. Superconducting cubits are man-built, their size can reach up to 0.1mm, which makes it possible to significantly increase their dipole moment and polarity, engineering strong interaction between light and matter,” noted Prof. Alexey Ustinov, Head of the Laboratory for Superconducting Metamaterials at NUST MISIS and Group Head at Russian Quantum Center, who co-authored the study.

    Superconducting qubits are a leading qubit modality today that is currently being pursued by industry and academia for quantum computing applications. However, they require milli-Kelvin (mK) temperatures to operate. The most powerful of the existing superconducting quantum devices contains under 100 qubits. As you add qubits, the number of operations a quantum computer can perform grows exponentially, but the maximum number of qubits that can be integrated in a quantum computer is limited by the size of refrigerators used to cool them down to operational temperatures. Taking this into account, the efforts of the scientific community have been recently focused on increasing the processing power of a quantum computer by transmitting quantum signals from one refrigerator to another. To engineer this transmission, the scientists coupled an array of eight superconducting transmon qubits to a common waveguide — a structure that guides waves, e.g. light waves.

    “By employing dedicated flux-bias lines for each qubit, we establish control over their transition frequencies. It was derived and experimentally verified that multiple qubits obtain an infinite range photon mediated effective interaction, which can be tuned with the inter-qubit distance,” says Alexey Ustinov.

    The circuit of this work extends experiments with one and two qubits toward a full-blown quantum metamaterial, thus paving the way for large-scale applications in superconducting waveguide quantum electrodynamics.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The National University of Science and Technology “MISiS” (formerly Moscow Institute of Steel and Alloys State Technological University [Национальный исследовательский университет “МИСиС”] ) is Russia’s primary technological university in the field of steelmaking and metallurgy. It was established in 1918 as a part of the Moscow Mining Academy. In 1930, it became independent. During Stalin’s regime, the Institute was renamed the Stalin Moscow Institute of Steel. It adopted the name Moscow Institute of Steel and Alloys in 1962 after uniting with the Institute of Nonferrous Metals and Gold. The status of Technological University was awarded in 1993 and the status of a National University in 2008, when the institution adopted its current name.

    MISIS is the leading university of the Higher Metallurgical Education Association, whose members include universities from Russia, Ukraine, and Kazakhstan. It has joint degree programmes with the Technical University of Bergakademie Freiberg [Technisc Universität Bergakademie Freiberg] (DE) in Freiberg, Germany, and the National Polytechnic Institute of Lorraine [Institut National Polytechnique de Lorraine] (FR) at University of Lorraine [Université de Lorraine](FR)in Nancy, France.

    Institutes

    Institute of Metallurgy, Ecology and Quality
    Institute of Physical Chemistry of Materials
    Institute of Materials Technology
    Institute of Mining
    Institute of Computer Science and Economics
    Institute of Humanities
    Institute of Information Business Systems
    Faculty of Part-Time Education
    Post Higher Education Centre
    Preparatory Faculty
    Mining

     
  • richardmitnick 7:33 am on May 12, 2021 Permalink | Reply
    Tags: "Light meets superconducting circuits", , Enabling the engineering of large-scale quantum systems without requiring enormous cryogenic cooling power., HEMTs-low noise high electron mobility transistors, , Microwave superconducting circuit platforms, Optical fibers are about 100 times better heat isolators than coaxial cables and are 100 times more compact., Quantum Computing, , Replacing HEMT amplifiers and coaxial cables with a lithium niobate electro-optical phase modulator and optical fibers respectively., , Using light to read out superconducting circuits thus overcoming the scaling challenges of quantum systems.   

    From Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne](CH): “Light meets superconducting circuits” 

    From Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne](CH)

    5.12.21
    Amir Youssefi
    Nik Papageorgiou

    EPFL researchers have developed a light-based approach to read out superconducting circuits, overcoming the scaling-up limitations of quantum computing systems.

    1
    No image caption or credit.

    In the last few years, several technology companies including Google, Microsoft, and IBM, have massively invested in quantum computing systems based on microwave superconducting circuit platforms in an effort to scale them up from small research-oriented systems to commercialized computing platforms. But fulfilling the potential of quantum computers requires a significant increase in the number of qubits, the building blocks of quantum computers, which can store and manipulate quantum information.

    But quantum signals can be contaminated by thermal noise generated by the movement of electrons. To prevent this, superconducting quantum systems must operate at ultra-low temperatures – less than 20 milli-Kelvin – which can be achieved with cryogenic helium-dilution refrigerators.

    The output microwave signals from such systems are amplified by low-noise high-electron mobility transistors (HEMTs) at low temperatures. Signals are then routed outside the refrigerator by microwave coaxial cables, which are the easiest solutions to control and read superconducting devices but are poor heat isolators, and take up a lot of space; this becomes a problem when we need to scale up qubits in the thousands.

    Researchers in the group of Professor Tobias J. Kippenberg at EPFL’s School of Basic Sciences have now developed a novel approach that uses light to read out superconducting circuits thus overcoming the scaling challenges of quantum systems. The work is published in Nature Electronics.

    The scientists replaced HEMT amplifiers and coaxial cables with a lithium niobate electro-optical phase modulator and optical fibers respectively. Microwave signals from superconducting circuits modulate a laser carrier and encode information on the output light at cryogenic temperatures. Optical fibers are about 100 times better heat isolators than coaxial cables and are 100 times more compact. This enables the engineering of large-scale quantum systems without requiring enormous cryogenic cooling power. In addition, the direct conversion of microwave signals to the optical domain facilitates long-range transfer and networking between quantum systems.

    “We demonstrate a proof-of-principle experiment using a novel optical readout protocol to optically measure a superconducting device at cryogenic temperatures,” says Amir Youssefi, a PhD student working on the project. “It opens up a new avenue to scale future quantum systems.” To verify this approach, the team performed conventional coherent and incoherent spectroscopic measurements on a superconducting electromechanical circuit, which showed perfect agreement between optical and traditional HEMT measurements.

    Although this project used a commercial electro-optical phase modulator, the researchers are currently developing advanced electro-optical devices based on integrated lithium niobate technology to significantly enhance their method’s conversion efficiency and lower noise.

    Funding: Horizon 2020; Swiss National Science Foundation [Schweizerischer Nationalfonds zur Förderung der wissenschaftlichen Forschung] [Fonds national suisse de la recherche scientifique] (CH) (NCCR-QSIT and Sinergia)

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    EPFL bloc

    The Swiss Federal Institute of Technology in Lausanne [EPFL-École polytechnique fédérale de Lausanne](CH) is a research institute and university in Lausanne, Switzerland, that specializes in natural sciences and engineering. It is one of the two Swiss Federal Institutes of Technology, and it has three main missions: education, research and technology transfer.

    The QS World University Rankings ranks EPFL(CH) 14th in the world across all fields in their 2020/2021 ranking, whereas Times Higher Education World University Rankings ranks EPFL(CH) as the world’s 19th best school for Engineering and Technology in 2020.

    EPFL(CH) is located in the French-speaking part of Switzerland; the sister institution in the German-speaking part of Switzerland is the Swiss Federal Institute of Technology ETH Zürich [Eidgenössische Technische Hochschule Zürich)](CH). Associated with several specialized research institutes, the two universities form the Swiss Federal Institutes of Technology Domain (ETH(CH) Domain) which is directly dependent on the Federal Department of Economic Affairs, Education and Research. In connection with research and teaching activities, EPFL(CH) operates a nuclear reactor CROCUS; a Tokamak Fusion reactor; a Blue Gene/Q Supercomputer; and P3 bio-hazard facilities.

    The roots of modern-day EPFL(CH) can be traced back to the foundation of a private school under the name École spéciale de Lausanne in 1853 at the initiative of Lois Rivier, a graduate of the École Centrale Paris (FR) and John Gay the then professor and rector of the Académie de Lausanne. At its inception it had only 11 students and the offices was located at Rue du Valentin in Lausanne. In 1869, it became the technical department of the public Académie de Lausanne. When the Académie was reorganised and acquired the status of a university in 1890, the technical faculty changed its name to École d’ingénieurs de l’Université de Lausanne. In 1946, it was renamed the École polytechnique de l’Université de Lausanne (EPUL). In 1969, the EPUL was separated from the rest of the University of Lausanne and became a federal institute under its current name. EPFL(CH), like ETH Zürich(CH), is thus directly controlled by the Swiss federal government. In contrast, all other universities in Switzerland are controlled by their respective cantonal governments. Following the nomination of Patrick Aebischer as president in 2000, EPFL(CH) has started to develop into the field of life sciences. It absorbed the Swiss Institute for Experimental Cancer Research (ISREC) in 2008.

    In 1946, there were 360 students. In 1969, EPFL(CH) had 1,400 students and 55 professors. In the past two decades the university has grown rapidly and as of 2012 roughly 14,000 people study or work on campus, about 9,300 of these being Bachelor, Master or PhD students. The environment at modern day EPFL(CH) is highly international with the school attracting students and researchers from all over the world. More than 125 countries are represented on the campus and the university has two official languages, French and English.

    Organization

    EPFL is organised into eight schools, themselves formed of institutes that group research units (laboratories or chairs) around common themes:

    School of Basic Sciences (SB, Jan S. Hesthaven)

    Institute of Mathematics (MATH, Victor Panaretos)
    Institute of Chemical Sciences and Engineering (ISIC, Emsley Lyndon)
    Institute of Physics (IPHYS, Harald Brune)
    European Centre of Atomic and Molecular Computations (CECAM, Ignacio Pagonabarraga Mora)
    Bernoulli Center (CIB, Nicolas Monod)
    Biomedical Imaging Research Center (CIBM, Rolf Gruetter)
    Interdisciplinary Center for Electron Microscopy (CIME, Cécile Hébert)
    Max Planck-EPFL Centre for Molecular Nanosciences and Technology (CMNT, Thomas Rizzo)
    Swiss Plasma Center (SPC, Ambrogio Fasoli)
    Laboratory of Astrophysics (LASTRO, Jean-Paul Kneib)

    School of Engineering (STI, Ali Sayed)

    Institute of Electrical Engineering (IEL, Giovanni De Micheli)
    Institute of Mechanical Engineering (IGM, Thomas Gmür)
    Institute of Materials (IMX, Michaud Véronique)
    Institute of Microengineering (IMT, Olivier Martin)
    Institute of Bioengineering (IBI, Matthias Lütolf)

    School of Architecture, Civil and Environmental Engineering (ENAC, Claudia R. Binder)

    Institute of Architecture (IA, Luca Ortelli)
    Civil Engineering Institute (IIC, Eugen Brühwiler)
    Institute of Urban and Regional Sciences (INTER, Philippe Thalmann)
    Environmental Engineering Institute (IIE, David Andrew Barry)

    School of Computer and Communication Sciences (IC, James Larus)

    Algorithms & Theoretical Computer Science
    Artificial Intelligence & Machine Learning
    Computational Biology
    Computer Architecture & Integrated Systems
    Data Management & Information Retrieval
    Graphics & Vision
    Human-Computer Interaction
    Information & Communication Theory
    Networking
    Programming Languages & Formal Methods
    Security & Cryptography
    Signal & Image Processing
    Systems

    School of Life Sciences (SV, Gisou van der Goot)

    Bachelor-Master Teaching Section in Life Sciences and Technologies (SSV)
    Brain Mind Institute (BMI, Carmen Sandi)
    Institute of Bioengineering (IBI, Melody Swartz)
    Swiss Institute for Experimental Cancer Research (ISREC, Douglas Hanahan)
    Global Health Institute (GHI, Bruno Lemaitre)
    Ten Technology Platforms & Core Facilities (PTECH)
    Center for Phenogenomics (CPG)
    NCCR Synaptic Bases of Mental Diseases (NCCR-SYNAPSY)

    College of Management of Technology (CDM)

    Swiss Finance Institute at EPFL (CDM-SFI, Damir Filipovic)
    Section of Management of Technology and Entrepreneurship (CDM-PMTE, Daniel Kuhn)
    Institute of Technology and Public Policy (CDM-ITPP, Matthias Finger)
    Institute of Management of Technology and Entrepreneurship (CDM-MTEI, Ralf Seifert)
    Section of Financial Engineering (CDM-IF, Julien Hugonnier)

    College of Humanities (CDH, Thomas David)

    Human and social sciences teaching program (CDH-SHS, Thomas David)

    EPFL Middle East (EME, Dr. Franco Vigliotti)[62]

    Section of Energy Management and Sustainability (MES, Prof. Maher Kayal)

    In addition to the eight schools there are seven closely related institutions

    Swiss Cancer Centre
    Center for Biomedical Imaging (CIBM)
    Centre for Advanced Modelling Science (CADMOS)
    École cantonale d’art de Lausanne (ECAL)
    Campus Biotech
    Wyss Center for Bio- and Neuro-engineering
    Swiss National Supercomputing Centre

     
  • richardmitnick 11:17 am on May 9, 2021 Permalink | Reply
    Tags: "NIST Team Directs and Measures Quantum Drum Duet", , , Quantum Computing, ,   

    From National Institute of Standards and Technology (US) : “NIST Team Directs and Measures Quantum Drum Duet” 

    From National Institute of Standards and Technology (US)

    May 06, 2021
    Laura Ost
    laura.ost@nist.gov
    (303) 497-4880

    Like conductors of a spooky symphony, researchers at the National Institute of Standards and Technology (NIST) have “entangled” two small mechanical drums and precisely measured their linked quantum properties. Entangled pairs like this might someday perform computations and transmit data in large-scale quantum networks.

    The NIST team used microwave pulses to entice the two tiny aluminum drums into a quantum version of the Lindy Hop, with one partner bopping in a cool and calm pattern while the other was jiggling a bit more. Researchers analyzed radar-like signals to verify that the two drums’ steps formed an entangled pattern — a duet that would be impossible in the everyday classical world.

    1
    Credit: Juha Juvonen.

    2
    NIST researchers entangled the beats of these two mechanical drums — tiny aluminum membranes each made of about 1 trillion atoms — and precisely measured their linked quantum properties. Entangled pairs like this (as shown in this colorized micrograph), which are massive by quantum standards, might someday perform computations and transmit data in large-scale quantum networks. Credit: J. Teufel/NIST.

    What’s new is not so much the dance itself but the researchers’ ability to measure the drumbeats, rising and falling by just one-quadrillionth of a meter, and verify their fragile entanglement by detecting subtle statistical relationships between their motions.

    The research is described in the May 7 issue of Science.

    “If you analyze the position and momentum data for the two drums independently, they each simply look hot,” NIST physicist John Teufel said. “But looking at them together, we can see that what looks like random motion of one drum is highly correlated with the other, in a way that is only possible through quantum entanglement.”

    Quantum mechanics was originally conceived as the rulebook for light and matter at atomic scales. However, in recent years researchers have shown that the same rules can apply to increasingly larger objects such as the drums. Their back-and-forth motion makes them a type of system known as a mechanical oscillator. Such systems were entangled for the first time at NIST about a decade ago, and in that case the mechanical elements were single atoms.

    Since then, Teufel’s research group has been demonstrating quantum control of drumlike aluminum membranes suspended above sapphire mats. By quantum standards, the NIST drums are massive, 20 micrometers wide by 14 micrometers long and 100 nanometers thick. They each weigh about 70 picograms, which corresponds to about 1 trillion atoms.

    Entangling massive objects is difficult because they interact strongly with the environment, which can destroy delicate quantum states. Teufel’s group developed new methods to control and measure the motion of two drums simultaneously. The researchers adapted a technique first demonstrated in 2011 for cooling a single drum by switching from steady to pulsed microwave signals to separately optimize the steps of cooling, entangling and measuring the states. To rigorously analyze the entanglement, experimentalists also worked more closely with theorists, an increasingly important alliance in the global effort to build quantum networks.

    The NIST drum set is connected to an electrical circuit and encased in a cryogenically chilled cavity. When a microwave pulse is applied, the electrical system interacts with and controls the activities of the drums, which can sustain quantum states like entanglement for approximately a millisecond, a long time in the quantum world.

    For the experiments, researchers applied two simultaneous microwave pulses to cool the drums, two more simultaneous pulses to entangle the drums, and two final pulses to amplify and record the signals representing the quantum states of the two drums. The states are encoded in a reflected microwave field, similar to radar. Researchers compared the reflections to the original microwave pulse to determine the position and momentum of each drum.

    To cool the drums, researchers applied pulses at a frequency below the cavity’s natural vibrations. As in the 2011 experiment, the drumbeats converted applied photons to the cavity’s higher frequency. These photons leaked out of the cavity as it filled up. Each departing photon took with it one mechanical unit of energy — one phonon, or one quantum — from drum motion. This got rid of most of the heat-related drum motion.

    To create entanglement, researchers applied microwave pulses in between the frequencies of the two drums, higher than drum 1 and lower than drum 2. These pulses entangled drum 1 phonons with the cavity’s photons, generating correlated photon-phonon pairs. The pulses also cooled drum 2 further, as photons leaving the cavity were replaced with phonons. What was left was mostly pairs of entangled phonons shared between the two drums.

    To entangle the phonon pairs, the duration of the pulses was crucial. Researchers discovered that these microwave pulses needed to last longer than 4 microseconds, ideally 16.8 microseconds, to strongly entangle the phonons. During this time period the entanglement became stronger and the motion of each drum increased because they were moving in unison, a kind of sympathetic reinforcement, Teufel said.

    Researchers looked for patterns in the returned signals, or radar data. In the classical world the results would be random. Plotting the results on a graph revealed unusual patterns suggesting the drums were entangled. To be certain, the researchers ran the experiment 10,000 times and applied a statistical test to calculate the correlations between various sets of results, such as the positions of the two drums.

    “Roughly speaking, we measured how correlated two variables are — for example, if you measured the position of one drum, how well could you predict the position of the other drum,” Teufel said. “If they have no correlations and they are both perfectly cold, you could only guess the average position of the other drum within an uncertainly of half a quantum of motion. When they are entangled, we can do better, with less uncertainty. Entanglement is the only way this is possible.”

    “To verify that entanglement is present, we do a statistical test called an ‘entanglement witness,’’’ NIST theorist Scott Glancy said. “We observe correlations between the drums’ positions and momentums, and if those correlations are stronger than can be produced by classical physics, we know the drums must have been entangled. The radar signals measure position and momentum simultaneously, but the Heisenberg uncertainty principle says that this can’t be done with perfect accuracy. Therefore, we pay a cost of extra randomness in our measurements. We manage that uncertainty by collecting a large data set and correcting for the uncertainty during our statistical analysis.”

    Highly entangled, massive quantum systems like this might serve as long-lived nodes of quantum networks. The high-efficiency radar measurements used in this work could be helpful in applications such as quantum teleportation — data transfer without a physical link — or swapping entanglement between nodes of a quantum network, because these applications require decisions to be made based on measurements of entanglement outcomes. Entangled systems could also be used in fundamental tests of quantum mechanics and force sensing beyond standard quantum limits.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    National Institute of Standards and Technology (US)‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “National Institute of Standards and Technology (US)” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock. NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR). The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961. SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology (CNST) performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility. This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).

    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: