Tagged: Quantum Mechanics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:33 am on May 12, 2021 Permalink | Reply
    Tags: , , , Quantum Mechanics, , "Light meets superconducting circuits", Microwave superconducting circuit platforms, Using light to read out superconducting circuits thus overcoming the scaling challenges of quantum systems., HEMTs-low noise high electron mobility transistors, Replacing HEMT amplifiers and coaxial cables with a lithium niobate electro-optical phase modulator and optical fibers respectively., Optical fibers are about 100 times better heat isolators than coaxial cables and are 100 times more compact., Enabling the engineering of large-scale quantum systems without requiring enormous cryogenic cooling power.   

    From Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne](CH): “Light meets superconducting circuits” 

    From Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne](CH)

    Amir Youssefi
    Nik Papageorgiou

    EPFL researchers have developed a light-based approach to read out superconducting circuits, overcoming the scaling-up limitations of quantum computing systems.

    No image caption or credit.

    In the last few years, several technology companies including Google, Microsoft, and IBM, have massively invested in quantum computing systems based on microwave superconducting circuit platforms in an effort to scale them up from small research-oriented systems to commercialized computing platforms. But fulfilling the potential of quantum computers requires a significant increase in the number of qubits, the building blocks of quantum computers, which can store and manipulate quantum information.

    But quantum signals can be contaminated by thermal noise generated by the movement of electrons. To prevent this, superconducting quantum systems must operate at ultra-low temperatures – less than 20 milli-Kelvin – which can be achieved with cryogenic helium-dilution refrigerators.

    The output microwave signals from such systems are amplified by low-noise high-electron mobility transistors (HEMTs) at low temperatures. Signals are then routed outside the refrigerator by microwave coaxial cables, which are the easiest solutions to control and read superconducting devices but are poor heat isolators, and take up a lot of space; this becomes a problem when we need to scale up qubits in the thousands.

    Researchers in the group of Professor Tobias J. Kippenberg at EPFL’s School of Basic Sciences have now developed a novel approach that uses light to read out superconducting circuits thus overcoming the scaling challenges of quantum systems. The work is published in Nature Electronics.

    The scientists replaced HEMT amplifiers and coaxial cables with a lithium niobate electro-optical phase modulator and optical fibers respectively. Microwave signals from superconducting circuits modulate a laser carrier and encode information on the output light at cryogenic temperatures. Optical fibers are about 100 times better heat isolators than coaxial cables and are 100 times more compact. This enables the engineering of large-scale quantum systems without requiring enormous cryogenic cooling power. In addition, the direct conversion of microwave signals to the optical domain facilitates long-range transfer and networking between quantum systems.

    “We demonstrate a proof-of-principle experiment using a novel optical readout protocol to optically measure a superconducting device at cryogenic temperatures,” says Amir Youssefi, a PhD student working on the project. “It opens up a new avenue to scale future quantum systems.” To verify this approach, the team performed conventional coherent and incoherent spectroscopic measurements on a superconducting electromechanical circuit, which showed perfect agreement between optical and traditional HEMT measurements.

    Although this project used a commercial electro-optical phase modulator, the researchers are currently developing advanced electro-optical devices based on integrated lithium niobate technology to significantly enhance their method’s conversion efficiency and lower noise.

    Funding: Horizon 2020; Swiss National Science Foundation [Schweizerischer Nationalfonds zur Förderung der wissenschaftlichen Forschung] [Fonds national suisse de la recherche scientifique] (CH) (NCCR-QSIT and Sinergia)

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    EPFL bloc

    The Swiss Federal Institute of Technology in Lausanne [EPFL-École polytechnique fédérale de Lausanne](CH) is a research institute and university in Lausanne, Switzerland, that specializes in natural sciences and engineering. It is one of the two Swiss Federal Institutes of Technology, and it has three main missions: education, research and technology transfer.

    The QS World University Rankings ranks EPFL(CH) 14th in the world across all fields in their 2020/2021 ranking, whereas Times Higher Education World University Rankings ranks EPFL(CH) as the world’s 19th best school for Engineering and Technology in 2020.

    EPFL(CH) is located in the French-speaking part of Switzerland; the sister institution in the German-speaking part of Switzerland is the Swiss Federal Institute of Technology ETH Zürich [Eidgenössische Technische Hochschule Zürich)](CH). Associated with several specialized research institutes, the two universities form the Swiss Federal Institutes of Technology Domain (ETH(CH) Domain) which is directly dependent on the Federal Department of Economic Affairs, Education and Research. In connection with research and teaching activities, EPFL(CH) operates a nuclear reactor CROCUS; a Tokamak Fusion reactor; a Blue Gene/Q Supercomputer; and P3 bio-hazard facilities.

    The roots of modern-day EPFL(CH) can be traced back to the foundation of a private school under the name École spéciale de Lausanne in 1853 at the initiative of Lois Rivier, a graduate of the École Centrale Paris (FR) and John Gay the then professor and rector of the Académie de Lausanne. At its inception it had only 11 students and the offices was located at Rue du Valentin in Lausanne. In 1869, it became the technical department of the public Académie de Lausanne. When the Académie was reorganised and acquired the status of a university in 1890, the technical faculty changed its name to École d’ingénieurs de l’Université de Lausanne. In 1946, it was renamed the École polytechnique de l’Université de Lausanne (EPUL). In 1969, the EPUL was separated from the rest of the University of Lausanne and became a federal institute under its current name. EPFL(CH), like ETH Zürich(CH), is thus directly controlled by the Swiss federal government. In contrast, all other universities in Switzerland are controlled by their respective cantonal governments. Following the nomination of Patrick Aebischer as president in 2000, EPFL(CH) has started to develop into the field of life sciences. It absorbed the Swiss Institute for Experimental Cancer Research (ISREC) in 2008.

    In 1946, there were 360 students. In 1969, EPFL(CH) had 1,400 students and 55 professors. In the past two decades the university has grown rapidly and as of 2012 roughly 14,000 people study or work on campus, about 9,300 of these being Bachelor, Master or PhD students. The environment at modern day EPFL(CH) is highly international with the school attracting students and researchers from all over the world. More than 125 countries are represented on the campus and the university has two official languages, French and English.


    EPFL is organised into eight schools, themselves formed of institutes that group research units (laboratories or chairs) around common themes:

    School of Basic Sciences (SB, Jan S. Hesthaven)

    Institute of Mathematics (MATH, Victor Panaretos)
    Institute of Chemical Sciences and Engineering (ISIC, Emsley Lyndon)
    Institute of Physics (IPHYS, Harald Brune)
    European Centre of Atomic and Molecular Computations (CECAM, Ignacio Pagonabarraga Mora)
    Bernoulli Center (CIB, Nicolas Monod)
    Biomedical Imaging Research Center (CIBM, Rolf Gruetter)
    Interdisciplinary Center for Electron Microscopy (CIME, Cécile Hébert)
    Max Planck-EPFL Centre for Molecular Nanosciences and Technology (CMNT, Thomas Rizzo)
    Swiss Plasma Center (SPC, Ambrogio Fasoli)
    Laboratory of Astrophysics (LASTRO, Jean-Paul Kneib)

    School of Engineering (STI, Ali Sayed)

    Institute of Electrical Engineering (IEL, Giovanni De Micheli)
    Institute of Mechanical Engineering (IGM, Thomas Gmür)
    Institute of Materials (IMX, Michaud Véronique)
    Institute of Microengineering (IMT, Olivier Martin)
    Institute of Bioengineering (IBI, Matthias Lütolf)

    School of Architecture, Civil and Environmental Engineering (ENAC, Claudia R. Binder)

    Institute of Architecture (IA, Luca Ortelli)
    Civil Engineering Institute (IIC, Eugen Brühwiler)
    Institute of Urban and Regional Sciences (INTER, Philippe Thalmann)
    Environmental Engineering Institute (IIE, David Andrew Barry)

    School of Computer and Communication Sciences (IC, James Larus)

    Algorithms & Theoretical Computer Science
    Artificial Intelligence & Machine Learning
    Computational Biology
    Computer Architecture & Integrated Systems
    Data Management & Information Retrieval
    Graphics & Vision
    Human-Computer Interaction
    Information & Communication Theory
    Programming Languages & Formal Methods
    Security & Cryptography
    Signal & Image Processing

    School of Life Sciences (SV, Gisou van der Goot)

    Bachelor-Master Teaching Section in Life Sciences and Technologies (SSV)
    Brain Mind Institute (BMI, Carmen Sandi)
    Institute of Bioengineering (IBI, Melody Swartz)
    Swiss Institute for Experimental Cancer Research (ISREC, Douglas Hanahan)
    Global Health Institute (GHI, Bruno Lemaitre)
    Ten Technology Platforms & Core Facilities (PTECH)
    Center for Phenogenomics (CPG)
    NCCR Synaptic Bases of Mental Diseases (NCCR-SYNAPSY)

    College of Management of Technology (CDM)

    Swiss Finance Institute at EPFL (CDM-SFI, Damir Filipovic)
    Section of Management of Technology and Entrepreneurship (CDM-PMTE, Daniel Kuhn)
    Institute of Technology and Public Policy (CDM-ITPP, Matthias Finger)
    Institute of Management of Technology and Entrepreneurship (CDM-MTEI, Ralf Seifert)
    Section of Financial Engineering (CDM-IF, Julien Hugonnier)

    College of Humanities (CDH, Thomas David)

    Human and social sciences teaching program (CDH-SHS, Thomas David)

    EPFL Middle East (EME, Dr. Franco Vigliotti)[62]

    Section of Energy Management and Sustainability (MES, Prof. Maher Kayal)

    In addition to the eight schools there are seven closely related institutions

    Swiss Cancer Centre
    Center for Biomedical Imaging (CIBM)
    Centre for Advanced Modelling Science (CADMOS)
    École cantonale d’art de Lausanne (ECAL)
    Campus Biotech
    Wyss Center for Bio- and Neuro-engineering
    Swiss National Supercomputing Centre

  • richardmitnick 5:01 pm on May 10, 2021 Permalink | Reply
    Tags: , , Quantum Mechanics, , , "JQI Researchers Generate Tunable Twin Particles of Light", A new technique sees two distinct particles of light enter a chip and two identical twin particles of light leave it., Identical twins might seem “indistinguishable” but in the quantum world the word takes on a new level of meaning., Quantum interference— needed for quantum computers., Inside a resonator a photon from each of the beams spontaneously combine. The researchers then observed how the photons reformed into two indistinguishable photons., The resulting combination of being indistinguishable and entangled is essential for many potential uses of photons in quantum technologies.   

    From Joint Quantum Institute (US): “JQI Researchers Generate Tunable Twin Particles of Light” 

    JQI bloc


    University of Maryland (US)

    May 10, 2021

    Story by Bailey Bedford

    Mohammad Hafezi

    A new technique sees two distinct particles of light enter a chip and two identical twin particles of light leave it. The image artistically combines the journey of twin particles of light along the outer edge of a checkerboard of rings with the abstract shape of its topological underpinnings. Credit: Kaveh Haerian.

    Identical twins might seem “indistinguishable” but in the quantum world the word takes on a new level of meaning. While identical twins share many traits, the universe treats two indistinguishable quantum particles as intrinsically interchangeable. This opens the door for indistinguishable particles to interact in unique ways—such as in quantum interference—that are needed for quantum computers.

    While generating a crowd of photons—particles of light—is as easy as flipping a light switch, it’s trickier to make a pair of indistinguishable photons. And it takes yet more work to endow that pair with a quantum mechanical link known as entanglement. In a paper published May 10, 2021 in the journal Nature Photonics, JQI researchers and their colleagues describe a new way to make entangled twin particles of light and to tune their properties using a method conveniently housed on a chip, a potential boon for quantum technologies that require a reliable source of well-tailored photon pairs.

    The researchers, led by JQI fellow Mohammad Hafezi, designed the method to harness the advantages of topological physics. Topological physics explores previously untapped physical phenomena using the mathematical field of topology, which describes common traits shared by different shapes. (Where geometry concerns angles and sizes, topology is more about holes and punctures—all-encompassing characteristics that don’t depend on local details.) Researchers have made several major discoveries by applying this approach, which describes how quantum particles—like electrons or, in this case, photons—can move in a particular material or device by analyzing its broad characteristics through the lens of topological features that correspond to abstract shapes (such as the donut in the image above). The topological phenomena that have been revealed are directly tied to the general nature of the material; they must exist even in the presence of material impurities that would upset the smooth movement of photons or electrons in most other circumstances.

    Their new method builds on previous work, including the generation of a series of distinguishable photon pairs. In both the new and old experiments, the team created a checkerboard of rings on a silicon chip. Each ring is a resonator that acts like a tiny race track designed to keep certain photons traveling round and round for a long time. But since individual photons in a resonator live by quantum rules, the racecars (photons) can sometimes just pass unchanged through an intervening wall and proceed to speed along a neighboring track.

    The repeating grid of rings mimics the repeating grid of atoms that electrons travel through in a solid, allowing the researchers to design situations for light that mirror well known topological effects in electronics. By creating and exploring different topological environments, the team has developed new ways to manipulate photons.

    “It’s exactly the same mathematics that applies to electrons and photons,” says Sunil Mittal, a JQI postdoctoral researcher and the first author of the paper. “So you get more or less all the same topological features. All the mathematics that you do with electrons, you can simply carry to photonic systems.”

    In the current work, they recreated an electronic phenomenon called the anomalous quantum Hall effect that opens up paths for electrons on the edge of a material. These edge paths, which are called topological edge states, exist because of topological effects, and they can reliably transport electrons while leaving routes through the interior easily disrupted and impassable. Achieving this particular topological effect requires that localized magnetic fields push on electrons and that the total magnetic field when averaged over larger sections of the material cancels out to zero.

    But photons lack the electrical charge that makes electrons susceptible to magnetic shoves, so the team had to recreate the magnetic push in some other way. To achieve this, they laid out the tracks so that it is easier for the photons to quantum mechanically jump between rings in certain directions. This simulates the missing magnetic influence and creates an environment with a photonic version of the anomalous quantum Hall effect and its stable edge paths.

    For this experiment, the team sent two laser beams of two different colors (frequencies) of light into this carefully designed environment. Inside a resonator a photon from each of the beams spontaneously combine. The researchers then observed how the photons reformed into two indistinguishable photons, travelled through the topological edge paths and were eventually ejected from the chip.

    Since the new photons formed inside a resonator ring, they took on the traits (polarization and spatial mode) of the photons that the resonators are designed to hold. The only trait left that the team needed to worry about was their frequencies.

    The researchers matched the frequencies of the photons by selecting the appropriate input frequencies for the two lasers based on how they would combine inside the silicon resonators. With the appropriate theoretical understanding of the experiment, they can produce photons that are quantum mechanically indistinguishable.

    The nature of the formation of the new photons provides the desired quantum characteristics. The photons are quantum mechanically entangled due to the way they were generated as pairs; their combined properties—like the total energy of the pair—are constrained by what the original photons brought into the mix, so observing the property of one instantly reveals the equivalent fact about the other. Until they are observed—that is, detected by the researchers—they don’t exist as two individual particles with distinct quantum states for their frequencies. Rather, they are identical mixtures of possible frequency states called a superposition. The two photons being indistinguishable means they can quantum mechanically interfere with each other

    The resulting combination of being indistinguishable and entangled is essential for many potential uses of photons in quantum technologies. An additional lucky side effect of the researcher’s topological approach is that it gives them a greater ability to adjust the frequencies of the twin photons based on the frequencies they pump into the chip and how well the frequencies match with the topological states on the edge of the device.

    “This is not the only way to generate entangled photon pairs—there are many other devices—but they are not tunable,” Mittal says. “So once you fabricate your device, it is what it is. If you want to change the bandwidth of the photons or do something else, it’s not possible. But in our case, we don’t have to design a new device. We showed that, just by tuning the pump frequencies, we could tune the interference properties. So, that was very exciting.”

    The combination of the devices being tunable and robust against manufacturing imperfections make them an appealing option for practical applications, the authors say. The team plans to continue exploring the potential of this technique and related topological devices and possible ways to further improve the devices such as using other materials to make them.

    In addition to Hafezi and Mittal, former JQI graduate student Venkata Vikram Orre and former JQI postdoctoral researcher and current assistant professor at the University of Illinois Urbana-Champaign (US) Elizabeth Goldschmidt were also co-authors of the paper.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    JQI supported by Gordon and Betty Moore Foundation

    We are on the verge of a new technological revolution as the strange and unique properties of quantum physics become relevant and exploitable in the context of information science and technology.

    The Joint Quantum Institute (JQI) (US) is pursuing that goal through the work of leading quantum scientists from the Department of Physics of the University of Maryland (UMD (US)), the National Institute of Standards and Technology (NIST) and the Laboratory for Physical Sciences (LPS). Each institution brings to JQI major experimental and theoretical research programs that are dedicated to the goals of controlling and exploiting quantum systems.

    U Maryland Campus

    Driven by the pursuit of excellence, the University of Maryland (US) has enjoyed a remarkable rise in accomplishment and reputation over the past two decades. By any measure, Maryland is now one of the nation’s preeminent public research universities and on a path to become one of the world’s best. To fulfill this promise, we must capitalize on our momentum, fully exploit our competitive advantages, and pursue ambitious goals with great discipline and entrepreneurial spirit. This promise is within reach. This strategic plan is our working agenda.

    The plan is comprehensive, bold, and action oriented. It sets forth a vision of the University as an institution unmatched in its capacity to attract talent, address the most important issues of our time, and produce the leaders of tomorrow. The plan will guide the investment of our human and material resources as we strengthen our undergraduate and graduate programs and expand research, outreach and partnerships, become a truly international center, and enhance our surrounding community.

    Our success will benefit Maryland in the near and long term, strengthen the State’s competitive capacity in a challenging and changing environment and enrich the economic, social and cultural life of the region. We will be a catalyst for progress, the State’s most valuable asset, and an indispensable contributor to the nation’s well-being. Achieving the goals of Transforming Maryland requires broad-based and sustained support from our extended community. We ask our stakeholders to join with us to make the University an institution of world-class quality with world-wide reach and unparalleled impact as it serves the people and the state of Maryland.

  • richardmitnick 11:23 am on May 10, 2021 Permalink | Reply
    Tags: , Quantum Mechanics, , KTH Royal Institute of Technology [Kungliga Tekniska högskolan] (SE), "New light emitters developed for quantum circuits", Harnessing optical photons to integrate quantum computing seamlessly with fiber-optic networks.   

    From KTH Royal Institute of Technology [Kungliga Tekniska högskolan] (SE): “New light emitters developed for quantum circuits” 

    From KTH Royal Institute of Technology [Kungliga Tekniska högskolan] (SE)

    May 10, 2021
    David Callahan

    A close-up look at the integrated chip that emits photons. Courtesy of Ali Elshaari.

    The promise of a quantum internet depends on the complexities of harnessing light to transmit quantum information over fiber optic networks. A potential step forward was reported today by researchers at KTH who developed integrated chips that can generate light particles on demand and without the need for extreme refrigeration.

    Quantum computing today relies on states of matter, that is, electrons which carry qubits of information to perform multiple calculations simultaneously, in a fraction of the time it takes with classical computing.

    KTH Professor Val Zwiller says that in order to integrate quantum computing seamlessly with fiber-optic networks—which are used by the internet today—a more promising approach would be to harness optical photons.

    “The photonic approach offers a natural link between communication and computation,” he says. “That’s important, since the end goal is to transmit the processed quantum information using light.”

    Deterministic rather than random

    But in order for photons to deliver qubits on-demand in quantum systems, they need to be emitted in a deterministic, rather than probabilistic, fashion. This can be accomplished at extremely low temperatures in artificial atoms, but today the research group at KTH reported a way to make it work in optical integrated circuits—at room temperature [Advanced Quantum Technologies].

    The new method enables photon emitters to be precisely positioned in integrated optical circuits that resemble copper wires for electricity, except that they carry light instead, says Associate Professor Ali Elshaari.

    The researchers harnessed the single-photon-emitting properties of hexagonal boron nitride (hBN), a layered material. hBN is a compound commonly used is used ceramics, alloys, resins, plastics and rubbers to give them self-lubricating properties. They integrated the material with silicon nitride waveguides to direct the emitted photons.

    Quantum circuits with light are either operated at cryogenic temperatures—plus 4 Kelvin above absolute zero—using atom-like single photon sources, or at room temperature using random single photon sources, Elshaari says. By contrast, the technique developed at KTH enables optical circuits with on-demand emission of light particles at room temperature.

    “In existing optical circuits operating at room temperature, you never know when the single photon is generated unless you do a heralding measurement,” Elshaari says. “We realized a deterministic process that precisely positions light-particles emitters operating at room temperature in an integrated photonic circuit.”

    The researchers reported coupling of hBN single photon emitter to silicon nitride waveguides, and they developed a method to image the quantum emitters. Then in a hybrid approach, the team built the photonic circuits with respect to the quantum sources locations using a series of steps involving electron beam lithography and etching, while still preserving the high quality nature of the quantum light.

    The achievement opens a path to hybrid integration, that is, incorporating atom-like single-photon emitters into photonic platforms that cannot emit light efficiently on demand.

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    KTH Royal Institute of Technology [Kungliga Tekniska högskolan](SE) is a public research university in Stockholm, Sweden. KTH conducts research and education within engineering and technology, and is Sweden’s largest technical university. Currently, KTH consists of five schools with four campuses in and around Stockholm.

    KTH was established in 1827 as Teknologiska Institutet (Institute of Technology), and had its roots in Mekaniska skolan (School of Mechanics) that was established in 1798 in Stockholm. But the origin of KTH dates back to the predecessor to Mekaniska skolan, the Laboratorium Mechanicum, which was established in 1697 by Swedish scientist and innovator Christopher Polhem. Laboratorium Mechanicum combined education technology, a laboratory and an exhibition space for innovations. In 1877 KTH received its current name, Kungliga Tekniska högskolan (KTH Royal Institute of Technology). It is ranked top 100 in the world among all universities in the 2020 QS World University Rankings.

  • richardmitnick 11:17 am on May 9, 2021 Permalink | Reply
    Tags: , , Quantum Mechanics, , , "NIST Team Directs and Measures Quantum Drum Duet"   

    From National Institute of Standards and Technology (US) : “NIST Team Directs and Measures Quantum Drum Duet” 

    From National Institute of Standards and Technology (US)

    May 06, 2021
    Laura Ost
    (303) 497-4880

    Like conductors of a spooky symphony, researchers at the National Institute of Standards and Technology (NIST) have “entangled” two small mechanical drums and precisely measured their linked quantum properties. Entangled pairs like this might someday perform computations and transmit data in large-scale quantum networks.

    The NIST team used microwave pulses to entice the two tiny aluminum drums into a quantum version of the Lindy Hop, with one partner bopping in a cool and calm pattern while the other was jiggling a bit more. Researchers analyzed radar-like signals to verify that the two drums’ steps formed an entangled pattern — a duet that would be impossible in the everyday classical world.

    Credit: Juha Juvonen.

    NIST researchers entangled the beats of these two mechanical drums — tiny aluminum membranes each made of about 1 trillion atoms — and precisely measured their linked quantum properties. Entangled pairs like this (as shown in this colorized micrograph), which are massive by quantum standards, might someday perform computations and transmit data in large-scale quantum networks. Credit: J. Teufel/NIST.

    What’s new is not so much the dance itself but the researchers’ ability to measure the drumbeats, rising and falling by just one-quadrillionth of a meter, and verify their fragile entanglement by detecting subtle statistical relationships between their motions.

    The research is described in the May 7 issue of Science.

    “If you analyze the position and momentum data for the two drums independently, they each simply look hot,” NIST physicist John Teufel said. “But looking at them together, we can see that what looks like random motion of one drum is highly correlated with the other, in a way that is only possible through quantum entanglement.”

    Quantum mechanics was originally conceived as the rulebook for light and matter at atomic scales. However, in recent years researchers have shown that the same rules can apply to increasingly larger objects such as the drums. Their back-and-forth motion makes them a type of system known as a mechanical oscillator. Such systems were entangled for the first time at NIST about a decade ago, and in that case the mechanical elements were single atoms.

    Since then, Teufel’s research group has been demonstrating quantum control of drumlike aluminum membranes suspended above sapphire mats. By quantum standards, the NIST drums are massive, 20 micrometers wide by 14 micrometers long and 100 nanometers thick. They each weigh about 70 picograms, which corresponds to about 1 trillion atoms.

    Entangling massive objects is difficult because they interact strongly with the environment, which can destroy delicate quantum states. Teufel’s group developed new methods to control and measure the motion of two drums simultaneously. The researchers adapted a technique first demonstrated in 2011 for cooling a single drum by switching from steady to pulsed microwave signals to separately optimize the steps of cooling, entangling and measuring the states. To rigorously analyze the entanglement, experimentalists also worked more closely with theorists, an increasingly important alliance in the global effort to build quantum networks.

    The NIST drum set is connected to an electrical circuit and encased in a cryogenically chilled cavity. When a microwave pulse is applied, the electrical system interacts with and controls the activities of the drums, which can sustain quantum states like entanglement for approximately a millisecond, a long time in the quantum world.

    For the experiments, researchers applied two simultaneous microwave pulses to cool the drums, two more simultaneous pulses to entangle the drums, and two final pulses to amplify and record the signals representing the quantum states of the two drums. The states are encoded in a reflected microwave field, similar to radar. Researchers compared the reflections to the original microwave pulse to determine the position and momentum of each drum.

    To cool the drums, researchers applied pulses at a frequency below the cavity’s natural vibrations. As in the 2011 experiment, the drumbeats converted applied photons to the cavity’s higher frequency. These photons leaked out of the cavity as it filled up. Each departing photon took with it one mechanical unit of energy — one phonon, or one quantum — from drum motion. This got rid of most of the heat-related drum motion.

    To create entanglement, researchers applied microwave pulses in between the frequencies of the two drums, higher than drum 1 and lower than drum 2. These pulses entangled drum 1 phonons with the cavity’s photons, generating correlated photon-phonon pairs. The pulses also cooled drum 2 further, as photons leaving the cavity were replaced with phonons. What was left was mostly pairs of entangled phonons shared between the two drums.

    To entangle the phonon pairs, the duration of the pulses was crucial. Researchers discovered that these microwave pulses needed to last longer than 4 microseconds, ideally 16.8 microseconds, to strongly entangle the phonons. During this time period the entanglement became stronger and the motion of each drum increased because they were moving in unison, a kind of sympathetic reinforcement, Teufel said.

    Researchers looked for patterns in the returned signals, or radar data. In the classical world the results would be random. Plotting the results on a graph revealed unusual patterns suggesting the drums were entangled. To be certain, the researchers ran the experiment 10,000 times and applied a statistical test to calculate the correlations between various sets of results, such as the positions of the two drums.

    “Roughly speaking, we measured how correlated two variables are — for example, if you measured the position of one drum, how well could you predict the position of the other drum,” Teufel said. “If they have no correlations and they are both perfectly cold, you could only guess the average position of the other drum within an uncertainly of half a quantum of motion. When they are entangled, we can do better, with less uncertainty. Entanglement is the only way this is possible.”

    “To verify that entanglement is present, we do a statistical test called an ‘entanglement witness,’’’ NIST theorist Scott Glancy said. “We observe correlations between the drums’ positions and momentums, and if those correlations are stronger than can be produced by classical physics, we know the drums must have been entangled. The radar signals measure position and momentum simultaneously, but the Heisenberg uncertainty principle says that this can’t be done with perfect accuracy. Therefore, we pay a cost of extra randomness in our measurements. We manage that uncertainty by collecting a large data set and correcting for the uncertainty during our statistical analysis.”

    Highly entangled, massive quantum systems like this might serve as long-lived nodes of quantum networks. The high-efficiency radar measurements used in this work could be helpful in applications such as quantum teleportation — data transfer without a physical link — or swapping entanglement between nodes of a quantum network, because these applications require decisions to be made based on measurements of entanglement outcomes. Entangled systems could also be used in fundamental tests of quantum mechanics and force sensing beyond standard quantum limits.

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    National Institute of Standards and Technology (US)‘s Mission, Vision, Core Competencies, and Core Values


    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.


    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “National Institute of Standards and Technology (US)” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.


    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock. NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR). The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961. SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology (CNST) performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility. This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).


    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

  • richardmitnick 2:00 pm on April 18, 2021 Permalink | Reply
    Tags: , DOE's Fermi National Accelerator Laboratory(US) Muon g-2 studio., , Don Lincoln-DOE's Fermi National Accelerator Laboratory (US), , , , , Quantum Mechanics   

    From Forbes Magazine : “Recent Reports Of Overturned Scientific Theory Are Premature” 

    From Forbes Magazine

    Apr 17, 2021


    On April 7, 2021, the world’s scientific community watched with rapt attention as scientists based at Fermi National Accelerator Laboratory (US) presented a research result that the science media reported heavily. A new measurement disagreed in a very significant way with predictions. This disagreement could have been strong evidence that scientists would have to rethink their theory. That’s an exciting prospect, if it’s true. However, a theoretical paper [Leading hadronic contribution to the muon magnetic moment from lattice QCD] was released the same day as the experimental result that puts the entire situation in turmoil.

    The new experimental measurement involved the magnetic properties of subatomic particles called muons. Muons are essentially heavy cousins of the electron. Like the electron, the muon has both electric charge, and it spins. And any spinning electric charge creates a magnet. It is the strength of the magnet that researchers measured.

    It is possible for scientists to calculate the relationship between the strength of the magnet and the quantity describing the amount of spin. Ignoring some constants, the ratio of magnetic strength to amount of spin is called “g.” Using the quantum theory of the 1930s, it is easy to show that for electrons (and muons) that g is exactly equal to two (g = 2).


    Measurements in 1947 [Physical Review Journals Archive] found that this prediction wasn’t quite right. The measured value of g was closer to 2.00238, or about 0.1% higher. This discrepancy could have been simply a measurement error, but it turned out that the difference was real. Shortly after measurement, a physicist by the name of Julian Schwinger used a more advanced form of quantum mechanics and found that the earlier prediction was incomplete and the correct value for g was indeed 2.00238. Schwinger shared the 1965 Nobel Prize in physics with Richard Feynman and Sin-Itiro Tomonaga, for developing this more advanced form of quantum mechanics.

    This more advanced form of quantum mechanics considered the effect of a charged particle on the space surrounding it. As one gets close to a charged particle, the electric field gets stronger and stronger. This strengthened field is accompanied by energy. According to Einstein’s theory of relativity, energy and mass are equivalent, so what happens is that the energy of the electric field can temporarily convert into a pair of particles, one matter and one antimatter. These two particles quickly convert back to energy, and the process repeats itself. In fact, there is so much energy involved in the electric field near, for example, an electron, that at any time there are many pairs of matter and antimatter particles at the same time.

    Quantum Foam

    A principle called the Heisenberg Uncertainty Principle applies here. This quantum principle says that pairs of matter and antimatter particles can appear, but only for a short time. Furthermore, the more massive the particles are, the harder it is for them to appear, and they live for a shorter amount of time.

    Because the electron is the lightest of the charged subatomic particles, they appear most often (along with their antimatter counterpart, called the positron). Thus, surrounding every electron is a cloud of energy from the electric field, and a second cloud of electrons and positrons flickering in and out of existence.

    Those clouds are the reason that the g factor for electrons or muons isn’t exactly 2. The electron or muon interacts with the cloud and this enhances the particle’s magnetic properties.

    So that’s the big idea. In the following decades, scientists tried to measure the magnetic properties of both electrons and muons more accurately. Some researchers have focused on measuring the magnetic properties of muons. The first experiment attempting to do this was performed in 1959 at the European Organization for Nuclear Research (Organisation européenne pour la recherche nucléaire)(CH) [CERN] laboratory in Europe. Because researchers were more interested in the new quantum corrections than they were with the 1930’s prediction, they subtracted off the “2” from the 1930s, and only looked at the excess. Hence this form of experiment is now called the “g – 2” experiment.

    The early experiment measuring the magnetic properties of the muon was not terribly precise, but the situation has improved over the years. In 2006, researchers at the DOE’s Brookhaven National Laboratory (US) , measured an extremely precise value for the magnetic properties of the muon.

    They measured exactly 2.0023318418, with an uncertainty of 0.0000000012. This is an impressive measurement by any standards. (The measurement numbers can be found at this URL (page 715).)

    The theoretical calculation for the magnetic properties of the muon is similarly impressive. A commonly accepted value for the calculation is 2.00233183620, with an uncertainty of 0.00000000086. The data and prediction agree, digit for digit for nine places.

    Two measurements (red and blue) of the magnetic properties of the muon can be statistically combined into an experimental measurement (pink). This can be compared to a theoretical prediction (green), and prediction and measurement don’t agree. DOE’s Fermi National Accelerator Laboratory(US).


    Such good agreement should be applauded, but the interesting feature is in a slight remaining disagreement. Scientists strip off all of the numbers that agree and remake the comparison. In this case, the theoretical number is 362.0 ± 8.6 and the experimental number is 418 ± 12. The two disagree by 56 with an uncertainty of 14.8.

    When one compares two independently generated numbers, one expects disagreement, but the agreement should be about the same size as the uncertainty. Here, the disagreement is 3.8 times the uncertainty. That’s weird and it could mean that a discovery has been made. Or it could mean that one of the two measurements is simply wrong. Which is it?

    To test the experimental result, another measurement was made. In April of 2021, researchers at Fermilab, America’s flagship particle physics laboratory, repeated the Brookhaven measurement. They reported a number that agreed with the Brookhaven measurement. When they combine their data and the Brookhaven data, they find a result of 2.00233184122 ± 0.00000000082. Stripped of the numbers that agree between data and theory, the current state of the art is:

    Theoretical prediction: 362.0 ± 8.6

    Experimental measurement: 412.2 ± 8.2

    This disagreement is substantial, and many have reported that this is good evidence that current theory will need to be revised to accommodate the measurement.

    However, this conclusion might be premature. On the same day that the experimental result was released, another theoretical estimate was published that disagrees with the earlier one. Furthermore, the new theoretical estimate is in agreement with the experimental prediction.

    Two theoretical calculations are compared to a measurement (pink). The old calculation disagrees with the measurement, but the new lattice QCD calculation agrees rather well. The difference between the two predictions means any claims for a discovery are premature. Adapted from Science Magazine.

    How the theory is done

    Theoretical particle physics calculations are difficult to do. In fact, scientists don’t have the mathematical tools required to solve many problems exactly. Instead, they replace the actual problem with an approximation and solve the approximation.

    The way this is done for the magnetic properties of the muon is they look at the cloud of particles surrounding the muon and ask which of them is responsible for the largest effect. They calculate the contribution of those particles. Then they move to the next most important contributors and repeat the process. Some of the contributions are relatively easy, but some are not.

    While the particles surrounding the muon are often electrons and their antimatter electrons, some of the particles in the cloud are quarks, which are particles normally found inside protons and neutrons. Quarks are heavier than electrons, and they also interact with the strong nuclear charge. This strong interaction means that the quarks not only interact with the muon, the quarks interact with other quarks in the cloud. This makes it difficult to calculate their effect on the magnetic properties of the muon.

    So historically, scientists have used other data measurements to get an estimate of the quarks contribution to the muon’s magnetism. With this technique, they came up with the discrepancy between the prediction and measurement.

    However, a new technique has been employed which predicts the contribution caused by quarks. This new technique is called “lattice QCD,” where QCD is the conventional theory of strong nuclear force interactions. Lattice QCD is an interesting technique, where scientists set up a three dimensional grid and calculate the effect of the strong force on that grid. Lattice QCD is a brute force method and it has been successful in the past. But this is the first full attempt to employ the technique for the magnetic properties of muons.

    This new lattice QCD calculation differs from the earlier theoretical prediction. Indeed, it is much closer to the experimental result.

    So where does this leave us? When the Fermilab results were released, it appeared that the measurement and prediction disagreed substantially, suggesting that perhaps we needed to modify our theory to make it agree with data. However, now we have the unsettling situation that perhaps the theory wasn’t right. Maybe the new lattice QCD calculation is correct. In that case, there is no discrepancy between data and prediction.

    I think that the bottom line is that the entire situation is uncertain and it is too soon to draw any conclusion. The lattice QCD calculation is certainly interesting, but it’s new and also not all lattice QCD calculations agree. And the Fermilab version of the experiment measuring the magnetic properties of the muon is just getting started. They have reported a mere 6% of the total data they expect to eventually record and analyze.

    Precision measurements of the magnetic properties of muons have the potential to rewrite physics. But that’s only true if the measurement and predictions are both accurate and precise, and we’re not really ready to conclude that either are complete. It appears that the experimental measurement is pretty solid, although researchers are constantly looking for overlooked flaws. And the theory side is still a bit murky, with a lot of work required to understand the details of the lattice QCD calculation.

    I think it’s safe to say that we are still many years from resolving this question. This is, without a doubt, an unsatisfying state of affairs, but that’s science on the frontier of knowledge for you. We waited nearly two decades to get an improved measurement of the magnetic properties of muons. We can wait a few more years while scientists work hard to figure it all out.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 9:25 pm on March 30, 2021 Permalink | Reply
    Tags: "Molecules in Flat Lands- an Entanglement Paradise", Another benefit of using molecules in quantum experiments is that molecules also have long-range dipolar interactions-They can interact at a distance., , , Exciting new physics emerges due to dipolar interactions in such pancake shaped arrays., Molecules are very appealing for quantum simulation; quantum information; and precision measurements., , , Quantum Mechanics, The reason is that molecules have a large number of internal degrees of freedom that can be a useful resource for quantum sensing and fundamental physics tests., The scientists have been compressing molecular gas into a stack of pancake shaped arrays.,   

    From University of Colorado Boulder: “Molecules in Flat Lands- an Entanglement Paradise” 

    U Colorado

    From University of Colorado Boulder

    03/18/2021 [Just now in social media.]
    Kenna Castleberry, Science Communicator, JILA – Exploring the Frontiers of Physics a joint Institute of U Colorado Boulder and NIST

    Credit: Steven Burrows/The Rey Lab

    Within the realm of quantum mechanics, the generation of quantum entanglement remains one of the most challenging goals. Entanglement, simply put, is when the quantum state of each particle or a group of particles is not independent of the quantum states of other particles or groups, even over long distances. Entangled particles have always fascinated physicists, as measuring one entangled particle can result in a change in another entangled particle, famously dismissed as “spooky action at a distance” by Einstein. By now, physicists understand this strange effect and how to make use of it, for example to increase the sensitivity of measurements. However, entangled states are very fragile, as they can be easily disrupted by decoherence. Researchers have already created entangled states in atoms, photons, electrons and ions, but only recently have studies begun to explore entanglement in gases of polar molecules.

    “Molecules are very appealing for quantum simulation; quantum information; and precision measurements,” explained Dr. Ana Maria Rey, a University of Colorado Boulder Adjoint Professor of Physics and JILA Fellow. The reason is that molecules have a large number of internal degrees of freedom that can be a useful resource for quantum sensing and fundamental physics tests. Another benefit of using molecules in quantum experiments is that molecules also have long-range dipolar interactions: in contrast to atoms which have to bump into each other to interact, molecules can interact at a distance. “Molecules offer really great advantages compared to atoms, but at the same time, they are really hard to cool down. In fact, cooling molecules to quantum degeneracy (condition reached when they are cold enough to make quantum effects dominate) has been one of the most sought-after outstanding goals for many years. The progress has been very slow, but it’s happening now.”

    In 2019 JILA Fellow and Adjoint professor for University of Colorado, Boulder, Jun Ye, finally achieved this important milestone. Ye’s lab managed to cool down molecules consisting of one rubidium and one potassium atom down to quantum degeneracy and observe their quantum nature. More recently, he has been compressing this molecular gas into a stack of pancake shaped arrays. The work by the Rey’s and Ye’s groups investigates the exciting new physics that emerges due to dipolar interactions in such pancake shaped arrays.

    The Importance of Pancake Geometry

    Chemical reactions are one of the most detrimental enemies to cooling molecules. A few years ago, the Ye lab was able to avoid chemical reactions while allowing molecules to interact with each other via dipolar interactions by loading the molecules in a 3D lattice. A 3D lattice can be imagined as a perfect crystal of light. In a 3D lattice molecules are pinned at individual lattice sites without moving. The molecules then interact via dipolar interactions in the same way that magnets interact: when they are placed side by side they repel and when they are placed head to tail they attract. In a 3D lattice, molecules experience both attractive and repulsive interactions and as a consequence on average the interactions between molecules cancel each other out. Moreover, in the 3D lattice experiment the molecular filling fraction was very low, which is to say that the molecules were mostly quite far apart and interacted only very weakly.

    In a recent experiment, however, the Ye group was able to increase the density by compressing a 3D quantum degenerate gas into a few pancakes, each one with a flat 2D shape. Within a pancake the Ye group found it is possible to suppress undesirable chemical reactions and in addition make dipole interactions stronger. This is because in a 2D configuration all molecules repel and the interactions do not average out. The exciting observation made by the investigators is that the strong dipolar interactions in the pancake can also make the gas robust to undesirable dephasing effects and chemical reactions. Bilitewski stated: In studying this shape, “conceptually, and this is at the heart of this work, the interactions between the molecules depend on the quantum states they are in, and thus on this confinement. So, you first have to figure out the interactions in this new geometry. It turns out these actually have very beneficial properties for generating the collective dynamics we are after.” But the even better news is that interactions not only protect the state by forcing the molecular dipoles to be all aligned, but also naturally create entanglement. In Bilitewski’ words: “the benefit to this collective synchronization is that the entanglement we generate becomes robust to certain effects that would usually destroy is.” Such entangled arrays of molecules could have applications for future measurements of various quantities, such as electric fields, with sensitivity enhanced by the entanglement.

    The work done by the Rey group illustrates the importance of geometrical effects in dipolar gases and the exciting many-body phenomena yet to be explored once molecules are brought to quantum degeneracy. In theorizing about the importance of this 2D shape, Rey said: “thanks to the amazing work done by Thomas Bilitewski, we have been able to model their quantum dynamics and show it should be possible to entangle them, he computed all the integrals needed to write an effective model, solved the equations of motion and showed everything can be made to work out to generate entanglement through flip-flop processes induced by dipolar interactions.”

    The production of ultracold molecular gases in controllable geometries hints at new discoveries and predictions within the field of quantum mechanics. “This observation was a demonstration that molecules can explore quantum magnetism,” Rey added, “In other words, the molecules can behave as quantum magnets and emulate the behavior of electrons in solids, for example. In our recent work, we have made a step forward toward this direction.” The proposal put forth by the Rey and Ye groups is only the beginning of all the great science yet to be studied with entanglement arrays of molecules. According to Bilitewski: “this is all really exciting in the sense that we are exploring a novel regime that has only now become available in the lab.”

    Science paper:
    Dynamical Generation of Spin Squeezing in Ultracold Dipolar Molecules
    Physical Review Letters

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Colorado Campus

    As the flagship university of the state of Colorado University of Colorado Boulder(US), founded in 1876, five months before Colorado became a state. It is a dynamic community of scholars and learners situated on one of the most spectacular college campuses in the country, and is classified as an R1 University, meaning that it engages in a very high level of research activity. As one of 34 U.S. public institutions belonging to the prestigious Association of American Universities (AAU), a selective group of major research universities in North America, – and the only member in the Rocky Mountain region – we have a proud tradition of academic excellence, with five Nobel laureates and more than 50 members of prestigious academic academies.

    CU-Boulder has blossomed in size and quality since we opened our doors in 1877 – attracting superb faculty, staff, and students and building strong programs in the sciences, engineering, business, law, arts, humanities, education, music, and many other disciplines.

    Today, with our sights set on becoming the standard for the great comprehensive public research universities of the new century, we strive to serve the people of Colorado and to engage with the world through excellence in our teaching, research, creative work, and service.

    In 2015, the university comprised nine colleges and schools and offered over 150 academic programs and enrolled almost 17,000 students. Five Nobel Laureates, nine MacArthur Fellows, and 20 astronauts have been affiliated with CU Boulder as students; researchers; or faculty members in its history. In 2010, the university received nearly $454 million in sponsored research to fund programs like the Laboratory for Atmospheric and Space Physics and JILA. CU Boulder has been called a Public Ivy, a group of publicly funded universities considered as providing a quality of education comparable to those of the Ivy League.

    The Colorado Buffaloes compete in 17 varsity sports and are members of the NCAA Division I Pac-12 Conference. The Buffaloes have won 28 national championships: 20 in skiing, seven total in men’s and women’s cross country, and one in football. The university has produced a total of ten Olympic medalists. Approximately 900 students participate in 34 intercollegiate club sports annually as well.

    On March 14, 1876, the Colorado territorial legislature passed an amendment to the state constitution that provided money for the establishment of the University of Colorado in Boulder, the Colorado School of Mines(US) in Golden, and the Colorado State University (US) – College of Agricultural Sciences in Fort Collins.

    Two cities competed for the site of the University of Colorado: Boulder and Cañon City. The consolation prize for the losing city was to be home of the new Colorado State Prison. Cañon City was at a disadvantage as it was already the home of the Colorado Territorial Prison. (There are now six prisons in the Cañon City area.)

    The cornerstone of the building that became Old Main was laid on September 20, 1875. The doors of the university opened on September 5, 1877. At the time, there were few high schools in the state that could adequately prepare students for university work, so in addition to the University, a preparatory school was formed on campus. In the fall of 1877, the student body consisted of 15 students in the college proper and 50 students in the preparatory school. There were 38 men and 27 women, and their ages ranged from 12–23 years.

    During World War II, Colorado was one of 131 colleges and universities nationally that took part in the V-12 Navy College Training Program which offered students a path to a navy commission.

    CU hired its first female professor, Mary Rippon, in 1878. It hired its first African-American professor, Charles H. Nilon, in 1956, and its first African-American librarian, Mildred Nilon, in 1962. Its first African American female graduate, Lucile Berkeley Buchanan, received her degree in 1918.

    Research institutes

    CU Boulder’s research mission is supported by eleven research institutes within the university. Each research institute supports faculty from multiple academic departments, allowing institutes to conduct truly multidisciplinary research.

    The Institute for Behavioral Genetics (IBG) is a research institute within the Graduate School dedicated to conducting and facilitating research on the genetic and environmental bases of individual differences in behavior. After its founding in 1967 IBG led the resurging interest in genetic influences on behavior. IBG was the first post-World War II research institute dedicated to research in behavioral genetics. IBG remains one of the top research facilities for research in behavioral genetics, including human behavioral genetics, psychiatric genetics, quantitative genetics, statistical genetics, and animal behavioral genetics.

    The Institute of Cognitive Science (ICS) at CU Boulder promotes interdisciplinary research and training in cognitive science. ICS is highly interdisciplinary; its research focuses on education, language processing, emotion, and higher level cognition using experimental methods. It is home to a state of the art fMRI system used to collect neuroimaging data.

    ATLAS Institute is a center for interdisciplinary research and academic study, where engineering, computer science and robotics are blended with design-oriented topics. Part of CU Boulder’s College of Engineering and Applied Science, the institute offers academic programs at the undergraduate, master’s and doctoral levels, and administers research labs, hacker and makerspaces, and a black box experimental performance studio. At the beginning of the 2018–2019 academic year, approximately 1,200 students were enrolled in ATLAS academic programs and the institute sponsored six research labs.[64]

    In addition to IBG, ICS and ATLAS, the university’s other institutes include Biofrontiers Institute, Cooperative Institute for Research in Environmental Sciences, Institute of Arctic & Alpine Research (INSTAAR), Institute of Behavioral Science (IBS), JILA, Laboratory for Atmospheric & Space Physics (LASP), Renewable & Sustainable Energy Institute (RASEI), and the University of Colorado Museum of Natural History.

  • richardmitnick 1:15 pm on March 30, 2021 Permalink | Reply
    Tags: "The mystery of the muon’s magnetism", , , , , , , Quantum Mechanics,   

    From Symmetry: “The mystery of the muon’s magnetism” 

    Symmetry Mag
    From Symmetry

    Brianna Barbu

    A super-precise experiment at DOE’s Fermi National Accelerator Laboratory(US) is carefully analyzing every detail of the muon’s magnetic moment.


    Modern physics is full of the sort of twisty, puzzle-within-a-puzzle plots you’d find in a classic detective story: Both physicists and detectives must carefully separate important clues from unrelated information. Both physicists and detectives must sometimes push beyond the obvious explanation to fully reveal what’s going on.

    And for both physicists and detectives, momentous discoveries can hinge upon Sherlock Holmes-level deductions based on evidence that is easy to overlook. Case in point: the Muon g-2 experiment currently underway at the US Department of Energy’s Fermi National Accelerator Laboratory.

    The current Muon g-2 (pronounced g minus two) experiment is actually a sequel, an experiment designed to reexamine a slight discrepancy between theory and the results from an earlier experiment at DOE’s Brookhaven National Laboratory(US), which was also called Muon g-2.

    DOE’s Fermi National Accelerator Laboratory(US) G-2 magnet from DOE’s Brookhaven National Laboratory(US) finds a new home in the FNAL Muon G-2 experiment. The move by barge and truck.

    Fermi National Accelerator Laboratory(US) Muon g-2 studio. As muons race around a ring at the , their spin axes twirl, reflecting the influence of unseen particles.

    The discrepancy could be a sign that new physics is afoot. Scientists want to know whether the measurement holds up… or if it’s nothing but a red herring.

    The Fermilab Muon g-2 collaboration has announced it will present its first result on April 7. Until then, let’s unpack the facts of the case.

    The mysterious magnetic moment

    All spinning, charged objects—including muons and their better-known particle siblings, electrons—generate their own magnetic fields. The strength of a particle’s magnetic field is referred to as its “magnetic moment” or its “g-factor.” (That’s what the “g” part of “g-2” refers to.)

    To understand the “-2” part of “g-2,” we have to travel a bit back in time.

    Spectroscopy experiments in the 1920s (before the discovery of muons in 1936) revealed that the electron has an intrinsic spin and a magnetic moment. The value of that magnetic moment, g, was found experimentally to be 2. As for why that was the value—that mystery was soon solved using the new but fast-growing field of quantum mechanics.

    In 1928, physicist Paul Dirac—building upon the work of Llewelyn Thomas and others—produced a now-famous equation that combined quantum mechanics and special relativity to accurately describe the motion and electromagnetic interactions of electrons and all other particles with the same spin quantum number. The Dirac equation, which incorporated spin as a fundamental part of the theory, predicted that g should be equal to 2, exactly what scientists had measured at the time.

    The Dirac equation in the form originally proposed by Dirac is


    But as experiments became more precise in the 1940s, new evidence came to light that reopened the case and led to surprising new insights about the quantum realm.

    Credit: Sandbox Studio, Chicago with Steve Shanabruch.

    A conspiracy of particles

    The electron, it turned out, had a little bit of extra magnetism that Dirac’s equation didn’t account for. That extra magnetism, mathematically expressed as “g-2” (or the amount that g differs from Dirac’s prediction), is known as the “anomalous magnetic moment.” For a while, scientists didn’t know what caused it.

    If this were a murder mystery, the anomalous magnetic moment would be sort of like an extra fingerprint of unknown provenance on a knife used to stab a victim—a small but suspicious detail that warrants further investigation and could unveil a whole new dimension of the story.

    Physicist Julian Schwinger explained the anomaly in 1947 by theorizing that the electron could emit and then reabsorb a “virtual photon.” The fleeting interaction would slightly boost the electron’s internal magnetism by a tenth of a percent, the amount needed to bring the predicted value into line with the experimental evidence. But the photon isn’t the only accomplice.

    Over time, researchers discovered that there was an extensive network of “virtual particles” constantly popping in and out of existence from the quantum vacuum. That’s what had been messing with the electron’s little spinning magnet.

    The anomalous magnetic moment represents the simultaneous combined influence of every possible effect of those ephemeral quantum conspirators on the electron. Some interactions are more likely to occur, or are more strongly felt than others, and they therefore make a larger contribution. But every particle and force in the Standard Model takes part.

    The theoretical models that describe these virtual interactions have been quite successful in describing the magnetism of electrons. For the electron’s g-2, theoretical calculations are now in such close agreement with the experimental value that it’s like measuring the circumference of the Earth with an accuracy smaller than the width of a single human hair.

    All of the evidence points to quantum mischief perpetrated by known particles causing any magnetic anomalies. Case closed, right?

    Not quite. It’s now time to hear the muon’s side of the story.

    Not a hair out of place—or is there?

    Early measurements of the muon’s anomalous magnetic moment at Columbia University (US) in the 1950s and at the European physics laboratory CERN [European Organization for Nuclear Research (Organisation européenne pour la recherche nucléaire)(EU)] in the 1960s and 1970s agreed well with theoretical predictions. The measurement’s uncertainty shrank from 2% in 1961 to 0.0007% in 1979. It looked as if the same conspiracy of particles that affected the electron’s g-2 were responsible for the magnetic moment of the muon as well.

    But then, in 2001, the Brookhaven Muon g-2 experiment turned up something strange. The experiment was designed to increase the precision from the CERN measurements and look at the weak interaction’s contribution to the anomaly. It succeeded in shrinking the error bars to half a part per million. But it also showed a tiny discrepancy—less than 3 parts per million—between the new measurement and the theoretical value. This time, theorists couldn’t come up with a way to recalculate their models to explain it. Nothing in the Standard Model could account for the difference.

    It was the physics mystery equivalent of a single hair found at a crime scene with DNA that didn’t seem to match anyone connected to the case. The question was—and still is—whether the presence of the hair is just a coincidence, or whether it is actually an important clue.

    Physicists are now re-examining this “hair” at Fermilab, with support from the DOE Office of Science (US), the National Science Foundation (US) and several international agencies in Italy, the UK, the EU, China, Korea and Germany.

    In the new Muon g-2 experiment, a beam of muons—their spins all pointing the same direction—are shot into a type of accelerator called a storage ring. The ring’s strong magnetic field keeps the muons on a well-defined circular path. If g were exactly 2, then the muons’ spins would follow their momentum exactly. But, because of the anomalous magnetic moment, the muons have a slight additional wobble in the rotation of their spins.

    When a muon decays into an electron and two neutrinos, the electron tends to shoot off in the direction that the muon’s spin was pointing. Detectors on the inside of the ring pick up a portion of the electrons flung by muons experiencing the wobble. Recording the numbers and energies of electrons they detect over time will tell researchers how much the muon spin has rotated.

    Using the same magnet from the Brookhaven experiment with significantly better instrumentation, plus a more intense beam of muons produced by Fermilab’s accelerator complex, researchers are collecting 21 times more data to achieve four times greater precision.

    The experiment may confirm the existence of the discrepancy; it may find no discrepancy at all, pointing to a problem with the Brookhaven result; or it may find something in between, leaving the case unsolved.

    Seeking the quantum underworld

    There’s reason to believe something is going on that the Standard Model hasn’t told us about.

    The Standard Model is a remarkably consistent explanation for pretty much everything that goes on in the subatomic world.

    Standard Model of Particle Physics from “Particle Fever” via Symmetry Magazine

    But there are still a number of unsolved mysteries in physics that it doesn’t address.

    Dark matter, for instance, makes up about 27% of the universe. And yet, scientists still have no idea what it’s made of. None of the known particles seem to fit the bill. The Standard Model also can’t explain the mass of the Higgs boson, which is surprisingly small. If the Fermilab Muon g-2 experiment determines that something beyond the Standard Model—for example an unknown particle—is measurably messing with the muon’s magnetic moment, it may point researchers in the right direction to close another one of these open files.

    A confirmed discrepancy won’t actually provide DNA-level details about what particle or force is making its presence known, but it will help narrow down the ranges of mass and interaction strength in which future experiments are most likely to find something new. Even if the discrepancy fades, the data will still be useful for deciding where to look.

    It might be that a shadowy quantum figure lurking beyond the Standard Model is too well hidden for current technology to detect. But if it’s not, physicists will leave no stone unturned and no speck of evidence un-analyzed until they crack the case.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 9:41 pm on March 29, 2021 Permalink | Reply
    Tags: "Physicists flip particle accelerator setup to gain a clearer view of atomic nuclei", , “Inverse kinematics”, , , , , Quantum Mechanics   

    From MIT : “Physicists flip particle accelerator setup to gain a clearer view of atomic nuclei” 

    MIT News

    From MIT News

    March 29, 2021
    Jennifer Chu

    Shooting beams of ions at proton clouds may help researchers map the inner workings of neutron stars.

    Shooting beams of ions at proton clouds, like throwing nuclear darts at the speed of light, can provide a clearer view of nuclear structure. Credits: Jose-Luis Olivares/ MIT.

    Physicists at MIT and elsewhere are blasting beams of ions at clouds of protons —like throwing nuclear darts at the speed of light — to map the structure of an atom’s nucleus.

    The experiment is an inversion of the usual particle accelerators, which hurl electrons at atomic nuclei to probe their structures. The team used this “inverse kinematics” approach to sift out the messy, quantum mechanical influences within a nucleus, to provide a clear view of a nucleus’ protons and neutrons, as well as its short-range correlated (SRC) pairs. These are pairs of protons or neutrons that briefly bind to form super-dense droplets of nuclear matter and that are thought to dominate the ultradense environments in neutron stars.

    The results, published today in Nature Physics, demonstrate that inverse kinematics may be used to characterize the structure of more unstable nuclei — essential ingredients scientists can use to understand the dynamics of neutron stars and the processes by which they generate heavy elements.

    “We’ve opened the door for studying SRC pairs, not only in stable nuclei but also in neutron-rich nuclei that are very abundant in environments like neutron star mergers,” says study co-author Or Hen, assistant professor of physics at MIT. “That gets us closer to understanding such exotic astrophysical phenomena.”

    Hen’s co-authors include Jullian Kahlbow and Efrain Segarra of MIT, Eli Piasetzky of Tel Aviv University [ אוּנִיבֶרְסִיטַת תֵּל אָבִיב} (IL), and researchers from Technical University of Darmstadt [Technische Universität Darmstadt](DE), the Joint Institute for Nuclear Research [Объединенный институт ядерных исследований России](RU) [JINR], the Alternative Energies and Atomic Energy Commission [Commissariat à l’énergie atomique et aux énergies alternatives] (FR)(CEA), and the GSI Helmholtz Centre for Heavy Ion Research [GSI Helmholtzzentrum für Schwerionenforschung] (DE).

    An inverted accelerator

    Particle accelerators typically probe nuclear structures through electron scattering, in which high-energy electrons are beamed at a stationary cloud of target nuclei. When an electron hits a nucleus, it knocks out protons and neutrons, and the electron loses energy in the process. Researchers measure the energy of the electron beam before and after this interaction to calculate the original energies of the protons and neutrons that were kicked away.

    While electron scattering is a precise way to reconstruct a nucleus’ structure, it is also a game of chance. The probability that an electron will hit a nucleus is relatively low, given that a single electron is vanishingly small in comparison to an entire nucleus. To increase this probability, beams are loaded with ever-higher electron densities.

    Scientists also use beams of protons instead of electrons to probe nuclei, as protons are comparably larger and more likely to hit their target. But protons are also more complex, and made of quarks and gluons, the interactions of which can muddy the final interpretation of the nucleus itself.

    To get a clearer picture, physicists in recent years have inverted the traditional setup: By aiming a beam of nuclei, or ions, at a target of protons, scientists can not only directly measure the knocked out protons and neutrons, but also compare the original nucleus with the residual nucleus, or nuclear fragment, after it has interacted with a target proton.

    “With inverted kinematics, we know exactly what happens to a nucleus when we remove its protons and neutrons,” Hen says.

    Quantum sifting

    The team took this inverted kinematics approach to ultrahigh energies, using JINR’s particle accelerator facility to target a stationary cloud of protons with a beam of carbon-12 nuclei, which they shot out at 48 billion electron-volts — orders of magnitude higher than the energies found naturally in nuclei.

    At such high energies, any nucleon that interacts with a proton will stand out in the data, compared with noninteracting nucleons that pass through at much lower energies. In this way, the researchers can quickly isolate any interactions that did occur between a nucleus and a proton.

    From these interactions, the team picked through the residual nuclear fragments, looking for boron-11 — a configuration of carbon-12, minus a single proton. If a nucleus started out as carbon-12 and wound up as boron-11, it could only mean that it encountered a target proton in a way that knocked out a single proton. If the target proton knocked out more than one proton, it would have been the result of quantum mechanical effects within the nucleus that would be difficult to interpret. The team isolated boron-11 as a clear signature and discarded any lighter, quantumly influenced fragments.

    The team calculated the energy of the proton knocked out of the original carbon-12 nucleus, based on each interaction that produced boron-11. When they set the energies into a graph, the pattern fit exactly with carbon-12’s well-established distribution — a validation of the inverted, high-energy approach.

    They then turned the technique on short-range correlated pairs, looking to see if they could reconstruct the respective energies of each particle in a pair — fundamental information for ultimately understanding the dynamics in neutron stars and other neutron-dense objects.

    They repeated the experiment and this time looked for boron-10, a configuration of carbon-12, minus a proton and a neutron. Any detection of boron-10 would mean that a carbon-12 nucleus interacted with a target proton, which knocked out a proton, and its bound partner, a neutron. The scientists could measure the energies of both the target and the knocked out protons to calculate the neutron’s energy and the energy of the original SRC pair.

    In all, the researchers observed 20 SRC interactions and from them mapped carbon-12’s distribution of SRC energies, which fit well with previous experiments. The results suggest that inverse kinematics can be used to characterize SRC pairs in more unstable and even radioactive nuclei with many more neutrons.

    “When everything is inverted, this means a beam driving through could be made of unstable particles with very short lifetimes that live for a millisecond,” says Julian Kahlbow, a joint postdoc at MIT and Tel-aviv University and a co-leading author of the paper. “That millisecond is enough for us to create it, let it interact, and let it go. So now we can systematically add more neutrons to the system and see how these SRCs evolve, which will help us inform what happens in neutron stars, which have many more neutrons than anything else in the universe.”

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal
    Massachusetts Institute of Technology (MIT) is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory, the Bates Center, and the Haystack Observatory, as well as affiliated laboratories such as the Broad and Whitehead Institutes.

    MIT Haystack Observatory, Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, MIT adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with MIT. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. MIT is a member of the Association of American Universities (AAU).

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia, wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after MIT was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst. In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    MIT was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, MIT faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, the MIT administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, MIT catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at MIT that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    MIT’s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at MIT’s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, MIT became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected MIT profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of MIT between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, MIT no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and MIT’s defense research. In this period MIT’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. MIT ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However six MIT students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at MIT over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, MIT’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    MIT has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the OpenCourseWare project has made course materials for over 2,000 MIT classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    MIT was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, MIT launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, MIT announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the MIT faculty adopted an open-access policy to make its scholarship publicly accessible online.

    MIT has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the MIT community with thousands of police officers from the New England region and Canada. On November 25, 2013, MIT announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of the MIT community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Laser Interferometer Gravitational-Wave Observatory (LIGO)(US) was designed and constructed by a team of scientists from California Institute of Technology, MIT, and industrial contractors, and funded by the National Science Foundation.

    MIT/Caltech Advanced aLigo .

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and MIT physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also an MIT graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

  • richardmitnick 11:20 pm on March 24, 2021 Permalink | Reply
    Tags: , , , Classical field theory, , , , Particle physicists use lattice quantum chromodynamics and supercomputers to search for physics beyond the Standard Model., , , Quantum chromodynamics-QCD-is the theory of the strong interaction between quarks; gluons-the particles that make up some of the larger composite particles such as the proton; neutron; and pion., Quantum field theory is the theoretical framework from which the Standard Model of particle physics is constructed., Quantum Mechanics, , , Texas Advanced Computing Center(US), University of Texas at Austin (US)   

    From University of Texas at Austin (US) and From Texas Advanced Computing Center(US): “Searching for Hints of New Physics in the Subatomic World” 

    From University of Texas at Austin (US)


    From Texas Advanced Computing Center(US)

    March 23, 2021
    Aaron Dubrow

    Particle physicists use lattice quantum chromodynamics and supercomputers to search for physics beyond the Standard Model.

    This plot shows how the decay properties of a meson made from a heavy quark and a light quark change when the lattice spacing and heavy quark mass are varied on the calculation. [Credit: A. Bazavov Michigan State University (US); C. Bernard Washington University in St Louis (US); N. Brown Washington University in St Louis (US); C. DeTar University of Utah(US); A.X. El-Khadra University of Illinois(US); and Fermi National Accelerator Laboratory(US) et al.]

    Peer deeper into the heart of the atom than any microscope allows and scientists hypothesize that you will find a rich world of particles popping in and out of the vacuum, decaying into other particles, and adding to the weirdness of the visible world. These subatomic particles are governed by the quantum nature of the Universe and find tangible, physical form in experimental results.

    Some subatomic particles were first discovered over a century ago with relatively simple experiments. More recently, however, the endeavor to understand these particles has spawned the largest, most ambitious and complex experiments in the world, including those at particle physics laboratories such as the European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire](CH) (CERN) in Europe, Fermi National Accelerator Laboratory(US) in Illinois, and the KEK High Energy Accelerator Research Organization(JP).

    These experiments have a mission to expand our understanding of the Universe, characterized most harmoniously in the Standard Model of particle physics; and to look beyond the Standard Model for as-yet-unknown physics.

    Standard Model of Particle Physics from “Particle Fever” via Symmetry Magazine


    “The Standard Model explains so much of what we observe in elementary particle and nuclear physics, but it leaves many questions unanswered,” said Steven Gottlieb, distinguished professor of Physics at Indiana University(US). “We are trying to unravel the mystery of what lies beyond the Standard Model.”

    A plot of the Unitarity Triangle, a good test of the Standard Model, showing constraints on the ρ, ¯ η¯ plane. The shaded areas have 95% CL, a statistical method for setting upper limits on model parameters. [Credit: A. Ceccucci (European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire](CH)), Z. Ligeti (DOE’s Lawrence Berkeley National Laboratory(US)) and Y. Sakai (KEK High Energy Accelerator Research Organization(JP))]

    Ever since the beginning of the study of particle physics, experimental and theoretical approaches have complemented each other in the attempt to understand nature. In the past four to five decades, advanced computing has become an important part of both approaches. Great progress has been made in understanding the behavior of the zoo of subatomic particles, including bosons (especially the long sought and recently discovered Higgs boson), various flavors of quarks, gluons, muons, neutrinos and many states made from combinations of quarks or anti-quarks bound together.

    Quantum field theory is the theoretical framework from which the Standard Model of particle physics is constructed. It combines classical field theory, special relativity and quantum mechanics, developed with contributions from Einstein, Dirac, Fermi, Feynman, and others. Within the Standard Model, quantum chromodynamics-or QCD-is the theory of the strong interaction between quarks and gluons-the fundamental particles that make up some of the larger composite particles such as the proton; neutron; and pion.

    Peering through the Lattice

    Carleton DeTar and Steven Gottlieb are two of the leading contemporary scholars of QCD research and practitioners of an approach known as lattice QCD. Lattice QCD represents continuous space as a discrete set of spacetime points (called the lattice). It uses supercomputers to study the interactions of quarks, and importantly, to determine more precisely several parameters of the Standard Model, thereby reducing the uncertainties in its predictions. It’s a slow and resource-intensive approach, but it has proven to have wide applicability, giving insight into parts of the theory inaccessible by other means, in particular the explicit forces acting between quarks and antiquarks.

    DeTar and Gottlieb are part of the MIMD Lattice Computation (MILC) Collaboration and work very closely with the Fermilab Lattice Collaboration on the vast majority of their work. They also work with the High Precision QCD (HPQCD) Collaboration for the study of the muon anomalous magnetic moment. As part of these efforts, they use the fastest supercomputers in the world.

    Since 2019, they have used Frontera [below] at the Texas Advanced Computing Center (TACC) — the fastest academic supercomputer in the world and the 9th fastest overall — to propel their work. They are among the largest users of that resource, which is funded by the National Science Foundation(US). The team also uses Summit at the DOE’s Oak Ridge National Laboratory(US) (the #2 fastest supercomputer in the world); Cori at the National Energy Research Scientific Computing Center(US) at DOE’s Lawrence Berkeley National Laboratory(US) (#20), and Stampede2 [below] (#25) at TACC, for the lattice calculations.

    IBM AC922 SUMMIT supercomputer, was No.1 on the TOP500. Credit: Carlos Jones, DOE’s Oak Ridge National Laboratory (US).

    Cray Cori II supercomputer at National Energy Research Scientific Computing Center(US) at DOE’s Lawrence Berkeley National Laboratory(US), named after Gerty Cori, the first American woman to win a Nobel Prize in science.

    The efforts of the lattice QCD community over decades have brought greater accuracy to particle predictions through a combination of faster computers and improved algorithms and methodologies.

    “We can do calculations and make predictions with high precision for how strong interactions work,” said DeTar, professor of Physics and Astronomy at the University of Utah(US). “When I started as a graduate student in the late 1960s, some of our best estimates were within 20 percent of experimental results. Now we can get answers with sub-percent accuracy.”

    In particle physics, physical experiment and theory travel in tandem, informing each other, but sometimes producing different results. These differences suggest areas of further exploration or improvement.

    “There are some tensions in these tests,” said Gottlieb, distinguished professor of Physics at Indiana University (US). “The tensions are not large enough to say that there is a problem here — the usual requirement is at least five standard deviations[σ]. But it means either you make the theory and experiment more precise and find that the agreement is better; or you do it and you find out, ‘Wait a minute, what was the three sigma tension is now a five standard deviation tension, and maybe we really have evidence for new physics.'”

    DeTar calls these small discrepancies between theory and experiment ‘tantalizing.’ “They might be telling us something.”

    Over the last several years, DeTar, Gottlieb and their collaborators have followed the paths of quarks and antiquarks with ever-greater resolution as they move through a background cloud of gluons and virtual quark-antiquark pairs, as prescribed precisely by QCD. The results of the calculation are used to determine physically meaningful quantities such as particle masses and decays.

    Results for the B → πℓν semileptonic form factor (a function that encapsulates the properties of a certain particle interaction without including all of the underlying physics). The results from the FNAL/MILC 15 collaboration are the only ones that achieved the highest quality rating (green star) from the Flavour Lattice Averaging Group (FLAG) for control of continuum extrapolation and finite volume effects. [Credit: Y. Aoki, D. Beˇcirevi´c, M. Della Morte, S. Gottlieb, D. Lin, E. Lunghi, C. Pena]

    One of the current state-of-the-art approaches that is applied by the researchers uses the so-called highly improved staggered quark (HISQ) formalism to simulate interactions of quarks with gluons. On Frontera, DeTar and Gottlieb are currently simulating at a lattice spacing of 0.06 femtometers (10-15 meters), but they are quickly approaching their ultimate goal of 0.03 femtometers, a distance where the lattice spacing is smaller than the wavelength of the heaviest quark, consequently removing a significant source of uncertainty from these calculations.

    Each doubling of resolution, however, requires about two orders of magnitude more computing power, putting a 0.03 femtometers lattice spacing firmly in the quickly-approaching ‘exascale’ regime.

    “The costs of calculations keeps rising as you make the lattice spacing smaller,” DeTar said. “For smaller lattice spacing, we’re thinking of future Department of Energy machines and the Leadership Class Computing Facility [TACC’s future system in planning]. But we can make do with extrapolations now.”

    The Anomalous Magnetic Moment of the Muon and Other Outstanding Mysteries

    Among the phenomena that DeTar and Gottlieb are tackling is the anomalous magnetic moment of the muon (essentially a heavy electron) – which, in quantum field theory, arises from a weak cloud of elementary particles that surrounds the muon. The same sort of cloud affects particle decays. Theorists believe yet-undiscovered elementary particles could potentially be in that cloud.

    A large international collaboration called the Muon g-2 Theory Initiative recently reviewed the present status of the Standard Model calculation of the muon’s anomalous magnetic moment.

    Fermi National Accelerator Laboratory(US) Muon g-2 studio. As muons race around a ring at the , their spin axes twirl, reflecting the influence of unseen particles.

    Their review appeared in Physics Reports in December 2020. DeTar, Gottlieb and several of their Fermilab Lattice, HPQCD and MILC collaborators are among the coauthors. They find a 3.7 σ difference between experiment and theory.

    While some parts of the theoretical contributions can be calculated with extreme accuracy, the hadronic contributions (the class of subatomic particles that are composed of two or three quarks and participate in strong interactions) are the most difficult to calculate and are responsible for almost all of the theoretical uncertainty. Lattice QCD is one of two ways to calculate these contributions.

    “The experimental uncertainty will soon be reduced by up to a factor of four by the new experiment currently running at Fermilab, and also by the future J-PARC T2K Neutrino Experiment(JP),” they wrote. “This and the prospects to further reduce the theoretical uncertainty in the near future… make this quantity one of the most promising places to look for evidence of new physics.”

    Gottlieb, DeTar and collaborators have calculated the hadronic contribution to the anomalous magnetic moment with a precision of 2.2 percent. “This give us confidence that our short-term goal of achieving a precision of 1 percent on the hadronic contribution to the muon anomalous magnetic moment is now a realistic one,” Gottlieb said. The hope to achieve a precision of 0.5 percent a few years later.

    Other ‘tantalizing’ hints of new physics involve measurements of the decay of B mesons. There, various experimental methods arrive at different results. “The decay properties and mixings of the D and B mesons are critical to a more accurate determination of several of the least well-known parameters of the Standard Model,” Gottlieb said. “Our work is improving the determinations of the masses of the up, down, strange, charm and bottom quarks and how they mix under weak decays.” The mixing is described by the so-called CKM mixing matrix for which Kobayashi and Maskawa won the 2008 Nobel Prize in Physics.

    The answers DeTar and Gottlieb seek are the most fundamental in science: What is matter made of? And where did it come from?

    “The Universe is very connected in many ways,” said DeTar. “We want to understand how the Universe began. The current understanding is that it began with the Big Bang. And the processes that were important in the earliest instance of the Universe involve the same interactions that we’re working with here. So, the mysteries we’re trying to solve in the microcosm may very well provide answers to the mysteries on the cosmological scale as well.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Texas Advanced Computing Center (TACC) designs and operates some of the world’s most powerful computing resources. The center’s mission is to enable discoveries that advance science and society through the application of advanced computing technologies.

    TACC Maverick HP NVIDIA supercomputer

    TACC Lonestar Cray XC40 supercomputer

    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF

    TACC HPE Apollo 8000 Hikari supercomputer

    TACC Maverick HP NVIDIA supercomputer

    TACC DELL EMC Stampede2 supercomputer

    TACC Frontera Dell EMC supercomputer fastest at any university

    U Texas Austin(US) campus

    The University of Texas at Austin (US) is a public research university in Austin, Texas and the flagship institution of the University of Texas System. Founded in 1883, the University of Texas was inducted into the Association of American Universities in 1929, becoming only the third university in the American South to be elected. The institution has the nation’s seventh-largest single-campus enrollment, with over 50,000 undergraduate and graduate students and over 24,000 faculty and staff.

    A Public Ivy, it is a major center for academic research. The university houses seven museums and seventeen libraries, including the LBJ Presidential Library and the Blanton Museum of Art, and operates various auxiliary research facilities, such as the J. J. Pickle Research Campus and the McDonald Observatory. As of November 2020, 13 Nobel Prize winners, four Pulitzer Prize winners, two Turing Award winners, two Fields medalists, two Wolf Prize winners, and two Abel prize winners have been affiliated with the school as alumni, faculty members or researchers. The university has also been affiliated with three Primetime Emmy Award winners, and has produced a total of 143 Olympic medalists.

    Student-athletes compete as the Texas Longhorns and are members of the Big 12 Conference. Its Longhorn Network is the only sports network featuring the college sports of a single university. The Longhorns have won four NCAA Division I National Football Championships, six NCAA Division I National Baseball Championships, thirteen NCAA Division I National Men’s Swimming and Diving Championships, and has claimed more titles in men’s and women’s sports than any other school in the Big 12 since the league was founded in 1996.


    The first mention of a public university in Texas can be traced to the 1827 constitution for the Mexican state of Coahuila y Tejas. Although Title 6, Article 217 of the Constitution promised to establish public education in the arts and sciences, no action was taken by the Mexican government. After Texas obtained its independence from Mexico in 1836, the Texas Congress adopted the Constitution of the Republic, which, under Section 5 of its General Provisions, stated “It shall be the duty of Congress, as soon as circumstances will permit, to provide, by law, a general system of education.”

    On April 18, 1838, “An Act to Establish the University of Texas” was referred to a special committee of the Texas Congress, but was not reported back for further action. On January 26, 1839, the Texas Congress agreed to set aside fifty leagues of land—approximately 288,000 acres (117,000 ha)—towards the establishment of a publicly funded university. In addition, 40 acres (16 ha) in the new capital of Austin were reserved and designated “College Hill”. (The term “Forty Acres” is colloquially used to refer to the University as a whole. The original 40 acres is the area from Guadalupe to Speedway and 21st Street to 24th Street.)

    In 1845, Texas was annexed into the United States. The state’s Constitution of 1845 failed to mention higher education. On February 11, 1858, the Seventh Texas Legislature approved O.B. 102, an act to establish the University of Texas, which set aside $100,000 in United States bonds toward construction of the state’s first publicly funded university (the $100,000 was an allocation from the $10 million the state received pursuant to the Compromise of 1850 and Texas’s relinquishing claims to lands outside its present boundaries). The legislature also designated land reserved for the encouragement of railroad construction toward the university’s endowment. On January 31, 1860, the state legislature, wanting to avoid raising taxes, passed an act authorizing the money set aside for the University of Texas to be used for frontier defense in west Texas to protect settlers from Indian attacks.

    Texas’s secession from the Union and the American Civil War delayed repayment of the borrowed monies. At the end of the Civil War in 1865, The University of Texas’s endowment was just over $16,000 in warrants and nothing substantive had been done to organize the university’s operations. This effort to establish a University was again mandated by Article 7, Section 10 of the Texas Constitution of 1876 which directed the legislature to “establish, organize and provide for the maintenance, support and direction of a university of the first class, to be located by a vote of the people of this State, and styled “The University of Texas”.

    Additionally, Article 7, Section 11 of the 1876 Constitution established the Permanent University Fund, a sovereign wealth fund managed by the Board of Regents of the University of Texas and dedicated to the maintenance of the university. Because some state legislators perceived an extravagance in the construction of academic buildings of other universities, Article 7, Section 14 of the Constitution expressly prohibited the legislature from using the state’s general revenue to fund construction of university buildings. Funds for constructing university buildings had to come from the university’s endowment or from private gifts to the university, but the university’s operating expenses could come from the state’s general revenues.

    The 1876 Constitution also revoked the endowment of the railroad lands of the Act of 1858, but dedicated 1,000,000 acres (400,000 ha) of land, along with other property appropriated for the university, to the Permanent University Fund. This was greatly to the detriment of the university as the lands the Constitution of 1876 granted the university represented less than 5% of the value of the lands granted to the university under the Act of 1858 (the lands close to the railroads were quite valuable, while the lands granted the university were in far west Texas, distant from sources of transportation and water). The more valuable lands reverted to the fund to support general education in the state (the Special School Fund).

    On April 10, 1883, the legislature supplemented the Permanent University Fund with another 1,000,000 acres (400,000 ha) of land in west Texas granted to the Texas and Pacific Railroad but returned to the state as seemingly too worthless to even survey. The legislature additionally appropriated $256,272.57 to repay the funds taken from the university in 1860 to pay for frontier defense and for transfers to the state’s General Fund in 1861 and 1862. The 1883 grant of land increased the land in the Permanent University Fund to almost 2.2 million acres. Under the Act of 1858, the university was entitled to just over 1,000 acres (400 ha) of land for every mile of railroad built in the state. Had the 1876 Constitution not revoked the original 1858 grant of land, by 1883, the university lands would have totaled 3.2 million acres, so the 1883 grant was to restore lands taken from the university by the 1876 Constitution, not an act of munificence.

    On March 30, 1881, the legislature set forth the university’s structure and organization and called for an election to establish its location. By popular election on September 6, 1881, Austin (with 30,913 votes) was chosen as the site. Galveston, having come in second in the election (with 20,741 votes), was designated the location of the medical department (Houston was third with 12,586 votes). On November 17, 1882, on the original “College Hill,” an official ceremony commemorated the laying of the cornerstone of the Old Main building. University President Ashbel Smith, presiding over the ceremony, prophetically proclaimed “Texas holds embedded in its earth rocks and minerals which now lie idle because unknown, resources of incalculable industrial utility, of wealth and power. Smite the earth, smite the rocks with the rod of knowledge and fountains of unstinted wealth will gush forth.” The University of Texas officially opened its doors on September 15, 1883.

    Expansion and growth

    In 1890, George Washington Brackenridge donated $18,000 for the construction of a three-story brick mess hall known as Brackenridge Hall (affectionately known as “B.Hall”), one of the university’s most storied buildings and one that played an important place in university life until its demolition in 1952.

    The old Victorian-Gothic Main Building served as the central point of the campus’s 40-acre (16 ha) site, and was used for nearly all purposes. But by the 1930s, discussions arose about the need for new library space, and the Main Building was razed in 1934 over the objections of many students and faculty. The modern-day tower and Main Building were constructed in its place.

    In 1910, George Washington Brackenridge again displayed his philanthropy, this time donating 500 acres (200 ha) on the Colorado River to the university. A vote by the regents to move the campus to the donated land was met with outrage, and the land has only been used for auxiliary purposes such as graduate student housing. Part of the tract was sold in the late-1990s for luxury housing, and there are controversial proposals to sell the remainder of the tract. The Brackenridge Field Laboratory was established on 82 acres (33 ha) of the land in 1967.

    In 1916, Gov. James E. Ferguson became involved in a serious quarrel with the University of Texas. The controversy grew out of the board of regents’ refusal to remove certain faculty members whom the governor found objectionable. When Ferguson found he could not have his way, he vetoed practically the entire appropriation for the university. Without sufficient funding, the university would have been forced to close its doors. In the middle of the controversy, Ferguson’s critics brought to light a number of irregularities on the part of the governor. Eventually, the Texas House of Representatives prepared 21 charges against Ferguson, and the Senate convicted him on 10 of them, including misapplication of public funds and receiving $156,000 from an unnamed source. The Texas Senate removed Ferguson as governor and declared him ineligible to hold office.

    In 1921, the legislature appropriated $1.35 million for the purchase of land next to the main campus. However, expansion was hampered by the restriction against using state revenues to fund construction of university buildings as set forth in Article 7, Section 14 of the Constitution. With the completion of Santa Rita No. 1 well and the discovery of oil on university-owned lands in 1923, the university added significantly to its Permanent University Fund. The additional income from Permanent University Fund investments allowed for bond issues in 1931 and 1947, which allowed the legislature to address funding for the university along with the Agricultural and Mechanical College (now known as Texas A&M University). With sufficient funds to finance construction on both campuses, on April 8, 1931, the Forty Second Legislature passed H.B. 368. which dedicated the Agricultural and Mechanical College a 1/3 interest in the Available University Fund, the annual income from Permanent University Fund investments.

    The University of Texas was inducted into the Association of American Universities in 1929. During World War II, the University of Texas was one of 131 colleges and universities nationally that took part in the V-12 Navy College Training Program which offered students a path to a Navy commission.

    In 1950, following Sweatt v. Painter, the University of Texas was the first major university in the South to accept an African-American student. John S. Chase went on to become the first licensed African-American architect in Texas.

    In the fall of 1956, the first black students entered the university’s undergraduate class. Black students were permitted to live in campus dorms, but were barred from campus cafeterias. The University of Texas integrated its facilities and desegregated its dorms in 1965. UT, which had had an open admissions policy, adopted standardized testing for admissions in the mid-1950s at least in part as a conscious strategy to minimize the number of Black undergraduates, given that they were no longer able to simply bar their entry after the Brown decision.

    Following growth in enrollment after World War II, the university unveiled an ambitious master plan in 1960 designed for “10 years of growth” that was intended to “boost the University of Texas into the ranks of the top state universities in the nation.” In 1965, the Texas Legislature granted the university Board of Regents to use eminent domain to purchase additional properties surrounding the original 40 acres (160,000 m^2). The university began buying parcels of land to the north, south, and east of the existing campus, particularly in the Blackland neighborhood to the east and the Brackenridge tract to the southeast, in hopes of using the land to relocate the university’s intramural fields, baseball field, tennis courts, and parking lots.

    On March 6, 1967, the Sixtieth Texas Legislature changed the university’s official name from “The University of Texas” to “The University of Texas at Austin” to reflect the growth of the University of Texas System.

    Recent history

    The first presidential library on a university campus was dedicated on May 22, 1971, with former President Johnson, Lady Bird Johnson and then-President Richard Nixon in attendance. Constructed on the eastern side of the main campus, the Lyndon Baines Johnson Library and Museum is one of 13 presidential libraries administered by the National Archives and Records Administration.

    A statue of Martin Luther King Jr. was unveiled on campus in 1999 and subsequently vandalized. By 2004, John Butler, a professor at the McCombs School of Business suggested moving it to Morehouse College, a historically black college, “a place where he is loved”.

    The University of Texas at Austin has experienced a wave of new construction recently with several significant buildings. On April 30, 2006, the school opened the Blanton Museum of Art. In August 2008, the AT&T Executive Education and Conference Center opened, with the hotel and conference center forming part of a new gateway to the university. Also in 2008, Darrell K Royal-Texas Memorial Stadium was expanded to a seating capacity of 100,119, making it the largest stadium (by capacity) in the state of Texas at the time.

    On January 19, 2011, the university announced the creation of a 24-hour television network in partnership with ESPN, dubbed the Longhorn Network. ESPN agreed to pay a $300 million guaranteed rights fee over 20 years to the university and to IMG College, the school’s multimedia rights partner. The network covers the university’s intercollegiate athletics, music, cultural arts, and academics programs. The channel first aired in September 2011.

  • richardmitnick 5:19 pm on March 23, 2021 Permalink | Reply
    Tags: "Has the black hole information paradox evaporated?", , , , , Quantum Mechanics, , The "Black Hole War" Stephen Hawking defeated by Leonard Susskind: information remains smeared on the black hole's Event Horizon.   

    From Symmetry: “Has the black hole information paradox evaporated?” 

    Symmetry Mag
    From Symmetry

    Nathan Collins

    Researchers make progress on a vexing problem about how black holes evolve.

    Now iconic composite image of Centaurus A, a galaxy whose appearance is dominated by the large-scale jets emitted by the supermassive black hole at its center. [European Southern Observatory(EU)/Wide Field Imager (Optical);MPG Institute for Radio Astronomy [MPG Institut für Radioastronomie](DE)/ESO/ Atacama Pathfinder Experiment/A.Weiss et al. (Submillimetre); National Aeronautics Space Agency(USA)/NASA Chandra X-ray Space Telescope(US) /R.Kraft et al. (X-ray)]

    If there’s one misconception people have about black holes, it’s that nothing ever escapes them. As physicist Stephen Hawking and colleagues showed back in the 1970s, black holes actually emit a faint glow of light.

    There’s a funny consequence to this glow: It carries energy away from the black hole. Eventually this drip, drip, drip of radiation drains a black hole completely and causes it to disappear. All that remains is the light.

    In the 1970s, scientists’ calculations suggested that this light contained almost no information. Black holes seemed to be destroyers not just of the objects that sank into them but also of any information about what those objects had been in the first place.

    Black Hole War: Leonard Susskind wins over Stephen Hawking, finds informaton is not lost but remains smeared on the Event Horizon.

    The problem is: According to quantum mechanics, that’s impossible.

    A core tenet of quantum mechanics, the study of particle behavior on the subatomic level, is this: If you know the current state of any system, then you know everything there is to know about its past and its future.

    Somehow, black holes seemed to be destroying information that, according to quantum physics, cannot be destroyed. This problem, today known as the black hole information paradox, has befuddled physicists for decades.

    But over the last several years, theoretical physicists have identified key pieces that Hawking’s original calculation overlooked. Calculations completed in 2019 gave scientists insight into how that information might stick around.

    Those developments could mean more than just solving the information paradox—they could also provide clues that could help finally solve the mystery of how gravity works at the subatomic level, says Massachusetts Institute of Technology(US) physicist Netta Engelhardt, whose work with Institute for Advanced Study(US) physicist Ahmed Almheiri, along with similar work by University of California, Berkeley physicist Geoff Penington and colleauges, pointed the way toward the latest results. The research was supported in part by the US Department of Energy’s Office of Science.

    “This,” she says, “is where we need to look to understand quantum gravity better.”

    Building the Page curve

    In the 1990s, the first hint arrived that black holes might not be the information-destroyers they’d been made out to be.

    Physicist Don Page, a former student of Hawking’s, imagined a black hole that absorbed quantum-mechanical waves and then radiated them back out in a scrambled form. Unlike Hawking, he followed quantum theory in assuming that the combined system—of the black hole, the incoming waves and the outgoing radiation—was closed, so that whatever information was in the system to begin with would be preserved.

    In Page’s calculation, radiation both contains information and is correlated with what remains behind in the black hole—and therefore is also correlated with the radiation the black hole emits later on.

    Page’s key result is what’s now called the “Page curve”, which describes the amount of information connected to a black hole and its radiation. This curve increases slowly over time, reaching a maximum about halfway through the process—when all of the information that has emerged is as correlated as can be with all of the information that remains—and eventually declining back down to zero—when the black hole vanishes, and the pairing is no more.

    The Page curve tantalized physicists almost as much as the original information paradox. It showed that while physicists should still expect Hawking’s calculation to hold for quite a long time, it “has to go wrong eventually, and before you would expect it to,” based on other calculations, says Penington, the UC Berkeley physicist.

    Still, many physicists wondered whether Page could really be right. “The Hawking calculation seemed very robust,” Penington says. To get something like the Page curve in a real black hole, “it seemed like you’d need something very radical.”

    The replica trick

    For the last several decades, the challenge has been to calculate the Page curve in the full glory—or perhaps full brutality—of Einstein’s general theory of relativity, finally taking gravity into account.

    And now physicists have done just that.

    The result relies on the replica trick, a mathematical method for calculating entropy. In computing the entropy of a black hole and its radiation, physicists need to add up contributions from many different configurations of that system. That turns out to be practically impossible to do directly—if limited to one black hole.

    If, on the other hand, they consider two copies of a black hole, each existing in a separate universe, and the radiation that both emit, the entropy is relatively easy to compute, says Almheiri, the IAS physicist. From this calculation physicists can infer what information would exist between a single black hole and its radiation. Within the context of the replica trick, information in the interior of one black hole can flow into the interior of the other, and this information flow becomes more important over time.

    Back in the regular universe, the finding makes precise an idea that had been suggested a few times in the last decade: The entropy in the connection between Hawking radiation outside the black hole and what’s left inside can actually affect the interior structure of the black hole. As Almheiri puts it, “there’s an operation you can perform on the radiation outside that would create a cat inside.”

    Of course, no one actually knows what they would have to do to the radiation outside an actual, real-world black hole to make a cat inside it. Nor do the latest calculations reveal exactly what radiation black holes produce or how that’s connected to what falls into them in the first place.

    But most everyone agrees, it is significant progress. To Almheiri, “the paradox is not as severe as it once was.”

    What’s more, examining exactly how the quantum calculations work in a wider range of circumstances could reveal something new about how a full quantum theory of gravity could work, Engelhardt says.

    And even if it doesn’t, the possibility of resolving the information paradox is enticing. When Engelhardt and her collaborators first started studying it, she says, “I barely slept, it was so exciting. For the first time in a really long time, we’re making progress.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: