Tagged: Quantum Mechanics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 5:13 pm on February 3, 2023 Permalink | Reply
    Tags: , , Quantum Mechanics, , "Entan­gled atoms across the Inns­bruck quan­tum net­work", A promising platform for realizing future distributed networks of quantum computers and quantum sensors and atomic clocks., Trapped ions are one of the leading systems to build quantum computers and other quantum technologies., The nodes of this network were housed in two labs at the Campus Technik to the west of Innsbruck in Austria, Teams from the University of Innsbruck have entangled two ions over a distance of 230 meters., The two quantum systems were set up in in two separate laboratories 230 metres apart.   

    From The University of Innsbruck [Leopold-Franzens-Universität Innsbruck] (AT): “Entan­gled atoms across the Inns­bruck quan­tum net­work” 

    From The University of Innsbruck [Leopold-Franzens-Universität Innsbruck] (AT)

    2.3.23

    Trapped ions have previously only been entangled in one and the same laboratory. Now, teams led by Tracy Northup and Ben Lanyon from the University of Innsbruck have entangled two ions over a distance of 230 meters. The experiment shows that trapped ions are a promising platform for future quantum networks that span cities and eventually continents.

    1
    The nodes of this network were housed in two labs at the Campus Technik to the west of Innsbruck, Austria.

    Trapped ions are one of the leading systems to build quantum computers and other quantum technologies. To link multiple such quantum systems, interfaces are needed through which the quantum information can be transmitted. In recent years, researchers led by Tracy Northup and Ben Lanyon at the University of Innsbruck’s Department of Experimental Physics have developed a method for doing this by trapping atoms in optical cavities such that quantum information can be efficiently transferred to light particles. The light particles can then be sent through optical fibers to connect atoms at different locations. Now, their teams, together with theorists led by Nicolas Sangouard of the Université Paris-Saclay, have for the first time entangled two trapped ions more than a few meters apart.

    Platform for building quantum networks

    The two quantum systems were set up in in two laboratories, one in the building that houses the Department of Experimental Physics and one in the building that houses the Institute of Quantum Optics and Quantum Information of the Austrian Academy of Sciences. “Until now, trapped ions were only entangled with each other over a few meters in the same laboratory. Those results were also achieved using shared control systems and photons (light particles) with wavelengths that aren’t suitable for travelling over much longer distances,” Ben Lanyon explains. After years of research and development, the Innsbruck physicists have now managed to entangle two ions across campus. “To do this, we sent individual photons entangled with the ions over a 500-meter fiber optic cable and superimposed them on each other, swapping the entanglement to the two remote ions,” says Tracy Northup, describing the experiment. “Our results show that trapped ions are a promising platform for realizing future distributed networks of quantum computers and quantum sensors and atomic clocks.”

    Physical Review Letters

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Innsbruck [Leopold-Franzens-Universität Innsbruck ](AT) is currently the largest education facility in the Austrian Bundesland of Tirol, the third largest in Austria behind University of Vienna [Universität Wien] (AT) and the University of Graz [Karl-Franzens-Universität Graz] (AT) and according to The Times Higher Education Supplement World Ranking 2010 Austria’s leading university. Significant contributions have been made in many branches, most of all in the physics department. Further, regarding the number of Web of Science-listed publications, it occupies the third rank worldwide in the area of mountain research. In the Handelsblatt Ranking 2015, the business administration faculty ranks among the 15 best business administration faculties in German-speaking countries.

    History

    In 1562, a Jesuit grammar school was established in Innsbruck by Peter Canisius, today called “Akademisches Gymnasium Innsbruck”. It was financed by the salt mines in Hall in Tirol, and was refounded as a university in 1669 by Leopold I with four faculties. In 1782 this was reduced to a mere lyceum (as were all other universities in the Austrian Empire, apart from Prague, Vienna and Lviv), but it was reestablished as the University of Innsbruck in 1826 by Emperor Franz I. The university is therefore named after both of its founding fathers with the official title “Leopold-Franzens-Universität Innsbruck” (Universitas Leopoldino-Franciscea).

    In 2005, copies of letters written by the emperors Frederick II and Conrad IV were found in the university’s library. They arrived in Innsbruck in the 18th century, having left the charterhouse Allerengelberg in Schnals due to its abolishment.

     
  • richardmitnick 11:11 am on January 29, 2023 Permalink | Reply
    Tags: "How Quantum Physicists ‘Flipped Time’ (and How They Didn’t)", "Time’s arrow", , Before being measured a particle acts more like a wave., Physicists have coaxed particles of light into undergoing opposite transformations simultaneously like a human turning into a werewolf as the werewolf turns into a human., , , Quantum Mechanics, , The essence of quantum strangeness, The perplexing phenomenon could lead to new kinds of quantum technology.   

    From “Quanta Magazine” : “How Quantum Physicists ‘Flipped Time’ (and How They Didn’t)” 

    From “Quanta Magazine”

    1.27.23
    Charlie Wood


    The quantum time flip circuit is like a metronome swinging both ways at once. Kristina Armitage/Quanta Magazine.

    Physicists have coaxed particles of light into undergoing opposite transformations simultaneously, like a human turning into a werewolf as the werewolf turns into a human. In carefully engineered circuits, the photons act as if time were flowing in a quantum combination of forward and backward.

    “For the first time ever, we kind of have a time-traveling machine going in both directions,” said Sonja Franke-Arnold, a quantum physicist at the University of Glasgow in Scotland who was not involved in the research.

    Regrettably for science fiction fans, the devices have nothing in common with a 1982 DeLorean. Throughout the experiments, which were conducted by two independent teams in China and Austria, laboratory clocks continued to tick steadily forward. Only the photons flitting through the circuitry experienced temporal shenanigans. And even for the photons, researchers debate whether the flipping of “time’s arrow” is real or simulated.

    Either way, the perplexing phenomenon could lead to new kinds of quantum technology.

    “You could conceive of circuits in which your information could flow both ways,” said Giulia Rubino, a researcher at the University of Bristol.

    Anything Anytime All at Once

    Physicists first realized a decade ago that the strange rules of quantum mechanics topple commonsense notions of time.

    The essence of quantum strangeness is this: When you look for a particle, you’ll always detect it in a single, pointlike location. But before being measured, a particle acts more like a wave; it has a “wave function” that spreads out and ripples over multiple routes. In this undetermined state, a particle exists in a quantum blend of possible locations known as a superposition.

    In a paper published in 2013, Giulio Chiribella, a physicist now at the University of Hong Kong, and co-authors proposed a circuit that would put events into a superposition of temporal orders, going a step beyond the superposition of locations in space. Four years later, Rubino and her colleagues directly experimentally demonstrated the idea [Science Advances (below)]. They sent a photon down a superposition of two paths: one in which it experienced event A and then event B, and another where it experienced B then A. In some sense, each event seemed to cause the other, a phenomenon that came to be called “indefinite causality”.

    Not content to mess merely with the order of events while time marched onward, Chiribella and a colleague, Zixuan Liu, next took aim at the marching direction, or arrow, of time itself. They sought a quantum apparatus in which time entered a superposition of flowing from the past to the future and vice versa — an indefinite arrow of time.

    To do this, Chiribella and Liu realized they needed a system that could undergo opposite changes, like a metronome whose arm can swing left or right. They imagined putting such a system in a superposition, akin to a musician simultaneously flicking a quantum metronome rightward and leftward. They described a scheme for setting up such a system in 2020.

    Optics wizards immediately started constructing dueling arrows of time in the lab. Last fall, two teams declared success.

    A Two-Timing Game

    Chiribella and Liu had devised a game at which only a quantum two-timer could excel. Playing the game with light involves firing photons through two crystal gadgets, A and B. Passing forward through a gadget rotates a photon’s polarization by an amount that depends on the gadget’s settings. Passing backward through the gadget rotates the polarization in precisely the opposite way.

    Before each round of the game, a referee secretly sets the gadgets in one of two ways: The path forward through A, then backward through B, will either shift a photon’s wave function relative to the time-reversed path (backward through A, then forward through B), or it won’t. The player must figure out which choice the referee made. After the player arranges the gadgets and other optical elements however they want, they send a photon through the maze, perhaps splitting it into a superposition of two paths using a half-silvered mirror. The photon ends up at one of two detectors. If the player has set up their maze in a sufficiently clever way, the click of the detector that has the photon will reveal the referee’s choice.

    When the player sets up the circuit so that the photon moves in only one direction through each gadget, then even if A and B are in an indefinite causal order, the detector’s click will match the secret gadget settings at most about 90% of the time. Only when the photon experiences a superposition that takes it forward and backward through both gadgets — a tactic dubbed the “quantum time flip” — can the player theoretically win every round.

    2
    Merrill Sherman/Quanta Magazine

    Last year, a team in Hefei, China advised by Chiribella and one in Vienna advised by the physicist Časlav Brukner set up quantum time-flip circuits. Over 1 million rounds, the Vienna team guessed correctly 99.45% of the time. Chiribella’s group won 99.6% of its rounds. Both teams shattered the theoretical 90% limit, proving that their photons experienced a superposition of two opposing transformations and hence an indefinite arrow of time.

    Interpreting the Time Flip

    While the researchers have executed and named the quantum time flip, they’re not in perfect agreement regarding which words best capture what they’ve done.

    In Chiribella’s eyes, the experiments have simulated a flipping of time’s arrow. Actually flipping it would require arranging the fabric of space-time itself into a superposition of two geometries where time points in different directions. “Obviously, the experiment is not implementing the inversion of the arrow of time,” he said.

    Brukner, meanwhile, feels that the circuits take a modest step beyond simulation. He points out that the measurable properties of the photons change exactly as they would if they passed through a true superposition of two space-time geometries. And in the quantum world, there is no reality beyond what can be measured. “From the state itself, there is no difference between the simulation and the real thing,” he said.

    Granted, he admits, the circuit can time-flip only photons undergoing polarization changes; if space-time were truly in a superposition, dueling time directions would affect everything.

    Two-Arrow Circuits

    Whatever their philosophical inclinations, physicists hope that the ability to design quantum circuits that flow two ways at once might enable new devices for quantum computing, communication and metrology.

    “This allows you to do more things than just implementing the operations in one order or another,” said Cyril Branciard, a quantum information theorist at the Néel Institute in France.

    “This allows you to do more things than just implementing the operations in one order or another,” said Cyril Branciard, a quantum information theorist at the Néel Institute in France.

    Some researchers speculate that the time-travel flavor of the quantum time flip might enable a future quantum “undo” function. Others anticipate that circuits operating in two directions at once could allow quantum machines to run more efficiently. “You could use this for games where you want to reduce the so-called query complexity,” Rubino said, referring to the number of steps it takes to carry out some task.

    Such practical applications are far from assured. While the time-flip circuits broke a theoretical performance limit in Chiribella and Liu’s guessing game, that was a highly contrived task dreamt up only to highlight their advantage over one-way circuits.

    But bizarre, seemingly niche quantum phenomena have a knack for proving useful. The eminent physicist Anton Zeilinger used to believe that quantum entanglement — a link between separated particles — wasn’t good for anything. Today, entanglement threads together nodes in nascent quantum networks and qubits in prototype quantum computers, and Zeilinger’s work on the phenomenon won him a share of the 2022 Nobel Prize in Physics. For the flippable nature of quantum time, Franke-Arnold said, “it’s very early days.”

    a paper published in 2013
    Science Advances 2017
    described a scheme 2022

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 6:04 pm on January 28, 2023 Permalink | Reply
    Tags: , , , Quantum Mechanics, , , , "Danish quantum physicists make nanoscopic advance of colossal significance", Quantum mechanical entanglement, Two light sources can affect each other instantly and potentially across large geographic distances., Using photons as micro transporters to move quantum information about., Entanglement is the very basis of quantum networks and central to the development of an efficient quantum computer.   

    From The Niels Bohr Institute [Niels Bohr Institutet] (DK): “Danish quantum physicists make nanoscopic advance of colossal significance” 

    Niels Bohr Institute bloc

    From The Niels Bohr Institute [Niels Bohr Institutet] (DK)

    at

    University of Copenhagen [Københavns Universitet] [UCPH] (DK)

    1.26.23
    Peter Lodahl
    Professor
    Niels Bohr Institute
    University of Copenhagen
    Mobile: + 45 20 56 53 03
    Email: lodahl@nbi.ku.dk

    Alexey Tiranov
    Postdoc
    Niels Bohr Institute
    University of Copenhagen
    Phone: + 45 35 33 51 39
    Email: alexey.tiranov@nbi.ku.dk

    Michael Skov Jensen
    Journalist and team coordinator
    The Faculty of Science
    University of Copenhagen
    Mobile: + 45 93 56 58 97
    msj@science.ku.dk

    In a new breakthrough, researchers at the University of Copenhagen, in collaboration with Ruhr University Bochum, have solved a problem that has caused quantum researchers headaches for years. The researchers can now control two quantum light sources rather than one. Trivial as it may seem to those uninitiated in quantum, this colossal breakthrough allows researchers to create a phenomenon known as quantum mechanical entanglement. This in turn, opens new doors for companies and others to exploit the technology commercially.

    1
    Part of the team behind the invention. From left: Peter Lodahl, Anders Sørensen, Vasiliki Angelopoulou, Ying Wang, Alexey Tiranov, Cornelis van Diepen. Photo: Ola J. Joensen.

    Going from one to two is a minor feat in most contexts. But in the world of quantum physics, doing so is crucial. For years, researchers around the world have strived to develop stable quantum light sources and achieve the phenomenon known as quantum mechanical entanglement – a phenomenon, with nearly sci-fi-like properties, where two light sources can affect each other instantly and potentially across large geographic distances. Entanglement is the very basis of quantum networks and central to the development of an efficient quantum computer.

    Today, researchers from the Niels Bohr Institute published a new result in the highly esteemed journal Science [below], in which they succeeded in doing just that. According to Professor Peter Lodahl, one of the researchers behind the result, it is a crucial step in the effort to take the development of quantum technology to the next level and to “quantize” society’s computers, encryption and the internet.

    “We can now control two quantum light sources and connect them to each other. It might not sound like much, but it’s a major advancement and builds upon the past 20 years of work. By doing so, we’ve revealed the key to scaling up the technology, which is crucial for the most ground-breaking of quantum hardware applications,” says Professor Peter Lodahl, who has conducted research the area since 2001.

    The magic all happens in a so-called nanochip – which is not much larger than the diameter of a human hair – that the researchers also developed in recent years.

    2
    Illustration of two a chip comprising two entangled quantum light sources. Credit: NBI.

    Quantum sources overtake the world’s most powerful computer

    Peter Lodahl’s group is working with a type of quantum technology that uses light particles, called photons, as micro transporters to move quantum information about.

    While Lodahl’s group is a leader in this discipline of quantum physics, they have only been able to control one light source at a time until now. This is because light sources are extraordinarily sensitive to outside “noise”, making them very difficult to copy. In their new result, the research group succeeded in creating two identical quantum light sources rather than just one.

    “Entanglement means that by controlling one light source, you immediately affect the other. This makes it possible to create a whole network of entangled quantum light sources, all of which interact with one another, and which you can get to perform quantum bit operations in the same way as bits in a regular computer, only much more powerfully,” explains postdoc Alexey Tiranov, the article’s lead author.

    This is because a quantum bit can be both a 1 and 0 at the same time, which results in processing power that is unattainable using today’s computer technology. According to Professor Lodahl, just 100 photons emitted from a single quantum light source will contain more information than the world’s largest supercomputer can process.

    By using 20-30 entangled quantum light sources, there is the potential to build a universal error-corrected quantum computer – the ultimate “holy grail” for quantum technology, that large IT companies are now pumping many billions into.

    Other actors will build upon the research

    According to Lodahl, the biggest challenge has been to go from controlling one to two quantum light sources. Among other things, this has made it necessary for researchers to develop extremely quiet nanochips and have precise control over each light source.

    With the new research breakthrough, the fundamental quantum physics research is now in place. Now it is time for other actors to take the researchers’ work and use it in their quests to deploy quantum physics in a range of technologies including computers, the internet and encryption.

    “It is too expensive for a university to build a setup where we control 15-20 quantum light sources. So, now that we have contributed to understanding the fundamental quantum physics and taken the first step along the way, scaling up further is very much a technological task,” says Professor Lodahl.

    The research was conducted at the Danish National Research Foundation’s “Center of Excellence for Hybrid Quantum Networks (Hy-Q)” and is a collaboration between Ruhr University Bochum in Germany and the the University of Copenhagen’s Niels Bohr Institute.

    Science

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings

    Stem Education Coalition

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct.

    Niels Bohr Institute Campus

    The Niels Bohr Institutet (DK) is a research institute of the Københavns Universitet [UCPH] (DK). The research of the institute spans astronomy, geophysics, nanotechnology, particle physics, quantum mechanics and biophysics.

    The Institute was founded in 1921, as the Institute for Theoretical Physics of the Københavns Universitet [UCPH] (DK), by the Danish theoretical physicist Niels Bohr, who had been on the staff of the University of Copenhagen since 1914, and who had been lobbying for its creation since his appointment as professor in 1916. On the 80th anniversary of Niels Bohr’s birth – October 7, 1965 – the Institute officially became The Niels Bohr Institutet (DK). Much of its original funding came from the charitable foundation of the Carlsberg brewery, and later from the Rockefeller Foundation.

    During the 1920s, and 1930s, the Institute was the centre of the developing disciplines of atomic physics and quantum physics. Physicists from across Europe (and sometimes further abroad) often visited the Institute to confer with Bohr on new theories and discoveries. The Copenhagen interpretation of quantum mechanics is named after work done at the Institute during this time.

    On January 1, 1993 the institute was fused with the Astronomic Observatory, the Ørsted Laboratory and the Geophysical Institute. The new resulting institute retained the name Niels Bohr Institutet (DK).

    Københavns Universitet (UCPH) (DK) is the oldest university and research institution in Denmark. Founded in 1479 as a studium generale, it is the second oldest institution for higher education in Scandinavia after Uppsala University (1477). The university has 23,473 undergraduate students, 17,398 postgraduate students, 2,968 doctoral students and over 9,000 employees. The university has four campuses located in and around Copenhagen, with the headquarters located in central Copenhagen. Most courses are taught in Danish; however, many courses are also offered in English and a few in German. The university has several thousands of foreign students, about half of whom come from Nordic countries.

    The university is a member of the International Alliance of Research Universities (IARU), along with University of Cambridge (UK), Yale University , The Australian National University (AU), and University of California-Berkeley , amongst others. The 2016 Academic Ranking of World Universities ranks the University of Copenhagen as the best university in Scandinavia and 30th in the world, the 2016-2017 Times Higher Education World University Rankings as 120th in the world, and the 2016-2017 QS World University Rankings as 68th in the world. The university has had 9 alumni become Nobel laureates and has produced one Turing Award recipient.

     
  • richardmitnick 5:10 pm on January 28, 2023 Permalink | Reply
    Tags: "Requiem for a string - Charting the rise and fall of a theory of everything", , At the end of the 1970s string theory could potentially explain all the particles and all the interactions among them and provide a quantum solution to gravity., , By the early 1990s string theorists had developed not one and not two but five different versions of string theory., Early on physicists realized that the strings had to vibrate in more than three dimensions of space if they were to explain the full variety of forces and particles in the Universe., In the 1990s physicist Edward Witten declared a winner: "all of them"., It has been almost half a century since physicists first realized that string theory could potentially provide a theory of everything. String theory has not lived up to that potential., Once we determine whether our approximation schemes are valid all the five versions of string theory should converge on it and our Universe should pop out of the math., One theory to rule them all | one theory to find them | one theory to bring them all | and in the stringiness bind them., , Quantum field theory-which had been used successfully to explain electromagnetism and the weak nuclear force-wasn’t cutting it so physicists were eager for something new., Quantum Mechanics, So all the extra dimensions have to be tiny and curled up on themselves., String theory began over 50 years ago. Despite decades of work it has failed to deliver on its promise., String theory could build a bridge from the bosons to the fermions allowing it to leap from just a theory of forces to a theory of every single particle in existence., String theory started in the 1960s as an attempt to understand the workings of the strong nuclear force which had only recently been discovered., , Supersymmetry isn’t a single theory; it's a family of theories., The earliest versions of string theory needed 26 spatial dimensions but after supersymmetry and some dimensional layoffs theorists were able to slim that number down to “only” 10., The extra dimensions give the strings enough vibrational options to explain all of physics., The variety of shapes those dimensions can take as they curl up on themselves are known as Calabi-Yau manifolds., , Unlike its quantum cousins when it comes to string theory we have no fundamental theory. We have only a set of approximation and perturbation methods., We don’t know what M-theory is or even what the “M” stands for (my vote is “Manchego”)—but it should be the actual string theory., We have no reliable machinery that goes from a given Calabi-Yau manifold to the physics that appears in that universe., Which one of the zillions of potential Calabi-Yau structures corresponds to our reality?, Witten tied the five string theories into a single knot. This idea has yet to be mathematically proven but it indicates that the five string theories are really manifestations of a single unified “M, Wittens's work was almost 30 years ago and we still don’t know what M-theory is.   

    From “ars technica“: “Requiem for a string – Charting the rise and fall of a theory of everything” 

    From “ars technica“

    1.27.23
    Paul Sutter

    1
    Aurich Lawson | Getty Images

    String theory began over 50 years ago as a way to understand the strong nuclear force. Since then, it’s grown to become a “theory of everything”, capable of explaining the nature of every particle, every force, every fundamental constant, and the existence of the Universe itself. But despite decades of work it has failed to deliver on its promise.

    What went wrong, and where do we go from here?

    Beginning threads

    Like most revolutions, string theory had humble origins. It started in the 1960s as an attempt to understand the workings of the strong nuclear force, which had only recently been discovered. Quantum field theory, which had been used successfully to explain electromagnetism and the weak nuclear force, wasn’t cutting it, so physicists were eager for something new.

    A group of physicists took a mathematical technique developed (and later abandoned) by quantum godfather Werner Heisenberg and expanded it. In that expansion, they found the first strings—mathematical structures that repeated themselves in spacetime. Unfortunately, this proto-string theory made incorrect predictions about the nature of the strong force and also had a variety of troublesome artifacts (like the existence of tachyons, particles that only traveled faster than light). Once another theory was developed to explain the strong force—the one we use today, based on quarks and gluons—string theory faded from the scene.

    But again, like most revolutions, whispers remained through the years, keeping hopes alive. In the 1970s, physicists uncovered several remarkable properties of string theory. One, the theory could support more forces than just the strong nuclear force. The strings in string theory had enormous tension, forcing them to curl up on themselves into the smallest possible volume, something around the Planck scale. Once in place, the strings could support various vibrations, just like a taut guitar string. The different vibrations led to different manifestations of forces: one note for strong nuclear, another for electromagnetism, and so on.

    One of the possible vibrations of the string acted like a massless spin-2 particle. This is a very special particle because that would be the quantum force carrier of the gravitational force, the holy grail of a quantized theory of gravity. The theorists at the time couldn’t believe their chalkboards: String theory naturally, elegantly included “quantum gravity”, and they weren’t even trying!

    The second big deal to come out in the 1970s was the introduction of “supersymmetry”, which claimed that all the particles that carry forces (called bosons, a category that includes photons and gluons) were linked to a supersymmetric partner from the collection of particles that build stuff (called fermions, like electrons and quarks), and vice versa.

    This symmetry doesn’t appear in everyday settings; it only manifests at extremely high energies. So if you were to go back in time to the earliest moments of the Big Bang or had enough funding to build a particle collider along the orbit of Jupiter, you wouldn’t just see the normal zoo of particles we’re familiar with; you’d see all their supersymmetric partners, too. These were given suitably stupid names, like selectrons, sneutrinos, squarks, photinos, and my personal (least) favorite, the wino boson.

    By making this connection, string theory could build a bridge from the bosons to the fermions, allowing it to leap from just a theory of forces to a theory of every single particle in existence. The introduction of supersymmetry also solved the nasty problem of tachyons by replacing those troublesome particles with supersymmetric partners, which was a nice flourish.

    At the end of the 1970s string theory could potentially explain all the particles and all the interactions among them and provide a quantum solution to gravity.

    One theory to rule them all | one theory to find them | one theory to bring them all | and in the stringiness bind them.

    A string perturbed

    It has been almost half a century since physicists first realized that string theory could potentially provide a theory of everything. Despite decades of work involving hundreds of scientists over several (academic) generations and countless papers, conferences, and workshops, string theory hasn’t lived up to that potential.

    One of the biggest issues involves the way that strings interact with each other. A major pain in the asymptote when it comes to quantum theory is the infinite variety of ways that particles can interact. It’s easy enough to write down the fundamental governing equations that describe an interaction, but the math tends to blow up when we actually try to use it. In string theory, fundamental particles aren’t particles at all; they’re tiny loops of vibrating… well, strings. When we see two particles bouncing off each other, for example, it’s really two strings briefly merging and then separating. That sounds super cool, but there are still an infinite number of ways that process can unfold.

    Unlike its quantum cousins when it comes to string theory we have no fundamental theory. We have only a set of approximation and perturbation methods. We’re not exactly sure if our approximations are good ones or if we’re way off the mark. We have perturbation techniques, but we’re not sure what we’re perturbing from. In other words, there’s no such thing as string theory, just approximations of what we hope string theory could be.

    The second major difficulty involves the vibrations of the strings themselves. Early on physicists realized that the strings had to vibrate in more than three dimensions of space if they were to explain the full variety of forces and particles in the Universe. 3D was just too limiting; it constricted the number of potential vibrations so severely that it was no longer a theory of everything, just a theory of some things, which isn’t nearly as exciting.

    The earliest versions of string theory needed 26 spatial dimensions but after supersymmetry and some dimensional layoffs theorists were able to slim that number down to “only” 10.

    Now, the Universe doesn’t have 10 spatial dimensions, at least on large scales, because we would have noticed them by now. So all the extra dimensions have to be tiny and curled up on themselves. When you wave your arm in front of you, you’re traversing these tiny dimensions countless times, but they’re so small (typically at the Planck scale) that you don’t notice them.

    2
    A 2D slice of a 6D Calabi-Yau quintic manifold. Credit:Andrew J. Hanson (CC BY-SA 3.0)

    The extra dimensions give the strings enough vibrational options to explain all of physics. And the variety of shapes those dimensions can take as they curl up on themselves are known as Calabi-Yau manifolds. If you curl a piece of paper up on itself, you have a few choices: you can connect just one pair of edges (a cylinder) or both pairs (a delicious doughnut), you can introduce one flip (a Mobius strip) or two (a Klein bottle), and so on. That’s only two dimensions. With six, you have somewhere between 10500 and 1010,000 possible options.

    We care about all these possible shapes because the way the extra spatial dimensions curl up determines the possible set of vibrations of the strings—each shape produces a different set of string vibrations, like different musical instruments. A tuba sounds different from a saxophone because of the way it’s structured and the kind of vibrations it can support. But our Universe is only a single instrument (an oboe, perhaps) with a single set of “notes” that correspond to our suite forces and particles.

    So which one of the zillions of potential Calabi-Yau structures corresponds to our reality? We don’t know. Because we don’t have a full accounting of string theory, only approximations, we don’t know how the shape of the curled-up dimensions affects the string vibrations. We have no reliable machinery that goes from a given Calabi-Yau manifold to the physics that appears in that universe, so we can’t run the reverse operation and use our unique experience of physics to discover the shape of the curled-up dimensions.

    Supersymmetry super-headaches

    It gets worse. By the early 1990s string theorists had developed not one and not two but five different versions of string theory. The variations were based on how a fundamental string was treated. In some versions, all strings had to form closed loops; in others, they could be open. In some, the vibrations could only travel in one direction; in others, they could travel both, and so on. For the curious (and those eager for edgy names for your kids) the five string theories are Type 1, Type IIA, Type IIB, SO(32) heterotic, and E8xE8 heterotic.

    So now we have a slight embarrassment of riches. Five potential theories, all claiming to be the best approximation of the true string theory. That’s pretty awkward, but in the 1990s physicist Edward Witten declared a winner: all of them.

    He discovered dualities, which are mathematical relationships between theories that allow you to transform one to the other. In this case, Witten tied the five string theories into a single knot. This idea has yet to be mathematically proven, but it indicates that the five string theories are really manifestations of a single, unified-for-real-this-time string theory, which Witten called M-theory. We don’t know what M-theory is—or even what the “M” stands for (my vote is “Manchego”)—but it should be the actual string theory.

    That’s potentially very useful since once we determine whether our approximation schemes are valid all the five versions of string theory should converge on it and our Universe should pop out of the math.

    Wittens’s work was almost 30 years ago and we still don’t know what M-theory is. We still haven’t figured out a solution for string theory.

    To be clear, our inability to understand string theory isn’t limited by experiment. Even if we could build a super-duper-collider experiment that achieved the energies necessary to unlock quantum gravity, we still wouldn’t be able to test string theory because we have no string theory. We have no mathematical model that can make reliable predictions, only approximations that we hope accurately represent the true physics. We can test those approximations, I guess, but it won’t help us determine the inner workings of the true model.

    Even so, the experiments we do have aren’t exactly helping. When supersymmetry was developed by the string theory community in the 1970s, it proved to be such a popular idea that many particle physicists took it as their own, using those techniques to develop models of high-energy physics beyond the Standard Model.

    Supersymmetry isn’t a single theory; it’s a family of theories. They all share the same core principle: that bosons and fermions are partners of each other at high enough energies. But the details of the interactions are left as a homework exercise for each individual theorist. Some supersymmetric theories are relatively (and that’s putting a lot of work on the word) straightforward, while others are more complex. Either way, in the 1990s, physicists became so convinced the supersymmetry was super-terrific that they devised a super-powerful collider to test it out: the Large Hadron Collider.

    The beams of the LHC began their first test operations in 2008 with two main science goals in mind: finding the elusive Higgs boson and finding evidence of supersymmetry.

    Four years later, the Higgs was found.

    Supersymmetry was not. It’s now 15 years later, and there are still no signs of supersymmetry.

    In fact, all the “easy” versions of supersymmetry have been ruled out, and many of the more complicated ones, too. The dearth of evidence has slaughtered so many members of the supersymmetric family that the whole idea is on very shaky ground, with physicists beginning to have conferences with titles like “Beyond Supersymmetry” and “Oh My God, I Think I Wasted My Career.”

    Where does that leave string theory? Well, since (and I’ll never stop reminding you of this) there is no string theory, only approximations, it’s not quite pining-for-the-fjords dead yet. It’s possible to build a version of string theory without using supersymmetry… maybe. The math gets even thornier and the approximations even sketchier, though. Without supersymmetry, string theory isn’t gone, but it’s certainly on life support.

    Duality of the fates

    After 50 years of work on a theory of everything, we’re left with approximate theories that seem so tantalizingly close to explaining all of physics… and yet always out of reach. Work continues on finding the underlying dualities that link the different versions of string theory, trying to suss out the mysterious M-theory that might underlie them all. Improvements to perturbation theory and approximation schemes provide some hope for making a breakthrough to link the dimensional structure of the extra dimensions to predictable physics. Routes around the damage caused by the LHC’s lack of evidence for supersymmetry continue to be laid.

    In response to our inability to choose which Calabi-Yau manifold corresponds to our Universe—and more importantly, why our Universe has that manifold rather than any of the other ones—some string theorists appeal to what you might call the landscape. They argue that all possible configurations of compact dimensions are realized, each one with its own unique universe and set of physical laws, and we happen to live in this one because life would be impossible in most or all of the others. That’s not the strongest argument to come out of physics, but I’ll save a dissection of the idea for another day.

    We don’t have a string theory, so we can’t test it. But it might be possible to perform experiments on string theory-adjacent ideas, and there’s been some progress on that front. Perhaps the event of inflation, which occurred immediately after the Big Bang, can teach us about string theory (or the formation of Universe-spanning cosmic strings). And perhaps there’s more to the dualities than we initially thought.

    Recently, theorists have proposed another duality, the AdS/CFT correspondence.

    In theoretical physics, the anti-de Sitter/conformal field theory correspondence, sometimes called Maldacena duality or gauge/gravity duality, is a conjectured relationship between two kinds of physical theories. On one side are anti-de Sitter spaces (AdS) which are used in theories of quantum gravity, formulated in terms of string theory or M-theory. On the other side of the correspondence are conformal field theories (CFT) which are quantum field theories, including theories similar to the Yang–Mills theories that describe elementary particles.

    The duality represents a major advance in the understanding of string theory and quantum gravity.[1] This is because it provides a non-perturbative formulation of string theory with certain boundary conditions and because it is the most successful realization of the holographic principle, an idea in quantum gravity originally proposed by Gerard ‘t Hooft and promoted by Leonard Susskind.

    It’s not exactly string theory, but the idea is certainly sponsored by it. This correspondence proposes that you can write down a string theory in a special three-dimensional setting and connect it to a special kind of quantum theory on its two-dimensional boundary. In principle, the correspondence should allow you to transform your impossible-to-solve string theory problem into a merely really-difficult-to-solve quantum problem (or vice versa, allowing you to use some of the mathematical tools developed in string theory to solve your thorny quantum problem).

    The AdS/CFT correspondence has found some limited applications, but its full utility remains unclear. And while the AdS/CFT correspondence has yet to be proven, theorists claim it should be possible soon (although they said the same thing about string theory itself during the Reagan administration).

    Most string theorists of the modern era don’t work on string theory directly but instead mostly on the AdS/CFT correspondence and its implications, hoping that continuing to probe that mathematical relationship will unlock some hidden insight into the workings of a theory of everything.

    I wish them luck.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Ars Technica was founded in 1998 when Founder & Editor-in-Chief Ken Fisher announced his plans for starting a publication devoted to technology that would cater to what he called “alpha geeks”: technologists and IT professionals. Ken’s vision was to build a publication with a simple editorial mission: be “technically savvy, up-to-date, and more fun” than what was currently popular in the space. In the ensuing years, with formidable contributions by a unique editorial staff, Ars Technica became a trusted source for technology news, tech policy analysis, breakdowns of the latest scientific advancements, gadget reviews, software, hardware, and nearly everything else found in between layers of silicon.

    Ars Technica innovates by listening to its core readership. Readers have come to demand devotedness to accuracy and integrity, flanked by a willingness to leave each day’s meaningless, click-bait fodder by the wayside. The result is something unique: the unparalleled marriage of breadth and depth in technology journalism. By 2001, Ars Technica was regularly producing news reports, op-eds, and the like, but the company stood out from the competition by regularly providing long thought-pieces and in-depth explainers.

    And thanks to its readership, Ars Technica also accomplished a number of industry leading moves. In 2001, Ars launched a digital subscription service when such things were non-existent for digital media. Ars was also the first IT publication to begin covering the resurgence of Apple, and the first to draw analytical and cultural ties between the world of high technology and gaming. Ars was also first to begin selling its long form content in digitally distributable forms, such as PDFs and eventually eBooks (again, starting in 2001).

     
  • richardmitnick 2:42 pm on January 24, 2023 Permalink | Reply
    Tags: "Quantum simulators": quantum devices tailored to solve specific problems, "Randomness in Quantum Machines Helps Verify Their Accuracy", A new method to verify the accuracy of quantum devices, A single microscopic error leads to a completely different macroscopic outcome-quite similar to the butterfly effect., , Demonstrating a novel way to measure a quantum device's accuracy, Looking for deviations in the patterns that indicate errors have been made, New error-detection method takes advantage of the way quantum information is scrambled., One challenge in using these quantum machines is that they are very prone to errors., , , Quantum Mechanics, Scientists don't want just a result from our quantum machines; they want a verified result., , The key to the new strategy is randomness.   

    From The California Institute of Technology: “Randomness in Quantum Machines Helps Verify Their Accuracy” 

    Caltech Logo

    From The California Institute of Technology

    1.24.23
    Whitney Clavin
    (626) 395‑1944
    wclavin@caltech.edu

    New error-detection method takes advantage of the way quantum information is scrambled.

    1
    Researchers have discovered that complex random behaviors naturally emerge from even the simplest, chaotic dynamics in a quantum simulator. This illustration zooms into one such complex set of states within an apparently smooth quantum system. Credit: Adam Shaw.

    In quantum computers and other experimental quantum systems, information spreads around the devices and quickly becomes scrambled like dice in a game of Boggle. This scrambling process happens as the basic units of the system, called qubits (like computer bits only quantum) become entangled with one another; entanglement is a phenomenon in quantum physics where particles link up with each other and remain connected even though they are not in direct contact.

    These quantum devices mimic what happens in nature and allow scientists to develop new, exotic materials that are potentially useful in medicine, computer electronics, and other fields. While full-scale quantum computers are still years away, researchers are already performing experiments on so-called “quantum simulators”: quantum devices tailored to solve specific problems, such as efficiently simulating high-temperature superconductors and other quantum materials. The machines could also solve complex optimization problems, such as planning routes for autonomous vehicles to ensure they don’t collide.

    One challenge in using these quantum machines is that they are very prone to errors, much more so than classical computers. It is also much harder to identify errors in these newer systems. “For the most part, quantum computers make a lot of mistakes,” says Adam Shaw, a Caltech graduate student in physics and one of two lead authors of a study in the journal Nature about a new method to verify the accuracy of quantum devices. “You cannot open the machine and look inside, and there is a huge amount of information being stored—too much for a classical computer to account for and verify.”

    In the Nature study [below], Shaw and co-lead author Joonhee Choi, a former postdoctoral scholar at Caltech who is now a professor at Stanford University, demonstrate a novel way to measure a quantum device’s accuracy, also known as fidelity. Both researchers work in the laboratory of Manuel Endres, a professor of physics at Caltech and a Rosenberg scholar. The key to their new strategy is randomness. The scientists have discovered and characterized a newfound type of randomness pertaining to the way information is scrambled in the quantum systems. But even though the quantum behavior is random, universal statistical patterns can be identified in the noise.

    “We are interested in better understanding what happens when the information is scrambled,” Choi says. “And by analyzing this behavior with statistics, we can look for deviations in the patterns that indicate errors have been made.”

    “We don’t want just a result from our quantum machines; we want a verified result,” Endres says. “Because of quantum chaos, a single microscopic error leads to a completely different macroscopic outcome, quite similar to the butterfly effect. This enables us to detect the error efficiently.”

    The researchers demonstrated their protocol on a quantum simulator with as many as 25 qubits. To find whether errors have occurred, they measured the behavior of the system down to the single qubit level thousands of times. By looking at how qubits evolved over time, the researchers could identify patterns in the seemingly random behavior and then look for deviations from what they expected. Ultimately, by finding errors, researchers will know how and when to fix them.

    “We can trace how information moves across a system with single qubit resolution,” Choi says. “The reason we can do this is that we also discovered that this randomness, which just happens naturally, is represented at the level of just one qubit. You can see the universal random pattern in the subparts of the system.”

    Shaw compares their work to measuring the choppiness of waves on a lake. “If a wind comes, you’ll get peaks and troughs on the lake, and while it may look random, one could identify a pattern to the randomness and track how the wind affects the water. We would be able to tell if the wind changes by analyzing how the pattern changes. Our new method similarly allows us to look for changes in the quantum system that would indicate errors.”

    The Nature study is funded by the National Science Foundation via the Institute for Quantum Information and Matter, or IQIM; the Defense Advanced Research Projects Agency (DARPA); the Army Research Office, the Eddleman Quantum Institute graduate fellowship; the Troesh postdoctoral fellowship; the Gordon and Betty Moore Foundation; the J. Yang & Family Foundation; the Harvard Quantum Initiative (HQI) graduate fellowship; the Junior Fellowship from the Harvard Society of Fellows; the Department of Energy; and the Miller Institute for Basic Research in Science at UC-Berkeley. Other authors include Ran Finkelstein, Hsin-Yuan Huang, and Fernando Brandão of Caltech; Ivaylo Madjarov, Xin Xie, and Jacob Covey, who performed the research while previously at Caltech; Jordan Cotler and Anant Kale of Harvard University; Daniel Mark and Soonwon Choi of MIT; and Hannes Pichler of University of Innsbruck in Austria.

    Nature
    See the science paper for instructive material with images.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The California Institute of Technology is a private research university in Pasadena, California. The university is known for its strength in science and engineering, and is one among a small group of institutes of technology in the United States which is primarily devoted to the instruction of pure and applied sciences.

    The California Institute of Technology was founded as a preparatory and vocational school by Amos G. Throop in 1891 and began attracting influential scientists such as George Ellery Hale, Arthur Amos Noyes, and Robert Andrews Millikan in the early 20th century. The vocational and preparatory schools were disbanded and spun off in 1910 and the college assumed its present name in 1920. In 1934, The California Institute of Technology was elected to the Association of American Universities, and the antecedents of National Aeronautics and Space Administration ‘s Jet Propulsion Laboratory, which The California Institute of Technology continues to manage and operate, were established between 1936 and 1943 under Theodore von Kármán.

    The California Institute of Technology has six academic divisions with strong emphasis on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. First-year students are required to live on campus, and 95% of undergraduates remain in the on-campus House System at The California Institute of Technology. Although The California Institute of Technology has a strong tradition of practical jokes and pranks, student life is governed by an honor code which allows faculty to assign take-home examinations. The The California Institute of Technology Beavers compete in 13 intercollegiate sports in the NCAA Division III’s Southern California Intercollegiate Athletic Conference (SCIAC).

    As of October 2020, there are 76 Nobel laureates who have been affiliated with The California Institute of Technology, including 40 alumni and faculty members (41 prizes, with chemist Linus Pauling being the only individual in history to win two unshared prizes). In addition, 4 Fields Medalists and 6 Turing Award winners have been affiliated with The California Institute of Technology. There are 8 Crafoord Laureates and 56 non-emeritus faculty members (as well as many emeritus faculty members) who have been elected to one of the United States National Academies. Four Chief Scientists of the U.S. Air Force and 71 have won the United States National Medal of Science or Technology. Numerous faculty members are associated with the Howard Hughes Medical Institute as well as National Aeronautics and Space Administration. According to a 2015 Pomona College study, The California Institute of Technology ranked number one in the U.S. for the percentage of its graduates who go on to earn a PhD.

    Research

    The California Institute of Technology is classified among “R1: Doctoral Universities – Very High Research Activity”. Caltech was elected to The Association of American Universities in 1934 and remains a research university with “very high” research activity, primarily in STEM fields. The largest federal agencies contributing to research are National Aeronautics and Space Administration; National Science Foundation; Department of Health and Human Services; Department of Defense, and Department of Energy.

    In 2005, The California Institute of Technology had 739,000 square feet (68,700 m^2) dedicated to research: 330,000 square feet (30,700 m^2) to physical sciences, 163,000 square feet (15,100 m^2) to engineering, and 160,000 square feet (14,900 m^2) to biological sciences.

    In addition to managing NASA-JPL/Caltech , The California Institute of Technology also operates the Caltech Palomar Observatory; The Owens Valley Radio Observatory;the Caltech Submillimeter Observatory; the W. M. Keck Observatory at the Mauna Kea Observatory; the Laser Interferometer Gravitational-Wave Observatory at Livingston, Louisiana and Hanford, Washington; and Kerckhoff Marine Laboratory in Corona del Mar, California. The Institute launched the Kavli Nanoscience Institute at The California Institute of Technology in 2006; the Keck Institute for Space Studies in 2008; and is also the current home for the Einstein Papers Project. The Spitzer Science Center, part of the Infrared Processing and Analysis Center located on The California Institute of Technology campus, is the data analysis and community support center for NASA’s Spitzer Infrared Space Telescope [no longer in service].


    The California Institute of Technology partnered with University of California at Los Angeles to establish a Joint Center for Translational Medicine (UCLA-Caltech JCTM), which conducts experimental research into clinical applications, including the diagnosis and treatment of diseases such as cancer.

    The California Institute of Technology operates several Total Carbon Column Observing Network stations as part of an international collaborative effort of measuring greenhouse gases globally. One station is on campus.

     
  • richardmitnick 3:04 pm on January 22, 2023 Permalink | Reply
    Tags: "Why This Universe? Maybe It’s Not Special—Just Probable", "Wick rotation", Boyle and Turok believe the equation conducts a census of all conceivable cosmic histories., coauthored a new calculation about the relative likelihoods of different universes., Cosmologists have spent decades striving to understand why our universe is so stunningly vanilla., , If the Wick rotation would work for more than just black holes it’s irresistible to do the same with the cosmological properties of the whole universe.”, Latham Boyle-a physicist and cosmologist at the Perimeter Institute for Theoretical Physics, Neil Turok of the University of Edinburgh and Latham Boyle of the Perimeter Institute, , Quantum Mechanics, The properties of our universe — smooth and flat-just a pinch of dark energy-are what we should expect to see according to a new calculation., The provocative conclusion rests on a mathematical trick involving switching to a clock that ticks with "imaginary numbers"., , Two physicists find that our universe has a higher entropy—and is therefore more likely—than alternative possible universes.,   

    From “Quanta Magazine” : “Why This Universe? Maybe It’s Not Special—Just Probable” 

    From “Quanta Magazine”

    1.22.23
    Charlie Wood

    Two physicists find that our universe has a higher entropy—and is therefore more likely—than alternative possible universes.

    1
    The properties of our universe — smooth and flat-just a pinch of dark energy-are what we should expect to see according to a new calculation. Illustration: Kouzou Sakai/Quanta Magazine.

    Cosmologists have spent decades striving to understand why our universe is so stunningly vanilla. Not only is it smooth and flat as far as we can see, but it’s also expanding at an ever-so-slowly increasing pace, when naive calculations suggest that—coming out of the Big Bang—space should have become crumpled up by gravity and blasted apart by repulsive dark energy.

    To explain the cosmos’s flatness, physicists have added a dramatic opening chapter to cosmic history: They propose that space rapidly inflated like a balloon at the start of the Big Bang, ironing out any curvature.

    ___________________________________________________________________
    Inflation

    In physical cosmology, cosmic inflation, cosmological inflation is a theory of exponential expansion of space in the early universe. The inflationary epoch lasted from 10^−36 seconds after the conjectured Big Bang singularity to some time between 10^−33 and 10^−32 seconds after the singularity. Following the inflationary period, the universe continued to expand, but at a slower rate. The acceleration of this expansion due to dark energy began after the universe was already over 7.7 billion years old (5.4 billion years ago).

    Inflation theory was developed in the late 1970s and early 80s, with notable contributions by several theoretical physicists, including Alexei Starobinsky at Landau Institute for Theoretical Physics, Alan Guth at Cornell University, and Andrei Linde at Lebedev Physical Institute. Alexei Starobinsky, Alan Guth, and Andrei Linde won the 2014 Kavli Prize “for pioneering the theory of cosmic inflation.” It was developed further in the early 1980s. It explains the origin of the large-scale structure of the cosmos. Quantum fluctuations in the microscopic inflationary region, magnified to cosmic size, become the seeds for the growth of structure in the Universe. Many physicists also believe that inflation explains why the universe appears to be the same in all directions (isotropic), why the cosmic microwave background radiation is distributed evenly, why the universe is flat, and why no magnetic monopoles have been observed.

    The detailed particle physics mechanism responsible for inflation is unknown. The basic inflationary paradigm is accepted by most physicists, as a number of inflation model predictions have been confirmed by observation; however, a substantial minority of scientists dissent from this position. The hypothetical field thought to be responsible for inflation is called the inflaton.

    In 2002 three of the original architects of the theory were recognized for their major contributions; physicists Alan Guth of M.I.T., Andrei Linde of Stanford, and Paul Steinhardt of Princeton shared the prestigious Dirac Prize “for development of the concept of inflation in cosmology”. In 2012 Guth and Linde were awarded the Breakthrough Prize in Fundamental Physics for their invention and development of inflationary cosmology.

    4
    Alan Guth, from M.I.T., who first proposed Cosmic Inflation.

    Alan Guth’s notes:
    Alan Guth’s original notes on inflation.
    ___________________________________________________________________
    And to explain the gentle growth of space following that initial spell of inflation, some have argued that our universe is just one among many less hospitable universes in a giant multiverse.

    Multiverse. Image credit: public domain, retrieved from https://pixabay.com/

    But now two physicists have turned the conventional thinking about our vanilla universe on its head. Following a line of research started by Stephen Hawking and Gary Gibbons in 1977, the duo has published a new calculation suggesting that the plainness of the cosmos is expected, rather than rare. Our universe is the way it is, according to Neil Turok of the University of Edinburgh and Latham Boyle of the Perimeter Institute for Theoretical Physics in Waterloo, Canada, for the same reason that air spreads evenly throughout a room: Weirder options are conceivable but exceedingly improbable.

    The universe “may seem extremely fine-tuned, extremely unlikely, but [they’re] saying, ‘Wait a minute, it’s the favored one,’” said Thomas Hertog, a cosmologist at the Catholic University of Leuven in Belgium.

    “It’s a novel contribution that uses different methods compared to what most people have been doing,” said Steffen Gielen, a cosmologist at the University of Sheffield in the United Kingdom.

    The provocative conclusion rests on a mathematical trick involving switching to a clock that ticks with “imaginary numbers”. Using the imaginary clock, as Hawking did in the ’70s, Turok and Boyle could calculate a quantity, known as entropy, that appears to correspond to our universe. But the imaginary time trick is a roundabout way of calculating entropy, and without a more rigorous method, the meaning of the quantity remains hotly debated. While physicists puzzle over the correct interpretation of the entropy calculation, many view it as a new guidepost on the road to the fundamental, quantum nature of space and time.

    “Somehow,” Gielen said, “it’s giving us a window into perhaps seeing the microstructure of space-time.”

    Imaginary Paths

    Turok and Boyle, frequent collaborators, are renowned for devising creative and unorthodox ideas about cosmology. Last year, to study how likely our universe may be, they turned to a technique developed in the ’40s by the physicist Richard Feynman.

    Aiming to capture the probabilistic behavior of particles, Feynman imagined that a particle explores all possible routes linking start to finish: a straight line, a curve, a loop, ad infinitum. He devised a way to give each path a number related to its likelihood and add all the numbers up. This “path integral” technique became a powerful framework for predicting how any quantum system would most likely behave.

    As soon as Feynman started publicizing the path integral, physicists spotted a curious connection with thermodynamics, the venerable science of temperature and energy. It was this bridge between quantum theory and thermodynamics that enabled Turok and Boyle’s calculation.

    3
    The South African physicist and cosmologist Neil Turok is a professor at the University of Edinburgh.Photograph: Gabriela Secara/Perimeter Institute.

    Thermodynamics leverages the power of statistics so that you can use just a few numbers to describe a system of many parts, such as the gajillion air molecules rattling around in a room. Temperature, for instance—essentially the average speed of air molecules—gives a rough sense of the room’s energy. Overall properties like temperature and pressure describe a “macrostate” of the room.

    But a macrostate is a crude account; air molecules can be arranged in a tremendous number of ways that all correspond to the same macrostate. Nudge one oxygen atom a bit to the left, and the temperature won’t budge. Each unique microscopic configuration is known as a microstate, and the number of microstates corresponding to a given macrostate determines its entropy.

    Entropy gives physicists a sharp way of comparing the odds of different outcomes: The higher the entropy of a macrostate, the more likely it is. There are vastly more ways for air molecules to arrange themselves throughout the whole room than if they’re bunched up in a corner, for instance. As a result, one expects air molecules to spread out (and stay spread out). The self-evident truth that probable outcomes are probable, couched in the language of physics, becomes the famous second law of thermodynamics: that the total entropy of a system tends to grow.

    The resemblance to the path integral was unmistakable: In thermodynamics, you add up all possible configurations of a system. And with the path integral, you add up all possible paths a system can take. There’s just one rather glaring distinction: Thermodynamics deals in probabilities, which are positive numbers that straightforwardly add together. But in the path integral, the number assigned to each path is complex, meaning that it involves the imaginary number i, the square root of −1. Complex numbers can grow or shrink when added together—allowing them to capture the wavelike nature of quantum particles, which can combine or cancel out.

    Yet physicists found that a simple transformation can take you from one realm to the other. Make time imaginary (a move known as a “Wick rotation” after the Italian physicist Gian Carlo Wick), and a second i enters the path integral that snuffs out the first one, turning imaginary numbers into real probabilities. Replace the time variable with the inverse of temperature, and you get a well-known thermodynamic equation.

    This Wick trick led to a blockbuster finding by Hawking and Gibbons in 1977, at the end of a whirlwind series of theoretical discoveries about space and time.

    The Entropy of Space-Time

    Decades earlier, Albert Einstein’s General Theory of Relativity had revealed that space and time together form a unified fabric of reality—space-time—and that the force of gravity is really the tendency for objects to follow the folds in space-time. In extreme circumstances, space-time can curve steeply enough to create an inescapable Alcatraz known as a black hole.

    In 1973, Jacob Bekenstein advanced the heresy [Physical Review D (below)] that black holes are imperfect cosmic prisons. He reasoned that the abysses should absorb the entropy of their meals, rather than deleting that entropy from the universe and violating the second law of thermodynamics. But if black holes have entropy, they must also have temperatures and must radiate heat.

    A skeptical Stephen Hawking tried to prove Bekenstein wrong, embarking on an intricate calculation of how quantum particles behave in the curved space-time of a black hole. To his surprise, in 1974 he found that black holes do indeed radiate. Another calculation confirmed Bekenstein’s guess: A black hole has entropy equal to one-quarter the area of its event horizon—the point of no return for an infalling object.

    In the years that followed, the British physicists Malcolm Perry and Gibbons, and later Gibbons and Hawking, arrived at the same result from another direction. They set up a path integral, in principle adding up all the different ways space-time might bend to make a black hole. Next, they Wick-rotated the black hole, marking the flow of time with imaginary numbers, and scrutinized its shape. They discovered that in the imaginary time direction the black hole periodically returned to its initial state. This Groundhog Day-like repetition in imaginary time gave the black hole a sort of stasis that allowed them to calculate its temperature and entropy.

    They might not have trusted the results if the answers had not precisely matched those calculated earlier by Bekenstein and Hawking. By the end of the decade, their collective work had yielded a startling notion: The entropy of black holes implied that space-time itself is made of tiny, rearrangeable pieces, much as air is made of molecules. And miraculously, even without knowing what these “gravitational atoms” were, physicists could count their arrangements by looking at a black hole in imaginary time.

    “It’s that result which left a deep, deep impression on Hawking,” said Hertog, Hawking’s former graduate student and longtime collaborator. Hawking immediately wondered if the Wick rotation would work for more than just black holes. “If that geometry captures a quantum property of a black hole,” Hertog said, “then it’s irresistible to do the same with the cosmological properties of the whole universe.”

    Counting All Possible Universes

    Right away, Hawking and Gibbons Wick-rotated one of the simplest imaginable universes—one containing nothing but the dark energy built into space itself. This empty, expanding universe, called a “de Sitter” space-time, has a horizon, beyond which space expands so quickly that no signal from there will ever reach an observer in the center of the space. In 1977, Gibbons and Hawking calculated that, like a black hole, a de Sitter universe also has an entropy equal to one-fourth its horizon’s area. Again, space-time seemed to have a countable number of microstates.

    But the entropy of the actual universe remained an open question. Our universe is not empty; it brims with radiating light and streams of galaxies and dark matter. Light drove a brisk expansion of space during the universe’s youth, then the gravitational attraction of matter slowed things to a crawl during cosmic adolescence. Now dark energy appears have taken over, driving a runaway expansion. “That expansion history is a bumpy ride,” Hertog said. “To get an explicit solution is not so easy.”

    Over the past year or so, Boyle and Turok have built just such an explicit solution. First, in January, while playing with toy cosmologies, they noticed that adding radiation to de Sitter space-time didn’t spoil the simplicity required to Wick-rotate the universe.

    Then over the summer they discovered that the technique would withstand even the messy inclusion of matter. The mathematical curve describing the more complicated expansion history still fell into a particular group of easy-to-handle functions, and the world of thermodynamics remained accessible. “This Wick rotation is murky business when you move away from very symmetric space-time,” said Guilherme Leite Pimentel, a cosmologist at the Scuola Normale Superiore in Pisa, Italy. “But they managed to find it.”

    By Wick-rotating the roller-coaster expansion history of a more realistic class of universes, they got a more versatile equation for cosmic entropy. For a wide range of cosmic macrostates defined by radiation, matter, curvature, and a dark energy density (much as a range of temperatures and pressures define different possible environments of a room), the formula spits out the number of corresponding microstates. Turok and Boyle posted their results online in early October.

    3
    Latham Boyle, a physicist and cosmologist at the Perimeter Institute for Theoretical Physics, coauthored a new calculation about the relative likelihoods of different universes. Photograph: Gabriela Secara/Perimeter Institute.

    Experts have praised the explicit, quantitative result. But from their entropy equation, Boyle and Turok have drawn an unconventional conclusion about the nature of our universe. “That’s where it becomes a little more interesting, and a little more controversial,” Hertog said.

    Boyle and Turok believe the equation conducts a census of all conceivable cosmic histories. Just as a room’s entropy counts all the ways of arranging the air molecules for a given temperature, they suspect their entropy counts all the ways one might jumble up the atoms of space-time and still end up with a universe with a given overall history, curvature, and dark energy density.

    Boyle likens the process to surveying a gigantic sack of marbles, each a different universe. Those with negative curvature might be green. Those with tons of dark energy might be cat’s-eyes, and so on. Their census reveals that the overwhelming majority of the marbles have just one color—blue, say—corresponding to one type of universe: one broadly like our own, with no appreciable curvature and just a touch of dark energy. Weirder types of cosmos are vanishingly rare. In other words, the strangely vanilla features of our universe that have motivated decades of theorizing about cosmic inflation and the multiverse may not be strange at all.

    Counting Confusion

    Boyle and Turok have calculated an equation that counts universes. And they’ve made the striking observation that universes like ours seem to account for the lion’s share of the conceivable cosmic options. But that’s where the certainty ends.

    The duo make no attempt to explain what quantum theory of gravity and cosmology might make certain universes common or rare. Nor do they explain how our universe, with its particular configuration of microscopic parts, came into being. Ultimately, they view their calculation as more of a clue to which sorts of universes are preferred than anything close to a full theory of cosmology. “What we’ve used is a cheap trick to get the answer without knowing what the theory is,” Turok said.

    Their work also revitalizes a question that has gone unanswered since Gibbons and Hawking first kicked off the whole business of space-time entropy: What exactly are the microstates that the cheap trick is counting?

    “The key thing here is to say that we don’t know what that entropy means,” said Henry Maxfield, a physicist at Stanford University who studies quantum theories of gravity.

    At its heart, entropy encapsulates ignorance. For a gas made of molecules, for instance, physicists know the temperature—the average speed of particles—but not what every particle is doing; the gas’s entropy reflects the number of options.

    After decades of theoretical work, physicists are converging on a similar picture for black holes. Many theorists now believe that the area of the horizon describes their ignorance of the stuff that’s fallen in—all the ways of internally arranging the building blocks of the black hole to match its outward appearance. (Researchers still don’t know what the microstates actually are; ideas include configurations of the particles called gravitons or the strings of string theory.)

    4
    A recent calculation by Ted Jacobson, top, and Batoul Banihashemi of the University of Maryland offers a possible interpretation of the entropy of de Sitter space. Courtesy of Ted Jacobson; Courtesy of Batoul Banihashemi.

    But when it comes to the entropy of the universe, physicists feel less certain about where their ignorance even lies.

    In April, two theorists attempted to put cosmological entropy on a firmer mathematical footing. Ted Jacobson, a physicist at the University of Maryland renowned for deriving Einstein’s theory of gravity from black hole thermodynamics, and his graduate student Batoul Banihashemi explicitly defined the entropy of a (vacant, expanding) de Sitter universe. They adopted the perspective of an observer at the center. Their technique, which involved adding a fictitious surface between the central observer and the horizon, then shrinking the surface until it reached the central observer and disappeared, recovered the Gibbons and Hawking answer that entropy equals one-quarter of the horizon area. They concluded that the de Sitter entropy counts all possible microstates inside the horizon.

    Turok and Boyle calculate the same entropy as Jacobson and Banihashemi for an empty universe. But in their new calculation pertaining to a realistic universe filled with matter and radiation, they get a much larger number of microstates—proportional to volume and not area. Faced with this apparent clash, they speculate that the different entropies answer different questions: The smaller de Sitter entropy counts microstates of pure space-time bounded by a horizon, while they suspect their larger entropy counts all the microstates of a space-time filled with matter and energy, both inside and outside the horizon. “It’s the whole shebang,” Turok said.

    Ultimately, settling the question of what Boyle and Turok are counting will require a more explicit mathematical definition of the ensemble of microstates, analogous to what Jacobson and Banihashemi have done for de Sitter space. Banihashemi said she views Boyle and Turok’s entropy calculation “as an answer to a question that is yet to be fully understood.”

    As for more established answers to the question “Why this universe?” cosmologists say inflation and the multiverse are far from dead. Modern inflation theory, in particular, has come to solve more than just the universe’s smoothness and flatness. Observations of the sky match many of its other predictions. Turok and Boyle’s entropic argument has passed a notable first test, Pimentel said, but it will have to nail other, more detailed data to more seriously rival inflation.

    As befits a quantity that measures ignorance, mysteries rooted in entropy have served as harbingers of unknown physics before. In the late 1800s, a precise understanding of entropy in terms of microscopic arrangements helped confirm the existence of atoms. Today, the hope is that if the researchers calculating cosmological entropy in different ways can work out exactly what questions they’re answering, those numbers will guide them toward a similar understanding of how Lego bricks of time and space pile up to create the universe that surrounds us.

    “What our calculation does is provide huge extra motivation for people who are trying to build microscopic theories of quantum gravity,” Turok said. “Because the prospect is that that theory will ultimately explain the large-scale geometry of the universe.”

    “It’s a very intriguing result,” Hertog said. But “it raises more questions than it answers.”

    Physical Review D 1973
    posted their results online 2022

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 4:17 pm on January 21, 2023 Permalink | Reply
    Tags: , , , Quantum Mechanics, , "New spectrometer for extreme UV and soft X-ray light enables novel research", Probe new and exciting processes on the atomic scale., The spectrometer will give scientists the ability to look at XUV light emitted from atoms and molecules.   

    From The European XFEL(DE): “New spectrometer for extreme UV and soft X-ray light enables novel research” 

    XFEL bloc

    From The European XFEL(DE)

    1.18.23

    Science contact:

    Michael Meyer
    Tel: +49-40-8998-5614
    E-mail: michael.meyer@xfel.eu

    Press contact:

    Bernd Ebeling
    Tel: +49-40-8998-6921
    E-mail: pr@xfel.eu 

    1
    Researchers from European XFEL and Uppsala University standing behind the new 1D-imaging XUV spectrometer at SQS. Left to right: T. Baumann (EuXFEL), J.-E. Rubensson (U. Uppsala), M. Meyer (EuXFEL), and M. Agåkar (U. Uppsala).

    A new spectrometer at the European XFEL’s small quantum systems (SQS) instrument will measure soft x-ray radiation and extreme ultraviolet (XUV) light generated by gaseous samples after interaction with intense XFEL pulses. This enables fresh avenues of research for the instrument. The spectrometer was built by a collaboration involving scientists from European XFEL and Uppsala University in Sweden, and will allow scientists at SQS to probe new and exciting processes on the atomic scale.

    “The spectrometer will give us the ability to look at XUV light emitted from atoms and molecules. Its unique capability to image along the interaction zone enables us to study the effect of European XFEL’s intense X-ray radiation as it travels through dense gases,” says Michael Meyer, leading scientist of the SQS instrument. “It will offer new possibilities to study fundamental processes in the interaction of x-ray radiation with matter.”

    Radiation with wavelengths in the extreme ultraviolet (XUV) range is emitted upon excitation or ionization of a sample by the European XFEL pulses. Spectroscopy of these emitted XUV photons is an ideal tool for studying the quantum mechanical properties of the sample in its interaction with the intense X-ray pulses. This is particularly useful in comparison with other techniques based on electron or ion spectroscopy as the photons are not severely impacted by the charged particles created during the interaction.

    “At SQS we study fundamental properties of atomic and molecular systems, predominantly looking at electrons and ions. The new spectrometer complements these techniques and helps us to better understand physics on the very small scale,” says Thomas Baumann, scientist at SQS.

    2
    The new XUV spectrometer at the SQS instrument station.

    Find out more about the SQS instrument at: https://www.xfel.eu/facility/instruments/sqs/index_eng.html 

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    XFEL Campus

    The European XFEL(DE) is an X-ray research laser facility commissioned during 2017. The first laser pulses were produced in May 2017 and the facility started user operation in September 2017. The international project with twelve participating countries; nine shareholders at the time of commissioning (Denmark, France, Germany, Hungary, Poland, Russia, Slovakia, Sweden and Switzerland), later joined by three other partners (Italy, Spain and the United Kingdom), is located in the German federal states of Hamburg and Schleswig-Holstein. A free-electron laser generates high-intensity electromagnetic radiation by accelerating electrons to relativistic speeds and directing them through special magnetic structures. The European XFEL is constructed such that the electrons produce X-ray light in synchronization, resulting in high-intensity X-ray pulses with the properties of laser light and at intensities much brighter than those produced by conventional synchrotron light sources.


    The Hamburg area will soon boast a research facility of superlatives: The European XFEL (DE)) will generate ultrashort X-ray flashes—27 000 times per second and with a brilliance that is a billion times higher than that of the best conventional X-ray radiation sources.

    The outstanding characteristics of the facility are unique worldwide. Started in 2017, it will open up completely new research opportunities for scientists and industrial users.

     
  • richardmitnick 4:21 pm on January 20, 2023 Permalink | Reply
    Tags: "Can you trust your quantum simulator?", A new technique helps verify the accuracy of experiments that probe the strange behavior of atomic-scale systems., Laboratory experiments involve super-cooling tens to hundreds of atoms and probing them with finely tuned lasers and magnets., , Quantum Mechanics, , Researchers could determine the accuracy of a quantum simulator by analyzing its random fluctuations., Researchers have to be sure that their quantum device has “high fidelity” and accurately reflects quantum behavior., Researchers have used quantum randomness as a tool to characterize the fidelity of a quantum analog simulator., Scientists hope that any new understanding gained from quantum simulators will provide blueprints for new exotic materials and smarter and more efficient electronics and practical quantum computers., , The team developed a new benchmarking protocol that can be applied to existing quantum analog simulators to gauge their fidelity based on their pattern of quantum fluctuations., Until now there has been no reliable way to characterize the fidelity of quantum analog simulators.   

    From The Massachusetts Institute of Technology: “Can you trust your quantum simulator?” 

    From The Massachusetts Institute of Technology

    1.18.23
    Jennifer Chu

    A new technique helps verify the accuracy of experiments that probe the strange behavior of atomic-scale systems.

    1
    MIT physicists have developed a protocol to verify the accuracy of quantum experiments. Image: Jose-Luis Olivares, MIT, with images from iStock.

    At the scale of individual atoms, physics gets weird. Researchers are working to reveal, harness, and control these strange quantum effects using quantum analog simulators — laboratory experiments that involve super-cooling tens to hundreds of atoms and probing them with finely tuned lasers and magnets.

    Scientists hope that any new understanding gained from quantum simulators will provide blueprints for designing new exotic materials, smarter and more efficient electronics, and practical quantum computers. But in order to reap the insights from quantum simulators, scientists first have to trust them.

    That is, they have to be sure that their quantum device has “high fidelity” and accurately reflects quantum behavior. For instance, if a system of atoms is easily influenced by external noise, researchers could assume a quantum effect where there is none. But there has been no reliable way to characterize the fidelity of quantum analog simulators, until now.

    In a study appearing today in Nature [below], physicists from MIT and Caltech report a new quantum phenomenon: They found that there is a certain randomness in the quantum fluctuations of atoms and that this random behavior exhibits a universal, predictable pattern. Behavior that is both random and predictable may sound like a contradiction. But the team confirmed that certain random fluctuations can indeed follow a predictable, statistical pattern.

    What’s more, the researchers have used this quantum randomness as a tool to characterize the fidelity of a quantum analog simulator. They showed through theory and experiments that they could determine the accuracy of a quantum simulator by analyzing its random fluctuations.

    The team developed a new benchmarking protocol that can be applied to existing quantum analog simulators to gauge their fidelity based on their pattern of quantum fluctuations. The protocol could help to speed the development of new exotic materials and quantum computing systems.

    “This work would allow characterizing many existing quantum devices with very high precision,” says study co-author Soonwon Choi, assistant professor of physics at MIT. “It also suggests there are deeper theoretical structures behind the randomness in chaotic quantum systems than we have previously thought about.”

    The study’s authors include MIT graduate student Daniel Mark and collaborators at Caltech, the University of Illinois- Urbana-Champaign, Harvard University, and the University of California-Berkeley.

    Nature

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    4

    The Computer Science and Artificial Intelligence Laboratory (CSAIL)

    From The Kavli Institute For Astrophysics and Space Research

    MIT’s Institute for Medical Engineering and Science is a research institute at the Massachusetts Institute of Technology

    The MIT Laboratory for Nuclear Science

    The MIT Media Lab

    The MIT School of Engineering

    The MIT Sloan School of Management

    Spectrum

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 9:58 pm on January 19, 2023 Permalink | Reply
    Tags: "Quantum simulator enables first microscopic observation of charge carriers pairing", An antiferromagnetic structure was created which is characteristic of many high-temperature superconductors – and stabilized by magnetic interactions., , , Crucial to any kind of superconductivity is the formation of tightly linked pairs of charge carriers - electrons or holes as electrons vacancies are called., Each electron or hole carries a half-integer spin – a quantum physical quantity that can be imagined as a measure of a particle's internal rotation., Holes (positive charge carriers) in a solid-state model combined to form pairs., Holes emerged into the lattice-like structure., New microscopic insight into processes that may underlie these so-called unconventional superconductors., , Quantum Mechanics, Resistance-free transport of electric current in high-temperature superconductors., The exact physical mechanisms in these complex materials are still largely unknown., , The spins of the atoms arranged themselves with alternating directions., The team arranged ultracold atoms in a vacuum with laser light in such a way that they simulate the electrons in a simplified solid-state model., The team at MPQ now could show that the magnetic forces indeed lead to pairs., Theories assume that the cause for the pair formation and thus for the phenomenon of superconductivity lies in magnetic forces.   

    From The MPG Institute for Quantum Optics [MPG Institut für Quantenoptik] (DE) : “Quantum simulator enables first microscopic observation of charge carriers pairing” 

    Max Planck Institut für Quantenoptik (DE)

    From The MPG Institute for Quantum Optics [MPG Institut für Quantenoptik] (DE)

    1.19.23

    Sarah Hirthe
    Doctoral candidate
    +49 89 32905-713
    sarah.hirthe@mpq.mpg.de
    Max Planck Institute of Quantum Optics, Garching

    Prof. Dr Immanuel Bloch
    Director
    +49 89 32905-238
    immanuel.bloch@mpq.mpg.de
    Max Planck Institute of Quantum Optics, Garching

    Katharina Jarrah
    PR and Communications
    +49 89 32905-213
    katharina.jarrah@mpq.mpg.de
    Max Planck Institute of Quantum Optics, Garching

    1.19.23

    A team of researchers at the MPQ has for the first time monitored in an experiment how holes (positive charge carriers) in a solid-state model combined to form pairs. This process could play an important role in understanding high-temperature superconductivity.

    1
    Binding mechanism in a magnetically ordered system. The red and blue spheres are spins of opposite orientations, the shaded bands connecting the spheres show the magnetic order. The white spheres are holes. When a hole moves as shown in (i) and (ii), it perturbs the magnetic order. However, if a second hole is connected to the first, as in (iii), the magnetic order is maintained despite movement. The holes thus pair to maintain the magnetic order in the system. © MPQ.

    Using a quantum simulator, researchers at the Max Planck Institute of Quantum Optics (MPQ) have observed pairs of charge carriers that may be responsible for the resistance-free transport of electric current in high-temperature superconductors. So far, the exact physical mechanisms in these complex materials are still largely unknown. Theories assume that the cause for the pair formation and thus for the phenomenon of superconductivity lies in magnetic forces. The team in Garching has now for the first time been able to demonstrate pairs which are formed this way. Their experiment was based on a lattice-like arrangement of cold atoms, as well as on a tricky suppression of the movement of free charge carriers. The researchers report on their results in the journal Nature.

    Since the discovery of high-temperature superconductors almost 40 years ago, scientists have been trying to track down their fundamental quantum-physical mechanisms. But the complex materials still pose mysteries. The new findings of a team in the Quantum Many-Body Systems Department at MPQ in Garching now provide new microscopic insight into processes that may underlie these so-called unconventional superconductors.

    Crucial to any kind of superconductivity is the formation of tightly linked pairs of charge carriers – electrons or holes, as electrons vacancies are called. “The reason for this lies in quantum mechanics,” explains MPQ physicist Sarah Hirthe: each electron or hole carries a half-integer spin – a quantum physical quantity that can be imagined as a measure of a particle’s internal rotation. Atoms also have a spin. For quantum statistical reasons, however, only particles with an integer spin can move through a crystal lattice without resistance under certain conditions. “Therefore, electrons or holes have to pair up to do this,” says Hirthe. In conventional superconductors, lattice vibrations called phonons help with pairing. In non-conventional superconductors, on the other hand, a different mechanism is at work – but the question of which one it is has remained unanswered until now. “In a widely accepted theory, indirect magnetic forces play a crucial role,” Sarah Hirthe reports. “But this could not be confirmed in experiments so far.”

    Solid state model spiked with holes

    To better understand the processes in such materials, the researchers used a quantum simulator: a kind of quantum computer that recreates physical systems. To do this, they arranged ultracold atoms in a vacuum with laser light in such a way that they simulate the electrons in a simplified solid-state model. In the process, the spins of the atoms arranged themselves with alternating directions: an antiferromagnetic structure was created which is characteristic of many high-temperature superconductors – and stabilized by magnetic interactions. The team then “doped” this model by reducing the number of atoms in the system. In that way, holes emerged into the lattice-like structure.

    The team at MPQ now could show that the magnetic forces indeed lead to pairs. To achieve this, they used an experimental trick. “Moving charge carriers in a material like high-temperature superconductors are subject to a competition of different forces,” explains Hirthe. On the one hand, they have the urge to spread out, i.e. to be everywhere at the same time. This gives them an energetic advantage. On the other hand, magnetic interactions ensure a regular arrangement of the spin states of atoms, electrons and holes – and presumably also the formation of charge carrier pairs. However: “The competition of forces has so far prevented us from observing such pairs microscopically,” says Timon Hilker, leader of the research group. “That’s why we had the idea of preventing the disruptive movement of the charge carriers in one spatial direction.”

    A close look through the quantum gas microscope

    This way, the magnetic forces were, to a large extent, undisturbed. The result: holes that came close to each other formed the expected pairs. To observe such pairing, the team used a quantum gas microscope – a device with which quantum mechanical processes can be followed in detail. Not only were the hole pairs revealed , but the relative arrangement of the pairs was also observed, suggesting repulsive forces between them. The team reports on their work in the scientific journal “Nature”. “The results underline the idea that the loss of electrical resistance in non-conventional superconductors is caused by magnetic forces,” emphasizes Prof Immanuel Bloch, Director at MPQ and Head of the Quantum Many-Body Systems Division. “This leads to a better understanding of these extraordinary materials and shows a new way of how stable hole pairs can form even at very high temperatures, potentially significantly increasing the critical temperature of superconductors”.

    The researchers at the Max Planck Institute of Quantum Optics now plan new experiments on more complex models in which large two-dimensional arrays of atoms are connected. Such larger systems will hopefully create more hole pairs and allow for the observation of their motion through the lattice: the transport of electric current without resistance due to superconductivity.

    Nature
    See the science paper for instructive material with images.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition


    Research at the The MPG Institute for Quantum Optics [MPG Institut für Quantenoptik ] (DE)

    Light can behave as an electromagnetic wave or a shower of particles that have no mass, called photons, depending on the conditions under which it is studied or used. Matter, on the other hand, is composed of particles, but it can actually exhibit wave-like properties, giving rise to many astonishing phenomena in the microcosm.

    At our institute we explore the interaction of light and quantum systems, exploiting the two extreme regimes of the wave-particle duality of light and matter. On the one hand we handle light at the single photon level where wave-interference phenomena differ from those of intense light beams. On the other hand, when cooling ensembles of massive particles down to extremely low temperatures we suddenly observe phenomena that go back to their wave-like nature. Furthermore, when dealing with ultrashort and highly intense light pulses comprising trillions of photons we can completely neglect the particle properties of light. We take advantage of the large force that the rapidly oscillating electromagnetic field exerts on electrons to steer their motion within molecules or accelerate them to relativistic energies.

    The MPG Society for the Advancement of Science [MPG Gesellschaft zur Förderung der Wissenschaften e. V.] is a formally independent non-governmental and non-profit association of German research institutes founded in 1911 as the Kaiser Wilhelm Society and renamed the Max Planck Society in 1948 in honor of its former president, theoretical physicist Max Planck. The society is funded by the federal and state governments of Germany as well as other sources.

    According to its primary goal, the MPG Society supports fundamental research in the natural, life and social sciences, the arts and humanities in its 83 (as of January 2014) MPG Institutes. The society has a total staff of approximately 17,000 permanent employees, including 5,470 scientists, plus around 4,600 non-tenured scientists and guests. Society budget for 2015 was about €1.7 billion.

    The MPG Institutes focus on excellence in research. The MPG Society has a world-leading reputation as a science and technology research organization, with 33 Nobel Prizes awarded to their scientists, and is generally regarded as the foremost basic research organization in Europe and the world. In 2013, the Nature Publishing Index placed the MPG institutes fifth worldwide in terms of research published in Nature journals (after Harvard University, The Massachusetts Institute of Technology, Stanford University and The National Institutes of Health). In terms of total research volume (unweighted by citations or impact), the Max Planck Society is only outranked by The Chinese Academy of Sciences [中国科学院](CN), The Russian Academy of Sciences [Росси́йская акаде́мия нау́к](RU) and Harvard University. The Thomson Reuters-Science Watch website placed the MPG Society as the second leading research organization worldwide following Harvard University, in terms of the impact of the produced research over science fields.

    The MPG Society and its predecessor Kaiser Wilhelm Society hosted several renowned scientists in their fields, including Otto Hahn, Werner Heisenberg, and Albert Einstein.

    History

    The organization was established in 1911 as the Kaiser Wilhelm Society, or Kaiser-Wilhelm-Gesellschaft (KWG), a non-governmental research organization named for the then German emperor. The KWG was one of the world’s leading research organizations; its board of directors included scientists like Walther Bothe, Peter Debye, Albert Einstein, and Fritz Haber. In 1946, Otto Hahn assumed the position of President of KWG, and in 1948, the society was renamed the Max Planck Society (MPG) after its former President (1930–37) Max Planck, who died in 1947.

    The MPG Society has a world-leading reputation as a science and technology research organization. In 2006, the Times Higher Education Supplement rankings of non-university research institutions (based on international peer review by academics) placed the MPG Society as No.1 in the world for science research, and No.3 in technology research (behind AT&T Corporation and The DOE’s Argonne National Laboratory.

    The domain mpg.de attracted at least 1.7 million visitors annually by 2008 according to a Compete.com study.

    MPG Institutes and research groups

    The MPG Society consists of over 80 research institutes. In addition, the society funds a number of Max Planck Research Groups (MPRG) and International Max Planck Research Schools (IMPRS). The purpose of establishing independent research groups at various universities is to strengthen the required networking between universities and institutes of the Max Planck Society.
    The research units are primarily located across Europe with a few in South Korea and the U.S. In 2007, the Society established its first non-European centre, with an institute on the Jupiter campus of Florida Atlantic University (US) focusing on neuroscience.
    The MPG Institutes operate independently from, though in close cooperation with, the universities, and focus on innovative research which does not fit into the university structure due to their interdisciplinary or transdisciplinary nature or which require resources that cannot be met by the state universities.

    Internally, MPG Institutes are organized into research departments headed by directors such that each MPI has several directors, a position roughly comparable to anything from full professor to department head at a university. Other core members include Junior and Senior Research Fellows.

    In addition, there are several associated institutes:

    International Max Planck Research Schools

    Together with the Association of Universities and other Education Institutions in Germany, the Max Planck Society established numerous International Max Planck Research Schools (IMPRS) to promote junior scientists:

    • Cologne Graduate School of Ageing Research, Cologne
    • International Max Planck Research School for Intelligent Systems, at the Max Planck Institute for Intelligent Systems located in Tübingen and Stuttgart
    • International Max Planck Research School on Adapting Behavior in a Fundamentally Uncertain World (Uncertainty School), at the Max Planck Institutes for Economics, for Human Development, and/or Research on Collective Goods
    • International Max Planck Research School for Analysis, Design and Optimization in Chemical and Biochemical Process Engineering, Magdeburg
    • International Max Planck Research School for Astronomy and Cosmic Physics, Heidelberg at the MPI for Astronomy
    • International Max Planck Research School for Astrophysics, Garching at the MPI for Astrophysics
    • International Max Planck Research School for Complex Surfaces in Material Sciences, Berlin
    • International Max Planck Research School for Computer Science, Saarbrücken
    • International Max Planck Research School for Earth System Modeling, Hamburg
    • International Max Planck Research School for Elementary Particle Physics, Munich, at the MPI for Physics
    • International Max Planck Research School for Environmental, Cellular and Molecular Microbiology, Marburg at the Max Planck Institute for Terrestrial Microbiology
    • International Max Planck Research School for Evolutionary Biology, Plön at the Max Planck Institute for Evolutionary Biology
    • International Max Planck Research School “From Molecules to Organisms”, Tübingen at the Max Planck Institute for Developmental Biology
    • International Max Planck Research School for Global Biogeochemical Cycles, Jena at the Max Planck Institute for Biogeochemistry
    • International Max Planck Research School on Gravitational Wave Astronomy, Hannover and Potsdam MPI for Gravitational Physics
    • International Max Planck Research School for Heart and Lung Research, Bad Nauheim at the Max Planck Institute for Heart and Lung Research
    • International Max Planck Research School for Infectious Diseases and Immunity, Berlin at the Max Planck Institute for Infection Biology
    • International Max Planck Research School for Language Sciences, Nijmegen
    • International Max Planck Research School for Neurosciences, Göttingen
    • International Max Planck Research School for Cognitive and Systems Neuroscience, Tübingen
    • International Max Planck Research School for Marine Microbiology (MarMic), joint program of the Max Planck Institute for Marine Microbiology in Bremen, the University of Bremen, the Alfred Wegener Institute for Polar and Marine Research in Bremerhaven, and the Jacobs University Bremen
    • International Max Planck Research School for Maritime Affairs, Hamburg
    • International Max Planck Research School for Molecular and Cellular Biology, Freiburg
    • International Max Planck Research School for Molecular and Cellular Life Sciences, Munich
    • International Max Planck Research School for Molecular Biology, Göttingen
    • International Max Planck Research School for Molecular Cell Biology and Bioengineering, Dresden
    • International Max Planck Research School Molecular Biomedicine, program combined with the ‘Graduate Programm Cell Dynamics And Disease’ at the University of Münster and the Max Planck Institute for Molecular Biomedicine
    • International Max Planck Research School on Multiscale Bio-Systems, Potsdam
    • International Max Planck Research School for Organismal Biology, at the University of Konstanz and the Max Planck Institute for Ornithology
    • International Max Planck Research School on Reactive Structure Analysis for Chemical Reactions (IMPRS RECHARGE), Mülheim an der Ruhr, at the Max Planck Institute for Chemical Energy Conversion
    • International Max Planck Research School for Science and Technology of Nano-Systems, Halle at Max Planck Institute of Microstructure Physics
    • International Max Planck Research School for Solar System Science at the University of Göttingen hosted by MPI for Solar System Research
    • International Max Planck Research School for Astronomy and Astrophysics, Bonn, at the MPI for Radio Astronomy (formerly the International Max Planck Research School for Radio and Infrared Astronomy)
    • International Max Planck Research School for the Social and Political Constitution of the Economy, Cologne
    • International Max Planck Research School for Surface and Interface Engineering in Advanced Materials, Düsseldorf at Max Planck Institute for Iron Research GmbH
    • International Max Planck Research School for Ultrafast Imaging and Structural Dynamics, Hamburg

    Max Planck Schools

    • Max Planck School of Cognition
    • Max Planck School Matter to Life
    • Max Planck School of Photonics

    Max Planck Center

    • The Max Planck Centre for Attosecond Science (MPC-AS), POSTECH Pohang
    • The Max Planck POSTECH Center for Complex Phase Materials, POSTECH Pohang

    Max Planck Institutes

    Among others:
    • Max Planck Institute for Neurobiology of Behavior – caesar, Bonn
    • Max Planck Institute for Aeronomics in Katlenburg-Lindau was renamed to Max Planck Institute for Solar System Research in 2004;
    • Max Planck Institute for Biology in Tübingen was closed in 2005;
    • Max Planck Institute for Cell Biology in Ladenburg b. Heidelberg was closed in 2003;
    • Max Planck Institute for Economics in Jena was renamed to the Max Planck Institute for the Science of Human History in 2014;
    • Max Planck Institute for Ionospheric Research in Katlenburg-Lindau was renamed to Max Planck Institute for Aeronomics in 1958;
    • Max Planck Institute for Metals Research, Stuttgart
    • Max Planck Institute of Oceanic Biology in Wilhelmshaven was renamed to Max Planck Institute of Cell Biology in 1968 and moved to Ladenburg 1977;
    • Max Planck Institute for Psychological Research in Munich merged into the Max Planck Institute for Human Cognitive and Brain Sciences in 2004;
    • Max Planck Institute for Protein and Leather Research in Regensburg moved to Munich 1957 and was united with the Max Planck Institute for Biochemistry in 1977;
    • Max Planck Institute for Virus Research in Tübingen was renamed as Max Planck Institute for Developmental Biology in 1985;
    • Max Planck Institute for the Study of the Scientific-Technical World in Starnberg (from 1970 until 1981 (closed)) directed by Carl Friedrich von Weizsäcker and Jürgen Habermas.
    • Max Planck Institute for Behavioral Physiology
    • Max Planck Institute of Experimental Endocrinology
    • Max Planck Institute for Foreign and International Social Law
    • Max Planck Institute for Physics and Astrophysics
    • Max Planck Research Unit for Enzymology of Protein Folding
    • Max Planck Institute for Biology of Ageing

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: