Tagged: Quantum Mechanics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 5:59 pm on January 24, 2022 Permalink | Reply
    Tags: "Complex" numbers, "Complex" numbers are widely exploited in classical and relativistic physics., "Physics(US)", "Quantum Mechanics Must Be Complex", A basic starting point for quantum theory is to represent a particle state by a vector in a "complex"-valued space called a Hilbert space., , Early on the pioneers of quantum mechanics abandoned the attempt to develop a quantum theory based on real numbers because they thought it impractical., Polarization-entangled photons generated by parametric down-conversion and detected in superconducting nanowire single-photon detectors., , Quantum Mechanics, , Recent theoretical results suggested that a real-valued quantum theory could describe an unexpectedly broad range of quantum systems., Superconducting quantum processors in which the qubits have individual control and readout., The lack of a general proof left open some paths for refuting the equivalence between “complex” and “real” quantum theories., The possibility of using real numbers was never formally ruled out., This real-number approach has now been squashed by two independent experiments., Two teams show that within a standard formulation of quantum mechanics "complex" numbers are indispensable for describing experiments carried out on simple quantum networks.   

    From Physics(US): “Quantum Mechanics Must Be Complex” 

    About Physics

    From Physics(US)

    January 24, 2022

    Alessio Avella, The National Institute of Metrological Research [Istituto Nazionale di Ricerca Metrologica](IT)

    Two independent studies demonstrate that a formulation of quantum mechanics involving “complex” rather than real numbers is necessary to reproduce experimental results.

    1
    Credit: Carin Cain/American Physical Society(US)
    Figure 1: Conceptual sketch of the three-party game used by [Chen and colleagues] and [Li and colleagues] to demonstrate that a real quantum theory cannot describe certain measurements on small quantum networks. The game involves two sources distributing entangled qubits to three observers, who calculate a “score” from measurements performed on the qubits. In both experiments, the obtained score isn’t compatible with a real-valued, traditional formulation of quantum mechanics.

    “Complex” numbers are widely exploited in classical and relativistic physics. In electromagnetism, for instance, they tremendously simplify the description of wave-like phenomena. However, in these physical theories, “complex” numbers aren’t strictly needed, as all meaningful observables can be expressed in terms of real numbers. Thus, “complex” analysis is just a powerful computational tool. But are “complex” numbers essential in quantum physics—where the mathematics (the Schrödinger equation, the Hilbert space, etc.) is intrinsically “complex”-valued? This simple question has accompanied the development of quantum mechanics since its origins, when Schrödinger, Lorentz, and Planck debated it in their correspondence [1]. But early on, the pioneers of quantum mechanics abandoned the attempt to develop a quantum theory based on real numbers because they thought it impractical. However, the possibility of using real numbers was never formally ruled out, and recent theoretical results suggested that a real-valued quantum theory could describe an unexpectedly broad range of quantum systems [2]. But this real-number approach has now been squashed by two independent experiments, performed by Ming-Cheng Chen of The University of Science and Technology [中国科学技术大学](CN) at Chinese Academy of Sciences [中国科学院](CN) [3] and by Zheng-Da Li of The Southern University of Science and Technology[南方科技大學](CN) [4]. The two teams show that within a standard formulation of quantum mechanics “complex” numbers are indispensable for describing experiments carried out on simple quantum networks.

    A basic starting point for quantum theory is to represent a particle state by a vector in a “complex”-valued space called a Hilbert space. However, for a single, isolated quantum system, finding a description based purely on real numbers is straightforward: It can simply be obtained by doubling the dimension of the Hilbert space, as the space of complex numbers is equivalent, or “isomorphic,” to a two-dimensional, real plane, with the two dimensions representing the real and imaginary part of “complex” numbers, respectively. The problem becomes less trivial when we consider the unique quantum correlations, such as entanglement, that arise in quantum mechanics. These correlations can violate the principle of local realism, as proven by so-called Bell inequality tests [5]. Violations of Bell tests may appear to require “complex” values for their description [6]. But in 2009, a theoretical work demonstrated that, using real numbers, it is possible to reproduce the statistics of any standard Bell experiment, even those involving multiple quantum systems [2]. The result reinforced the conjecture that “complex” numbers aren’t necessary, but the lack of a general proof left open some paths for refuting the equivalence between “complex” and “real” quantum theories.

    One such path was identified in 2021 through the brilliant theoretical work of Marc-Olivier Renou of the The Institute of Photonic Sciences [Instituto de Ciencias Fotónicas](ES)and co-workers [7]. The researchers considered two theories that are both based on the postulates of quantum mechanics, but one uses a “complex” Hilbert space, as in the traditional formulation, while the other uses a real space. They then devised Bell-like experiments that could prove the inadequacy of the real theory. In their theorized experiments, two independent sources distribute entangled qubits in a quantum network configuration, while causally independent measurements on the nodes can reveal quantum correlations that do not admit any real quantum representation.

    Chen and colleagues and Li and colleagues now provide the experimental demonstration of Renou and co-workers’ proposal in two different physical platforms. The experiments are conceptually based on a “game” in which three parties (Alice, Bob, and Charlie) perform a Bell-like experiment (Fig. 1). In this game, two sources distribute entangled qubits between Alice and Bob and between Bob and Charlie, respectively. Each party independently chooses, from a set of possibilities, the measurements to perform on their qubit(s). Since the sources are independent, the qubits sent to Alice and Charlie are originally uncorrelated. Bob receives a qubit from both sources and, by performing a Bell-state measurement, he generates entanglement between Alice’s and Charlie’s qubits even though these qubits never interacted (a procedure called “entanglement swapping” [8]). Finally, a “score” is calculated from the statistical distribution of measurement outcomes. As demonstrated by Renou and co-workers, a “complex” quantum theory can produce a larger score than the one produced by a real quantum theory.

    The two groups follow different approaches to implement the quantum game. Chen and colleagues use a superconducting quantum processor in which the qubits have individual control and readout. The main challenge of this approach is making the qubits, which sit on the same circuit, truly independent and decoupled—a stringent requirement for the Bell-like tests. Li and colleagues instead choose a photonic implementation that more easily achieves this independence. Specifically, they use polarization-entangled photons generated by parametric down-conversion and detected in superconducting nanowire single-photon detectors. The optical implementation comes, however, with a different challenge: The protocol proposed by Renou and co-workers requires a complete Bell-state measurement, which can be directly implemented using superconducting qubits but is not achievable exploiting linear optical phenomena. Therefore, Li and colleagues had to rely on a so-called “partial” Bell-state measurement.

    Despite the difficulties inherent in each implementation, both experiments deliver compelling results. Impressively, they beat the score of real theory by many standard deviations (by 43 σ and 4.5 σ for Chen’s and Li’s experiments, respectively), providing convincing proof that complex numbers are needed to describe the experiments.

    Interestingly, both experiments are based on a minimal quantum network scheme (two sources and three nodes), which is a promising building block for a future quantum internet. The results thus offer one more demonstration that the availability of new quantum technologies is closely linked to the possibility of testing foundational aspects of quantum mechanics. Conversely, these new fundamental insights on quantum mechanics could have unexpected implications on the development of new quantum information technologies.

    We must be careful, however, in assessing the implications of these results. One might be tempted to conclude that “complex” numbers are indispensable to describe the physical reality of the Universe. However, this conclusion is true only if we accept the standard framework of quantum mechanics, which is based on several postulates. As Renou and his co-workers point out, these results would not be applicable to alternative formulations of quantum mechanics, such as Bohmian mechanics, which are based on different postulates. Therefore, these results could stimulate attempts to go beyond the standard formalism of quantum mechanics, which, despite great successes in predicting experimental results, is often considered inadequate from an interpretative point of view [9].

    References

    C. N. Yang, “Square root of minus one, complex phases and Erwin Schrödinger,” Selected Papers II with Commentary (World Scientific, Hackensack, 2013)[Amazon][WorldCat].
    M. McKague et al., “Simulating quantum systems using real Hilbert spaces,” Phys. Rev. Lett. 102, 020505 (2009).
    M.-C. Chen et al., “Ruling out real-valued standard formalism of quantum theory,” Phys. Rev. Lett. 128, 040403 (2022).
    Z.-D. Li et al., “Testing real quantum theory in an optical quantum network,” Phys. Rev. Lett. 128, 040402 (2022).
    A. Aspect, “Closing the door on Einstein and Bohr’s quantum debate,” Physics 8, 123 (2015).
    N. Gisin, “Bell Inequalities: Many Questions, a Few Answers,” in Quantum Reality, Relativistic Causality, and Closing the Epistemic Circle, edited by W. C. Myrvold et al. The Western Ontario Series in Philosophy of Science, Vol. 73 (Springer, Dordrecht, 2009)[Amazon][WorldCat].
    M.-O. Renou et al., “Quantum theory based on real numbers can be experimentally falsified,” Nature 600, 625 (2021).
    J.-W. Pan et al., “Experimental entanglement swapping: Entangling photons that never interacted,” Phys. Rev. Lett. 80, 3891 (1998).
    T. Norsen, Foundations of Quantum Mechanics – An Exploration of the Physical Meaning of Quantum Theory, Undergraduate Lecture Notes in Physics (Springer, Cham, 2017)[Amazon][WorldCat].

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Physicists are drowning in a flood of research papers in their own fields and coping with an even larger deluge in other areas of physics. How can an active researcher stay informed about the most important developments in physics? Physics (US) highlights a selection of papers from the Physical Review journals. In consultation with expert scientists, the editors choose these papers for their importance and/or intrinsic interest. To highlight these papers, Physics features three kinds of articles: Viewpoints are commentaries written by active researchers, who are asked to explain the results to physicists in other subfields. Focus stories are written by professional science writers in a journalistic style and are intended to be accessible to students and non-experts. Synopses are brief editor-written summaries. Physics provides a much-needed guide to the best in physics, and we welcome your comments.

     
  • richardmitnick 10:13 pm on January 22, 2022 Permalink | Reply
    Tags: "This New Record in Laser Beam Stability Could Help Answer Physics' Biggest Questions", , , , , Quantum Mechanics, ,   

    From The University of Western Australia (AU) via Science Alert (AU) : “This New Record in Laser Beam Stability Could Help Answer Physics’ Biggest Questions” 

    U Western Australia bloc

    From The University of Western Australia (AU)

    via

    Science Alert (AU)

    1
    The laser setup at the University of Western Australia. Credit: D. Gozzard/UWA.

    22 JANUARY 2022
    DAVID NIELD

    Scientists are on a mission to create a global network of atomic clocks that will enable us to, among other things, better understand the fundamental laws of physics, investigate dark matter, and navigate across Earth and space more precisely.

    However, to be at their most effective, these clocks will need to be reliably and speedily linked together through layers of the atmosphere, which is far from easy. New research outlines a successful experiment with a laser beam that has been kept stable across a distance of 2.4 kilometers (1.5 miles).

    For comparison, the new link is around 100 times more stable than anything that’s been put together before. It also demonstrates stability that’s around 1,000 times better than the atomic clocks these lasers could be used to monitor.

    “The result shows that the phase and amplitude stabilization technologies presented in this paper can provide the basis for ultra-precise timescale comparison of optical atomic clocks through the turbulent atmosphere,” write the researchers in their published paper [Physical Review Letters].

    The system builds on research carried out last year in which scientists developed a laser link capable of holding its own through the atmosphere with unprecedented stability.

    In the new study, researchers shot a laser beam from a fifth-floor window to a reflector 1.2 kilometers (0.74 miles) away. The beam was then bounced back to the source to achieve the total distance for a period of five minutes.

    Using noise reduction techniques, temperature controls, and tiny adjustments to the reflector, the team was able to keep the laser stable through the pockets of fluctuating air. The atmospheric turbulence at ground level here is likely to equate to ground-to-satellite turbulence (the air is calmer and less dense higher in the atmosphere) of several hundred kilometers.

    While laser accuracy has remained fairly constant for a decade or so, we’ve seen some significant improvements recently, including a laser setup operated by the Boulder Atomic Clock Optical Network (BACON) Collaboration and tested last March [Nature].

    That setup involved a pulse laser rather than the continuous wave laser tested in this new study. Both have their advantages in different scenarios, but continuous wave lasers offer better stability and can transfer more data in a set period of time.

    “Both systems beat the current best atomic clock, so we’re splitting hairs here, but our ultimate precision is better,” says astrophysicist David Gozzard from the University of Western Australia.

    Once an atomic clock network is put together, among the tests scientists will be able to perform is Albert Einstein’s Theory of General Relativity, and how its incompatibility with what we know about quantum physics could be resolved.

    By very precisely comparing the time-keeping of two atomic clocks – one on Earth and one in space – scientists are eventually hoping to be able to work out where General Relativity does and doesn’t hold up. If Einstein’s ideas are correct, the clock further away from Earth’s gravity should tick ever-so-slightly faster.

    But its usefulness doesn’t stop there. Lasers like this could eventually be used for managing the launching of objects into orbit, for communications between Earth and space, or for connecting two points in space.

    “Of course, you can’t run fiber optic cable to a satellite,” says Gozzard.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Western Australia is a public research university in the Australian state of Western Australia. The university’s main campus is in Perth, the state capital, with a secondary campus in Albany and various other facilities elsewhere.

    UWA was established in 1911 by an act of the Parliament of Western Australia and began teaching students two years later. It is the sixth-oldest university in Australia and was Western Australia’s only university until the establishment of Murdoch University (AU) in 1973. Because of its age and reputation, UWA is classed one of the “sandstone universities”, an informal designation given to the oldest university in each state. The university also belongs to several more formal groupings, including The Group of Eight (AU) and The Matariki Network of Universities. In recent years, UWA has generally been ranked either in the bottom half or just outside the world’s top 100 universities, depending on the system used.

    Alumni of UWA include one Prime Minister of Australia (Bob Hawke), five Justices of the High Court of Australia (including one Chief Justice, Robert French, now Chancellor), one Governor of the Reserve Bank (H. C. Coombs), various federal cabinet ministers, and seven of Western Australia’s eight most recent premiers. In 2018 alumnus mathematician Akshay Venkatesh was a recipient of the Fields Medal. As at 2021, the university had produced 106 Rhodes Scholars. Two members of the UWA faculty, Barry Marshall and Robin Warren won Nobel Prizes as a result of research at the university.

    History

    The university was established in 1911 following the tabling of proposals by a royal commission in September 1910. The original campus, which received its first students in March 1913, was located on Irwin Street in the centre of Perth, and consisted of several buildings situated between Hay Street and St Georges Terrace. Irwin Street was also known as “Tin Pan Alley” as many buildings featured corrugated iron roofs. These buildings served as the university campus until 1932, when the campus relocated to its present-day site in Crawley.

    The founding chancellor, Sir John Winthrop Hackett, died in 1916, and bequeathed property which, after being carefully managed for ten years, yielded £425,000 to the university, a far larger sum than expected. This allowed the construction of the main buildings. Many buildings and landmarks within the university bear his name, including Winthrop Hall and Hackett Hall. In addition, his bequest funded many scholarships, because he did not wish eager students to be deterred from studying because they could not afford to do so.

    During UWA’s first decade there was controversy about whether the policy of free education was compatible with high expenditure on professorial chairs and faculties. An “old student” publicised his concern in 1921 that there were 13 faculties serving only 280 students.

    A remnant of the original buildings survives to this day in the form of the “Irwin Street Building”, so called after its former location. In the 1930s it was transported to the new campus and served a number of uses till its 1987 restoration, after which it was moved across campus to James Oval. Recently, the building has served as the Senate meeting room and is currently in use as a cricket pavilion and office of the university archives. The building has been heritage-listed by both the National Trust and the Australian Heritage Council.

    The university introduced the Doctorate of Philosophy degree in 1946 and made its first award in October 1950 to Warwick Bottomley for his research of the chemistry of native plants in Western Australia.

     
  • richardmitnick 11:37 am on January 22, 2022 Permalink | Reply
    Tags: "Towards compact quantum computers thanks to topology", At SLS the PSI researchers used an investigation method called soft X-ray angle-resolved photoelectron spectroscopy – SX-ARPES for short., By now the future of computing is inconceivable without quantum computers., Indium antimonide has a particularly low electron density below its oxide layer. This would be advantageous for the formation of topological Majorana fermions in the planned nanowires., It is known that thin-film systems of certain semiconductors and superconductors could lead to exotic electron states that would act as such topological qubits., Majorana fermions are topological states. They could therefore act as information carriers-ergo as quantum bits in a quantum computer., Most types of qubits unfortunately lose their information quickly., , Quantum bits-or qubits for short-form the basis of quantum computers., Quantum Mechanics, Quasiparticles in semiconductor nanowires, Researchers at The Paul Scherrer Institute [Paul Scherrer Institut](CH)] have compared the electron distribution below the oxide layer of two semiconductors., Scientists at Paul Scherrer Institute want to help create a new kind of qubit that is immune to leakage of information., So-called topological quantum bits are a novel type that might prove to be superior., , The researchers hope to obtain such immunity with so-called topological quantum bits., The researchers investigated two different semiconductors and their natural oxide layer: on the one hand indium arsenide and on the other indium antimonide.   

    From The Paul Scherrer Institute [Paul Scherrer Institut](CH): “Towards compact quantum computers thanks to topology” 

    From The Paul Scherrer Institute [Paul Scherrer Institut](CH)

    20 January 2022
    Laura Hennemann

    Researchers at The Paul Scherrer Institute [Paul Scherrer Institut](CH) have compared the electron distribution below the oxide layer of two semiconductors. The investigation is part of an effort to develop particularly stable quantum bits –and thus, in turn, particularly efficient quantum computers. They have now published their latest research, which is supported in part by Microsoft, in the scientific journal Advanced Quantum Technologies.

    By now the future of computing is inconceivable without quantum computers. For the most part, these are still in the research phase. They hold the promise of speeding up certain calculations and simulations by orders of magnitude compared to classical computers.

    Quantum bits-or qubits for short-form the basis of quantum computers. So-called topological quantum bits are a novel type that might prove to be superior. To find out how these could be created, an international team of researchers has carried out measurements at the Swiss Light Source SLS at The Paul Scherrer Institute [Paul Scherrer Institut](CH).

    1
    Niels Schröter (left) and Vladimir Strocov at one of the experiment stations of the Swiss Light Source SLS at PSI. Here the researchers used soft X-ray angle-resolved photoelectron spectroscopy to measure the electron distribution below the oxide layer of indium arsenide as well as indium antimonide.
    Photo: Mahir Dzambegovic/Paul Scherrer Institute.

    More stable quantum bits

    3
    Gabriel Aeppli, head of the Photon Science Division at PSI, specialises in the study of quantum materials.
    Photo: Thomas Baumann.

    “Computer bits that follow the laws of quantum mechanics can be achieved in different ways,” explains Niels Schröter, one of the study’s authors. He was a researcher at PSI until April 2021, when he moved to The MPG Institute of Microstructure Physics [MPG für Mikrostrukturphysik] (DE). “Most types of qubits unfortunately lose their information quickly; you could say they are forgetful qubits.” There is a technical solution to this: Each qubit is backed up with a system of additional qubits that correct any errors that occur. But this means that the total number of qubits needed for an operational quantum computer quickly rises into the millions.

    “Microsoft’s approach, which we are now collaborating on, is quite different,” Schröter continues. “We want to help create a new kind of qubit that is immune to leakage of information. This would allow us to use just a few qubits to achieve a slim, functioning quantum computer.”

    The researchers hope to obtain such immunity with so-called topological quantum bits. These would be something completely new that no research group has yet been able to create.

    Topological materials became more widely known through the Nobel Prize in Physics in 2016. Topology is originally a field of mathematics that explores, among other things, how geometric objects behave when they are deformed. However, the mathematical language developed for this can also be applied to other physical properties of materials. Quantum bits in topological materials would then be topological qubits.

    Quasiparticles in semiconductor nanowires

    It is known that thin-film systems of certain semiconductors and superconductors could lead to exotic electron states that would act as such topological qubits. Specifically, ultra-thin, short wires made of a semiconductor material could be considered for this purpose. These have a diameter of only 100 nanometres and are 1,000 nanometres (i.e., 0.0001 centimetres) long. On their outer surface, in the longitudinal direction, the top half of the wires is coated with a thin layer of a superconductor. The rest of the wire is not coated so that a natural oxide layer forms there. Computer simulations for optimising these components predict that the crucial, quantum mechanical electron states are only located at the interface between the semiconductor and the superconductor and not between the semiconductor and its oxide layer.

    “The collective, asymmetric distribution of electrons generated in these nanowires can be physically described as so-called quasiparticles,” says Gabriel Aeppli, head of the Photon Science Division at PSI, who was also involved in the current study. “Now, if suitable semiconductor and superconductor materials are chosen, these electrons should give rise to special quasiparticles called Majorana fermions at the ends of the nanowires.”

    Majorana fermions are topological states. They could therefore act as information carriers, ergo as quantum bits in a quantum computer. “Over the course of the last decade, recipes to create Majorana fermions have already been studied and refined by research groups around the world,” Aeppli continues. “But to continue with this analogy: we still didn’t know which cooking pot would give us the best results for this recipe.”

    Indium antimonide has the advantage

    A central concern of the current research project was therefore the comparison of two “cooking pots”. The researchers investigated two different semiconductors and their natural oxide layer: on the one hand indium arsenide and on the other indium antimonide.

    At SLS the PSI researchers used an investigation method called soft X-ray angle-resolved photoelectron spectroscopy – SX-ARPES for short. A novel computer model developed by Noa Marom’s group at Carnegie Mellon University, USA, together with Vladimir Strocov from PSI, was used to interpret the complex experimental data. “The computer models used up to now led to an unmanageably large number of spurious results. With our new method, we can now look at all the results, automatically filter out the physically relevant ones, and properly interpret the experimental outcome,” explains Strocov.

    Through their combination of SX-ARPES experiments and computer models, the researchers have now been able to show that indium antimonide has a particularly low electron density below its oxide layer. This would be advantageous for the formation of topological Majorana fermions in the planned nanowires.

    “From the point of view of electron distribution under the oxide layer, indium antimonide is therefore better suited than indium arsenide to serve as a carrier material for topological quantum bits,” concludes Niels Schröter. However, he points out that in the search for the best materials for a topological quantum computer, other advantages and disadvantages must certainly be weighed against each other. “Our advanced spectroscopic methods will certainly be instrumental in the quest for the quantum computing materials,” says Strocov. “PSI is currently taking big steps to expand quantum research and engineering in Switzerland, and SLS is an essential part of that.”

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Paul Scherrer Institute [Paul Scherrer Institut] (CH) is the largest research institute for natural and engineering sciences within Switzerland. We perform world-class research in three main subject areas: Matter and Material; Energy and the Environment; and Human Health. By conducting fundamental and applied research, we work on long-term solutions for major challenges facing society, industry and science.

    The Paul Scherrer Institute (PSI) is a multi-disciplinary research institute for natural and engineering sciences in Switzerland. It is located in the Canton of Aargau in the municipalities Villigen and Würenlingen on either side of the River Aare, and covers an area over 35 hectares in size. Like ETH Zurich [Swiss Federal Institute of Technology ETH Zürich [Eidgenössische Technische Hochschule Zürich)](CH) and EPFL (Swiss Federal Institute of Technology in Lausanne) [École polytechnique fédérale de Lausanne](CH)], PSI belongs to the The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH). The PSI employs around 2100 people. It conducts basic and applied research in the fields of matter and materials, human health, and energy and the environment. About 37% of PSI’s research activities focus on material sciences, 24% on life sciences, 19% on general energy, 11% on nuclear energy and safety, and 9% on particle physics.

    PSI develops, builds and operates large and complex research facilities and makes them available to the national and international scientific communities. In 2017, for example, more than 2500 researchers from 60 different countries came to PSI to take advantage of the concentration of large-scale research facilities in the same location, which is unique worldwide. About 1900 experiments are conducted each year at the approximately 40 measuring stations in these facilities.

    In recent years, the institute has been one of the largest recipients of money from the Swiss lottery fund.

    Research and specialist areas

    Paul Scherrer Institute develops, builds and operates several accelerator facilities, e. g. a 590 MeV high-current cyclotron, which in normal operation supplies a beam current of about 2.2 mA. PSI also operates four large-scale research facilities: a synchrotron light source (SLS), which is particularly brilliant and stable, a spallation neutron source (SINQ), a muon source (SμS) and an X-ray free-electron laser (SwissFEL).

    Paul Sherrer Institute SwissFEL Coherent Light Source, Spallation Neutron Source (SINQ), Muon Source (SμS), X-ray free-electron laser (SwissFEL)

    This makes PSI currently (2020) the only institute in the world to provide the four most important probes for researching the structure and dynamics of condensed matter (neutrons, muons and synchrotron radiation) on a campus for the international user community. In addition, HIPA’s target facilities also produce pions that feed the muon source and the Ultracold Neutron source UCN produces very slow, ultracold neutrons. All these particle types are used for research in particle physics.

     
  • richardmitnick 6:03 pm on January 21, 2022 Permalink | Reply
    Tags: "BQL" and "BQuL", "Computer Scientists Eliminate Pesky Quantum Computations", 28 years ago computer scientists established that for quantum algorithms you can wait until the end of a computation to make intermediate measurements without changing the final result., , , If at any point in a calculation you need to access the information contained in a qubit and you measure it the qubit collapses., Instead of encoding information in the 0s and 1s of typical bits quantum computers encode information in higher-dimensional combinations of bits called qubits., Proof that any quantum algorithm can be rearranged to move measurements performed in the middle of the calculation to the end of the process., , , Quantum Mechanics, The basic difference between quantum computers and the computers we have at home is the way each stores information., This collapse possibly affects all the other qubits in the system., Virtually all algorithms require knowing the value of a computation as it’s in progress.   

    From Quanta Magazine (US): “Computer Scientists Eliminate Pesky Quantum Computations” 

    From Quanta Magazine (US)

    January 19, 2022
    Nick Thieme

    1
    Credit: Samuel Velasco/Quanta Magazine.

    As quantum computers have become more functional, our understanding of them has remained muddled. Work by a pair of computer scientists [Symposium on Theory of Computing] has clarified part of the picture, providing insight into what can be computed with these futuristic machines.

    “It’s a really nice result that has implications for quantum computation,” said John Watrous of The University of Waterloo (CA).

    The research, posted in June 2020 by Bill Fefferman and Zachary Remscrim of The University of Chicago (US), proves that any quantum algorithm can be rearranged to move measurements performed in the middle of the calculation to the end of the process, without changing the final result or drastically increasing the amount of memory required to carry out the task. Previously, computer scientists thought that the timing of those measurements affected memory requirements, creating a bifurcated view of the complexity of quantum algorithms.

    “This has been quite annoying,” said Fefferman. “We’ve had to talk about two complexity classes — one with intermediate measurements and one without.”

    This issue applies exclusively to quantum computers due to the unique way they work. The basic difference between quantum computers and the computers we have at home is the way each stores information. Instead of encoding information in the 0s and 1s of typical bits quantum computers encode information in higher-dimensional combinations of bits called qubits.

    This approach enables denser information storage and sometimes faster calculations. But it also presents a problem. If at any point in a calculation you need to access the information contained in a qubit and you measure it, the qubit collapses from a delicate combination of simultaneously possible bits into a single definite one, possibly affecting all the other qubits in the system.

    This can be a problem because virtually all algorithms require knowing the value of a computation as it’s in progress. For instance, an algorithm may contain a statement like “If the variable x is a number, multiply it by 10; if not, leave it alone.” Performing these steps would seem to require knowing what x is at that moment in the computation — a potential challenge for quantum computers, where measuring the state of a particle (to determine what x is) inherently changes it.

    But 28 years ago, computer scientists proved it’s possible to avoid this kind of no-win situation. They established that for quantum algorithms, you can wait until the end of a computation to make intermediate measurements without changing the final result.

    An essential part of that result showed that you can push intermediate measurements to the end of a computation without drastically increasing the total running time. These features of quantum algorithms — that measurements can be delayed without affecting the answer or the runtime — came to be called the principle of deferred measurement.

    This principle fortifies quantum algorithms, but at a cost. Deferring measurements uses a great deal of extra memory space, essentially one extra qubit per deferred measurement. While one bit per measurement might take only a tiny toll on a classical computer with 4 trillion bits, it’s prohibitive given the limited number of qubits currently in the largest quantum computers.

    Google 53-qubit “Sycamore” superconducting processor quantum computer.

    3
    IBM Unveils Breakthrough 127-Qubit Quantum Processor. Credit: IBM Corp.

    Fefferman and Remscrim’s work resolves this issue in a surprising way. With an abstract proof, they show that subject to a few caveats, anything calculable with intermediate measurements can be calculated without them. Their proof offers a memory-efficient way to defer intermediate measurements — circumventing the memory problems that such measurements created.

    3

    “In the most standard scenario, you don’t need intermediate measurements,” Fefferman said.

    Fefferman and Remscrim achieved their result by showing that a representative problem called “well-conditioned matrix powering” is, in a way, equivalent to a different kind of problem with important properties.

    The “well-conditioned matrix powering” problem effectively asks you to find the values for particular entries in a type of matrix (an array of numbers), given some conditions. Fefferman and Remscrim proved that matrix powering is just as hard as any other quantum computing problem that allows for intermediate measurements. This set of problems is called “BQL”, and the team’s work meant that matrix powering could serve as a representative for all other problems in that class — so anything they proved about matrix powering would be true for all other problems involving intermediate measurements.

    At this point, the researchers took advantage of some of their earlier work. In 2016, Fefferman and Cedric Lin proved that a related problem called “well-conditioned matrix inversion” was equivalent to the hardest problem in a very similar class of problems called “BQuL”. This class is like BQL’s little sibling. It’s identical to BQL, except that it comes with the requirement that every problem in the class must also be reversible.

    In quantum computing, the distinction between reversible and irreversible measurements is essential. If a calculation measures a qubit, it collapses the state of the qubit, making the initial information impossible to recover. As a result, all measurements in quantum algorithms are innately irreversible.

    That means that BQuL is not just the reversible version of BQL; it’s also BQL without any intermediate measurements (because intermediate measurements, like all quantum measurements, would be irreversible, violating the signal condition of the class). The 2016 work proved that matrix inversion is a prototypical quantum calculation without intermediate measurements — that is, a fully representative problem for BQuL.

    The new paper builds on that by connecting the two, proving that well-conditioned matrix powering, which represents all problems with intermediate measurements, can be reduced to well-conditioned matrix inversion, which represents all problems that cannot feature intermediate measurements. In other words, any quantum computing problem with intermediate measurements can be reduced to a quantum computing problem without intermediate measurements.

    This means that for quantum computers with limited memory, researchers no longer need to worry about intermediate measurements when classifying the memory needs of different types of quantum algorithms.

    In 2020, a group of researchers at Princeton University (US) — Ran Raz, Uma Girish and Wei Zhan — independently proved a slightly weaker but nearly identical result that they posted three days after Fefferman and Rimscrim’s work. Raz and Girish later extended the result, proving that intermediate measurements can be deferred in both a time-efficient and space-efficient way for a more limited class of computers.

    Altogether, the recent work provides a much better understanding of how limited-memory quantum computation works. With this theoretical guarantee, researchers have a road map for translating their theory into applied algorithms. Quantum algorithms are now free, in a sense, to proceed without the prohibitive costs of deferred measurements.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine (US) is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 4:07 pm on January 20, 2022 Permalink | Reply
    Tags: "Going beyond the exascale", , , Classical computers have been central to physics research for decades., , , , Fermilab has used classical computing to simulate lattice quantum chromodynamics., , , , Planning for a future that is still decades out., Quantum computers could enable physicists to tackle questions even the most powerful computers cannot handle., , Quantum computing is here—sort of., Quantum Mechanics, Solving equations on a quantum computer requires completely new ways of thinking about programming and algorithms., , , The biggest place where quantum simulators will have an impact is in discovery science.   

    From Symmetry: “Going beyond the exascale” 

    Symmetry Mag

    From Symmetry

    01/20/22
    Emily Ayshford

    1
    Illustration by Sandbox Studio, Chicago with Ana Kova.

    Quantum computers could enable physicists to tackle questions even the most powerful computers cannot handle.

    After years of speculation, quantum computing is here—sort of.

    Physicists are beginning to consider how quantum computing could provide answers to the deepest questions in the field. But most aren’t getting caught up in the hype. Instead, they are taking what for them is a familiar tack—planning for a future that is still decades out, while making room for pivots, turns and potential breakthroughs along the way.

    “When we’re working on building a new particle collider, that sort of project can take 40 years,” says Hank Lamm, an associate scientist at The DOE’s Fermi National Accelerator Laboratory (US). “This is on the same timeline. I hope to start seeing quantum computing provide big answers for particle physics before I die. But that doesn’t mean there isn’t interesting physics to do along the way.”

    Equations that overpower even supercomputers.

    Classical computers have been central to physics research for decades, and simulations that run on classical computers have guided many breakthroughs. Fermilab, for example, has used classical computing to simulate lattice quantum chromodynamics. Lattice QCD is a set of equations that describe the interactions of quarks and gluons via the strong force.

    Theorists developed lattice QCD in the 1970s. But applying its equations proved extremely difficult. “Even back in the 1980s, many people said that even if they had an exascale computer [a computer that can perform a billion billion calculations per second], they still couldn’t calculate lattice QCD,” Lamm says.

    Depiction of ANL ALCF Cray Intel SC18 Shasta Aurora exascale supercomputer, to be built at DOE’s Argonne National Laboratory (US).

    Depiction of ORNL Cray Frontier Shasta based Exascale supercomputer with Slingshot interconnect featuring high-performance AMD EPYC CPU and AMD Radeon Instinct GPU technology , being built at DOE’s Oak Ridge National Laboratory (US).

    But that turned out not to be true.

    Within the past 10 to 15 years, researchers have discovered the algorithms needed to make their calculations more manageable, while learning to understand theoretical errors and how to ameliorate them. These advances have allowed them to use a lattice simulation, a simulation that uses a volume of a specified grid of points in space and time as a substitute for the continuous vastness of reality.

    Lattice simulations have allowed physicists to calculate the mass of the proton—a particle made up of quarks and gluons all interacting via the strong force—and find that the theoretical prediction lines up well with the experimental result. The simulations have also allowed them to accurately predict the temperature at which quarks should detach from one another in a quark-gluon plasma.

    Quark-Gluon Plasma from BNL Relative Heavy Ion Collider (US).

    DOE’s Brookhaven National Laboratory(US) RHIC Campus

    The limit of these calculations? Along with being approximate, or based on a confined, hypothetical area of space, only certain properties can be computed efficiently. Try to look at more than that, and even the biggest high-performance computer cannot handle all of the possibilities.

    Enter quantum computers.

    Quantum computers are all about possibilities. Classical computers don’t have the memory to compute the many possible outcomes of lattice QCD problems, but quantum computers take advantage of quantum mechanics to calculate differently.

    Quantum computing isn’t an easy answer, though. Solving equations on a quantum computer requires completely new ways of thinking about programming and algorithms.

    Using a classical computer, when you program code, you can look at its state at all times. You can check a classical computer’s work before it’s done and trouble-shoot if things go wrong. But under the laws of quantum mechanics, you cannot observe any intermediate step of a quantum computation without corrupting the computation; you can observe only the final state.

    That means you can’t store any information in an intermediate state and bring it back later, and you cannot clone information from one set of qubits into another, making error correction difficult.

    “It can be a nightmare designing an algorithm for quantum computation,” says Lamm, who spends his days trying to figure out how to do quantum simulations for high-energy physics. “Everything has to be redesigned from the ground up. We are right at the beginning of understanding how to do this.”

    Just getting started

    Quantum computers have already proved useful in basic research. Condensed matter physicists—whose research relates to phases of matter—have spent much more time than particle physicists thinking about how quantum computers and simulators can help them. They have used quantum simulators to explore quantum spin liquid states [Science] and to observe a previously unobserved phase of matter called a prethermal time crystal [Science].

    “The biggest place where quantum simulators will have an impact is in discovery science, in discovering new phenomena like this that exist in nature,” says Norman Yao, an assistant professor at The University of California-Berkeley (US) and co-author on the time crystal paper.

    Quantum computers are showing promise in particle physics and astrophysics. Many physics and astrophysics researchers are using quantum computers to simulate “toy problems”—small, simple versions of much more complicated problems. They have, for example, used quantum computing to test parts of theories of quantum gravity [npj Quantum Information] or create proof-of-principle models, like models of the parton showers that emit from particle colliders [Physical Review Letters] such as the Large Hadron Collider.

    The European Organization for Nuclear Research [Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) [CERN].

    The European Organization for Nuclear Research [Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH)[CERN] map.

    CERN LHC tube in the tunnel. Credit: Maximilien Brice and Julien Marius Ordan.

    SixTRack CERN LHC particles.

    “Physicists are taking on the small problems, ones that they can solve with other ways, to try to understand how quantum computing can have an advantage,” says Roni Harnik, a scientist at Fermilab. “Learning from this, they can build a ladder of simulations, through trial and error, to more difficult problems.”

    But just which approaches will succeed, and which will lead to dead ends, remains to be seen. Estimates of how many qubits will be needed to simulate big enough problems in physics to get breakthroughs range from thousands to (more likely) millions. Many in the field expect this to be possible in the 2030s or 2040s.

    “In high-energy physics, problems like these are clearly a regime in which quantum computers will have an advantage,” says Ning Bao, associate computational scientist at DOE’s Brookhaven National Laboratory (US). “The problem is that quantum computers are still too limited in what they can do.”

    Starting with physics

    Some physicists are coming at things from a different perspective: They’re looking to physics to better understand quantum computing.

    John Preskill is a physics professor at The California Institute of Technology (US) and an early leader in the field of quantum computing. A few years ago, he and Patrick Hayden, professor of physics at Stanford University (US), showed that if you entangled two photons and threw one into a black hole, decoding the information that eventually came back out via Hawking radiation would be significantly easier than if you had used non-entangled particles. Physicists Beni Yoshida and Alexei Kitaev then came up with an explicit protocol for such decoding, and Yao went a step further, showing that protocol could also be a powerful tool in characterizing quantum computers.

    “We took something that was thought about in terms of high-energy physics and quantum information science, then thought of it as a tool that could be used in quantum computing,” Yao says.

    That sort of cross-disciplinary thinking will be key to moving the field forward, physicists say.

    “Everyone is coming into this field with different expertise,” Bao says. “From computing, or physics, or quantum information theory—everyone gets together to bring different perspectives and figure out problems. There are probably many ways of using quantum computing to study physics that we can’t predict right now, and it will just be a matter of getting the right two people in a room together.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 11:48 am on January 20, 2022 Permalink | Reply
    Tags: , , , Quantum Mechanics, , "Global leaders unveil responsible quantum computing guidelines", Quantum computing is set to step out of the shadows., The "next big thing" in tech., A hot topic among investors and research and development (R&D) communities., Confused? Don’t worry- quantum computing is inherently confusing., Quantum computing makes it possible to process vast amounts of information very quickly., Such things are not feasible with classical computers., A (quantum) leap into the unknown, Unlocking innovation for decades to come.   

    From CSIROscope (AU): “Global leaders unveil responsible quantum computing guidelines” 

    CSIRO bloc

    From CSIROscope (AU)

    at

    CSIRO (AU)-Commonwealth Scientific and Industrial Research Organisation

    20 Jan, 2022
    Sophie Schmidt

    We’ve joined forces with The World Economic Forum to contribute to best-practice governance principles for quantum technologies.

    1
    Quantum computing is promising to transform the way we think about and understand the world around us. Credit: Shutterstock.

    As 2022 arrives full of uncertainty, one thing remains guaranteed: quantum computing is set to step out of the shadows.

    Many are announcing quantum technology as the “next big thing” in tech. And it has become a hot topic among investors and research and development (R&D) communities. This is largely because quantum computing has the potential to solve problems that wouldn’t be possible using conventional ‘classical’ computers.

    Harder, better, faster, stronger – quantum computing is promising to transform the way we think about and understand the world around us.

    Right now, the technology is still at an early stage (as in, no one has built the first practical quantum computer). Even so, excitement is at an all-time high. Health care (pharmaceuticals), climate modelling, machine learning and cybersecurity are just a few examples of where quantum might deliver significant value.

    Confused? Don’t worry- quantum computing is inherently confusing.

    Hype aside, many of us are still struggling to understand how quantum computing works – and what makes it so ‘new’. And rightly so, according to Professor Jim Rabeau, Director of our Quantum Technologies Future Science Platform (FSP).

    “Understanding the power of quantum computing requires us to think differently about how information is processed,” Jim explains.

    Quantum computers use ‘qubits’ which can be electrons, photons or other small particles. Only the very non-intuitive science of quantum mechanics can explain the qubit behaviour.

    But what’s more useful to focus on is how using quantum mechanics enables us to conduct multiple operations all at once.

    Jim compares classical versus quantum computing using the analogy of old-school style phone directories.

    “Rather than searching line by line, page by page, imagine being able to instantly find the name and number you are looking for by looking at all pages and all lines at once,” he says.

    Quantum computing makes it possible to process vast amounts of information very quickly. This is simply because it is looking at all ‘possibilities’ (or in this case, names and numbers) simultaneously.

    “In the case of pharmaceuticals, it means we would be able to very quickly look at all possible structural combinations of atoms and molecules to form the perfect drug to address a particular disease,” Jim says.

    “Such things are not feasible with classical computers.”

    The bottom line is that the quest to harness the potential of quantum computing is on. We are rapidly making progress to close in on gaps between research and real-world applications.

    A (quantum) leap into the unknown

    Quantum technology has been on our minds a lot lately. For starters, our researchers are exploring how we could use quantum computers to outperform today’s computers. For example, quantum computers could crack the cryptography protocols that keep our data private, making current security measures virtually useless.

    It’s not just post-quantum cryptography we’re exploring. The ethical challenges associated with it need to also be assessed. For example, providing fair and secure data storage and communication systems.

    As with any new technology, the assumption is that humans will automatically know to do the right thing – or even that we will agree on what that might be. This is where developing and applying ethical standards and responsible governance guidelines can help.

    It’s not entirely about stopping CEOs in tech companies from using the technology for nefarious purposes. That is, as Responsible Innovation FSP Director Dr Justine Lacey says, “a little too simplistic.”

    “It suggests that all bad outcomes are merely the result of bad actors,” Justine says.

    There will always be a risk of an individual using technology in an unethical way.

    “But what’s easier to lose sight of is whether or not a technology is used to generate broad societal benefit. And this also means ensuring it is not used to inadvertently create harmful outcomes, by for example, overlooking certain socioeconomic groups, or undermining cybersecurity measures,” she says.

    To help ensure quantum technology benefits everyone, we recently joined forces with the World Economic Forum and contributed to their latest Insight Report released at the World Economic Forum Annual Meeting 2022.

    This report outlines a set of governance principles. They are the result of an extensive international multi-sector collaboration with stakeholders from across the globe. Their aim is to help guide responsible design and adoption of quantum technology by applying best-practice governance principles.

    New guidelines will help guide responsible quantum

    It’s a familiar discussion that has ramped up over the last 10 years around ethics and artificial intelligence (AI). Except this time around, according to Justine, we are getting on the front foot.

    “The development of global governance principles for quantum technology presents a rare opportunity to embed responsible innovation practices from a very early stage and well before we have seen wide application, uptake and commercialisation of the technology,” Justine explains.

    “It also comes at a time when those in the quantum technology community are starting to consider how the application of this technology may broadly impact our lives and society, and how we can steer its application toward producing more desirable societal outcomes.

    “If we look to similar discussions on responsible AI, it is clear a major stumbling block was not the development of high-level ethical principles to guide the development of responsible AI systems. In fact, hundreds of such frameworks and guidelines exist.

    “The real and persistent challenge has been in how to effectively operationalise those principles to transform the practice and deployment of those AI systems,” she says.

    Recognising this, the Forum has designed quantum technology governance guidelines specifically for adoption by quantum technology stakeholders. They are unique from other new technology ethical guidelines by providing directed guidance and practical ‘off-the shelf’ applicability.

    3
    The World Economic Forum’s latest Insight report outlines a set of governance principles for quantum computing.

    Unlocking innovation for decades to come

    The guidelines drew on a diverse array of thinking around quantum technology from all over the world.

    Justine and Jim are excited to see the guidelines embedded not only in the research and development stage of quantum technology, but through early-stage translation, commercialisation and application.

    “It’s an ideal time to be embracing this,” Jim says.

    “I am really glad to have people like Justine to work alongside as we ramp up the effort to translate quantum technology research into viable industry applications, with active consideration and implementation of Responsible Innovation from the get-go.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    CSIRO campus

    CSIRO (AU)-Commonwealth Scientific and Industrial Research Organisation , is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

    CSIRO works with leading organisations around the world. From its headquarters in Canberra, CSIRO maintains more than 50 sites across Australia and in France, Chile and the United States, employing about 5,500 people.

    Federally funded scientific research began in Australia 104 years ago. The Advisory Council of Science and Industry was established in 1916 but was hampered by insufficient available finance. In 1926 the research effort was reinvigorated by establishment of the Council for Scientific and Industrial Research (CSIR), which strengthened national science leadership and increased research funding. CSIR grew rapidly and achieved significant early successes. In 1949 further legislated changes included renaming the organisation as CSIRO.

    Notable developments by CSIRO have included the invention of atomic absorption spectroscopy; essential components of Wi-Fi technology; development of the first commercially successful polymer banknote; the invention of the insect repellent in Aerogard and the introduction of a series of biological controls into Australia, such as the introduction of myxomatosis and rabbit calicivirus for the control of rabbit populations.

    Research and focus areas

    Research Business Units

    As at 2019, CSIRO’s research areas are identified as “Impact science” and organised into the following Business Units:

    Agriculture and Food
    Health and Biosecurity
    Data 61
    Energy
    Land and Water
    Manufacturing
    Mineral Resources
    Oceans and Atmosphere

    National Facilities

    CSIRO manages national research facilities and scientific infrastructure on behalf of the nation to assist with the delivery of research. The national facilities and specialized laboratories are available to both international and Australian users from industry and research. As at 2019, the following National Facilities are listed:

    Australian Animal Health Laboratory (AAHL)
    Australia Telescope National Facility – radio telescopes included in the Facility include the Australia Telescope Compact Array, the Parkes Observatory, Mopra Observatory and the Australian Square Kilometre Array Pathfinder.

    STCA CSIRO Australia Compact Array (AU), six radio telescopes at the Paul Wild Observatory, is an array of six 22-m antennas located about twenty five kilometres (16 mi) west of the town of Narrabri in Australia.

    CSIRO-Commonwealth Scientific and Industrial Research Organisation (AU) Parkes Observatory, [ Murriyang, the traditional Indigenous name] , located 20 kilometres north of the town of Parkes, New South Wales, Australia, 414.80m above sea level.

    CSIRO-Commonwealth Scientific and Industrial Research Organisation (AU) Mopra radio telescope

    Australian Square Kilometre Array Pathfinder

    NASA Canberra Deep Space Communication Complex, AU, Deep Space Network. Credit: The National Aeronautics and Space Agency (US)

    CSIRO Canberra campus

    ESA DSA 1, hosts a 35-metre deep-space antenna with transmission and reception in both S- and X-band and is located 140 kilometres north of Perth, Western Australia, near the town of New Norcia

    CSIRO-Commonwealth Scientific and Industrial Research Organisation (AU)CSIRO R/V Investigator.

    UK Space NovaSAR-1 satellite (UK) synthetic aperture radar satellite.

    CSIRO Pawsey Supercomputing Centre AU)

    Magnus Cray XC40 supercomputer at Pawsey Supercomputer Centre Perth Australia

    Galaxy Cray XC30 Series Supercomputer at at Pawsey Supercomputer Centre Perth Australia

    Pausey Supercomputer CSIRO Zeus SGI Linux cluster

    Others not shown

    SKA

    SKA- Square Kilometer Array

    SKA Square Kilometre Array low frequency at Murchison Widefield Array, Boolardy station in outback Western Australia on the traditional lands of the Wajarri peoples.

    EDGES telescope in a radio quiet zone at the Murchison Radio-astronomy Observatory in Western Australia, on the traditional lands of the Wajarri peoples.

     
  • richardmitnick 12:07 pm on January 15, 2022 Permalink | Reply
    Tags: , , , , , Quantum Mechanics, , , , "From bits to qubits"   

    From Symmetry: “From bits to qubits” 

    Symmetry Mag

    From Symmetry

    01/13/22
    Sarah Charley

    1
    Illustration by Sandbox Studio, Chicago with Ana Kova.

    Quantum computers go beyond the binary.

    The first desktop computer was invented in the 1960s. But computing technology has been around for centuries, says Irfan Siddiqi, director of the Quantum Nanoelectronics Laboratory at The University of California- Berkeley (US).

    “An abacus is an ancient computer,” he says. “The materials science revolution made bits smaller, but the fundamental architecture hasn’t changed.”

    Both modern computers and abaci use basic units of information that have two possible states. In a classical computer, a binary digit (called a bit) is a 1 or a 0, represented by on-off switches in the hardware. On an abacus, a sliding bead can also be thought of as being “on” or “off,” based on its position (left or right on an abacus with horizontal rods, or up or down on an abacus with vertical ones). Bits and beads can form patterns that represent other numbers and, in the case of computers, letters and symbols.

    But what if there were even more possibilities? What if the beads of an abacus could sit in between two positions? What if the switches in a computer could consult each other before outputting a calculation?

    This is the fundamental idea behind quantum computers, which embrace the oddities of quantum mechanics to encode and process information.

    “Information in quantum mechanics is stored in very different ways than in classical mechanics, and that’s where the power comes from,” says Heather Gray, an assistant professor and particle physicist at UC Berkeley.

    Classical computer; classical mechanics

    Computing devices break down numbers into discrete components. A simple abacus could be made up of three rows: one with beads representing 100s, one with beads representing 10s, and one with beads representing 1s. In this case, the number 514 could be indicated by sliding to the right 5 beads in the 100s row, 1 bead in the 10s row, and 4 beads in the 1s row.

    The computer you may be using to read this article does something similar, counting by powers of two instead of 10s. In binary, the number 514 becomes 1000000010.

    The more complex the task, the more bits or time a computer needs to perform the calculation. To speed things up, scientists have over the years found ways to fit more and more bits into a computer. “You can now have one trillion transistors on a small silicon chip, which is a far cry from the ancient Chinese abacus,” Siddiqi says.

    But as engineers make transistors smaller and smaller, they’ve started to notice some funny effects.

    The quantum twist on computing

    Bits that behave classically are determinate: A 1 is a 1. But at very small scales, an entirely new set of physical rules comes into play.

    “We are hitting the quantum limits,” says Alberto Di Meglio, the head of CERN’s Quantum Technology Initiative. “As the scale of classic computing technology becomes smaller and smaller, quantum mechanics’ effects are not negligible anymore, and we do not want this in classic computers.”

    But quantum computers use quantum mechanics to their benefit. Rather than offering decisive answers, quantum bits, called qubits, behave like a distribution of probable values.

    Di Meglio likens qubits to undecided voters in an election. “You might know how a particular person is likely to vote, but until you actually ask them to vote, you won’t have a definite answer,” Di Meglio says.

    Qubits can be made from subatomic particles, such as electrons. Like other, similar particles, electrons have a property called spin that can exist in one of two possible states (spin-up or spin-down).

    If we think of these electrons as undecided voters, the question they are voting on is their direction of spin. Quantum computers process information while the qubits are still undecided—somewhere in between spin-up and spin-down.

    The situation becomes even more complicated when the “voters” can influence one another. This happens when two qubits are entangled. “For example, if one person votes yes, then an entangled ‘undecided’ voter will automatically vote no,” Di Meglio says. “The relationships become important, and the more voters you put together, the more chaotic it becomes.”

    When the qubits start talking to each other, each qubit can find itself in many different configurations, Siddiqi says. “An entangled array of qubits—with ‘n’ number of qubits—can exist in 2^n configurations. A quantum computer with 300 good qubits would have 2^300 possible configurations, which is more than the number of particles in the known universe.”

    With great power comes great… noise

    Entanglement allows a quantum computer to perform a complex task in a fraction of the time it would take a classical computer. But entanglement is also the quantum computer’s greatest weakness.

    “A qubit can get entangled with something else that you don’t have access to,” Siddiqi says. “Information can leave the system.”

    An electron from the computer’s power supply or a stray photon can entangle with a qubit and make it go rogue.

    “Quantum computing is not just about the number of qubits,” Di Meglio says. “You might have a quantum computer with thousands of qubits, but only a fraction are reliable.”

    Because of the problem of rogue qubits, today’s quantum computers are classified as noisy intermediate-scale quantum, or NISQ, devices. “Most quantum computers look like a physics experiment,” Gray says. “We’re very far from having one you could use at home.”

    But scientists are trying. In the future, scientists hope that they can use quantum computers to quickly search through large databases and calculate complex mathematical matrices.

    Today, physicists are already experimenting with quantum computers to simulate quantum processes, such as how particles interact with each other inside the detectors at the Large Hadron Collider. “You can do all sorts of cool things with entangled qubits,” Gray says.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 11:41 am on January 15, 2022 Permalink | Reply
    Tags: Quantum Mechanics, , , "A century of quantum mechanics questions the fundamental nature of reality", The quantum revolution upended our understanding of nature and a lot of uncertainty remains., Scientists have dug deep enough to discover that reality’s foundations do not mirror the world of everyday appearances., At its roots reality is described by the mysterious set of mathematical rules known as quantum mechanics., Quantum mechanics is the math that explains matter., The physics of the microworld, Reality isn’t what it seems., Quantum theory represents the ultimate outcome of superior logical reasoning., Science morphs from dictator to oddsmaker: quantum math tells only probabilities for different possible outcomes. Some uncertainty always remains.   

    From California Institute of Technology (US) via Science News (US) : “A century of quantum mechanics questions the fundamental nature of reality” 

    Caltech Logo

    From California Institute of Technology (US)

    via

    Science News (US)

    January 12, 2022
    Tom Siegfried

    1
    Quantum theory describes a reality ruled by probabilities. How to reconcile that reality with everyday experiences is still unclear. Credit: Max Löffler.

    The quantum revolution upended our understanding of nature and a lot of uncertainty remains.

    Scientists are like prospectors, excavating the natural world seeking gems of knowledge about physical reality. And in the century just past, scientists have dug deep enough to discover that reality’s foundations do not mirror the world of everyday appearances. At its roots reality is described by the mysterious set of mathematical rules known as quantum mechanics.

    Conceived at the turn of the 20th century and then emerging in its full form in the mid-1920s, quantum mechanics is the math that explains matter. It’s the theory for describing the physics of the microworld, where atoms and molecules interact to generate the world of human experience. And it’s at the heart of everything that made the century just past so dramatically unlike the century preceding it. From cell phones to supercomputers, DVDs to pdfs, quantum physics fueled the present-day electronics-based economy, transforming commerce, communication and entertainment.

    But quantum theory taught scientists much more than how to make computer chips. It taught that reality isn’t what it seems.

    “The fundamental nature of reality could be radically different from our familiar world of objects moving around in space and interacting with each other,” physicist Sean Carroll suggested in a recent tweet. “We shouldn’t fool ourselves into mistaking the world as we experience it for the world as it really is.”

    In a technical paper [Quantum Mechanics and Fundamentality Reality as a Vector in Hilbert Space] backing up his tweet, Carroll notes that quantum theory consists of equations that describe mathematical entities roaming through an abstract realm of possible natural events. It’s plausible, Carroll argues, that this quantum realm of mathematical possibilities represents the true, fundamental nature of reality. If so, all the physical phenomena we perceive are just a “higher-level emergent description” of what’s really going on.

    “Emergent” events in ordinary space are real in their own way, just not fundamental, Carroll allows. Belief that the “spatial arena” is fundamental “is more a matter of convenience and convention than one of principle,” he says.

    Carroll’s perspective is not the only way of viewing the meaning of quantum math, he acknowledges, and it is not fully shared by most physicists. But everybody does agree that quantum physics has drastically remodeled humankind’s understanding of nature. In fact, a fair reading of history suggests that quantum theory is the most dramatic shift in science’s conception of reality since the ancient Greeks deposed mythological explanations of natural phenomena in favor of logic and reason. After all, quantum physics itself seems to defy logic and reason.

    It doesn’t, of course. Quantum theory represents the ultimate outcome of superior logical reasoning, arriving at truths that could never be discovered merely by observing the visible world.

    It turns out that in the microworld — beyond the reach of the senses — phenomena play a game with fantastical rules. Matter’s basic particles are not tiny rocks, but more like ghostly waves that maintain multiple possible futures until forced to assume the subatomic equivalent of substance. As a result, quantum math does not describe a relentless cause-and-effect sequence of events as Newtonian science had insisted. Instead science morphs from dictator to oddsmaker: quantum math tells only probabilities for different possible outcomes. Some uncertainty always remains.

    2
    Quantum mechanics says that whether an electron behaves as particle or wave depends on how it is observed.Credit: Max Löffler.

    The quantum revolution

    The discovery of quantum uncertainty was what first impressed the world with the depth of the quantum revolution. German physicist Werner Heisenberg, in 1927, astounded the scientific community with the revelation that deterministic cause-and-effect physics failed when applied to atoms. It was impossible, Heisenberg deduced, to measure both the location and velocity of a subatomic particle at the same time. If you measured one precisely, some uncertainty remained for the other.

    “A particle may have an exact place or an exact speed, but it can not have both,” as Science News Letter, the predecessor of Science News, reported in 1929. “Crudely stated, the new theory holds that chance rules the physical world.” Heisenberg’s uncertainty principle “is destined to revolutionize the ideas of the universe held by scientists and laymen to an even greater extent than Einstein’s relativity.”

    Heisenberg’s breakthrough was the culmination of a series of quantum surprises. First came German physicist Max Planck’s discovery, in 1900, that light and other forms of radiation could be absorbed or emitted only in discrete packets, which Planck called quanta. A few years later Albert Einstein argued that light also traveled through space as packets, or particles, later called photons. Many physicists dismissed such early quantum clues as inconsequential. But in 1913, the Danish physicist Niels Bohr used quantum theory to explain the structure of the atom. Soon the world realized that reality needed reexamining.

    By 1921, awareness of the quantum revolution had begun to expand beyond the confines of physics conferences. In that year, Science News Bulletin, the first iteration of Science News, distributed what was “believed to be the first popular explanation” of the quantum theory of radiation, provided by American physical chemist William D. Harkins. He proclaimed that the quantum theory “is of much more practical importance” than Albert Einsteins’s Theory of General Relativity.

    “Since it concerns itself with the relations between matter and radiation,” Harkins wrote, quantum theory “is of fundamental significance in connection with almost all processes which we know.” Electricity, chemical reactions and how matter responds to heat all require quantum-theoretic explanations.

    As for atoms, traditional physics asserts that atoms and their parts can move about “in a large number of different ways,” Harkins stated. But quantum theory maintains that “of all the states of motion (or ways of moving) prescribed by the older theory, only a certain number actually do occur.” Therefore, events previously believed “to occur as continuous processes, actually do occur in steps.”

    But in 1921 quantum physics remained embryonic. Some of its implications had been discerned, but its full form remained undeveloped in detail. It was Heisenberg, in 1925, who first transformed the puzzling jumble of clues into a coherent mathematical picture. His decisive advance was developing a way to represent the energies of electrons in atoms using matrix algebra. With aid from German physicists Max Born and Pascual Jordan, Heisenberg’s math became known as matrix mechanics. Shortly thereafter, Austrian physicist Erwin Schrödinger developed a competing equation for electron energies, viewing the supposed particles as waves described by a mathematical wave function. Schrödinger’s “wave mechanics” turned out to be mathematically equivalent to Heisenberg’s particle-based approach, and “quantum mechanics” became the general term for the math describing all subatomic systems.

    Still, some confusion remained. It wasn’t clear how an approach picturing electrons as particles could be equivalent to one supposing electrons to be waves. Bohr, by then regarded as the foremost of the world’s atomic physicists, pondered the question deeply and by 1927 arrived at a novel viewpoint he called complementarity.

    Bohr argued that the particle and wave views were complementary; both were necessary for a full description of subatomic phenomena. Whether a “particle” — say, an electron — exhibited its wave or particle nature depended on the experimental setup observing it. An apparatus designed to find a particle would find a particle; an apparatus geared to detect wave behavior would find a wave.

    At about the same time, Heisenberg derived his uncertainty principle. Just as wave and particle could not be observed in the same experiment, position and velocity could not both be precisely measured at the same time. As physicist Wolfgang Pauli commented, “Now it becomes day in quantum theory.”

    But the quantum adventure was really just beginning.

    3
    In the many worlds interpretation of quantum mechanics, all possible realities exist, but humans perceive just one. Credit: Max Löffler.

    A great debate

    Many physicists, Einstein among them, deplored the implications of Heisenberg’s uncertainty principle. Its introduction in 1927 eliminated the possibility of precisely predicting the outcomes of atomic observations. As Born had shown, you could merely predict the probabilities for the various possible outcomes, using calculations informed by the wave function that Schrödinger had introduced. Einstein famously retorted that he could not believe that God would play dice with the universe. Even worse, in Einstein’s view, the wave-particle duality described by Bohr implied that a physicist could affect reality by deciding what kind of measurement to make. Surely, Einstein believed, reality existed independently of human observations.

    On that point, Bohr engaged Einstein in a series of discussions that came to be known as the Bohr-Einstein debate, a continuing dialog that came to a head in 1935. In that year, Einstein, with collaborators Nathan Rosen and Boris Podolsky, described a thought experiment supposedly showing that quantum mechanics could not be a complete theory of reality.

    In a brief summary in Science News Letter in May 1935, Podolsky explained that a complete theory must include a mathematical “counterpart for every element of the physical world.” In other words, there should be a quantum wave function for the properties of every physical system. Yet if two physical systems, each described by a wave function, interact and then fly apart, “quantum mechanics … does not enable us to calculate the wave function of each physical system after the separation.” (In technical terms, the two systems become “entangled,” a term coined by Schrödinger.) So quantum math cannot describe all elements of reality and is therefore incomplete.

    Bohr soon responded, as reported in Science News Letter in August 1935. He declared that Einstein and colleagues’ criterion for physical reality was ambiguous in quantum systems. Einstein, Podolsky and Rosen assumed that a system (say an electron) possessed definite values for certain properties (such as its momentum) before those values were measured. Quantum mechanics, Bohr explained, preserved different possible values for a particle’s properties until one of them was measured. You could not assume the existence of an “element of reality” without specifying an experiment to measure it.

    Einstein did not relent. He acknowledged that the uncertainty principle was correct with respect to what was observable in nature, but insisted that some invisible aspect of reality nevertheless determined the course of physical events. In the early 1950s physicist David Bohm developed such a theory of “hidden variables” that restored determinism to quantum physics, but made no predictions that differed from the standard quantum mechanics math. Einstein was not impressed with Bohm’s effort. “That way seems too cheap to me,” Einstein wrote to Born, a lifelong friend.

    Einstein died in 1955, Bohr in 1962, neither conceding to the other. In any case it seemed like an irresolvable dispute, since experiments would give the same results either way. But in 1964, physicist John Stewart Bell deduced a clever theorem about entangled particles, enabling experiments to probe the possibility of hidden variables. Beginning in the 1970s, and continuing to today, experiment after experiment confirmed the standard quantum mechanical predictions. Einstein’s objection was overruled by the court of nature.

    Still, many physicists expressed discomfort with Bohr’s view (commonly referred to as the Copenhagen interpretation of quantum mechanics). One particularly dramatic challenge came from the physicist Hugh Everett III in 1957. He insisted that an experiment did not create one reality from the many quantum possibilities, but rather identified only one branch of reality. All the other experimental possibilities existed on other branches, all equally real. Humans perceive only their own particular branch, unaware of the others just as they are unaware of the rotation of the Earth. This “many worlds interpretation” was widely ignored at first but became popular decades later, with many adherents today.

    Since Everett’s work, numerous other interpretations of quantum theory have been offered. Some emphasize the “reality” of the wave function, the mathematical expression used for predicting the odds of different possibilities. Others emphasize the role of the math as describing the knowledge about reality accessible to experimenters.

    Some interpretations attempt to reconcile the many worlds view with the fact that humans perceive only one reality. In the 1980s, physicists including H. Dieter Zeh and Wojciech Zurek identified the importance of a quantum system’s interaction with its external environment, a process called quantum decoherence. Some of a particle’s many possible realities rapidly evaporate as it encounters matter and radiation in its vicinity. Soon only one of the possible realities remains consistent with all the environmental interactions, explaining why on the human scale of time and size only one such reality is perceived.

    This insight spawned the “consistent histories” interpretation, pioneered by Robert Griffiths and developed in more elaborate form by Murray Gell-Mann and James Hartle. It is widely known among physicists but has received little wider popularity and has not deterred the pursuit of other interpretations. Scientists continue to grapple with what quantum math means for the very nature of reality.

    4
    Using principles of quantum information theory, a particle’s quantum state can be replicated at a distant location, a feat known as quantum teleportation. Credit: Max Löffler.

    It from quantum bit

    In the 1990s, the quest for quantum clarity took a new turn with the rise of quantum information theory. Physicist John Archibald Wheeler, a disciple of Bohr, had long emphasized that specific realities emerged from the fog of quantum possibilities by irreversible amplifications — such as an electron definitely establishing its location by leaving a mark after hitting a detector. Wheeler suggested that reality as a whole could be built up from such processes, which he compared to yes or no questions — is the electron here? Answers corresponded to bits of information, the 1s and 0s used by computers. Wheeler coined the slogan “it from bit” to describe the link between existence and information.

    Taking the analogy further, one of Wheeler’s former students, Benjamin Schumacher, devised the notion of a quantum version of the classical bit of information. He introduced the quantum bit, or qubit, at a conference in Dallas in 1992.

    Schumacher’s qubit provided a basis for building computers that could process quantum information. Such “quantum computers” had previously been envisioned, in different ways, by physicists Paul Benioff, Richard Feynman and David Deutsch. In 1994, mathematician Peter Shor showed how a quantum computer manipulating qubits could crack the toughest secret codes, launching a quest to design and build quantum computers capable of that and other clever computing feats. By the early 21st century, rudimentary quantum computers had been built; the latest versions can perform some computing tasks but are not powerful enough yet to make current cryptography methods obsolete. For certain types of problems, though, quantum computing may soon achieve superiority over standard computers.

    5

    Quantum computing’s realization has not resolved the debate over quantum interpretations. Deutsch believed that quantum computers would support the many worlds view. Hardly anyone else agrees, though. And decades of quantum experiments have not provided any support for novel interpretations — all the results comply with the traditional quantum mechanics expectations. Quantum systems preserve different values for certain properties until one is measured, just as Bohr insisted. But nobody is completely satisfied, perhaps because the 20th century’s other pillar of fundamental physics, Einstein’s theory of gravity (general relativity), does not fit in quantum theory’s framework.

    For decades now, the quest for a quantum theory of gravity has fallen short of success, despite many promising ideas. Most recently a new approach suggests that the geometry of spacetime, the source of gravity in Einstein’s theory, may in some way be built from the entanglement of quantum entities. If so, the mysterious behavior of the quantum world defies understanding in terms of ordinary events in space and time because quantum reality creates spacetime, rather than occupying it. If so, human observers witness an artificial, emergent reality that gives the impression of events happening in space and time while the true, inaccessible reality doesn’t have to play by the spacetime rules.

    In a crude way this view echoes that of Parmenides, the ancient Greek philosopher who taught that all change is an illusion. Our senses show us the “way of seeming,” Parmenides declared; only logic and reason can reveal “the way of truth.” Parmenides didn’t reach that insight by doing the math, of course (he said it was explained to him by a goddess). But he was a crucial figure in the history of science, initiating the use of rigorous deductive reasoning and relying on it even when it led to conclusions that defied sensory experience.

    Yet as some of the other ancient Greeks realized, the world of the senses does offer clues about the reality we can’t see. “Phenomena are a sight of the unseen,” Anaxagoras said. As Carroll puts it, in modern terms, “the world as we experience it” is certainly related to “the world as it really is.”

    “But the relationship is complicated,” he says, “and it’s real work to figure it out.”

    In fact, it took two millennia of hard work for the Greek revolution in explaining nature to mature into Newtonian science’s mechanistic understanding of reality. Three centuries later quantum physics revolutionized science’s grasp of reality to a comparable extent. Yet the lack of agreement on what it all means suggests that perhaps science needs to dig a little deeper still.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Caltech campus

    The The California Institute of Technology (US) is a private research university in Pasadena, California. The university is known for its strength in science and engineering, and is one among a small group of institutes of technology in the United States which is primarily devoted to the instruction of pure and applied sciences.

    The California Institute of Technology was founded as a preparatory and vocational school by Amos G. Throop in 1891 and began attracting influential scientists such as George Ellery Hale, Arthur Amos Noyes, and Robert Andrews Millikan in the early 20th century. The vocational and preparatory schools were disbanded and spun off in 1910 and the college assumed its present name in 1920. In 1934, The California Institute of Technology was elected to the Association of American Universities, and the antecedents of National Aeronautics and Space Administration (US)’s Jet Propulsion Laboratory, which The California Institute of Technology continues to manage and operate, were established between 1936 and 1943 under Theodore von Kármán.

    The California Institute of Technology has six academic divisions with strong emphasis on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. First-year students are required to live on campus, and 95% of undergraduates remain in the on-campus House System at The California Institute of Technology. Although The California Institute of Technology has a strong tradition of practical jokes and pranks, student life is governed by an honor code which allows faculty to assign take-home examinations. The The California Institute of Technology Beavers compete in 13 intercollegiate sports in the NCAA Division III’s Southern California Intercollegiate Athletic Conference (SCIAC).

    As of October 2020, there are 76 Nobel laureates who have been affiliated with The California Institute of Technology, including 40 alumni and faculty members (41 prizes, with chemist Linus Pauling being the only individual in history to win two unshared prizes). In addition, 4 Fields Medalists and 6 Turing Award winners have been affiliated with The California Institute of Technology. There are 8 Crafoord Laureates and 56 non-emeritus faculty members (as well as many emeritus faculty members) who have been elected to one of the United States National Academies. Four Chief Scientists of the U.S. Air Force and 71 have won the United States National Medal of Science or Technology. Numerous faculty members are associated with the Howard Hughes Medical Institute(US) as well as National Aeronautics and Space Administration(US). According to a 2015 Pomona College(US) study, The California Institute of Technology ranked number one in the U.S. for the percentage of its graduates who go on to earn a PhD.

    Research

    The California Institute of Technology is classified among “R1: Doctoral Universities – Very High Research Activity”. Caltech was elected to The Association of American Universities in 1934 and remains a research university with “very high” research activity, primarily in STEM fields. The largest federal agencies contributing to research are National Aeronautics and Space Administration(US); National Science Foundation(US); Department of Health and Human Services(US); Department of Defense(US), and Department of Energy(US).

    In 2005, The California Institute of Technology had 739,000 square feet (68,700 m^2) dedicated to research: 330,000 square feet (30,700 m^2) to physical sciences, 163,000 square feet (15,100 m^2) to engineering, and 160,000 square feet (14,900 m^2) to biological sciences.

    In addition to managing NASA-JPL/Caltech (US), The California Institute of Technology also operates the Caltech Palomar Observatory(US); the Owens Valley Radio Observatory(US);the Caltech Submillimeter Observatory(US); the W. M. Keck Observatory at the Mauna Kea Observatory(US); the Laser Interferometer Gravitational-Wave Observatory at Livingston, Louisiana and Richland, Washington; and Kerckhoff Marine Laboratory(US) in Corona del Mar, California. The Institute launched the Kavli Nanoscience Institute at The California Institute of Technology in 2006; the Keck Institute for Space Studies in 2008; and is also the current home for the Einstein Papers Project. The Spitzer Science Center(US), part of the Infrared Processing and Analysis Center(US) located on The California Institute of Technology campus, is the data analysis and community support center for NASA’s Spitzer Infrared Space Telescope [no longer in service].

    The California Institute of Technology partnered with University of California at Los Angeles(US) to establish a Joint Center for Translational Medicine (UCLA-Caltech JCTM), which conducts experimental research into clinical applications, including the diagnosis and treatment of diseases such as cancer.

    The California Institute of Technology operates several Total Carbon Column Observing Network(US) stations as part of an international collaborative effort of measuring greenhouse gases globally. One station is on campus.

     
  • richardmitnick 10:38 am on January 12, 2022 Permalink | Reply
    Tags: , At the dawn of the 20th century a new theory of matter and energy was emerging., , Could a quantum worldview prove useful outside the lab?, Information Theory: a blend of math and computer science, , One of the main questions quantum mechanics addressed was the nature of light-particle or wave, , Peter Shor: a fast-factoring algorithm for a quantum computer-a computer whose bits exist in superposition and can be entangled., Physicists developed a new system of mechanics to describe what seemed to be a quantized and uncertain probabilistic world-Heisenberg's Uncertainty Principle, , , , Quantum Mechanics, , , Shor’s algorithm is of particular interest in encryption because of the difficulty of identifying the prime factors of large numbers., Shor’s algorithm was designed to quickly divide large numbers into their prime factors., The second quantum revolution also relies on and encompasses new ways of using technology to manipulate matter at the quantum level., Today’s quantum computers are not yet advanced enough to implement Shor’s algorithm., , Vacuum tubes, What changed was Shor’s introduction of error-correcting codes.   

    From Symmetry: “The second quantum revolution” 

    Symmetry Mag

    From Symmetry

    01/12/22
    Daniel Garisto

    1
    Illustration by Ana Kova / Sandbox Studio, Chicago.

    Inventions like the transistor and laser changed the world. What changes will the second quantum revolution bring?

    For physicists trying to harness the power of electricity, no tool was more important than the vacuum tube. This lightbulb-like device controlled the flow of electricity and could amplify signals. In the early 20th century, vacuum tubes were used in radios, televisions and long-distance telephone networks.

    But vacuum tubes had significant drawbacks: They generated heat; they were bulky; and they had a propensity to burn out. Physicists at Bell Labs, a spin-off of AT&T, were interested in finding a replacement.

    Applying their knowledge of quantum mechanics—specifically how electrons flowed between materials with electrical conductivity—they found a way to mimic the function of vacuum tubes without those shortcomings.

    They had invented the transistor. At the time, the invention did not grace the front page of any major news publications. Even the scientists themselves couldn’t have appreciated just how important their device would be.

    First came the transistor radio, popularized in large part by the new Japanese company Sony. Spreading portable access to radio broadcasts changed music and connected disparate corners of the world.

    Transistors then paved the way for NASA’s Apollo Project, which first took humans to the moon. And perhaps most importantly, transistors were made smaller and smaller, shrinking room-sized computers and magnifying their power to eventually create laptops and smartphones.

    These quantum-inspired devices are central to every single modern electronic application that uses some computing power, such as cars, cellphones and digital cameras. You would not be reading this sentence without transistors, which are an important part of what is now called the First Quantum Revolution.

    Quantum physicists Jonathan Dowling and Gerard Milburn coined the term “quantum revolution” in a 2002 paper [The Royal Society]. In it, they argue that we have now entered a new era, a Second Quantum Revolution. “It just dawned on me that actually there was a whole new technological frontier opening up,” says Milburn, professor emeritus at The University of Queensland (AU).

    This second quantum revolution is defined by developments in technologies like quantum computing and quantum sensing, brought on by a deeper understanding of the quantum world and precision control down to the level of individual particles.

    A quantum understanding

    At the dawn of the 20th century a new theory of matter and energy was emerging. Unsatisfied with classical explanations about the strange behavior of particles, physicists developed a new system of mechanics to describe what seemed to be a quantized, uncertain, probabilistic world.

    One of the main questions quantum mechanics addressed was the nature of light. Eighteenth-century physicists believed light was a particle. Nineteenth-century physicists proved it had to be a wave. Twentieth-century physicists resolved the problem by redefining particles using the principles of quantum mechanics. They proposed that particles of light, now called photons, had some probability of existing in a given location—a probability that could be represented as a wave and even experience interference like one.

    This newfound picture of the world helped make sense of results such as those of the double-slit experiment, which showed that particles like electrons and photons could behave as if they were waves.

    But could a quantum worldview prove useful outside the lab?

    At first, “quantum was usually seen as just a source of mystery and confusion and all sorts of strange paradoxes,” Milburn says.

    But after World War II, people began figuring out how to use those paradoxes to get things done. Building on new quantum ideas about the behavior of electrons in metals and other materials, Bell Labs researchers William Shockley, John Bardeen and Walter Brattain created the first transistors. They realized that sandwiching semiconductors together could create a device that would allow electrical current to flow in one direction, but not another. Other technologies, such as atomic clocks and the nuclear magnetic resonance used for MRI scans, were also products of the first quantum revolution.

    Another important and, well, visible quantum invention was the laser.

    In the 1950s, optical physicists knew that hitting certain kinds of atoms with a few photons at the right energy could lead them to emit more photons with the same energy and direction as the initial photons. This effect would cause a cascade of photons, creating a stable, straight beam of light unlike anything seen in nature. Today, lasers are ubiquitous, used in applications from laser pointers to barcode scanners to life-saving medical techniques.

    All of these devices were made possible by studies of the quantum world. Both the laser and transistor rely on an understanding of quantized atomic energy levels. Milburn and Dowling suggest that the technologies of the first quantum revolution are unified by “the idea that matter particles sometimes behaved like waves, and that light waves sometimes acted like particles.”

    For the first time, scientists were using their understanding of quantum mechanics to create new tools that could be used in the classical world.

    The second quantum revolution

    Many of these developments were described to the public without resorting to the word “quantum,” as this Bell Labs video about the laser attests.

    One reason for the disconnect was that the first quantum revolution didn’t make full use of quantum mechanics. “The systems were too noisy. In a sense, the full richness of quantum mechanics wasn’t really accessible,” says Ivan Deutsch, a quantum physicist at The University of New Mexico (US). “You can get by with a fairly classical picture.”

    The stage for the second quantum revolution was set in the 1960s, when the North Irish physicist John Stewart Bell [B.Sc.The Queen’s University of Belfast (NIR); Ph.DThe University of Birmingham (UK);The European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) [CERN]; Stanford University (US) ] shook the foundations of quantum mechanics. Bell proposed that entangled particles were correlated in strange quantum ways and could not be explained with so-called “hidden variables.” Tests performed in the ’70s and ’80s confirmed that measuring one entangled particle really did seem to determine the state of the other, faster than any signal could travel between the two.

    The other critical ingredient for the second quantum revolution was information theory, a blend of math and computer science developed by pioneers like Claude Shannon and Alan Turing. In 1994, combining new insight into the foundations of quantum mechanics with information theory led the mathematician Peter Shor to introduce a fast-factoring algorithm for a quantum computer, a computer whose bits exist in superposition and can be entangled.

    Shor’s algorithm was designed to quickly divide large numbers into their prime factors. Using the algorithm, a quantum computer could solve the problem much more efficiently than a classical one. It was the clearest early demonstration of the worth of quantum computing.

    “It really made the whole idea of quantum information, a new concept that those of us who had been working in related areas, instantly appreciated,” Deutsch says. “Shor’s algorithm suggested the possibilities new quantum tech could have over existing classical tech, galvanizing research across the board.”

    Shor’s algorithm is of particular interest in encryption because the difficulty of identifying the prime factors of large numbers is precisely what keeps data private online. To unlock encrypted information, a computer must know the prime factors of a large number associated with it. Use a large enough number, and the puzzle of guessing its prime factors can take a classical computer thousands of years. With Shor’s algorithm, the guessing game can take just moments.

    Today’s quantum computers are not yet advanced enough to implement Shor’s algorithm. But as Deutsch points out, skeptics once doubted a quantum computer was even possible.

    “Because there was a kind of trade-off,” he says. “The kind of exponential increase in computational power that might come from quantum superpositions would be counteracted exactly, by exponential sensitivity to noise.”

    While inventions like the transistor required knowledge of quantum mechanics, the device itself wasn’t in a delicate quantum state, so it could be described semi-classically. Quantum computers, on the other hand, require delicate quantum connections.

    What changed was Shor’s introduction of error-correcting codes. By combining concepts from classical information theory with quantum mechanics, Shor showed that, in theory, even the delicate state of a quantum computer could be preserved.

    Beyond quantum computing, the second quantum revolution also relies on and encompasses new ways of using technology to manipulate matter at the quantum level.

    Using lasers, researchers have learned to sap the energy of atoms and cool them. Like a soccer player dribbling a ball up field with a series of taps, lasers can cool atoms to billionths of a degree above absolute zero—far colder than conventional cooling techniques. In 1995, scientists used laser cooling to observe a long-predicted state of matter: the Bose-Einstein condensate.

    Other quantum optical techniques have been developed to make ultra-precise measurements.

    Classical interferometers, like the type used in the famous Michelson-Morley experiment that measured the speed of light in different directions to search for signs of a hypothetical aether, looked at the interference pattern of light. New matter-wave interferometers exploit the principle that everything—not just light—has a wavefunction. Measuring changes in the phase of atoms, which have far shorter wavelengths than light, could give unprecedented control to experiments that attempt to measure the smallest effects, like those of gravity.

    With laboratories and companies around the world focused on advancements in quantum science and applications, the second quantum revolution has only begun. As Bardeen put it in his Nobel lecture, we may be at another “particularly opportune time … to add another small step in the control of nature for the benefit of [hu]mankind.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 9:24 pm on January 4, 2022 Permalink | Reply
    Tags: "How could the Big Bang arise from nothing?", Albert Einstein's Theory of General Relativity proposes that a gravitational singularity may have existed., , , , , , In a gravitational singularity even the laws of quantum physics break down and the four fundamental forces (strong nuclear; weak nuclear; electromagnetic & gravity) could be unified as one!, Many-worlds quantum theory gives a new twist on conformal cyclic cosmology: Our Big Bang might be the rebirth of one single quantum multiverse containing infinitely different universes., , No matter how small the chance of something occurring if it has a non-zero chance then it occurs in some quantum parallel world., Other measurement results all play out in other universes in a multiverse effectively cut off from our own., , Quantum Mechanics, Some people believe parallel universes may also be observable in cosmological data as imprints caused by another universe colliding with ours., The "Grand Unification Epoch", The "Plank Epoch", , , The measurement result we see is just one possibility-the one that plays out in our own universe., Three options to the deeper question of how the cycles began: no physical explanation at all; endlessly repeating cycles each a universe in its own right; one single cycle .   

    From The Conversation : “How could the Big Bang arise from nothing?” 

    From The Conversation

    January 3, 2022
    Alastair Wilson
    Professor of Philosophy, The University of Birmingham (UK)

    The evolution of the cosmos after the Big Bang. Into what is the universe expanding? Credit: Dana Berry/NASA Goddard.

    READER QUESTION: My understanding is that nothing comes from nothing. For something to exist, there must be material or a component available, and for them to be available, there must be something else available. Now my question: Where did the material come from that created the Big Bang, and what happened in the first instance to create that material? Peter, 80, Australia.

    “The last star will slowly cool and fade away. With its passing, the universe will become once more a void, without light or life or meaning.” So warned the physicist Brian Cox in the recent BBC series Universe. The fading of that last star will only be the beginning of an infinitely long, dark epoch. All matter will eventually be consumed by monstrous black holes, which in their turn will evaporate away into the dimmest glimmers of light. Space will expand ever outwards until even that dim light becomes too spread out to interact. Activity will cease.

    Or will it? Strangely enough, some cosmologists believe a previous, cold dark empty universe like the one which lies in our far future could have been the source of our very own Big Bang.

    The first matter

    But before we get to that, let’s take a look at how “material” – physical matter – first came about. If we are aiming to explain the origins of stable matter made of atoms or molecules, there was certainly none of that around at the Big Bang – nor for hundreds of thousands of years afterwards. We do in fact have a pretty detailed understanding of how the first atoms formed out of simpler particles once conditions cooled down enough for complex matter to be stable, and how these atoms were later fused into heavier elements inside stars. But that understanding doesn’t address the question of whether something came from nothing.

    So let’s think further back. The first long-lived matter particles of any kind were protons and neutrons, which together make up the atomic nucleus.

    The quark structure of the proton. 16 March 2006 Arpad Horvath.

    The quark structure of the neutron. 15 January 2018 Jacek Rybak.

    These came into existence around one ten-thousandth of a second after the Big Bang. Before that point, there was really no material in any familiar sense of the word. But physics lets us keep on tracing the timeline backwards – to physical processes which predate any stable matter.

    This takes us to the so-called “grand unified epoch”.

    The Beginning of the Modern Universe

    The “Grand Unification Epoch” took place from 10^-43 seconds to 10^-36 seconds after our universe was born. Quantum theory allows us to form a clearer picture of this Epoch compared to the mysterious The “Plank Epoch”.

    During the “Grand Unification Epoch”, the universe was still extremely hot and incomprehensibly small. However, it had cooled down enough to allow the force of gravity to separate from the other three fundamental forces. The unification of the strong nuclear, weak nuclear, and electromagnetic force that existed during this period of time is referred to as the electronuclear force. However, the splitting off of gravity from the electronuclear force wasn’t the only milestone of this epoch- this is also when the first elementary particles began to form.

    What Are Elementary Particles?
    Elementary Particles are particles which have no substructure- i.e. they are the simplest form of matter possible. Elementary particles are the building blocks of electrons, neutrons, protons and more! Currently, there are 17 elementary particles that have been confirmed- the unconfirmed “gravitron”is still in the theoretical category. There are 12 “matter” elementary particles and 5 “force carrier” particles.

    Standard Model of Particle Physics, Quantum Diaries.

    Fermions
    These are the matter elementary particles are what make up the physical part of subatomic particles and are referred to as fermions. The two categories of elementary fermions are quarks and leptons. Quarks combine to form particles known as Hadrons (more on that later), which make up the famous neutrons and protons; Leptons form electrons and other fundamental particles.

    Bosons
    The 5 force carrier particles mediate the interactions between the weak magnetic, strong magnetic, and electromagnetic forces. Bosons are the fundamental reason for the attractions/reactions we view as forces.

    The “Plank Epoch”
    The “Plank Epoch” encompasses the time period from 0 to 10^-43 seconds.
    This extremely small unit of time is aptly referred to as a “Plank Time”.
    Not much is truly known about this period of time, however some very interesting hypothesis have been made.

    Albert Einstein’s Theory of General Relativity proposes that a gravitational singularity may have existed. In a gravitational singularity, even the laws of quantum physics break down and the four fundamental forces (strong nuclear, weak nuclear, electromagnetic, & gravity) could be unified as one! This is an extremely odd concept to consider. It also ties into the so-called “Theory of Everything” which states that at high enough energy levels, even gravity will combine back into one unified force with the other three.

    During the “Plank Epoch”, our universe was only 10^-35 meters wide (VERY small) and 10^32 degrees celsius (VERY hot)!

    During the The “Grand Unification Epoch”, the universe was still extremely hot and incomprehensibly small. However, it had cooled down enough to allow the force of gravity to separate from the other three fundamental forces. The unification of the strong nuclear, weak nuclear, and electromagnetic force that existed during this period of time is referred to as the electronuclear force. However, the splitting off of gravity from the electronuclear force wasn’t the only milestone of this epoch- this is also when the first elementary particles began to form.

    By now, we are well into the realm of speculative physics, as we can’t produce enough energy in our experiments to probe the sort of processes that were going on at the time. But a plausible hypothesis is that the physical world was made up of a soup of short-lived elementary particles – including quarks, the building blocks of protons and neutrons. There was both matter and “antimatter” in roughly equal quantities: each type of matter particle, such as the quark, has an antimatter “mirror image” companion, which is near identical to itself, differing only in one aspect. However, matter and antimatter annihilate in a flash of energy when they meet, meaning these particles were constantly created and destroyed.

    But how did these particles come to exist in the first place? Quantum field theory tells us that even a vacuum, supposedly corresponding to empty spacetime, is full of physical activity in the form of energy fluctuations. These fluctuations can give rise to particles popping out, only to be disappear shortly after. This may sound like a mathematical quirk rather than real physics, but such particles have been spotted in countless experiments.

    The spacetime vacuum state is seething with particles constantly being created and destroyed, apparently “out of nothing”. But perhaps all this really tells us is that the quantum vacuum is (despite its name) a something rather than a nothing. The philosopher David Albert has memorably criticized accounts of the Big Bang which promise to get something from nothing in this way.

    4
    Simulation of quantum vacuum fluctuations in quantum chromodynamics. Credit: Ahmed Neutron/Wikimedia.

    Suppose we ask: where did spacetime itself arise from? Then we can go on turning the clock yet further back, into the truly ancient “Planck epoch” – a period so early in the universe’s history that our best theories of physics break down [above]. This era occurred only one ten-millionth of a trillionth of a trillionth of a trillionth of a second after the Big Bang. At this point, space and time themselves became subject to quantum fluctuations. Physicists ordinarily work separately with Quantum Mechanics, which rules the microworld of particles, and with general relativity, which applies on large, cosmic scales. But to truly understand the Planck epoch, we need a complete theory of quantum gravity, merging the two.

    We still don’t have a perfect theory of quantum gravity, but there are attempts – like string theory and loop quantum gravity. In these attempts, ordinary space and time are typically seen as emergent, like the waves on the surface of a deep ocean. What we experience as space and time are the product of quantum processes operating at a deeper, microscopic level – processes that don’t make much sense to us as creatures rooted in the macroscopic world.

    In the “Planck epoch”, our ordinary understanding of space and time breaks down, so we can’t any longer rely on our ordinary understanding of cause and effect either. Despite this, all candidate theories of quantum gravity describe something physical that was going on in the Planck epoch – some quantum precursor of ordinary space and time. But where did that come from?

    Even if causality no longer applies in any ordinary fashion, it might still be possible to explain one component of the “Planck epoch” universe in terms of another. Unfortunately, by now even our best physics fails completely to provide answers. Until we make further progress towards a “theory of everything”, we won’t be able to give any definitive answer. The most we can say with confidence at this stage is that physics has so far found no confirmed instances of something arising from nothing.

    Cycles from almost nothing

    To truly answer the question of how something could arise from nothing, we would need to explain the quantum state of the entire universe at the beginning of the Planck epoch. All attempts to do this remain highly speculative. Some of them appeal to supernatural forces like a “designer”. But other candidate explanations remain within the realm of physics – such as a multiverse, which contains an infinite number of parallel universes, or cyclical models of the universe, being born and reborn again.

    The 2020 Nobel Prize-winning physicist Roger Penrose has proposed one intriguing but controversial model for a cyclical universe dubbed “conformal cyclic cosmology”. Penrose was inspired by an interesting mathematical connection between a very hot, dense, small state of the universe – as it was at the Big Bang – and an extremely cold, empty, expanded state of the universe – as it will be in the far future. His radical theory to explain this correspondence is that those states become mathematically identical when taken to their limits. Paradoxical though it might seem, a total absence of matter might have managed to give rise to all the matter we see around us in our universe.


    Nobel Lecture: Roger Penrose, Nobel Prize in Physics 2020
    34 minutes

    In this view, the Big Bang arises from an almost nothing. That’s what’s left over when all the matter in a universe has been consumed into black holes, which have in turn boiled away into photons – lost in a void. The whole universe thus arises from something that – viewed from another physical perspective – is as close as one can get to nothing at all. But that nothing is still a kind of something. It is still a physical universe, however empty.

    How can the very same state be a cold, empty universe from one perspective and a hot dense universe from another? The answer lies in a complex mathematical procedure called “conformal rescaling”, a geometrical transformation which in effect alters the size of an object but leaves its shape unchanged.

    Penrose showed how the cold dense state and the hot dense state could be related by such rescaling so that they match with respect to the shapes of their spacetimes – although not to their sizes. It is, admittedly, difficult to grasp how two objects can be identical in this way when they have different sizes – but Penrose argues size as a concept ceases to make sense in such extreme physical environments.

    In conformal cyclic cosmology, the direction of explanation goes from old and cold to young and hot: the hot dense state exists because of the cold empty state. But this “because” is not the familiar one – of a cause followed in time by its effect. It is not only size that ceases to be relevant in these extreme states: time does too. The cold dense state and the hot dense state are in effect located on different timelines. The cold empty state would continue on forever from the perspective of an observer in its own temporal geometry, but the hot dense state it gives rise to effectively inhabits a new timeline all its own.

    It may help to understand the hot dense state as produced from the cold empty state in some non-causal way. Perhaps we should say that the hot dense state emerges from, or is grounded in, or realised by the cold, empty state. These are distinctively metaphysical ideas which have been explored by philosophers of science extensively, especially in the context of quantum gravity where ordinary cause and effect seem to break down. At the limits of our knowledge, physics and philosophy become hard to disentangle.

    Experimental evidence?

    Conformal cyclic cosmology offers some detailed, albeit speculative, answers to the question of where our Big Bang came from. But even if Penrose’s vision is vindicated by the future progress of cosmology, we might think that we still wouldn’t have answered a deeper philosophical question – a question about where physical reality itself came from. How did the whole system of cycles come about? Then we finally end up with the pure question of why there is something rather than nothing – one of the biggest questions of metaphysics.

    But our focus here is on explanations which remain within the realm of physics. There are three broad options to the deeper question of how the cycles began. It could have no physical explanation at all. Or there could be endlessly repeating cycles each a universe in its own right, with the initial quantum state of each universe explained by some feature of the universe before. Or there could be one single cycle and one single repeating universe, with the beginning of that cycle explained by some feature of its own end. The latter two approaches avoid the need for any uncaused events – and this gives them a distinctive appeal. Nothing would be left unexplained by physics.

    5
    Ongoing cycles of distinct universes in conformal cyclic cosmology. Roger Penrose.

    Penrose envisages a sequence of endless new cycles for reasons partly linked to his own preferred interpretation of quantum theory. In quantum mechanics, a physical system exists in a superposition of many different states at the same time, and only “picks one” randomly, when we measure it. For Penrose, each cycle involves random quantum events turning out a different way – meaning each cycle will differ from those before and after it. This is actually good news for experimental physicists, because it might allow us to glimpse the old universe that gave rise to ours through faint traces, or anomalies, in the leftover radiation from the Big Bang seen by the Planck satellite.

    Penrose and his collaborators believe they may have spotted these traces already [MNRAS], attributing patterns in the Planck data [CMB] to radiation from supermassive black holes in the previous universe. However, their claimed observations have been challenged by other physicists [Journal of Cosmology and Astroparticle Physics] and the jury remains out.

    CMB per European Space Agency(EU) Planck.

    Endless new cycles are key to Penrose’s own vision. But there is a natural way to convert conformal cyclic cosmology from a multi-cycle to a one-cycle form. Then physical reality consists in a single cycling around through the Big Bang to a maximally empty state in the far future – and then around again to the very same Big Bang, giving rise to the very same universe all over again.

    This latter possibility is consistent with another interpretation of quantum mechanics, dubbed the many-worlds interpretation. The many-worlds interpretation tells us that each time we measure a system that is in superposition, this measurement doesn’t randomly select a state. Instead, the measurement result we see is just one possibility – the one that plays out in our own universe. The other measurement results all play out in other universes in a multiverse effectively cut off from our own. So no matter how small the chance of something occurring if it has a non-zero chance then it occurs in some quantum parallel world. There are people just like you out there in other worlds who have won the lottery, or have been swept up into the clouds by a freak typhoon, or have spontaneously ignited, or have done all three simultaneously.

    Some people believe such parallel universes may also be observable [MNRAS] in cosmological data as imprints caused by another universe colliding with ours.

    Many-worlds quantum theory gives a new twist on conformal cyclic cosmology, though not one that Penrose agrees with. Our Big Bang might be the rebirth of one single quantum multiverse containing infinitely many different universes all occurring together. Everything possible happens – then it happens again and again and again.

    An ancient myth

    For a philosopher of science, Penrose’s vision is fascinating. It opens up new possibilities for explaining the Big Bang, taking our explanations beyond ordinary cause and effect. It is therefore a great test case for exploring the different ways physics can explain our world. It deserves more attention from philosophers.

    For a lover of myth, Penrose’s vision is beautiful. In Penrose’s preferred multi-cycle form, it promises endless new worlds born from the ashes of their ancestors. In its one-cycle form, it is a striking modern re-invocation of the ancient idea of the ouroboros, or world-serpent. In Norse mythology, the serpent Jörmungandr is a child of Loki, a clever trickster, and the giant Angrboda. Jörmungandr consumes its own tail, and the circle created sustains the balance of the world. But the ouroboros myth has been documented all over the world – including as far back as ancient Egypt.

    6
    Ouroboros on the tomb of Tutankhamun. Credit: Djehouty/Wikimedia.

    The ouroboros of the one cyclic universe is majestic indeed. It contains within its belly our own universe, as well as every one of the weird and wonderful alternative possible universes allowed by quantum physics – and at the point where its head meets its tail, it is completely empty yet also coursing with energy at temperatures of a hundred thousand million billion trillion degrees Celsius. Even Loki, the shapeshifter, would be impressed.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Conversation launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: