Tagged: Quantum Mechanics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:46 pm on July 15, 2017 Permalink | Reply
    Tags: , , Laser communication to the orbit, , , , Quantum cryptography, Quantum Mechanics   

    From Max Planck Gesellschaft: “Quantum communication with a satellite” 

    Max Planck Gesellschaft

    July 10, 2017
    Prof. Dr. Gerd Leuchs
    Max Planck Institute for the Science of Light, Erlangen
    Phone:+49 9131 7133-100
    Fax:+49 9131 7133-109
    gerd.leuchs@mpl.mpg.de

    What started out as exotic research in physics laboratories could soon change the global communication of sensitive data: quantum cryptography. Interest in this technique has grown rapidly over the last two years or so. The most recent work in this field, which a team headed by Christoph Marquardt and Gerd Leuchs at the Max Planck Institute for the Science of Light in Erlangen is now presenting, is set to heighten the interest of telecommunications companies, banks and governmental institutions even further. This is due to the fact that the physicists collaborating with the company Tesat-Spacecom and the German Aerospace Center have now created one precondition for using quantum cryptography to communicate over large distances as well without any risk of interception. They measured the quantum states of light signals which were transmitted from a geostationary communication satellite 38,000 kilometres away from Earth. The physicists are therefore confident that a global interception-proof communications network based on established satellite technology could be set up within only a few years.

    1
    More versatile than originally thought: A part of the Alphasat I-XL was actually developed to demonstrate data transmission between the Earth observation satellites of the European Copernicus project and Earth, but has now helped a group including researchers from the Max Planck Institute for the Science of Light to test the measurement of quantum states after they have been transmitted over a distance of 38,000 kilometres.© ESA.

    Sensitive data from banks, state institutions or the health sector, for example, must not fall into unauthorized hands. Although modern encryption techniques are far advanced, they can be cracked in many cases if significant, commensurate efforts are expended. And conventional encryption methods would hardly represent a challenge for the quantum computers of the future. While scientists used to think that the realization of such a computer was still a very long way off, considerable progress in the recent past has now raised physicists’ hopes. “A quantum computer could then also crack the data being stored today,” as Christoph Marquardt, leader of a research group at the Max Planck Institute for the Science of Light, states. “And this is why we are harnessing quantum cryptography now in order to develop a secure data transfer method.”

    Quantum mechanics protects a key against spies

    In quantum cryptography, two parties exchange a secret key, which can be used to encrypt messages. Unlike established public key encryption methods, this method cannot be cracked as long as the key does not fall into the wrong hands. In order to prevent this from happening, the two parties send each other keys in the form of quantum states in laser pulses. The laws of quantum mechanics protect a key from spies here, because any eavesdropping attempt will inevitably leave traces in the signals, which sender and recipient will detect immediately. This is because reading quantum information equates to a measurement on the light pulse, which inevitably changes the quantum state of the light.

    In the laboratory and over short distances quantum key distribution already works rather well via optical fibres that are used in optical telecommunications technology. Over large distances the weak and sensitive quantum signals need to be refreshed, which is difficult for reasons similar to those determining the fact that that laser pulses cannot be intercepted unnoticed. Christoph Marquardt and his colleagues are therefore relying on the transmission of quantum states via the atmosphere, between Earth and satellites to be precise, to set up a global communications network that is protected by quantum cryptography.

    2
    Laser communication to the orbit: The infrared image shows the ground station for the communication with the Alphasat I-XL satellite 38,000 kilometres away. The receiver sends an infrared laser beam in the direction of the orbit so that the satellite can find it. Since the beam is scattered by a higher atmospheric layer, it appears as a larger spot. © Imran Khan, MPI for the Science of Light.

    In their current publication [Optica], the researchers showed that this can largely be based on existing technology. Using a measuring device on the Canarian island Teneriffe, they detected the quantum properties of laser pulses which the Alphasat I-XL communications satellite had transmitted to Earth. The satellite circles Earth on a geostationary orbit and therefore appears to stand still in the sky. The satellite, which was launched in 2013, carries laser communication equipment belonging to the European Space Agency ESA. The company Tesat-Spacecom, headquartered in Backnang near Stuttgart, developed the technology in collaboration with the German Aerospace Center as part of the European Copernicus project for Earth observation, which is funded by the German Federal Ministry for Economic Affairs and Energy.


    ESA Sentinels (Copernicus)

    While Alphasat I-XL was never intended for quantum communication, “we found out at some stage, however, that the data transmission of the satellite worked according to the same principle as that of our laboratory experiments,” explains Marquardt, “which is by modulating the amplitude and phase of the light waves.” The amplitude is a measure for the intensity of the light waves and the mutual shift of two waves can be determined with the aid of the phase.

    The laser beam is 800 metres wide after travelling 38,000 kilometres

    For conventional data transmission, the modulation of the amplitude, for example, is made particularly large. This makes it easier to read out in the receiver and guarantees a clear signal. Marquardt and his colleagues were striving to achieve the exact opposite, however: in order to get down to the quantum level with the laser pulses, they have to greatly reduce the amplitude.

    The signal, which is therefore already extremely weak, is attenuated a great deal more as it is being transmitted to Earth. The largest loss occurs due to the widening of the laser beam. After 38,000 kilometres, it has a diameter of 800 metres at the ground, while the diameter of the mirror in the receiving station is a mere 27 centimetres. Further receiving mirrors, which uninvited listeners could use to eavesdrop on the communication, could easily be accommodated in a beam which is widened to such an extent. The quantum cryptography procedure, however, takes this into account. In a simple picture it exploits the fact that a photon – which is what the signals of quantum communication employ – can only be measured once completely: either with the measuring apparatus of the lawful recipient or the eavesdropping device of the spy. The exaction location of where a photon is registered within the beam diameter, however, is still left to chance.

    The experiment carried out at the beginning of 2016 was successful despite the greatly attenuated signal, because the scientists found out that the properties of the signals received on the ground came very close to the limit of quantum noise. The noise of laser light is the term physicists use to describe variations in the detected photons. Some of this irregularity is caused by the inadequacies of the transmitting and receiving equipment or turbulences in the atmosphere, and can therefore be avoided in principle. Other variations result from the laws of quantum physics – more precisely the uncertainty principle – according to which amplitude and phase of the light cannot be specified simultaneously to any arbitrary level of accuracy.

    Quantum cryptography can use established satellite technology

    Since the transmission with the aid of the Tesat system already renders the quantum properties of the light pulses measurable, this technique can be used as the basis on which to develop satellite-based quantum cryptography. “We were particularly impressed by this because the satellite had not been designed for quantum communication,” as Marquardt explains.

    Together with their colleagues from Tesat and other partners, the Erlangen physicists now want to develop a new satellite specifically customized for the needs of quantum cryptography. Since they can largely build on tested and tried technology, the development should take much less time than a completely new development. Their main task is to develop a board computer designed for quantum communication and to render the quantum mechanical random number generator space-proof, which supplies the cryptographic key.

    Consequently, quantum cryptography, which started out as an exotic playground for physicists, became quite close to practical application. The race for the first operational secure system is in full swing. Countries such as Japan, Canada, the USA and China in particular are funneling a lot of money into this research. “The conditions for our research have changed completely,” explains Marquardt. “At the outset, we attempted to whet industry’s appetite for such a method, today they are coming to us without prompting and asking for practicable solutions.” These could become reality in the next five to ten years.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Max Planck Society is Germany’s most successful research organization. Since its establishment in 1948, no fewer than 18 Nobel laureates have emerged from the ranks of its scientists, putting it on a par with the best and most prestigious research institutions worldwide. The more than 15,000 publications each year in internationally renowned scientific journals are proof of the outstanding research work conducted at Max Planck Institutes – and many of those articles are among the most-cited publications in the relevant field.

    What is the basis of this success? The scientific attractiveness of the Max Planck Society is based on its understanding of research: Max Planck Institutes are built up solely around the world’s leading researchers. They themselves define their research subjects and are given the best working conditions, as well as free reign in selecting their staff. This is the core of the Harnack principle, which dates back to Adolph von Harnack, the first president of the Kaiser Wilhelm Society, which was established in 1911. This principle has been successfully applied for nearly one hundred years. The Max Planck Society continues the tradition of its predecessor institution with this structural principle of the person-centered research organization.

    The currently 83 Max Planck Institutes and facilities conduct basic research in the service of the general public in the natural sciences, life sciences, social sciences, and the humanities. Max Planck Institutes focus on research fields that are particularly innovative, or that are especially demanding in terms of funding or time requirements. And their research spectrum is continually evolving: new institutes are established to find answers to seminal, forward-looking scientific questions, while others are closed when, for example, their research field has been widely established at universities. This continuous renewal preserves the scope the Max Planck Society needs to react quickly to pioneering scientific developments.

     
  • richardmitnick 10:40 am on July 3, 2017 Permalink | Reply
    Tags: Larry Zamick, Lattice dynamics, Max Born, Quantum Mechanics, Vigyan Prasar Science Portal, Warner Heisenberg   

    Provided by Larry Zamick, Physics, Rutgers University: “Max Born Founder of Lattice Dynamics” 

    From Vigyan Prasar Science Portal

    1
    Max Born

    “I am now convinced that theoretical physics is actual philosophy.”

    “The problem of physics is how the actual phenomena, as observed with the help of our sense organs aided by instruments, can be reduced to simple notions which are suited for precise measurement and used for the formulation of quantitative laws.”

    Max Born in his Experiments and Theory in Physics

    Max Born was a pioneer in developing quantum mechanics. The term “quantum mechanics” was introduced by Born. Born’s initial research interests were lattice dynamics and how atoms in solids one held together and vibrate. The Born-Haber cycle, a cycle of theoretical reactions and changes, allows calculation of the lattice energy of ionic crystals. In 1926, immediately after his student Warner Heisenberg had formulated his first laws of a new quantum theory of atoms. Born collaborated with him to develop the mathematical formulation that would adequately describe it. It was Born who first showed that the solution of Schroendinger’s quantum mechanical wave equation has a statistical meaning of physical significance. Born’s interpretation of the wave equation has proved to be of great importance in the new quantum theory. Born reformulated the first law of thermodynamics. Born produced a very precise definition of quantity of heat and thus provide the most satisfactory mathematical interpretation of the first law of thermodynamics.

    Commenting on Born’s scientific contributions, the winner of 1977 Nobel Prize for Physics, Sir Neville Francis Mott (1905-1996) wrote: “As the founder of lattice dynamics, that is, the theory of how atoms in solids stick together and vibrate, Max Born is one of the pre-eminent physicists of this century. His celebrated work on cohesion in ionic crystals formed the bridge between the physicist’s and chemist’s ways of studying crystals. For the physicists, lattice energies of the crystals were of central interest and for the chemists, heats of reaction. Born showed that the ionization potentials of the atoms could be used to compare the chemical and physical concepts. This was a landmark.”

    Max Born was born on December 11, 1882 at Breslau, Germany (now Worclaw, Poland). His father Gustav Born was a professor of embryology at the University of Breslau and his mother Margarete Born (nee Kaufmann) came from a Silesian family of industrialists. It was from his mother that Born inherited his love for music. Born’s mother died when he was four years old. In his childhood, Born badly suffered from bad colds and asthma and which continued to afflict him throughout his life. Because of his bad health, he was taught by private tutor for a year in home and then after spending two years in a preparatory school, he joined the Wilhelm’s Gymnasium in Breslau. At the Gymnasium, Born studied a wide range of subjects including mathematics, physics, history, modern languages, Latin, Greek, and German. At the School, Born did not display any sign of a gifted child. He was more interested in humanities than in science subjects.

    In 1901, Born joined the University of Breslau. Following his father’s advice, Born did not specialize in any particular subject. He took a wide range of subjects including mathematics, astronomy, physics, chemistry, logic, philosophy, and zoology. At the Breslau University, Born became interested in mathematics and the credit for this goes to his teachers Rosanes and London. Rosanes, a student of Leopold Kronecker (1823-91), who developed algebraic number theory and invented the Kronecker delta, gave brilliant lectures on analytical geometry. It was Rosanes, who introduced Born the ideas of group theory and matrix calculus, which were later used successfully by Born to solve physical problems. London’s lectures on definite integrals and on analytical mechanics were clear and lucid. The resultant effect of the teachings of Rosanes and London was that Born was drawn towards mathematics. He was helped by some of his classmates to develop interest in science. One of his classmates named Lachmann awakened his interest in astronomy. His other classmate Otto Toeplitz introduced the lives and works of some of the greatest mathematicians like Euler, Lagrange, Cauchy and Riemann to Born. Toeplitz had learnt these from his father, who was a schoolmaster and mathematician. In his later life Born acknowledged his debt to Otto Toeplitz ‘for the first introduction to these pathfinders in mathematical science’.

    In those days it was a common practice for a German student to move from university to university. And Born was no exception. In 1902 Born went to the University of Heidelberg and then in 1903 he went to the University of Zurich. It was at Zurich that Born attended his first course on advanced mathematics given by Adolf Hurwitz (1859-1919). After coming back to Breslau University, he was told by his classmates Toeplitz and Hellinger of the great teachers of mathematics, Christian Felix Klein (1849-1925), the founder of modern geometry unifying Euclidean and non-Euclidean geometry; David Hilbert (1862-1943), who originated the concept of Hilbert Space; and Hermann Minkowski (1864-1909), who developed the mathematics that played a crucial role in Einstein’s formulation of theory of relativity at the University of Gottingen. So Born went to the University of Gottingen to attend lectures by these great scientists. At the Gottingen University, Born served as an Assistant to David Hilbert. He attended lectures by Klein and Carl Runge (1856-1927) on elasticity and a seminar on electrodynamics by Hilbert and Minkowski. Klein was annoyed with Born because of Born’s irregular attendance at his lectures. Born then attended Schwarzschild’s astronomy lectures. During his student days at the Gottingen University, he had the opportunity to go for walks in the woods with Hilbert and Minkowski. During these walks, all matters of fascinating subjects were discussed in addition to mathematics including problems pertaining to philosophy, politics and social. Born was also interacting with non-mathematicians like Courant, Schmidt and Caratheodory.

    Born earned his PhD in physics from the University of Gottingen in 1907. He then undertook compulsory military service. However, he did not have to complete the standard one year period because he suffered from asthma. Even his brief stint with the military made him loath all things military. After serving in the military Born visited Caius College, Cambridge for six months to study under Larmor and J. J. Thomson (1908-1909). He came back to Breslau and worked there with the physicists Lummer and Pringsheim. Around this time he was fascinated by Einstein’s work on relativity. Born’s work on combining ideas of Einstein and Minkowski led to an invitation to Gottingen in 1909, by Minkowski as his assistant. However, Minkowski died within weeks after Born’s coming to Gottingen. In 1912, Born joined the faculty of the Gottingen University and he started with working with Theodore von Karman (1881-1963), who discovered Karman vortices.

    In 1915 Born was appointed as Professor (extraordinarius) at the Berlin University to assist Max Plunck. At the time Albert Einstein was also at the Berlin University. However, soon he had to join the Army. He was attached to a scientific office of the Army, where he worked on the theory of sound ranging. He could also manage to find time to work on the theory of crystals, which led to publication his first book entitled “Dynamics of Crystal Lattices” summarizing a series of investigations that Born had initiated at Gottingen.

    In 1919, after the conclusion of the First World War, Born was appointed Professor at the University of Frankfurt-on-Main, where a laboratory was put at his disposal.. Here Born’s assistant was Otto Stern, the first of Stern’s well-known experiments, which were awarded with a Nobel Prize originated there.

    In 1921, Born came back to the University of Gottingen as Professor of Physics, where he stayed for 12 years, interrupted only by a visit to USA in 1925. Among his collaborators at Gottingen were Pauli, Heisenberg, Jordan, Fermi, Dirac, Hund, Weisskopf, Oppenheimer, Joseph Mayer and Maria Goeppert- Mayer. During his stay Born’s most important contributions to physics were made. He published a modernized version of his book on crystals. Assisted by his students he undertook numerous investigations on crystal lattices, followed by a series of studies on quantum theory. He collaborated with Heisenberg and Jordan to develop further the principles of quantum mechanics discovered by Heisenberg. He also undertook his own studies on the statistical interpretation of quantum mechanics. Born proposed that what Schrodinger had described with his wave equation, not the electron itself, but the probability of finding the electron in any given location. Consider you are bombarding a barrier with electrons, when some will go through and some will bounce off. Born figured out that a single electron has, say 55 percent chance of going through the barrier, and a 45 percent chance of bouncing back. Because electrons cannot readily divide, Schrodinger’s quantum mechanical wave equation could not have describing the electron itself, what it was describing was its probable location. Born’s interpretation was hailed by Leon Lederman, as “the single most dramatic and major change in our world view since Newton”. However, at the beginning Born’s interpretation was not liked either by Schrodinger, the propounder of the wave equation or many other physicists including Einstein. Born corresponded with Einstein on the subject and the Born-Einstein letters were published in 1971. Born’s proposition of probability meant that the determinism of Newton’s classical physics was no more valid. There is no predetermined way in which absolute prediction can be made, as in classical physics. Everything depends on probability. A similar idea is embodied in the uncertainty principle of Werner Heisenberg. But Bohr, Sommerfeld, Heisenberg and many others took Born’s ideas seriously and they continued the exciting work of trying to get all pieces to fit.

    Born introduced a useful technique, known as the Born Approximation, for solving problems concerning the scattering of atomic particles. Born and J. Robert Oppenheimer introduced a widely used simplification of the calculations dealing with electronic structures of molecules. This work known as “Born-Oppenheimer theory of molecules deals with interatomic forces.”

    In 1933, like many other scientists of Jewish origin, Born was forced to leave Germany. He went to England and became Stokes lecturer at the University of Cambridge. He worked there for three years. During these years he mostly worked in the field of nonliniear electrodynamics, which he developed with Infeld.

    During the winter of 1935-1936, Born spent six months at Bangalore at the invitation of C. V. Raman. Commenting on his coming to Bangalore and subsequent events, Born said: “ As I had no other job, I was willing to accept Raman’s offer namely, a permanent position at his institute, if he could obtain the consent of the Council. Then he insisted on my attending the next faculty meeting which had to decide on bringing my appointment before the Council.

    The English professor Aston (who had joined around the same time) went up and spoke in a most unpleasant way against Raman’s motion declaring that a second rank foreigner driven out from his own his country was not good enough for them. I was so shaken that, when I returned home, I simply cried.”

    Born was elected to the Tait Chair of natural philosophy at the University of Edinburgh in 1936. He became a British subject in 1936. One of Born’s research students described Born’s days at Edinburgh: “When Born arrived in the morning he first used to make the round of his research students, asking them whether they had any progress to report, and giving them advice, sometimes presenting them with sheets of elaborate calculations concerning their problems which he had himself done the day before…The rest of the morning was spent by Born in delivering his lectures to undergraduate honours students, attending to departmental business, and doing research work of his own. Most of the latter, however he used to carry out at home in the afternoons and evenings.”

    After his retirement in 1953 Born went back to his native country and settled in Gottingen. In 1954 he was awarded the Nobel Prize in Physics “for his fundamental research in quantum mechanics, especially for his statistical interpretation of the wavefunction.” He shared the Prize with Walther Wilhelm Georg Franz Bothe (1891-1957).

    Born was awarded Fellowships of many scientific academies—Gottingen, Moscow, Berlin, Bangalore, Bucharest, Edinburgh, London, Lima, Dublin, Copenhagen, Stockholm, Washington, and Boston. He was awarded honorary doctorates from a number of universities including Bristol, Bordeaux, Oxford, Freidburg/Breisgau, Edinburgh, Oslo, and Brussels. He received the Stokes Medal of Cambridge, the Max Planck Medal of the German Physical Society, the Hughes Medal of the Royal Society of London. He was also awarded the MacDougall-Brisbane Prize, the Gunning-Victoria Jubilee Prize of the Royal Society, Edinburgh and the Grand Cross of Merit with Star of the order of Merit of the German Federal Republic.
    During his post-retirement life in Bad Pyrmomt, a town neer Gottingen, Born wrote many articles and books on philosophy of science and the impact of science on human affairs particularly the responsibility of scientists for the use of nuclear energy in war and peace. He was totally against the use contemporary scientific knowledge of nuclear energy for warfare. He took the initiative in 1955 to get a statement on this subject signed by a gathering of Nobel Laureates. Born is buried in Gottingen, where he died on January 05, 1970. His tombstone displays his fundamental equation of matrix mechanics that is pq-qp = (h/ 2??i.

    References , [Sorry no lnks provided.]

    Born. Max. My Life: Reflections of a Nobel Laureate. London: Taylor & Francis, 1978.

    A Dictionary of Scientists. Oxford: Oxford University Press, 1999.

    The Cambridge Dictionary of Scientists (Second Edition). Cambridge: Cambridge University Press, 2002.

    Parthasarathy, R. Paths of Innovators in Science, Engineering & Technology. Chennai: East West Books (Madras) Pvt. Ltd., 2000.

    Spangenburg, Ray and Diane K. Moser. The History of Science: From 1895 to 1945. Hyderabad: Universities Press (India) Ltd., 1999.

    Dardo, Mauro. Nobel Laureates and Twentieth-Century Physics. Cambridge: Cambridge University Press, 2004.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 4:01 pm on July 1, 2017 Permalink | Reply
    Tags: , , , Quantum Mechanics, Van der Waals interactions repulsive in confinement   

    From phys.org: “Researchers refute textbook knowledge in molecular interactions” 

    physdotorg
    phys.org

    June 29, 2017

    1
    Repulsive ground state interaction E rep (solid lines) and the sum of repulsion and London attraction (E att) energy (broken lines) for argon and methane dimers on a perfectly reflecting surface. Credit: arXiv:1610.09275 [cond-mat.mes-hall]

    Van der Waals interactions between molecules are among the most important forces in biology, physics, and chemistry, as they determine the properties and physical behavior of many materials. For a long time, it was considered that these interactions between molecules are always attractive. Now, for the first time, Mainak Sadhukhan and Alexandre Tkatchenko from the Physics and Materials Science Research Unit at the University of Luxembourg found that in many rather common situations in nature the van der Waals force between two molecules becomes repulsive. This might lead to a paradigm shift in molecular interactions.

    “The textbooks so far assumed that the forces are solely attractive. For us, the interesting question is whether you can also make them repulsive,” Prof Tkatchenko explains. “Until recently, there was no evidence in scientific literature that van der Waals forces could also be repelling.” Now, the researchers have shown in their paper, published in the renowned scientific journal Physical Review Letters, that the forces are, in fact, repulsive when they take place under confinement.

    The ubiquitous van der Waals force was first explained by the German-American physicist Fritz London in 1930. Using quantum mechanics, he proved the purely attractive nature of the van der Waals force for any two molecules interacting in free space. “However, in nature molecules in most cases interact in confined spaces, such as cells, membranes, nanotubes, etc. In is this particular situation, van der Waals forces become repulsive at large distances between molecules,” says Prof Tkatchenko.

    Mainak Sadhukhan, the co-author of the study, developed a novel quantum-mechanical method that enabled them to model van der Waals forces in confinement. “We could rationalize many previous experimental results that remained unexplained until now. Our new theory allows, for the first time, for an interpretation of many interesting phenomena observed for molecules under confinement,” Mainak Sadhukhan says.

    The discovery could have many potential implications for the delivery of pharmaceutical molecules in cells, water desalination and transport, and self-assembly of molecular layers in photovoltaic devices.

    Prof Tkatchenko’s research group is working on methods that model the properties of a wide range of intermolecular interactions. Only in 2016, they found that the true nature of these van der Wals forces differs from conventional wisdom in chemistry and biology, as they have to be treated as coupling between waves rather than as mutual attraction (or repulsion) between particles.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

     
  • richardmitnick 11:23 am on July 1, 2017 Permalink | Reply
    Tags: Entanglement and quantum interference, IBM Q experience, Now an interface based on the popular programming language Python, , Quantum Mechanics, , Supercomputers still rule   

    From SA: “Quantum Computing Becomes More Accessible” 

    Scientific American

    Scientific American

    June 26, 2017
    Dario Gil

    1
    Credit: World Economic Forum

    Quantum computing has captured imaginations for almost 50 years. The reason is simple: it offers a path to solving problems that could never be answered with classical machines. Examples include simulating chemistry exactly to develop new molecules and materials and solving complex optimization problems, which seek the best solution from among many possible alternatives. Every industry has a need for optimization, which is one reason this technology has so much disruptive potential.

    Until recently, access to nascent quantum computers was restricted to specialists in a few labs around the world. But progress over the past several years has enabled the construction of the world’s first prototype systems that can finally test out ideas, algorithms and other techniques that until now were strictly theoretical.

    Quantum computers tackle problems by harnessing the power of quantum mechanics. Rather than considering each possible solution one at a time, as a classical machine would, they behave in ways that cannot be explained with classical analogies. They start out in a quantum superposition of all possible solutions, and then they use entanglement and quantum interference to home in on the correct answer—processes that we do not observe in our everyday lives. The promise they offer, however, comes at the cost of them being difficult to build. A popular design requires superconducting materials (kept 100 times colder than outer space), exquisite control over delicate quantum states and shielding for the processor to keep out even a single stray ray of light.

    Existing machines are still too small to fully solve problems more complex than supercomputers can handle today. Nevertheless, tremendous progress has been made. Algorithms have been developed that will run faster on a quantum machine. Techniques now exist that prolong coherence (the lifetime of quantum information) in superconducting quantum bits by a factor of more than 100 compared with 10 years ago. We can now measure the most important kinds of quantum errors. And in 2016 IBM provided the public access to the first quantum computer in the cloud—the IBM Q experience—with a graphical interface for programming it and now an interface based on the popular programming language Python. Opening this system to the world has fueled innovations that are vital for this technology to progress, and to date more than 20 academic papers have been published using this tool. The field is expanding dramatically. Academic research groups and more than 50 start-ups and large corporations worldwide are focused on making quantum computing a reality.

    With these technological advancements and a machine at anyone’s fingertips, now is the time for getting “quantum ready.” People can begin to figure out what they would do if machines existed today that could solve new problems. And many quantum computing guides are available online to help them get started.

    There are still many obstacles. Coherence times must improve, quantum error rates must decrease, and eventually, we must mitigate or correct the errors that do occur. Researchers will continue to drive innovations in both the hardware and software. Investigators disagree, however, over which criteria should determine when quantum computing has achieved technological maturity. Some have proposed a standard defined by the ability to perform a scientific measurement so obscure that it is not easily explained to a general audience. I and others disagree, arguing that quantum computing will not have emerged as a technology until it can solve problems that have commercial, intellectual and societal importance. The good news is, that day is finally within our sights.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 7:36 pm on June 26, 2017 Permalink | Reply
    Tags: , Electromagnetic radiation [light], Hyperbolic metamaterials (HMMs), Molecular beam epitaxy, Nanoresonators, , , , , Quantum Mechanics   

    From Notre Dame: “Notre Dame Researchers Open Path to New Generation of Optical Devices” 

    Notre Dame bloc

    Notre Dame University

    COLLEGE of ENGINEERING

    OFFICE of the PROVOST
    College of Engineering

    June 22, 2017
    Nina Welding

    1
    Sub-diffraction Confinement in All-semiconductor Hyperbolic Metamaterial Resonators was co-authored by graduate students Kaijun Feng and Galen Harden and Deborah L. Sivco, engineer-in-residence at MIRTHE+ Photonics Sensing Center, Princeton Univ.

    Cameras, telescopes and microscopes are everyday examples of optical devices that measure and manipulate electromagnetic radiation [light]. Being able to control the light in such devices provides the user with more information through a much better “picture” of what is occurring through the lens. The more information one can glean, the better the next generation of devices can become. Similarly, controlling light on small scales could lead to improved optical sources for applications that span health, homeland security and industry. This is what a team of researchers, led by Anthony Hoffman, assistant professor of electrical engineering and researcher in the University’s Center for Nano Science and Technology (NDnano), has been pursuing. Their findings were recently published in the June 19 issue of ACS Photonics.

    In fact, the team has fabricated and characterized sub-diffraction mid-infrared resonators using all-semiconductor hyperbolic metamaterials (HMMs) that confine light to extremely small volumes — thousands of times smaller than common materials.

    2
    The scanning electron microscope image here shows an array of 0.47 μm wide resonators with a 2.5 μm pitch. No image credit.

    HMMs combine the properties of metals, which are excellent conductors, and dielectrics, which are insulators, to realize artificial optical materials with properties that are very difficult, even impossible, to find naturally. These unusual properties may elucidate the quantum mechanical interactions between light and matter at the nanoscale while giving researchers a powerful tool to control and engineer these light-matter interactions for new optical devices and materials.

    Hoffman’s team engineered these desired properties in the HMMs by growing them via molecular beam epitaxy using III-V semiconductor materials routinely used for high-performance optoelectronic devices, such as lasers and detectors. Layers of Si-doped InGaAs and intrinsic AlInAs were placed on top of one another, with a single layer being 50 nm thick. The total thickness of the HMM was 1μm, about 100 times smaller than the width of a human hair.

    The nanoresonators were produced by Kaijun Feng, graduate student in the Department of Electrical Engineering, using state-of-the-art fabrication equipment in Notre Dame’s Nanofabrication Facility. The devices were then characterized in Hoffman’s laboratory using a variety of spectroscopic techniques.

    “What is particularly exciting about this work,” says Hoffman, “is that we have found a way to squeeze light into small volumes using a mature semiconductor technology. In addition to being able to employ these nanoresonators to generate mid-infrared light, we believe that these new sources could have significant application in the mid-infrared portion of the spectrum, which is used for optical sensing across areas such as medicine, environmental monitoring, industrial process control and defense. We are also excited about the possibility of utilizing these nanoresonators to study interactions between light and matter that previously have not been possible.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Notre Dame Campus

    The University of Notre Dame du Lac (or simply Notre Dame /ˌnoʊtərˈdeɪm/ NOH-tər-DAYM) is a Catholic research university located near South Bend, Indiana, in the United States. In French, Notre Dame du Lac means “Our Lady of the Lake” and refers to the university’s patron saint, the Virgin Mary.

    The school was founded by Father Edward Sorin, CSC, who was also its first president. Today, many Holy Cross priests continue to work for the university, including as its president. It was established as an all-male institution on November 26, 1842, on land donated by the Bishop of Vincennes. The university first enrolled women undergraduates in 1972. As of 2013 about 48 percent of the student body was female.[6] Notre Dame’s Catholic character is reflected in its explicit commitment to the Catholic faith, numerous ministries funded by the school, and the architecture around campus. The university is consistently ranked one of the top universities in the United States and as a major global university.

    The university today is organized into five colleges and one professional school, and its graduate program has 15 master’s and 26 doctoral degree programs.[7][8] Over 80% of the university’s 8,000 undergraduates live on campus in one of 29 single-sex residence halls, each of which fields teams for more than a dozen intramural sports, and the university counts approximately 120,000 alumni.[9]

    The university is globally recognized for its Notre Dame School of Architecture, a faculty that teaches (pre-modernist) traditional and classical architecture and urban planning (e.g. following the principles of New Urbanism and New Classical Architecture).[10] It also awards the renowned annual Driehaus Architecture Prize.

     
  • richardmitnick 1:27 pm on June 18, 2017 Permalink | Reply
    Tags: , China has taken the leadership in quantum communication, China Shatters 'Spooky Action at a Distance' Record, For now the system remains mostly a proof of concept, Global quantum communication is possible and will be achieved in the near future, , Preps for Quantum Internet, , Quantum Mechanics,   

    From SA: “China Shatters ‘Spooky Action at a Distance’ Record, Preps for Quantum Internet” 

    Scientific American

    Scientific American

    June 15, 2017
    Lee Billings

    1
    Credit: Alfred Pasieka Getty Images

    In a landmark study, a team of Chinese scientists using an experimental satellite has tested quantum entanglement over unprecedented distances, beaming entangled pairs of photons to three ground stations across China—each separated by more than 1,200 kilometers. The test verifies a mysterious and long-held tenet of quantum theory, and firmly establishes China as the front-runner in a burgeoning “quantum space race” to create a secure, quantum-based global communications network—that is, a potentially unhackable “quantum internet” that would be of immense geopolitical importance. The findings were published Thursday in Science.

    “China has taken the leadership in quantum communication,” says Nicolas Gisin, a physicist at the University of Geneva who was not involved in the study. “This demonstrates that global quantum communication is possible and will be achieved in the near future.”

    The concept of quantum communications is considered the gold standard for security, in part because any compromising surveillance leaves its imprint on the transmission. Conventional encrypted messages require secret keys to decrypt, but those keys are vulnerable to eavesdropping as they are sent out into the ether. In quantum communications, however, these keys can be encoded in various quantum states of entangled photons—such as their polarization—and these states will be unavoidably altered if a message is intercepted by eavesdroppers. Ground-based quantum communications typically send entangled photon pairs via fiber-optic cables or open air. But collisions with ordinary atoms along the way disrupt the photons’ delicate quantum states, limiting transmission distances to a few hundred kilometers. Sophisticated devices called “quantum repeaters”—equipped with “quantum memory” modules—could in principle be daisy-chained together to receive, store and retransmit the quantum keys across longer distances, but this task is so complex and difficult that such systems remain largely theoretical.

    “A quantum repeater has to receive photons from two different places, then store them in quantum memory, then interfere them directly with each other” before sending further signals along a network, says Paul Kwiat, a physicist at the University of Illinois in Urbana–Champaign who is unaffiliated with the Chinese team. “But in order to do all that, you have to know you’ve stored them without actually measuring them.” The situation, Kwiat says, is a bit like knowing what you have received in the mail without looking in your mailbox or opening the package inside. “You can shake the package—but that’s difficult to do if what you’re receiving is just photons. You want to make sure you’ve received them but you don’t want to absorb them. In principle it’s possible—no question—but it’s very hard to do.”

    To form a globe-girdling secure quantum communications network, then, the only available solution is to beam quantum keys through the vacuum of space then distribute them across tens to hundreds of kilometers using ground-based nodes. Launched into low Earth orbit in 2016 and named after an ancient Chinese philosopher, the 600-kilogram “Micius” satellite is China’s premiere effort to do just that, and is only the first of a fleet the nation plans as part of its $100-million Quantum Experiments at Space Scale (QUESS) program.

    Micius carries in its heart an assemblage of crystals and lasers that generates entangled photon pairs then splits and transmits them on separate beams to ground stations in its line-of-sight on Earth. For the latest test, the three receiving stations were located in the cities of Delingha and Ürümqi—both on the Tibetan Plateau—as well as in the city of Lijiang in China’s far southwest. At 1,203 kilometers, the geographical distance between Delingha and Lijiang is the record-setting stretch over which the entangled photon pairs were transmitted.

    For now the system remains mostly a proof of concept, because the current reported data transmission rate between Micius and its receiving stations is too low to sustain practical quantum communications. Of the roughly six million entangled pairs that Micius’s crystalline core produced during each second of transmission, only about one pair per second reached the ground-based detectors after the beams weakened as they passed through Earth’s atmosphere and each receiving station’s light-gathering telescopes. Team leader Jian-Wei Pan—a physicist at the University of Science and Technology of China in Hefei who has pushed and planned for the experiment since 2003—compares the feat with detecting a single photon from a lone match struck by someone standing on the moon. Even so, he says, Micius’s transmission of entangled photon pairs is “a trillion times more efficient than using the best telecommunication fibers. … We have done something that was absolutely impossible without the satellite.” Within the next five years, Pan says, QUESS will launch more practical quantum communications satellites.

    Although Pan and his team plan for Micius and its nascent network of sister satellites to eventually distribute quantum keys, their initial demonstration instead aimed to achieve a simpler task: proving Einstein wrong.

    Einstein famously derided as “spooky action at a distance” one of the most bizarre elements of quantum theory—the way that measuring one member of an entangled pair of particles seems to instantaneously change the state of its counterpart, even if that counterpart particle is on the other side of the galaxy. This was abhorrent to Einstein, because it suggests information might be transmitted between the particles faster than light, breaking the universal speed limit set by his theory of special relativity. Instead, he and others posited, perhaps the entangled particles somehow shared “hidden variables” that are inaccessible to experiment but would determine the particles’ subsequent behavior when measured. In 1964 the physicist John Bell devised a way to test Einstein’s idea, calculating a limit that physicists could statistically measure for how much hidden variables could possibly correlate with the behavior of entangled particles. If experiments showed this limit to be exceeded, then Einstein’s idea of hidden variables would be incorrect.

    Ever since the 1970s “Bell tests” by physicists across ever-larger swaths of spacetime have shown that Einstein was indeed mistaken, and that entangled particles do in fact surpass Bell’s strict limits. The most definitive test arguably occurred in the Netherlands in 2015, when a team at Delft University of Technology closed several potential “loopholes” that had plagued past experiments and offered slim-but-significant opportunities for the influence of hidden variables to slip through. That test, though, involved separating entangled particles by scarcely more than a kilometer. With Micius’s transmission of entangled photons between widely separated ground stations, Pan’s team has now performed a Bell test at distances a thousand times greater. Just as before, their results confirm that Einstein was wrong. The quantum realm remains a spooky place—although no one yet understands why.

    “Of course, no one who accepts quantum mechanics could possibly doubt that entanglement can be created over that distance—or over any distance—but it’s still nice to see it made concrete,” says Scott Aaronson, a physicist at The University of Texas at Austin. “Nothing we knew suggested this goal was unachievable. The significance of this news is not that it was unexpected or that it overturns anything previously believed, but simply that it’s a satisfying culmination of years of hard work.”

    That work largely began in the 1990s when Pan, leader of the Chinese team, was a graduate student in the lab of the physicist Anton Zeilinger at the University of Innsbruck in Austria. Zeilinger was Pan’s PhD adviser, and they collaborated closely to test and further develop ideas for quantum communication. Pan returned to China to start his own lab in 2001, and Zeilinger started one as well at the Austrian Academy of Sciences in Vienna. For the next seven years they would compete fiercely to break records for transmitting entangled photon pairs across ever-wider gaps, and in ever-more extreme conditions, in ground-based experiments. All the while each man lobbied his respective nation’s space agency to green-light a satellite that could be used to test the technique from space. But Zeilinger’s proposals perished in a bureaucratic swamp at the European Space Agency whereas Pan’s were quickly embraced by the China National Space Administration. Ultimately, Zeilinger chose to collaborate again with his old pupil rather than compete against him; today the Austrian Academy of Sciences is a partner in QUESS, and the project has plans to use Micius to perform an intercontinental quantum key distribution experiment between ground stations in Vienna and Beijing.

    “I am happy that the Micius works so well,” Zeilinger says. “But one has to realize that it is a missed opportunity for Europe and others, too.”

    For years now, other researchers and institutions have been scrambling to catch up, pushing governments for more funding for further experiments on the ground and in space—and many of them see Micius’s success as the catalytic event they have been waiting for. “This is a major milestone, because if we are ever to have a quantum internet in the future, we will need to send entanglement over these sorts of long distances,” says Thomas Jennewein, a physicist at the University of Waterloo in Canada who was not involved with the study. “This research is groundbreaking for all of us in the community—everyone can point to it and say, ‘see, it does work!’”

    Jennewein and his collaborators are pursuing a space-based approach from the ground up, partnering with the Canadian Space Agency to plan a smaller, simpler satellite that could launch as soon as five years from now to act as a “universal receiver” and redistribute entangled photons beamed up from ground stations. At the National University of Singapore, an international collaboration led by the physicist Alexander Ling has already launched cheap shoe box–size CubeSats to create, study and perhaps even transmit photon pairs that are “correlated”—a situation just shy of full entanglement. And in the U.S., Kwiat at the University of Illinois is using NASA funding to develop a device that could someday test quantum communications using “hyperentanglement” (the simultaneous entanglement of photon pairs in multiple ways) onboard the International Space Station.

    Perhaps most significantly, a team led by Gerd Leuchs and Christoph Marquardt at the Max Planck Institute for the Science of Light in Germany is developing quantum communications protocols for commercially available laser systems already in space onboard the European Copernicus and SpaceDataHighway satellites. Using one of these systems, the team successfully encoded and sent simple quantum states to ground stations using photons beamed from a satellite in geostationary orbit, some 38,000 kilometers above Earth. This approach, Marquardt explains, does not rely on entanglement and is very different from that of QUESS—but it could, with minimal upgrades, nonetheless be used to distribute quantum keys for secure communications in as little as five years. Their results appear in Optica.

    “Our purpose is really to find a shortcut into making things like quantum key distribution with satellites economically viable and employable, pretty fast and soon,” Marquardt says. “[Engineers] invested 20 years of hard work making these systems, so it’s easier to upgrade them than to design everything from scratch. … It is a very good advantage if you can rely on something that is already qualified in space, because space qualification is very complicated. It usually takes five to 10 years just to develop that.”

    Marquardt and others suspect, however, that this field could be much further advanced than has been publicly acknowledged, with developments possibly hidden behind veils of official secrecy in the U.S. and elsewhere. It may be that the era of quantum communication is already upon us. “Some colleague of mine made the joke, ‘the silence of the U.S. is very loud,’” Marquardt says. “They had some very good groups concerning free-space satellites and quantum key distribution at Los Alamos [National Laboratory] and other places, and suddenly they stopped publishing. So we always say there are two reasons that they stopped publishing: either it didn’t work, or it worked really well!”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 11:07 am on June 11, 2017 Permalink | Reply
    Tags: , Bell test, Cosmic Bell test, Experiment Reaffirms Quantum Weirdness, John Bell, , , Quantum Mechanics, , Superdeterminism   

    From Quanta: “Experiment Reaffirms Quantum Weirdness” 

    Quanta Magazine
    Quanta Magazine

    February 7, 2017 [I wonder where this was hiding. It just appeared today in social media.]
    Natalie Wolchover

    Physicists are closing the door on an intriguing loophole around the quantum phenomenon Einstein called “spooky action at a distance.”

    1
    Olena Shmahalo/Quanta Magazine

    There might be no getting around what Albert Einstein called “spooky action at a distance.” With an experiment described today in Physical Review Letters — a feat that involved harnessing starlight to control measurements of particles shot between buildings in Vienna — some of the world’s leading cosmologists and quantum physicists are closing the door on an intriguing alternative to “quantum entanglement.”

    “Technically, this experiment is truly impressive,” said Nicolas Gisin, a quantum physicist at the University of Geneva who has studied this loophole around entanglement.

    00:00/09:42

    According to standard quantum theory, particles have no definite states, only relative probabilities of being one thing or another — at least, until they are measured, when they seem to suddenly roll the dice and jump into formation. Stranger still, when two particles interact, they can become “entangled,” shedding their individual probabilities and becoming components of a more complicated probability function that describes both particles together. This function might specify that two entangled photons are polarized in perpendicular directions, with some probability that photon A is vertically polarized and photon B is horizontally polarized, and some chance of the opposite. The two photons can travel light-years apart, but they remain linked: Measure photon A to be vertically polarized, and photon B instantaneously becomes horizontally polarized, even though B’s state was unspecified a moment earlier and no signal has had time to travel between them. This is the “spooky action” that Einstein was famously skeptical about in his arguments against the completeness of quantum mechanics in the 1930s and ’40s.

    In 1964, the Northern Irish physicist John Bell found a way to put this paradoxical notion to the test. He showed that if particles have definite states even when no one is looking (a concept known as “realism”) and if indeed no signal travels faster than light (“locality”), then there is an upper limit to the amount of correlation that can be observed between the measured states of two particles. But experiments have shown time and again that entangled particles are more correlated than Bell’s upper limit, favoring the radical quantum worldview over local realism.

    Only there’s a hitch: In addition to locality and realism, Bell made another, subtle assumption to derive his formula — one that went largely ignored for decades. “The three assumptions that go into Bell’s theorem that are relevant are locality, realism and freedom,” said Andrew Friedman of the Massachusetts Institute of Technology, a co-author of the new paper. “Recently it’s been discovered that you can keep locality and realism by giving up just a little bit of freedom.” This is known as the “freedom-of-choice” loophole.

    In a Bell test, entangled photons A and B are separated and sent to far-apart optical modulators — devices that either block photons or let them through to detectors, depending on whether the modulators are aligned with or against the photons’ polarization directions. Bell’s inequality puts an upper limit on how often, in a local-realistic universe, photons A and B will both pass through their modulators and be detected. (Researchers find that entangled photons are correlated more often than this, violating the limit.) Crucially, Bell’s formula assumes that the two modulators’ settings are independent of the states of the particles being tested. In experiments, researchers typically use random-number generators to set the devices’ angles of orientation. However, if the modulators are not actually independent — if nature somehow restricts the possible settings that can be chosen, correlating these settings with the states of the particles in the moments before an experiment occurs — this reduced freedom could explain the outcomes that are normally attributed to quantum entanglement.

    The universe might be like a restaurant with 10 menu items, Friedman said. “You think you can order any of the 10, but then they tell you, ‘We’re out of chicken,’ and it turns out only five of the things are really on the menu. You still have the freedom to choose from the remaining five, but you were overcounting your degrees of freedom.” Similarly, he said, “there might be unknowns, constraints, boundary conditions, conservation laws that could end up limiting your choices in a very subtle way” when setting up an experiment, leading to seeming violations of local realism.

    This possible loophole gained traction in 2010, when Michael Hall, now of Griffith University in Australia, developed a quantitative way of reducing freedom of choice [Phys.Rev.Lett.]. In Bell tests, measuring devices have two possible settings (corresponding to one bit of information: either 1 or 0), and so it takes two bits of information to specify their settings when they are truly independent. But Hall showed that if the settings are not quite independent — if only one bit specifies them once in every 22 runs — this halves the number of possible measurement settings available in those 22 runs. This reduced freedom of choice correlates measurement outcomes enough to exceed Bell’s limit, creating the illusion of quantum entanglement.

    The idea that nature might restrict freedom while maintaining local realism has become more attractive in light of emerging connections between information and the geometry of space-time. Research on black holes, for instance, suggests that the stronger the gravity in a volume of space-time, the fewer bits can be stored in that region. Could gravity be reducing the number of possible measurement settings in Bell tests, secretly striking items from the universe’s menu?

    2
    Members of the cosmic Bell test team calibrating the telescope used to choose the settings of one of their two detectors located in far-apart buildings in Vienna. Jason Gallicchio

    Friedman, Alan Guth and colleagues at MIT were entertaining such speculations a few years ago when Anton Zeilinger, a famous Bell test experimenter at the University of Vienna, came for a visit.

    4
    Alan Guth, Highland Park High School and M.I.T., who first proposed cosmic inflation

    HPHS Owls

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    5
    Alan Guth’s notes. http://www.bestchinanews.com/Explore/4730.html

    Zeilinger also had his sights on the freedom-of-choice loophole. Together, they and their collaborators developed an idea for how to distinguish between a universe that lacks local realism and one that curbs freedom.

    In the first of a planned series of “cosmic Bell test” experiments, the team sent pairs of photons from the roof of Zeilinger’s lab in Vienna through the open windows of two other buildings and into optical modulators, tallying coincident detections as usual. But this time, they attempted to lower the chance that the modulator settings might somehow become correlated with the states of the photons in the moments before each measurement. They pointed a telescope out of each window, trained each telescope on a bright and conveniently located (but otherwise random) star, and, before each measurement, used the color of an incoming photon from each star to set the angle of the associated modulator. The colors of these photons were decided hundreds of years ago, when they left their stars, increasing the chance that they (and therefore the measurement settings) were independent of the states of the photons being measured.

    And yet, the scientists found that the measurement outcomes still violated Bell’s upper limit, boosting their confidence that the polarized photons in the experiment exhibit spooky action at a distance after all.

    Nature could still exploit the freedom-of-choice loophole, but the universe would have had to delete items from the menu of possible measurement settings at least 600 years before the measurements occurred (when the closer of the two stars sent its light toward Earth). “Now one needs the correlations to have been established even before Shakespeare wrote, ‘Until I know this sure uncertainty, I’ll entertain the offered fallacy,’” Hall said.

    Next, the team plans to use light from increasingly distant quasars to control their measurement settings, probing further back in time and giving the universe an even smaller window to cook up correlations between future device settings and restrict freedoms. It’s also possible (though extremely unlikely) that the team will find a transition point where measurement settings become uncorrelated and violations of Bell’s limit disappear — which would prove that Einstein was right to doubt spooky action.

    “For us it seems like kind of a win-win,” Friedman said. “Either we close the loophole more and more, and we’re more confident in quantum theory, or we see something that could point toward new physics.”

    There’s a final possibility that many physicists abhor. It could be that the universe restricted freedom of choice from the very beginning — that every measurement was predetermined by correlations established at the Big Bang. “Superdeterminism,” as this is called, is “unknowable,” said Jan-Åke Larsson, a physicist at Linköping University in Sweden; the cosmic Bell test crew will never be able to rule out correlations that existed before there were stars, quasars or any other light in the sky. That means the freedom-of-choice loophole can never be completely shut.

    But given the choice between quantum entanglement and superdeterminism, most scientists favor entanglement — and with it, freedom. “If the correlations are indeed set [at the Big Bang], everything is preordained,” Larsson said. “I find it a boring worldview. I cannot believe this would be true.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 11:52 am on June 8, 2017 Permalink | Reply
    Tags: A computer program called ParFit, , , D.O.E Ames Lab, Quantum Mechanics, , The Critical Materials Institute designs rare-earth extractants with the help of new software   

    From D.O.E. Ames Lab: “The Critical Materials Institute designs rare-earth extractants with the help of new software” 

    AmesLabII
    Ames Laboratory

    June 8, 2017
    Contacts: For release: June 8, 2017
    Frederico Zahariev
    fzahari@iastate.edu
    Critical Materials Institute
    (515) 708-6827

    Nuwan De Silva
    ndesilva@iastate.edu
    Critical Materials Institute
    (515) 294-7568

    Marilu Dick-Perez
    marilu@iastate.edu
    Critical Materials Institute
    (515) 294-6134

    Mark Gordon
    mark@si.msg.chem.iastate.edu
    Critical Materials Institute
    (515) 294-0452

    Theresa Windus
    twindus@iastate.edu
    Critical Materials Institute
    (515) 294-6134

    1
    No image caption or credit.

    The U.S. Department of Energy’s Critical Materials Institute has developed a computer program, called ParFit, that can vastly reduce the amount of time spent identifying promising chemical compounds used in rare-earth processing methods.

    Testing and developing more efficient and environmentally friendly ways of extracting rare-earth metals as speedily as possible is a primary goal of CMI. Rare-earth metals are vital to many modern energy technologies, but high commercial demand and mining challenges have made optimizing our country’s production and use of them of vital importance.

    “Traditional, quantum mechanical methods of predicting the molecular design and behavior of these extractants are too computationally expensive, and take too long for the timescale needed,” said software designer and CMI scientist Federico Zahariev. “So we developed a program that could create a simpler classical mechanical model which would still reflect the accuracy of the quantum mechanical model.”

    ParFit uses traditional and advanced methods to train the classical mechanical model to fit quantum mechanical information from a training set. These classical models can then be used to predict the shape of new extractants and how they bind to metals.

    “Roughly speaking, think of the molecule’s shape and structure as a system of springs, where there might need to be a lot of small tightening or loosening of different connections to make it work correctly,” said CMI Scientist Theresa Windus. “It’s the same way in which we apply the quantum mechanical calculations to create these classical mechanical models—it’s a tedious, error-prone, and lengthy process. ParFit makes this as quick as possible, automates the fitting of those parameters, and accurately reproduces the quantum mechanical energies.”

    “The program’s capabilities enable the researchers to model an almost unlimited number of new extractants,” said software developer and CMI Scientist Marilu Dick-Perez. For example, the classical models used in the software code, HostDesigner – developed by Benjamin Hay of Supramolecular Design Institute, creates and quickly assesses possible extractants for viability and targets extractants that are best suited for further research. “We’ve reduced the computational work from 2-3 years down to three months,” she said. “We’ve incorporated as much expert knowledge into this program as possible, so that even a novice user can navigate the program.”

    The software’s capabilities is further discussed in a paper, ParFit: A Python-Based Object –Oriented Program for Fitting Molecular Mechanics Parameters to ab Initio Data, authored by Federico Zahariev, Nuwan De Silva, Mark S. Gordon, Theresa L. Windus, and Marilu Dick-Perez, and published in the Journal of Chemical Information and Modeling.

    The Critical Materials Institute is a Department of Energy Innovation Hub led by the U.S. Department of Energy’s Ames Laboratory and supported by DOE’s Office of Energy Efficiency and Renewable Energy’s Advanced Manufacturing Office. CMI seeks ways to eliminate and reduce reliance on rare-earth metals and other materials critical to the success of clean energy technologies.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

    Ames Laboratory is a government-owned, contractor-operated research facility of the U.S. Department of Energy that is run by Iowa State University.

    For more than 60 years, the Ames Laboratory has sought solutions to energy-related problems through the exploration of chemical, engineering, materials, mathematical and physical sciences. Established in the 1940s with the successful development of the most efficient process to produce high-quality uranium metal for atomic energy, the Lab now pursues a broad range of scientific priorities.

    Ames Laboratory is a U.S. Department of Energy Office of Science national laboratory operated by Iowa State University. Ames Laboratory creates innovative materials, technologies and energy solutions. We use our expertise, unique capabilities and interdisciplinary collaborations to solve global problems.

    Ames Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.
    DOE Banner

     
  • richardmitnick 9:56 am on June 8, 2017 Permalink | Reply
    Tags: , , , , , , , , Nautlius, , Quantum Mechanics, Sean Carroll at Caltech, , Will Quantum Mechanics Swallow Relativity   

    From Nautilus: “Will Quantum Mechanics Swallow Relativity?” 

    Nautilus

    Nautilus

    June 8, 2017
    By Corey S. Powell
    Illustration by Nicholas Garber

    The contest between gravity and quantum physics takes a new turn.

    It is the biggest of problems, it is the smallest of problems.

    At present physicists have two separate rulebooks explaining how nature works. There is general relativity, which beautifully accounts for gravity and all of the things it dominates: orbiting planets, colliding galaxies, the dynamics of the expanding universe as a whole. That’s big. Then there is quantum mechanics, which handles the other three forces—electromagnetism and the two nuclear forces. Quantum theory is extremely adept at describing what happens when a uranium atom decays, or when individual particles of light hit a solar cell. That’s small.

    Now for the problem: Relativity and quantum mechanics are fundamentally different theories that have different formulations. It is not just a matter of scientific terminology; it is a clash of genuinely incompatible descriptions of reality.

    The conflict between the two halves of physics has been brewing for more than a century—sparked by a pair of 1905 papers by Einstein, one outlining relativity and the other introducing the quantum—but recently it has entered an intriguing, unpredictable new phase. Two notable physicists have staked out extreme positions in their camps, conducting experiments that could finally settle which approach is paramount.

    Basically you can think of the division between the relativity and quantum systems as “smooth” versus “chunky.” In general relativity, events are continuous and deterministic, meaning that every cause matches up to a specific, local effect. In quantum mechanics, events produced by the interaction of subatomic particles happen in jumps (yes, quantum leaps), with probabilistic rather than definite outcomes. Quantum rules allow connections forbidden by classical physics. This was demonstrated in a much-discussed recent experiment, in which Dutch researchers defied the local effect. They showed two particles—in this case, electrons—could influence each other instantly, even though they were a mile apart. When you try to interpret smooth relativistic laws in a chunky quantum style, or vice versa, things go dreadfully wrong.

    Relativity gives nonsensical answers when you try to scale it down to quantum size, eventually descending to infinite values in its description of gravity. Likewise, quantum mechanics runs into serious trouble when you blow it up to cosmic dimensions. Quantum fields carry a certain amount of energy, even in seemingly empty space, and the amount of energy gets bigger as the fields get bigger. According to Einstein, energy and mass are equivalent (that’s the message of e=mc2), so piling up energy is exactly like piling up mass. Go big enough, and the amount of energy in the quantum fields becomes so great that it creates a black hole that causes the universe to fold in on itself. Oops.

    Craig Hogan, a theoretical astrophysicist at the University of Chicago and the director of the Center for Particle Astrophysics at Fermilab, is reinterpreting the quantum side with a novel theory in which the quantum units of space itself might be large enough to be studied directly. Meanwhile, Lee Smolin, a founding member of the Perimeter Institute for Theoretical Physics in Waterloo, Canada, is seeking to push physics forward by returning back to Einstein’s philosophical roots and extending them in an exciting direction.

    To understand what is at stake, look back at the precedents. When Einstein unveiled general relativity, he not only superseded Isaac Newton’s theory of gravity; he also unleashed a new way of looking at physics that led to the modern conception of the Big Bang and black holes, not to mention atomic bombs and the time adjustments essential to your phone’s GPS. Likewise, quantum mechanics did much more than reformulate James Clerk Maxwell’s textbook equations of electricity, magnetism, and light. It provided the conceptual tools for the Large Hadron Collider, solar cells, all of modern microelectronics.

    What emerges from the dustup could be nothing less than a third revolution in modern physics, with staggering implications. It could tell us where the laws of nature came from, and whether the cosmos is built on uncertainty or whether it is fundamentally deterministic, with every event linked definitively to a cause.

    2
    THE MAN WITH THE HOLOMETER: Craig Hogan, a theoretical astrophysicist at Fermilab, has built a device to measure what he sees as the exceedingly fine graininess of space. “I’m hoping for an experimental result that forces people to focus the theoretical thinking in a different direction,” Hogan says.The Department of Astronomy and Astrophysics, the University of Chicago

    A Chunky Cosmos

    Hogan, champion of the quantum view, is what you might call a lamp-post physicist: Rather than groping about in the dark, he prefers to focus his efforts where the light is bright, because that’s where you are most likely to be able to see something interesting. That’s the guiding principle behind his current research. The clash between relativity and quantum mechanics happens when you try to analyze what gravity is doing over extremely short distances, he notes, so he has decided to get a really good look at what is happening right there. “I’m betting there’s an experiment we can do that might be able to see something about what’s going on, about that interface that we still don’t understand,” he says.

    A basic assumption in Einstein’s physics—an assumption going all the way back to Aristotle, really—is that space is continuous and infinitely divisible, so that any distance could be chopped up into even smaller distances. But Hogan questions whether that is really true. Just as a pixel is the smallest unit of an image on your screen and a photon is the smallest unit of light, he argues, so there might be an unbreakable smallest unit of distance: a quantum of space.

    In Hogan’s scenario, it would be meaningless to ask how gravity behaves at distances smaller than a single chunk of space. There would be no way for gravity to function at the smallest scales because no such scale would exist. Or put another way, general relativity would be forced to make peace with quantum physics, because the space in which physicists measure the effects of relativity would itself be divided into unbreakable quantum units. The theater of reality in which gravity acts would take place on a quantum stage.

    Hogan acknowledges that his concept sounds a bit odd, even to a lot of his colleagues on the quantum side of things. Since the late 1960s, a group of physicists and mathematicians have been developing a framework called string theory to help reconcile general relativity with quantum mechanics; over the years, it has evolved into the default mainstream theory, even as it has failed to deliver on much of its early promise. Like the chunky-space solution, string theory assumes a fundamental structure to space, but from there the two diverge. String theory posits that every object in the universe consists of vibrating strings of energy. Like chunky space, string theory averts gravitational catastrophe by introducing a finite, smallest scale to the universe, although the unit strings are drastically smaller even than the spatial structures Hogan is trying to find.

    Chunky space does not neatly align with the ideas in string theory—or in any other proposed physics model, for that matter. “It’s a new idea. It’s not in the textbooks; it’s not a prediction of any standard theory,” Hogan says, sounding not the least bit concerned. “But there isn’t any standard theory right?”

    If he is right about the chunkiness of space, that would knock out a lot of the current formulations of string theory and inspire a fresh approach to reformulating general relativity in quantum terms. It would suggest new ways to understand the inherent nature of space and time. And weirdest of all, perhaps, it would bolster an au courant notion that our seemingly three-dimensional reality is composed of more basic, two-dimensional units. Hogan takes the “pixel” metaphor seriously: Just as a TV picture can create the impression of depth from a bunch of flat pixels, he suggests, so space itself might emerge from a collection of elements that act as if they inhabit only two dimensions.

    Like many ideas from the far edge of today’s theoretical physics, Hogan’s speculations can sound suspiciously like late-night philosophizing in the freshman dorm. What makes them drastically different is that he plans to put them to a hard experimental test. As in, right now.

    Starting in 2007, Hogan began thinking about how to build a device that could measure the exceedingly fine graininess of space. As it turns out, his colleagues had plenty of ideas about how to do that, drawing on technology developed to search for gravitational waves. Within two years Hogan had put together a proposal and was working with collaborators at Fermilab, the University of Chicago, and other institutions to build a chunk-detecting machine, which he more elegantly calls a “holometer.” (The name is an esoteric pun, referencing both a 17th-century surveying instrument and the theory that 2-D space could appear three-dimensional, analogous to a hologram.)

    Beneath its layers of conceptual complexity, the holometer is technologically little more than a laser beam, a half-reflective mirror to split the laser into two perpendicular beams, and two other mirrors to bounce those beams back along a pair of 40-meter-long tunnels. The beams are calibrated to register the precise locations of the mirrors. If space is chunky, the locations of the mirrors would constantly wander about (strictly speaking, space itself is doing the wandering), creating a constant, random variation in their separation. When the two beams are recombined, they’d be slightly out of sync, and the amount of the discrepancy would reveal the scale of the chunks of space.

    For the scale of chunkiness that Hogan hopes to find, he needs to measure distances to an accuracy of 10-18 meters, about 100 million times smaller than a hydrogen atom, and collect data at a rate of about 100 million readings per second. Amazingly, such an experiment is not only possible, but practical. “We were able to do it pretty cheaply because of advances in photonics, a lot of off the shelf parts, fast electronics, and things like that,” Hogan says. “It’s a pretty speculative experiment, so you wouldn’t have done it unless it was cheap.” The holometer is currently humming away, collecting data at the target accuracy; he expects to have preliminary readings by the end of the year.

    Hogan has his share of fierce skeptics, including many within the theoretical physics community. The reason for the disagreement is easy to appreciate: A success for the holometer would mean failure for a lot of the work being done in string theory. Despite this superficial sparring, though, Hogan and most of his theorist colleagues share a deep core conviction: They broadly agree that general relativity will ultimately prove subordinate to quantum mechanics. The other three laws of physics follow quantum rules, so it makes sense that gravity must as well.

    For most of today’s theorists, though, belief in the primacy of quantum mechanics runs deeper still. At a philosophical—epistemological—level, they regard the large-scale reality of classical physics as a kind of illusion, an approximation that emerges from the more “true” aspects of the quantum world operating at an extremely small scale. Chunky space certainly aligns with that worldview.

    Hogan likens his project to the landmark Michelson-Morley experiment of the 19th century, which searched for the aether—the hypothetical substance of space that, according to the leading theory of the time, transmitted light waves through a vacuum. The experiment found nothing; that perplexing null result helped inspire Einstein’s special theory of relativity, which in turn spawned the general theory of relativity and eventually turned the entire world of physics upside down. Adding to the historical connection, the Michelson-Morley experiment also measured the structure of space using mirrors and a split beam of light, following a setup remarkably similar to Hogan’s.

    “We’re doing the holometer in that kind of spirit. If we don’t see something or we do see something, either way it’s interesting. The reason to do the experiment is just to see whether we can find something to guide the theory,” Hogan says. “You find out what your theorist colleagues are made of by how they react to this idea. There’s a world of very mathematical thinking out there. I’m hoping for an experimental result that forces people to focus the theoretical thinking in a different direction.”

    Whether or not he finds his quantum structure of space, Hogan is confident the holometer will help physics address its big-small problem. It will show the right way (or rule out the wrong way) to understand the underlying quantum structure of space and how that affects the relativistic laws of gravity flowing through it.

    _______________________________________________________________________

    The Black Hole Resolution

    Here on Earth, the clash between the top-down and bottom-up views of physics is playing out in academic journals and in a handful of complicated experimental apparatuses. Theorists on both sides concede that neither pure thought nor technologically feasible tests may be enough to break the deadlock, however. Fortunately, there are other places to look for a more definitive resolution. One of the most improbable of these is also one of the most promising—an idea embraced by physicists almost regardless of where they stand ideologically.

    “Black hole physics gives us a clean experimental target to look for,” says Craig Hogan, a theoretical astrophysicist at the University of Chicago and the director of the Center for Particle Astrophysics at Fermilab. “The issues around quantum black holes are important,” agrees Lee Smolin, a founding member of the Perimeter Institute for Theoretical Physics in Waterloo, Canada.

    Black holes? Really? Granted, these objects are more commonly associated with questions than with answers. They are not things you can create in the laboratory, or poke and prod with instruments, or even study up close with a space probe. Nevertheless, they are the only places in the universe where Hogan’s ideas unavoidably smash into Smolin’s and, more importantly, where the whole of quantum physics collides with general relativity in a way that is impossible to ignore.

    At the outer boundary of the black hole—the event horizon—gravity is so extreme that even light cannot escape, making it an extreme test of how general relativity behaves. At the event horizon, atomic-scale events become enormously stretched out and slowed down; the horizon also divides the physical world into two distinct zones, inside and outside. And there is a very interesting meeting place in terms of the size of a black hole. A stellar-mass black hole is about the size of Los Angeles; a black hole with the mass of the Earth would be roughly the size of a marble. Black holes literally bring the big-small problem in physics home to the human scale.

    The importance of black holes for resolving that problem is the reason why Stephen Hawking and his cohorts debate about them so often and so vigorously. It turns out that we don’t actually need to cozy up close to black holes in order to run experiments with them. Quantum theory implies that a single particle could potentially exist both inside and outside the event horizon, which makes no sense. There is also the question of what happens to information about things that fall into a black hole; the information seems to vanish, even though theory says that information cannot be destroyed. Addressing these contradictions is forcing theoretical physicists to grapple more vigorously than ever before with the interplay of quantum mechanics and general relativity.

    Best of all, the answers will not be confined to the world of theory. Astrophysicists have increasingly sophisticated ways to study the region just outside the event horizon by monitoring the hot, brilliant clouds of particles that swirl around some black holes. An even greater breakthrough is just around the corner: the Event Horizon Telescope. This project is in the process of linking together about a dozen radio dishes from around the world, creating an enormous networked telescope so powerful that it will be able to get a clear look at Sagittarius A*, the massive black hole that resides in the center of our galaxy. Soon, possibly by 2020, the Event Horizon Telescope should deliver its first good portraits. What they show will help constrain the theories of black holes, and so offer telling clues about how to solve the big-small problem.

    Human researchers using football stadium-size radio telescopes, linked together into a planet-size instrument, to study a star-size black hole, to reconcile the subatomic-and-cosmic-level enigma at the heart of physics … if it works, the scale of the achievement will be truly unprecedented.

    Event Horizon Telescope Array

    Event Horizon Telescope map

    The locations of the radio dishes that will be part of the Event Horizon Telescope array. Image credit: Event Horizon Telescope sites, via University of Arizona at https://www.as.arizona.edu/event-horizon-telescope.

    Arizona Radio Observatory
    Arizona Radio Observatory/Submillimeter-wave Astronomy (ARO/SMT)

    ESO/APEX
    Atacama Pathfinder EXperiment (APEX)

    CARMA Array no longer in service
    Combined Array for Research in Millimeter-wave Astronomy (CARMA)

    Atacama Submillimeter Telescope Experiment (ASTE)
    Atacama Submillimeter Telescope Experiment (ASTE)

    Caltech Submillimeter Observatory
    Caltech Submillimeter Observatory (CSO)

    IRAM NOEMA interferometer
    Institut de Radioastronomie Millimetrique (IRAM) 30m

    James Clerk Maxwell Telescope interior, Mauna Kea, Hawaii, USA
    James Clerk Maxwell Telescope interior, Mauna Kea, Hawaii, USA

    Large Millimeter Telescope Alfonso Serrano
    Large Millimeter Telescope Alfonso Serrano

    CfA Submillimeter Array Hawaii SAO
    Submillimeter Array Hawaii SAO

    ESO/NRAO/NAOJ ALMA Array
    ESO/NRAO/NAOJ ALMA Array, Chile

    Future Array/Telescopes

    Plateau de Bure interferometer
    Plateau de Bure interferometer

    South Pole Telescope SPTPOL
    South Pole Telescope SPTPOL

    _______________________________________________________________________

    3
    THE SYNTHESIZER: Black holes are the only place where the whole of quantum physics collides with general relativity in a way that is impossible to ignore. An artist’s impression shows the surroundings of the supermassive black hole at the heart of the active galaxy in the southern constellation of Centaurus. Observations at a European Southern Observatory in Chile have revealed not only the torus of hot dust around the black hole but also a wind of cool material in the polar regions. ESO/M. Kornmesser

    A Really, Really Big Show

    If you are looking for a totally different direction, Smolin of the Perimeter Institute is your man. Where Hogan goes gently against the grain, Smolin is a full-on dissenter: “There’s a thing that Richard Feynman told me when I was a graduate student. He said, approximately, ‘If all your colleagues have tried to demonstrate that something’s true and failed, it might be because that thing is not true.’ Well, string theory has been going for 40 or 50 years without definitive progress.”

    And that is just the start of a broader critique. Smolin thinks the small-scale approach to physics is inherently incomplete. Current versions of quantum field theory do a fine job explaining how individual particles or small systems of particles behave, but they fail to take into account what is needed to have a sensible theory of the cosmos as a whole. They don’t explain why reality is like this, and not like something else. In Smolin’s terms, quantum mechanics is merely “a theory of subsystems of the universe.”

    A more fruitful path forward, he suggests, is to consider the universe as a single enormous system, and to build a new kind of theory that can apply to the whole thing. And we already have a theory that provides a framework for that approach: general relativity. Unlike the quantum framework, general relativity allows no place for an outside observer or external clock, because there is no “outside.” Instead, all of reality is described in terms of relationships between objects and between different regions of space. Even something as basic as inertia (the resistance of your car to move until forced to by the engine, and its tendency to keep moving after you take your foot off the accelerator) can be thought of as connected to the gravitational field of every other particle in the universe.

    That last statement is strange enough that it’s worth pausing for a moment to consider it more closely. Consider a thought problem, closely related to the one that originally led Einstein to this idea in 1907. What if the universe were entirely empty except for two astronauts. One of them is spinning, the other is stationary. The spinning one feels dizzy, doing cartwheels in space. But which one of the two is spinning? From either astronaut’s perspective, the other is the one spinning. Without any external reference, Einstein argued, there is no way to say which one is correct, and no reason why one should feel an effect different from what the other experiences.

    The distinction between the two astronauts makes sense only when you reintroduce the rest of the universe. In the classic interpretation of general relativity, then, inertia exists only because you can measure it against the entire cosmic gravitational field. What holds true in that thought problem holds true for every object in the real world: The behavior of each part is inextricably related to that of every other part. If you’ve ever felt like you wanted to be a part of something big, well, this is the right kind of physics for you. It is also, Smolin thinks, a promising way to obtain bigger answers about how nature really works, across all scales.

    “General relativity is not a description of subsystems. It is a description of the whole universe as a closed system,” he says. When physicists are trying to resolve the clash between relativity and quantum mechanics, therefore, it seems like a smart strategy for them to follow Einstein’s lead and go as big as they possibly can.

    Smolin is keenly aware that he is pushing against the prevailing devotion to small-scale, quantum-style thinking. “I don’t mean to stir things up, it just kind of happens that way. My role is to think clearly about these difficult issues, put my conclusions out there, and let the dust settle,” he says genially. “I hope people will engage with the arguments, but I really hope that the arguments lead to testable predictions.”

    At first blush, Smolin’s ideas sound like a formidable starting point for concrete experimentation. Much as all of the parts of the universe are linked across space, they may also be linked across time, he suggests. His arguments led him to hypothesize that the laws of physics evolve over the history of the universe. Over the years, he has developed two detailed proposals for how this might happen. His theory of cosmological natural selection, which he hammered out in the 1990s, envisions black holes as cosmic eggs that hatch new universes. More recently, he has developed a provocative hypothesis about the emergence of the laws of quantum mechanics, called the principle of precedence—and this one seems much more readily put to the test.

    Smolin’s principle of precedence arises as an answer to the question of why physical phenomena are reproducible. If you perform an experiment that has been performed before, you expect the outcome will be the same as in the past. (Strike a match and it bursts into flame; strike another match the same way and … you get the idea.) Reproducibility is such a familiar part of life that we typically don’t even think about it. We simply attribute consistent outcomes to the action of a natural “law” that acts the same way at all times. Smolin hypothesizes that those laws actually may emerge over time, as quantum systems copy the behavior of similar systems in the past.

    One possible way to catch emergence in the act is by running an experiment that has never been done before, so there is no past version (that is, no precedent) for it to copy. Such an experiment might involve the creation of a highly complex quantum system, containing many components that exist in a novel entangled state. If the principle of precedence is correct, the initial response of the system will be essentially random. As the experiment is repeated, however, precedence builds up and the response should become predictable … in theory. “A system by which the universe is building up precedent would be hard to distinguish from the noises of experimental practice,” Smolin concedes, “but it’s not impossible.”

    Although precedence can play out at the atomic scale, its influence would be system-wide, cosmic. It ties back to Smolin’s idea that small-scale, reductionist thinking seems like the wrong way to solve the big puzzles. Getting the two classes of physics theories to work together, though important, is not enough, either. What he wants to know—what we all want to know—is why the universe is the way it is. Why does time move forward and not backward? How did we end up here, with these laws and this universe, not some others?

    The present lack of any meaningful answer to those questions reveals that “there’s something deeply wrong with our understanding of quantum field theory,” Smolin says. Like Hogan, he is less concerned about the outcome of any one experiment than he is with the larger program of seeking fundamental truths. For Smolin, that means being able to tell a complete, coherent story about the universe; it means being able to predict experiments, but also to explain the unique properties that made atoms, planets, rainbows, and people. Here again he draws inspiration from Einstein.

    “The lesson of general relativity, again and again, is the triumph of relationalism,” Smolin says. The most likely way to get the big answers is to engage with the universe as a whole.

    And the Winner Is …

    If you wanted to pick a referee in the big-small debate, you could hardly do better than Sean Carroll, an expert in cosmology, field theory, and gravitational physics at Caltech. He knows his way around relativity, he knows his way around quantum mechanics, and he has a healthy sense of the absurd: He calls his personal blog Preposterous Universe.

    Right off the bat, Carroll awards most of the points to the quantum side. “Most of us in this game believe that quantum mechanics is much more fundamental than general relativity is,” he says. That has been the prevailing view ever since the 1920s, when Einstein tried and repeatedly failed to find flaws in the counterintuitive predictions of quantum theory. The recent Dutch experiment demonstrating an instantaneous quantum connection between two widely separated particles—the kind of event that Einstein derided as “spooky action at a distance”—only underscores the strength of the evidence.

    Taking a larger view, the real issue is not general relativity versus quantum field theory, Carroll explains, but classical dynamics versus quantum dynamics. Relativity, despite its perceived strangeness, is classical in how it regards cause and effect; quantum mechanics most definitely is not. Einstein was optimistic that some deeper discoveries would uncover a classical, deterministic reality hiding beneath quantum mechanics, but no such order has yet been found. The demonstrated reality of spooky action at a distance argues that such order does not exist.

    “If anything, people under-appreciate the extent to which quantum mechanics just completely throws away our notions of space and locality [the notion that a physical event can affect only its immediate surroundings]. Those things simply are not there in quantum mechanics,” Carroll says. They may be large-scale impressions that emerge from very different small-scale phenomena, like Hogan’s argument about 3-D reality emerging from 2-D quantum units of space.

    Despite that seeming endorsement, Carroll regards Hogan’s holometer as a long shot, though he admits it is removed from his area of research. At the other end, he doesn’t think much of Smolin’s efforts to start with space as a fundamental thing; he regards the notion as absurd as trying to argue that air is more fundamental than atoms. As for what kind of quantum system might take physics to the next level, Carroll remains broadly optimistic about string theory, which he says “seems to be a very natural extension of quantum field theory.” In all these ways, he is true to the mainstream, quantum-based thinking in modern physics.

    Yet Carroll’s ruling, while almost entirely pro-quantum, is not purely an endorsement of small-scale thinking. There are still huge gaps in what quantum theory can explain. “Our inability to figure out the correct version of quantum mechanics is embarrassing,” he says. “And our current way of thinking about quantum mechanics is simply a complete failure when you try to think about cosmology or the whole universe. We don’t even know what time is.” Both Hogan and Smolin endorse this sentiment, although they disagree about what to do in response. Carroll favors a bottom-up explanation in which time emerges from small-scale quantum interactions, but declares himself “entirely agnostic” about Smolin’s competing suggestion that time is more universal and fundamental. In the case of time, then, the jury is still out.

    No matter how the theories shake out, the large scale is inescapably important, because it is the world we inhabit and observe. In essence, the universe as a whole is the answer, and the challenge to physicists is to find ways to make it pop out of their equations. Even if Hogan is right, his space-chunks have to average out to the smooth reality we experience every day. Even if Smolin is wrong, there is an entire cosmos out there with unique properties that need to be explained—something that, for now at least, quantum physics alone cannot do.

    By pushing at the bounds of understanding, Hogan and Smolin are helping the field of physics make that connection. They are nudging it not just toward reconciliation between quantum mechanics and general relativity, but between idea and perception. The next great theory of physics will undoubtedly lead to beautiful new mathematics and unimaginable new technologies. But the best thing it can do is create deeper meaning that connects back to us, the observers, who get to define ourselves as the fundamental scale of the universe.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 3:37 pm on June 2, 2017 Permalink | Reply
    Tags: Breaking Newton's Law, , Quantum Mechanics, The famous falling apple,   

    From TUM: “Breaking Newton’s Law” 

    Techniche Universitat Munchen

    Techniche Universitat Munchen

    01.06.2017

    Oscillation instead of free fall – Quantum interference leads to surprising results.

    In the quantum world, our intuition for moving objects is strongly challenged and may sometimes even completely fail. An international team of physicists of the Universities of Innsbruck, Paris-Sud and Harvard as well as the Technical University of Munich (TUM) has found a quantum particle which shows an intriguing oscillatory back-and-forth motion in a one-dimensional atomic gas instead of moving uniformly.

    1
    A quantum particle performing an intriguing oscillatory back-and-forth motion in a one-dimensional atomic gas. (Image: Florian Meinert / Univ. Innsbruck)

    A ripe apple falling from a tree has inspired Sir Isaac Newton to formulate a theory that describes the motion of objects subject to a force. Newton’s equations of motion tell us that a moving body keeps on moving on a straight line unless any disturbing force may change its path.

    The impact of Newton’s laws is ubiquitous in our everyday experience, ranging from a skydiver falling in the earth’s gravitational field, over the inertia one feels in an accelerating airplane, to the earth orbiting around the sun.

    In the quantum world, however, our intuition for the motion of objects is strongly challenged and may sometimes even completely fail. In the current issue of Science an international team of physicists from Innsbruck, Munich, Paris and Cambridge (USA) describes a quantum particle that shows a completely unexpected behavior.

    In a quantum gas the particle does not move like the famous falling apple, but it oscillates. At the heart of this surprising behavior is what physicists call ‘quantum interference’, the fact that quantum mechanics allows particles to behave like waves, which can add up or cancel each other.

    Approaching absolute zero temperature

    To observe the quantum particle oscillating back and forth the team had to cool a gas of Cesium atoms just above absolute zero temperature and to confine it to an arrangement of very thin “tubes” realized by high-power laser beams. With a special trick, the atoms were made to interact strongly with each other.

    At such extreme conditions the atoms form a quantum fluid whose motion is restricted to the direction of the tubes. The physicists then accelerated an impurity atom, which is an atom in a different spin state, through the gas. In our everyday world this corresponds to the apple falling from the tree.

    The scientists, however, observed that the quantum wave of the atom was scattered by the other atoms and reflected back again. The result is a striking oscillatory movement. The experiment demonstrates that Newton’s laws cannot be used in the quantum realm.

    Quantum fluids sometimes act like crystals

    The fact that a quantum-wave may get reflected into certain directions has been known since the early days of the development of the theory of quantum mechanics. For example, electrons reflect at the regular pattern of solid crystals, such as a piece of metal. This effect is termed ‘Bragg-scattering’.

    However, the surprise in the experiment performed in Innsbruck was that no such crystal was present for the impurity to reflect off. Instead, it was the gas of atoms itself that provided a type of hidden order in its arrangement, a property that physicist dub ‘correlations’.

    The publication has demonstrated how these correlations in combination with the wave-nature of matter determine the motion of particles in the quantum world and lead to novel and exciting phenomena that counteract the experiences from our daily life.

    Understanding fundamental processes in electronics components

    “Understanding the oddity of quantum mechanics is also relevant in a broader scope,” says Michael Knap, Professor for Collective Quantum Dynamics at the Technical University of Munich. “It might help to understand and optimize fundamental processes in electronics components, or even transport processes in complex biological systems.”

    The research was funded by the European Science Council (ERC), the Austrian Science Fund (FWF), the National Science Foundation (NSF) and the US Air Force’s Office of Scientific Research (AFOSP), the Alexander von Humboldt Foundation, the Max Planck Institute for Quantum Optics and the TUM Institute for Advanced Study. In addition to the Walter Schottky Institute of the TU Munich, the Center for Quantum Physics of the University of Innsbruck, the Laboratoire de Physique Théorique et Modèles Statistiques of the University of Paris-Sud and the Physics Department of the Harvard University (Cambridge, Massachusetts, USA) were involved in the research work.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Techniche Universitat Munchin Campus

    Technische Universität München (TUM) is one of Europe’s top universities. It is committed to excellence in research and teaching, interdisciplinary education and the active promotion of promising young scientists. The university also forges strong links with companies and scientific institutions across the world. TUM was one of the first universities in Germany to be named a University of Excellence. Moreover, TUM regularly ranks among the best European universities in international rankings.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: