Tagged: QCD: Quantum Chromodynamics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:50 am on October 12, 2021 Permalink | Reply
    Tags: "The Electron-Ion Collider- new accelerator could solve the mystery of how matter holds together", , QCD: Quantum Chromodynamics, ,   

    From The Conversation : “The Electron-Ion Collider- new accelerator could solve the mystery of how matter holds together” 

    From The Conversation

    October 11, 2021
    Daria Sokhan

    DOE’s Brookhaven National Laboratory (US) campus. Credit: Brookhaven National Laboratory.

    When the Nobel Prize-winning US physicist Robert Hofstadter and his team fired highly energetic electrons at a small vial of hydrogen at the Stanford Linear Accelerator Center in 1956, they opened the door to a new era of physics. Until then, it was thought that protons and neutrons, which make up an atom’s nucleus, were the most fundamental particles in nature. They were considered to be “dots” in space, lacking physical dimensions. Now it suddenly became clear that these particles were not fundamental at all, and had a size and complex internal structure as well.

    What Hofstadter and his team saw was a small deviation in how electrons “scattered”, or bounced, when hitting the hydrogen. This suggested there was more to a nucleus than the dot-like protons and neutrons they had imagined. The experiments that followed around the world at accelerators – machines that propel particles to very high energies – heralded a paradigm shift in our understanding of matter.

    Yet there is a lot we still don’t know about the atomic nucleus – as well as the “strong force”, one of four fundamental forces of nature, that holds it together. Now a brand-new accelerator, the Electron-Ion Collider, to be built within the decade at the DOE’s Brookhaven National Laboratory (US), with the help of 1,300 scientists from around the world, could help take our understanding of the nucleus to a new level.

    Strong but strange force

    After the revelations of the 1950s, it soon became clear that particles called quarks and gluons are the fundamental building blocks of matter. They are the constituents of hadrons, which is the collective name for protons and other particles. Sometimes people imagine that these kinds of particles fit together like Lego, with quarks in a certain configuration making up protons, and then protons and neutrons coupling up to create a nucleus, and the nucleus attracting electrons to build an atom. But quarks and gluons are anything but static building blocks.

    A theory called quantum chromodynamics describes how the strong force works between quarks, mediated by gluons, which are force carriers. Yet it cannot help us to analytically calculate the proton’s properties. This isn’t some fault of our theorists or computers — the equations themselves are simply not solvable.

    This is why the experimental study of the proton and other hadrons is so crucial: to understand the proton and the force that binds it, one must study it from every angle. For this, the accelerator is our most powerful tool.

    Yet when you look at the proton with a collider (a type of accelerator which uses two beams), what we see depends on how deep — and with what — we look: sometimes it appears as three constituent quarks, at other times as an ocean of gluons, or a teeming sea of pairs of quarks and their antiparticles (antiparticles are near identical to particles, but have the opposite charge or other quantum properties).

    How an electron colliding with a charged atom can reveal its nuclear structure. Brookhaven National Lab/Flickr, CC BY-NC.

    So while our understanding of matter at this tiniest of scales has made great progress in the past 60 years, many mysteries remain which the tools of today cannot fully address. What is the nature of the confinement of quarks within a hadron? How does the mass of the proton arise from the almost massless quarks, 1,000 times lighter?

    To answer such questions, we need a microscope that can image the structure of the proton and nucleus across the widest range of magnifications in exquisite detail, and build 3D images of their structure and dynamics. That’s exactly what the new collider will do.

    Experimental set up

    The Electron-Ion Collider (EIC) will use a very intense beam of electrons as its probe, with which it will be possible to slice the proton or nucleus open and look at the structure inside it. It will do that by colliding a beam of electrons with a beam of protons or ions (charged atoms) and look at how the electrons scatter. The ion beam is the first of its kind in the world.

    Effects which are barely perceptible, such as scattering processes which are so rare you only observe them once in a billion collisions, will become visible. By studying these processes, myself and other scientists will be able to reveal the structure of protons and neutrons, how it is modified when they are bound by the strong force, and how new hadrons are created. We could also uncover what sort of matter is made up of pure gluons — something which has never been seen.

    The collider will be tuneable to a wide range of energies: this is like turning the magnification dial on a microscope, the higher the energy, the deeper inside the proton or nucleus one can look and the finer the features one can resolve.

    Newly formed collaborations of scientists across the world, which are part of the EIC team, are also designing detectors, which will be placed at two different collision points in the collider. Aspects of this effort are led by UK teams, which have just been awarded a grant to lead the design of three key components of the detectors and develop the technologies needed to realise them: sensors for precision tracking of charged particles, sensors for the detection of electrons scattered extremely closely to the beam line and detectors to measure the polarisation (direction of spin) of the particles scattered in the collisions.

    While it may take another ten years before the collider is fully designed and built, it is likely to be well worth the effort. Understanding the structure of the proton and, through it, the fundamental force that gives rise to over 99% of the visible mass in the universe, is one of the greatest challenges in physics today.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Conversation launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

  • richardmitnick 4:12 pm on May 27, 2021 Permalink | Reply
    Tags: "Quark-gluon plasma [QGP] flows like water according to new study", , Fluid viscosity is governed by fundamental physical constants such as the Planck constant and the nucleon mass., Kinematic viscosity, Navier-Stokes equation which contains density and viscosity, , , QCD: Quantum Chromodynamics,   

    From Queen Mary University of London (UK) : “Quark-gluon plasma [QGP] flows like water according to new study” 

    27 May 2021

    Sophie McLachlan
    Faculty Communications Manager (Science and Engineering)

    What does quark-gluon plasma [QGP] – the hot soup of elementary particles formed a few microseconds after the Big Bang – have in common with tap water? Scientists say it’s the way it flows.

    Quark-Gluon Plasma from BNL RHIC.

    Quark gluon plasma from Duke University (US)

    A new study, published today in the journal SciPost Physics, has highlighted the surprising similarities between quark-gluon plasma, the first matter thought to have filled the early Universe, and water that comes from our tap.

    The ratio between the viscosity of a fluid, the measure of how runny it is, and its density, decides how it flows. Whilst both the viscosity and density of quark-gluon plasma are about 16 orders of magnitude larger than in water, the researchers found that the ratio between the viscosity and density of the two types of fluids are the same. This suggests that one of the most exotic states of matter known to exist in our universe would flow out of your tap in much the same way as water.

    What is quark-gluon plasma [QGP]?

    The matter that makes up our Universe is made of atoms, which consist of nuclei with orbiting electrons. Nuclei consist of protons and neutrons known collectively as nucleons and these in turn consist of quarks interacting via gluons. At very high temperatures – about one million times hotter than the centre of the Sun- quarks and gluons break free from their parent nucleons and instead form a dense, hot soup known as quark-gluon plasma.

    It is thought that shortly after the Big Bang the early Universe was filled with incredibly hot quark-gluon plasma. This then cooled microseconds later to form the building blocks of all the matter found within our universe. Since the early 2000s scientists have been able to recreate quark-gluon plasma experimentally using large particle colliders, which has provided new insights into this exotic state of matter.

    The ordinary matter we encounter on a daily basis are thought to have very different properties to the quark-gluon plasma found in the early beginnings of the Universe. For example, fluids like water are governed by the behaviour of atoms and molecules that are much larger than the particles found in quark-gluon plasma, and are held together by weaker forces.

    However, the recent study shows that despite these differences the ratio of viscosity and density, known as the kinematic viscosity, is close in both quark-gluon plasma and ordinary liquids. This ratio is important because the fluid flow does not depend on viscosity alone but is governed by the Navier-Stokes equation which contains density and viscosity. Therefore, if this ratio is the same for two different fluids these two fluids will flow in the same way even if they have very different viscosities and densities.

    The power of physics

    Importantly, it’s not just any liquid viscosity that coincides with the viscosity of quark-gluon plasma. Indeed, liquid viscosity can vary by many orders of magnitude depending on temperature. However, there is one very particular point where liquid viscosity has a nearly-universal lower limit.

    Previous research [Science Advances] found that in that limit, fluid viscosity is governed by fundamental physical constants such as the Planck constant and the nucleon mass. It is these constants of nature that ultimately decide whether a proton is a stable particle, and govern processes like nuclear synthesis in stars and the creation of essential biochemical elements needed for life. The recent study found that it is this universal lower limit of viscosity of ordinary fluids like water which turns out to be close to the viscosity of quark-gluon plasma.

    Professor Kostya Trachenko, Professor of Physics at Queen Mary University of London and author of the recent paper, said: “We do not fully understand the origin of this striking similarity yet but we think it could be related to the fundamental physical constants which set both the universal lower limit of viscosity for both ordinary liquids and quark-gluon plasma.”

    “This study provides a fairly rare and delightful example of where we can draw quantitative comparisons between hugely disparate systems,” continues Professor Matteo Baggioli from the Autonomous University of Madrid [Universidad Autónoma de Madrid] (ES). “Liquids are described by hydrodynamics, which leaves us with many open problems that are currently at the forefront of physics research. Our result shows the power of physics to translate general principles into specific predictions about complex properties such as liquid flow in exotic types of matter like quark-gluon plasma.”

    Improving our understanding

    Understanding quark-gluon plasma and its flow is currently at the forefront of high-energy physics. Strong forces between quarks and gluons are described by quantum chromodynamics, one of the most comprehensive physical theories that exist. However whilst quantum chromodynamics provides a theory of strong nuclear force, it is very hard to solve and understand quark-gluon plasma properties using this alone.

    “It is conceivable that the current result can provide us with a better understanding of the quark-gluon plasma,” added Professor Vadim Brazhkin from the Russian Academy of Sciences [Росси́йская акаде́мия нау́к; (РАН) Rossíiskaya akadémiya naúk](RU). “The reason is that viscosity in liquids at their minimum corresponds to a very particular regime of liquid dynamics which we understood only recently. The similarity with the quark-gluon plasma suggests that particles in this exotic system move in the same way as in tap water.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    At Queen Mary University of London (UK), we believe that a diversity of ideas helps us achieve the previously unthinkable.

    Throughout our history, we’ve fostered social justice and improved lives through academic excellence. And we continue to live and breathe this spirit today, not because it’s simply ‘the right thing to do’ but for what it helps us achieve and the intellectual brilliance it delivers.

    Our reformer heritage informs our conviction that great ideas can and should come from anywhere. It’s an approach that has brought results across the globe, from the communities of east London to the favelas of Rio de Janeiro.

    We continue to embrace diversity of thought and opinion in everything we do, in the belief that when views collide, disciplines interact, and perspectives intersect, truly original thought takes form.

  • richardmitnick 3:56 pm on March 5, 2021 Permalink | Reply
    Tags: "Tantalizing Signs of Phase-change ‘Turbulence’ in RHIC Collisions", , , Despite the tantalizing hints the STAR scientists acknowledge that the range of uncertainty in their measurements is still large., , Net baryon density, , , QCD: Quantum Chromodynamics, , STAR physicists took advantage of the incredible versatility of RHIC to collide gold ions (the nuclei of gold atoms) across a wide range of energies., Strictly speaking if the scientists don’t identify either the phase boundary or the critical point they really can’t put this [QGP phase] into the textbooks and say that there is a new state of ma, Tantalizing signs of a critical point—a change in the way that quarks and gluons-the building blocks of protons and neutrons-transform from one phase to another., The work is also a true collaboration of the experimentalists with nuclear theorists around the world and the accelerator physicists at RHIC., When there is a change from high energy to low energy there is an increase in the net baryon density and the structure of matter may change going through the phase transition area.   

    From DOE’s Brookhaven National Laboratory(US): “Tantalizing Signs of Phase-change ‘Turbulence’ in RHIC Collisions” 

    From DOE’s Brookhaven National Laboratory(US)

    March 5, 2021
    Karen McNulty Walsh
    Peter Genzer

    Fluctuations in net proton production hint at a possible ‘critical point’ marking a change in the way nuclear matter transforms from one phase to another.

    The STAR detector at the U.S. Department of Energy’s Brookhaven National Laboratory.

    Physicists studying collisions of gold ions at the Relativistic Heavy Ion Collider (RHIC), a U.S. Department of Energy Office of Science user facility for nuclear physics research at DOE’s Brookhaven National Laboratory, are embarking on a journey through the phases of nuclear matter—the stuff that makes up the nuclei of all the visible matter in our universe. A new analysis of collisions conducted at different energies shows tantalizing signs of a critical point—a change in the way that quarks and gluons-the building blocks of protons and neutrons-transform from one phase to another. The findings, just published by RHIC’s STAR Collaboration in the journal Physical Review Letters, will help physicists map out details of these nuclear phase changes to better understand the evolution of the universe and the conditions in the cores of neutron stars.

    “If we are able to discover this critical point, then our map of nuclear phases—the nuclear phase diagram—may find a place in the textbooks, alongside that of water,” said Bedanga Mohanty of India’s National Institute of Science and Research, one of hundreds of physicists collaborating on research at RHIC using the sophisticated STAR detector.

    As Mohanty noted, studying nuclear phases is somewhat like learning about the solid, liquid, and gaseous forms of water, and mapping out how the transitions take place depending on conditions like temperature and pressure. But with nuclear matter, you can’t just set a pot on the stove and watch it boil. You need powerful particle accelerators like RHIC to turn up the heat.

    As physicists turned the collision energy down at RHIC, they expected to see large event-by-event fluctuations in certain measurements such as net proton production—an effect that’s similar to the turbulence an airplane experiences when entering a bank of clouds—as evidence of a “critical point” in the nuclear phase transition. Higher level statistical analyses of the data, including the skew (kurtosis), revealed tantalizing hints of such fluctuations.

    RHIC’s highest collision energies “melt” ordinary nuclear matter (atomic nuclei made of protons and neutrons) to create an exotic phase called a quark-gluon plasma (QGP). Scientists believe the entire universe existed as QGP a fraction of a second after the Big Bang—before it cooled and the quarks bound together (glued by gluons) to form protons, neutrons, and eventually, atomic nuclei. But the tiny drops of QGP created at RHIC measure a mere 10^-13 centimeters across (that’s 0.0000000000001 cm) and they last for only 10^-23 seconds! That makes it incredibly challenging to map out the melting and freezing of the matter that makes up our world.

    “Strictly speaking if we don’t identify either the phase boundary or the critical point we really can’t put this [QGP phase] into the textbooks and say that we have a new state of matter,” said Nu Xu, a STAR physicist at DOE’s Lawrence Berkeley National Laboratory.

    Tracking phase transitions

    To track the transitions, STAR physicists took advantage of the incredible versatility of RHIC to collide gold ions (the nuclei of gold atoms) across a wide range of energies.

    Mapping nuclear phase changes is like studying how water changes under different conditions of temperature and pressure (net baryon density for nuclear matter). RHIC’s collisions “melt” protons and neutrons to create quark-gluon plasma (QGP). STAR physicists are exploring collisions at different energies, turning the “knobs” of temperature and baryon density, to look for signs of a “critical point.”

    “RHIC is the only facility that can do this, providing beams from 200 billion electron volts (GeV) all the way down to 3 GeV. Nobody can dream of such an excellent machine,” Xu said.

    The changes in energy turn the collision temperature up and down and also vary a quantity known as net baryon density that is somewhat analogous to pressure. Looking at data collected during the first phase of RHIC’s “beam energy scan” from 2010 to 2017, STAR physicists tracked particles streaming out at each collision energy. They performed a detailed statistical analysis of the net number of protons produced. A number of theorists had predicted that this quantity would show large event-by-event fluctuations as the critical point is approached.

    The reason for the expected fluctuations comes from a theoretical understanding of the force that governs quarks and gluons. That theory, known as quantum chromodynamics, suggests that the transition from normal nuclear matter (“hadronic” protons and neutrons) to QGP can take place in two different ways. At high temperatures, where protons and anti-protons are produced in pairs and the net baryon density is close to zero, physicists have evidence of a smooth crossover between the phases. It’s as if protons gradually melt to form QGP, like butter gradually melting on a counter on a warm day. But at lower energies, they expect what’s called a first-order phase transition—an abrupt change like water boiling at a set temperature as individual molecules escape the pot to become steam. Nuclear theorists predict that in the QGP-to-hadronic-matter phase transition, net proton production should vary dramatically as collisions approach this switchover point.

    “At high energy, there is only one phase. The system is more or less invariant, normal,” Xu said. “But when we change from high energy to low energy you also increase the net baryon density and the structure of matter may change as you are going through the phase transition area.

    “It’s just like when you ride an airplane and you get into turbulence,” he added. “You see the fluctuation—boom, boom, boom. Then, when you pass the turbulence—the phase of structural changes—you are back to normal into the one-phase structure.”

    In the RHIC collision data, the signs of this turbulence are not as apparent as food and drinks bouncing off tray tables in an airplane. STAR physicists had to perform what’s known as “higher order correlation function” statistical analysis of the distributions of particles—looking for more than just the mean and width of the curve representing the data to things like how asymmetrical and skewed that distribution is.

    The oscillations they see in these higher orders, particularly the skew (or kurtosis), are reminiscent of another famous phase change observed when transparent liquid carbon dioxide suddenly becomes cloudy when heated, the scientists say. This “critical opalescence” comes from dramatic fluctuations in the density of the CO2—variations in how tightly packed the molecules are.

    “In our data, the oscillations signify that something interesting is happening, like the opalescence,” Mohanty said.

    Yet despite the tantalizing hints the STAR scientists acknowledge that the range of uncertainty in their measurements is still large. The team hopes to narrow that uncertainty to nail their critical point discovery by analyzing a second set of measurements made from many more collisions during phase II of RHIC’s beam energy scan, from 2019 through 2021.

    The entire STAR collaboration was involved in the analysis, Xu notes, with a particular group of physicists—including Xiaofeng Luo (and his student, Yu Zhang), Ashish Pandav, and Toshihiro Nonaka, from China, India, and Japan, respectively—meeting weekly with the U.S. scientists (over many time zones and virtual networks) to discuss and refine the results. The work is also a true collaboration of the experimentalists with nuclear theorists around the world and the accelerator physicists at RHIC. The latter group, in Brookhaven Lab’s Collider-Accelerator Department, devised ways to run RHIC far below its design energy while also maximizing collision rates to enable the collection of the necessary data at low collision energies.

    “We are exploring uncharted territory,” Xu said. “This has never been done before. We made lots of efforts to control the environment and make corrections, and we are eagerly awaiting the next round of higher statistical data,” he said.

    This study was supported by the DOE Office of Science, the U.S. National Science Foundation, and a wide range of international funding agencies listed in the paper. RHIC operations are funded by the DOE Office of Science. Data analysis was performed using computing resources at the RHIC and ATLAS Computing Facility (RACF) at Brookhaven Lab, the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory, and via the Open Science Grid consortium.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    One of ten national laboratories overseen and primarily funded by the DOE(US) Office of Science, DOE’s Brookhaven National Laboratory(US) conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University(US), the largest academic user of Laboratory facilities, and Battelle(US), a nonprofit, applied science and technology organization.

    Research at BNL specializes in nuclear and high energy physics, energy science and technology, environmental and bioscience, nanoscience and national security. The 5,300 acre campus contains several large research facilities, including the Relativistic Heavy Ion Collider [below] and National Synchrotron Light Source II [below]. Seven Nobel prizes have been awarded for work conducted at Brookhaven lab.

    BNL is staffed by approximately 2,750 scientists, engineers, technicians, and support personnel, and hosts 4,000 guest investigators every year. The laboratory has its own police station, fire department, and ZIP code (11973). In total, the lab spans a 5,265-acre (21 km^2) area that is mostly coterminous with the hamlet of Upton, New York. BNL is served by a rail spur operated as-needed by the New York and Atlantic Railway. Co-located with the laboratory is the Upton, New York, forecast office of the National Weather Service.

    Major programs

    Although originally conceived as a nuclear research facility, Brookhaven Lab’s mission has greatly expanded. Its foci are now:

    Nuclear and high-energy physics
    Physics and chemistry of materials
    Environmental and climate research
    Energy research
    Structural biology
    Accelerator physics


    Brookhaven National Lab was originally owned by the Atomic Energy Commission(US) and is now owned by that agency’s successor, the United States Department of Energy (DOE). DOE subcontracts the research and operation to universities and research organizations. It is currently operated by Brookhaven Science Associates LLC, which is an equal partnership of Stony Brook University(US) and Battelle Memorial Institute(US). From 1947 to 1998, it was operated by Associated Universities, Inc. (AUI), but AUI lost its contract in the wake of two incidents: a 1994 fire at the facility’s high-beam flux reactor that exposed several workers to radiation and reports in 1997 of a tritium leak into the groundwater of the Long Island Central Pine Barrens on which the facility sits.


    Following World War II, the US Atomic Energy Commission was created to support government-sponsored peacetime research on atomic energy. The effort to build a nuclear reactor in the American northeast was fostered largely by physicists Isidor Isaac Rabi and Norman Foster Ramsey Jr., who during the war witnessed many of their colleagues at Columbia University leave for new remote research sites following the departure of the Manhattan Project from its campus. Their effort to house this reactor near New York City was rivalled by a similar effort at the Massachusetts Institute of Technology(US) to have a facility near Boston, Massachusettes(US). Involvement was quickly solicited from representatives of northeastern universities to the south and west of New York City such that this city would be at their geographic center. In March 1946 a nonprofit corporation was established that consisted of representatives from nine major research universities — Columbia(US), Cornell(US), Harvard(US), Johns Hopkins(US), MIT, Princeton University(US), University of Pennsylvania(US), University of Rochester(US), and Yale University(US).

    Out of 17 considered sites in the Boston-Washington corridor, Camp Upton on Long Island was eventually chosen as the most suitable in consideration of space, transportation, and availability. The camp had been a training center from the US Army during both World War I and World War II. After the latter war, Camp Upton was deemed no longer necessary and became available for reuse. A plan was conceived to convert the military camp into a research facility.

    On March 21, 1947, the Camp Upton site was officially transferred from the U.S. War Department to the new U.S. Atomic Energy Commission (AEC), predecessor to the U.S. Department of Energy (DOE).

    Research and facilities

    Reactor history

    In 1947 construction began on the first nuclear reactor at Brookhaven, the Brookhaven Graphite Research Reactor. This reactor, which opened in 1950, was the first reactor to be constructed in the United States after World War II. The High Flux Beam Reactor operated from 1965 to 1999. In 1959 Brookhaven built the first US reactor specifically tailored to medical research, the Brookhaven Medical Research Reactor, which operated until 2000.

    Accelerator history

    In 1952 Brookhaven began using its first particle accelerator, the Cosmotron. At the time the Cosmotron was the world’s highest energy accelerator, being the first to impart more than 1 GeV of energy to a particle.

    BNL Cosmotron 1952-1966

    The Cosmotron was retired in 1966, after it was superseded in 1960 by the new Alternating Gradient Synchrotron (AGS).

    BNL Alternating Gradient Synchrotron (AGS)

    The AGS was used in research that resulted in 3 Nobel prizes, including the discovery of the muon neutrino, the charm quark, and CP violation.

    In 1970 in BNL started the ISABELLE project to develop and build two proton intersecting storage rings.

    The groundbreaking for the project was in October 1978. In 1981, with the tunnel for the accelerator already excavated, problems with the superconducting magnets needed for the ISABELLE accelerator brought the project to a halt, and the project was eventually cancelled in 1983.

    The National Synchrotron Light Source operated from 1982 to 2014 and was involved with two Nobel Prize-winning discoveries. It has since been replaced by the National Synchrotron Light Source II [below].


    After ISABELLE’S cancellation, physicist at BNL proposed that the excavated tunnel and parts of the magnet assembly be used in another accelerator. In 1984 the first proposal for the accelerator now known as the Relativistic Heavy Ion Collider (RHIC)[below] was put forward. The construction got funded in 1991 and RHIC has been operational since 2000. One of the world’s only two operating heavy-ion colliders, RHIC is as of 2010 the second-highest-energy collider after the Large Hadron Collider(CH). RHIC is housed in a tunnel 2.4 miles (3.9 km) long and is visible from space.

    On January 9, 2020, It was announced by Paul Dabbar, undersecretary of the US Department of Energy Office of Science, that the BNL eRHIC design has been selected over the conceptual design put forward by DOE’s Thomas Jefferson National Accelerator Facility [Jlab] as the future Electron–ion collider (EIC) in the United States.

    Electron-Ion Collider (EIC) at BNL, to be built inside the tunnel that currently houses the RHIC.

    In addition to the site selection, it was announced that the BNL EIC had acquired CD-0 (mission need) from the Department of Energy. BNL’s eRHIC design proposes upgrading the existing Relativistic Heavy Ion Collider, which collides beams light to heavy ions including polarized protons, with a polarized electron facility, to be housed in the same tunnel.

    Other discoveries

    In 1958, Brookhaven scientists created one of the world’s first video games, Tennis for Two. In 1968 Brookhaven scientists patented Maglev, a transportation technology that utilizes magnetic levitation.

    Major facilities

    Relativistic Heavy Ion Collider (RHIC), which was designed to research quark–gluon plasma[16] and the sources of proton spin. Until 2009 it was the world’s most powerful heavy ion collider. It is the only collider of spin-polarized protons.
    Center for Functional Nanomaterials (CFN), used for the study of nanoscale materials.
    National Synchrotron Light Source II (NSLS-II), Brookhaven’s newest user facility, opened in 2015 to replace the National Synchrotron Light Source (NSLS), which had operated for 30 years.[19] NSLS was involved in the work that won the 2003 and 2009 Nobel Prize in Chemistry.
    Alternating Gradient Synchrotron, a particle accelerator that was used in three of the lab’s Nobel prizes.
    Accelerator Test Facility, generates, accelerates and monitors particle beams.
    Tandem Van de Graaff, once the world’s largest electrostatic accelerator.
    Computational Science resources, including access to a massively parallel Blue Gene series supercomputer that is among the fastest in the world for scientific research, run jointly by Brookhaven National Laboratory and Stony Brook University.
    Interdisciplinary Science Building, with unique laboratories for studying high-temperature superconductors and other materials important for addressing energy challenges.
    NASA Space Radiation Laboratory, where scientists use beams of ions to simulate cosmic rays and assess the risks of space radiation to human space travelers and equipment.

    Off-site contributions

    It is a contributing partner to ATLAS experiment, one of the four detectors located at the Large Hadron Collider (LHC).

    CERN map

    Iconic view of the CERN (CH) ATLAS detector.

    It is currently operating at CERN near Geneva, Switzerland.

    Brookhaven was also responsible for the design of the SNS accumulator ring in partnership with Spallation Neutron Source at DOE’s Oak Ridge National Laboratory, Tennessee.

    ORNL Spallation Neutron Source annotated.

    Brookhaven plays a role in a range of neutrino research projects around the world, including the Daya Bay Reactor Neutrino Experiment in China and the Deep Underground Neutrino Experiment at DOE’s Fermi National Accelerator Laboratory(US).

    Daya Bay, nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA.

    Brookhaven Campus.

    BNL Center for Functional Nanomaterials.



    BNL RHIC Campus.

    BNL/RHIC Star Detector.

    BNL/RHIC Phenix.

  • richardmitnick 11:56 am on March 5, 2021 Permalink | Reply
    Tags: "Physicists Just Found 4 New Subatomic Particles That May Test The Laws of Nature", , , , , CERN(CH), , Hadrons, , Mesons, , , , Protons and neutrons, QCD: Quantum Chromodynamics, Quarks and antiquarks, , , , Tetraquarks and pentaquarks, The four new particles we've discovered recently are all tetraquarks with a charm quark pair and two other quarks., The standard model is certainly not the last word in the understanding of particles., These models are crucial to achieve the ultimate goal of the LHC: find physics beyond the standard model.   

    From CERN(CH) via Science Alert(AU): “Physicists Just Found 4 New Subatomic Particles That May Test The Laws of Nature” 

    Cern New Bloc

    Cern New Particle Event

    From CERN(CH)



    Science Alert(AU)

    5 MARCH 2021
    Research Fellow in Particle Physics
    Dutch National Institute for Subatomic Physics, Dutch Research Council (NWO – Nederlandse Organisatie voor Wetenschappelijk Onderzoek)(NL)

    Harry Cliff
    Particle physicist
    University of Cambridge(UK).

    The Large Hadron Collider. Credit: CERN.

    This month is a time to celebrate. CERN has just announced the discovery of four brand new particles [3 March 2021: Observation of two ccus tetraquarks and two ccss tetraquarks.] at the Large Hadron Collider (LHC) in Geneva.

    This means that the LHC has now found a total of 59 new particles, in addition to the Nobel prize-winning Higgs boson, since it started colliding protons – particles that make up the atomic nucleus along with neutrons – in 2009.

    Excitingly, while some of these new particles were expected based on our established theories, some were altogether more surprising.

    The LHC’s goal is to explore the structure of matter at the shortest distances and highest energies ever probed in the lab – testing our current best theory of nature: the Standard Model of Particle Physics.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS).

    And the LHC has delivered the goods – it enabled scientists to discover the Higgs boson [below], the last missing piece of the model. That said, the theory is still far from being fully understood.

    One of its most troublesome features is its description of the strong interaction which holds the atomic nucleus together. The nucleus is made up of protons and neutrons, which are in turn each composed of three tiny particles called quarks (there are six different kinds of quarks: up, down, charm, strange, top and bottom).

    If we switched the strong force off for a second, all matter would immediately disintegrate into a soup of loose quarks – a state that existed for a fleeting instant at the beginning of the universe.

    Don’t get us wrong: the theory of the strong interaction, pretentiously called Quantum Chromodynamics, is on very solid footing. It describes how quarks interact through the strong interaction by exchanging particles called gluons. You can think of gluons as analogues of the more familiar photon, the particle of light and carrier of the electromagnetic interaction.

    However, the way gluons interact with quarks makes the strong interaction behave very differently from electromagnetism. While the electromagnetic interaction gets weaker as you pull two charged particles apart, the strong interaction actually gets stronger as you pull two quarks apart.

    As a result, quarks are forever locked up inside particles called hadrons – particles made of two or more quarks – which includes protons and neutrons. Unless, of course, you smash them open at incredible speeds, as we are doing at Cern.

    To complicate matters further, all the particles in the standard model have antiparticles which are nearly identical to themselves but with the opposite charge (or other quantum property). If you pull a quark out of a proton, the force will eventually be strong enough to create a quark-antiquark pair, with the newly created quark going into the proton.

    You end up with a proton and a brand new “meson”, a particle made of a quark and an antiquark. This may sound weird but according to quantum mechanics, which rules the universe on the smallest of scales, particles can pop out of empty space.

    This has been shown repeatedly by experiments – we have never seen a lone quark. An unpleasant feature of the theory of the strong interaction is that calculations of what would be a simple process in electromagnetism can end up being impossibly complicated. We therefore cannot (yet) prove theoretically that quarks can’t exist on their own.

    Worse still, we can’t even calculate which combinations of quarks would be viable in nature and which would not.

    Illustration of a tetraquark. Credit: CERN.

    When quarks were first discovered, scientists realized that several combinations should be possible in theory. This included pairs of quarks and antiquarks (mesons); three quarks (baryons); three antiquarks (antibaryons); two quarks and two antiquarks (tetraquarks); and four quarks and one antiquark (pentaquarks) – as long as the number of quarks minus antiquarks in each combination was a multiple of three.

    For a long time, only baryons and mesons were seen in experiments. But in 2003, the Belle experiment in Japan discovered a particle that didn’t fit in anywhere.

    KEK Belle detector, at the High Energy Accelerator Research Organisation (KEK) in Tsukuba, Ibaraki Prefecture, Japan.

    Belle II KEK High Energy Accelerator Research Organization Tsukuba, Japan.

    It turned out to be the first of a long series of tetraquarks.

    In 2015, the LHCb experiment [below] at the LHC discovered two pentaquarks.

    Is a pentaquark tightly (above) or weakly bound (see image below)? Credit: CERN.

    The four new particles we’ve discovered recently are all tetraquarks with a charm quark pair and two other quarks. All these objects are particles in the same way as the proton and the neutron are particles. But they are not fundamental particles: quarks and electrons are the true building blocks of matter.

    Charming new particles

    The LHC has now discovered 59 new hadrons. These include the tetraquarks most recently discovered, but also new mesons and baryons. All these new particles contain heavy quarks such as “charm” and “bottom”.

    These hadrons are interesting to study. They tell us what nature considers acceptable as a bound combination of quarks, even if only for very short times.

    They also tell us what nature does not like. For example, why do all tetra- and pentaquarks contain a charm-quark pair (with just one exception)? And why are there no corresponding particles with strange-quark pairs? There is currently no explanation.

    Is a pentaquark a molecule? A meson (left) interacting with a proton (right). Credit: CERN.

    Another mystery is how these particles are bound together by the strong interaction. One school of theorists considers them to be compact objects, like the proton or the neutron.

    Others claim they are akin to “molecules” formed by two loosely bound hadrons. Each newly found hadron allows experiments to measure its mass and other properties, which tell us something about how the strong interaction behaves. This helps bridge the gap between experiment and theory. The more hadrons we can find, the better we can tune the models to the experimental facts.

    These models are crucial to achieve the ultimate goal of the LHC: find physics beyond the standard model. Despite its successes, the standard model is certainly not the last word in the understanding of particles. It is for instance inconsistent with cosmological models describing the formation of the universe.

    The LHC is searching for new fundamental particles that could explain these discrepancies. These particles could be visible at the LHC, but hidden in the background of particle interactions. Or they could show up as small quantum mechanical effects in known processes.

    In either case, a better understanding of the strong interaction is needed to find them. With each new hadron, we improve our knowledge of nature’s laws, leading us to a better description of the most fundamental properties of matter.

    See the full article here.

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN(CH) in a variety of places:

    Quantum Diaries

    Cern Courier(CH)



    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    CERN/ALICE Detector

    CERN CMS New

    CERN LHCb New II


    CERN map

    CERN LHC Maximilien Brice and Julien Marius Ordan.

    SixTRack CERN LHC particles

    The European Organization for Nuclear Research (Organisation européenne pour la recherche nucléaire)(EU), known as CERN, is a European research organization that operates the largest particle physics laboratory in the world. Established in 1954, the organization is based in a northwest suburb of Geneva on the Franco–Swiss border and has 23 member states. Israel is the only non-European country granted full membership. CERN is an official United Nations Observer.

    The acronym CERN is also used to refer to the laboratory, which in 2019 had 2,660 scientific, technical, and administrative staff members, and hosted about 12,400 users from institutions in more than 70 countries. In 2016 CERN generated 49 petabytes of data.

    CERN’s main function is to provide the particle accelerators and other infrastructure needed for high-energy physics research – as a result, numerous experiments have been constructed at CERN through international collaborations. The main site at Meyrin hosts a large computing facility, which is primarily used to store and analyse data from experiments, as well as simulate events. Researchers need remote access to these facilities, so the lab has historically been a major wide area network hub. CERN is also the birthplace of the World Wide Web.

    The convention establishing CERN was ratified on 29 September 1954 by 12 countries in Western Europe. The acronym CERN originally represented the French words for Conseil Européen pour la Recherche Nucléaire (European Council for Nuclear Research), which was a provisional council for building the laboratory, established by 12 European governments in 1952. The acronym was retained for the new laboratory after the provisional council was dissolved, even though the name changed to the current Organisation Européenne pour la Recherche Nucléaire (European Organization for Nuclear Research)(EU) in 1954. According to Lew Kowarski, a former director of CERN, when the name was changed, the abbreviation could have become the awkward OERN, and Werner Heisenberg said that this could “still be CERN even if the name is [not]”.

    CERN’s first president was Sir Benjamin Lockspeiser. Edoardo Amaldi was the general secretary of CERN at its early stages when operations were still provisional, while the first Director-General (1954) was Felix Bloch.

    The laboratory was originally devoted to the study of atomic nuclei, but was soon applied to higher-energy physics, concerned mainly with the study of interactions between subatomic particles. Therefore, the laboratory operated by CERN is commonly referred to as the European laboratory for particle physics (Laboratoire européen pour la physique des particules), which better describes the research being performed there.

    Founding members

    At the sixth session of the CERN Council, which took place in Paris from 29 June – 1 July 1953, the convention establishing the organization was signed, subject to ratification, by 12 states. The convention was gradually ratified by the 12 founding Member States: Belgium, Denmark, France, the Federal Republic of Germany, Greece, Italy, the Netherlands, Norway, Sweden, Switzerland, the United Kingdom, and “Yugoslavia”.

    Scientific achievements

    Several important achievements in particle physics have been made through experiments at CERN. They include:

    1973: The discovery of neutral currents in the Gargamelle bubble chamber.
    1983: The discovery of W and Z bosons in the UA1 and UA2 experiments.
    1989: The determination of the number of light neutrino families at the Large Electron–Positron Collider (LEP) operating on the Z boson peak.
    1995: The first creation of antihydrogen atoms in the PS210 experiment.
    1999: The discovery of direct CP violation in the NA48 experiment.
    2010: The isolation of 38 atoms of antihydrogen.
    2011: Maintaining antihydrogen for over 15 minutes.
    2012: A boson with mass around 125 GeV/c2 consistent with the long-sought Higgs boson.

    In September 2011, CERN attracted media attention when the OPERA Collaboration reported the detection of possibly faster-than-light neutrinos. Further tests showed that the results were flawed due to an incorrectly connected GPS synchronization cable.

    The 1984 Nobel Prize for Physics was awarded to Carlo Rubbia and Simon van der Meer for the developments that resulted in the discoveries of the W and Z bosons. The 1992 Nobel Prize for Physics was awarded to CERN staff researcher Georges Charpak “for his invention and development of particle detectors, in particular the multiwire proportional chamber”. The 2013 Nobel Prize for Physics was awarded to François Englert and Peter Higgs for the theoretical description of the Higgs mechanism in the year after the Higgs boson was found by CERN experiments.

    Computer science

    The World Wide Web began as a CERN project named ENQUIRE, initiated by Tim Berners-Lee in 1989 and Robert Cailliau in 1990. Berners-Lee and Cailliau were jointly honoured by the Association for Computing Machinery in 1995 for their contributions to the development of the World Wide Web.

    Current complex

    CERN operates a network of six accelerators and a decelerator. Each machine in the chain increases the energy of particle beams before delivering them to experiments or to the next more powerful accelerator. Currently (as of 2019) active machines are:

    The LINAC 3 linear accelerator generating low energy particles. It provides heavy ions at 4.2 MeV/u for injection into the Low Energy Ion Ring (LEIR).
    The Proton Synchrotron Booster increases the energy of particles generated by the proton linear accelerator before they are transferred to the other accelerators.
    The Low Energy Ion Ring (LEIR) accelerates the ions from the ion linear accelerator LINAC 3, before transferring them to the Proton Synchrotron (PS). This accelerator was commissioned in 2005, after having been reconfigured from the previous Low Energy Antiproton Ring (LEAR).
    The 28 GeV Proton Synchrotron (PS), built during 1954—1959 and still operating as a feeder to the more powerful SPS.
    The Super Proton Synchrotron (SPS), a circular accelerator with a diameter of 2 kilometres built in a tunnel, which started operation in 1976. It was designed to deliver an energy of 300 GeV and was gradually upgraded to 450 GeV. As well as having its own beamlines for fixed-target experiments (currently COMPASS and NA62), it has been operated as a proton–antiproton collider (the SppS collider), and for accelerating high energy electrons and positrons which were injected into the Large Electron–Positron Collider (LEP). Since 2008, it has been used to inject protons and heavy ions into the Large Hadron Collider (LHC).
    The On-Line Isotope Mass Separator (ISOLDE), which is used to study unstable nuclei. The radioactive ions are produced by the impact of protons at an energy of 1.0–1.4 GeV from the Proton Synchrotron Booster. It was first commissioned in 1967 and was rebuilt with major upgrades in 1974 and 1992.
    The Antiproton Decelerator (AD), which reduces the velocity of antiprotons to about 10% of the speed of light for research of antimatter.[50] The AD machine was reconfigured from the previous Antiproton Collector (AC) machine.
    The AWAKE experiment, which is a proof-of-principle plasma wakefield accelerator.
    The CERN Linear Electron Accelerator for Research (CLEAR) accelerator research and development facility.

    Large Hadron Collider

    Many activities at CERN currently involve operating the Large Hadron Collider (LHC) and the experiments for it. The LHC represents a large-scale, worldwide scientific cooperation project.

    The LHC tunnel is located 100 metres underground, in the region between the Geneva International Airport and the nearby Jura mountains. The majority of its length is on the French side of the border. It uses the 27 km circumference circular tunnel previously occupied by the Large Electron–Positron Collider (LEP), which was shut down in November 2000. CERN’s existing PS/SPS accelerator complexes are used to pre-accelerate protons and lead ions which are then injected into the LHC.

    Eight experiments (CMS, ATLAS, LHCb, MoEDAL, TOTEM, LHCf, FASER and ALICE) are located along the collider; each of them studies particle collisions from a different aspect, and with different technologies. Construction for these experiments required an extraordinary engineering effort. For example, a special crane was rented from Belgium to lower pieces of the CMS detector into its cavern, since each piece weighed nearly 2,000 tons. The first of the approximately 5,000 magnets necessary for construction was lowered down a special shaft at 13:00 GMT on 7 March 2005.

    The LHC has begun to generate vast quantities of data, which CERN streams to laboratories around the world for distributed processing (making use of a specialized grid infrastructure, the LHC Computing Grid). During April 2005, a trial successfully streamed 600 MB/s to seven different sites across the world.

    The initial particle beams were injected into the LHC August 2008. The first beam was circulated through the entire LHC on 10 September 2008, but the system failed 10 days later because of a faulty magnet connection, and it was stopped for repairs on 19 September 2008.

    The LHC resumed operation on 20 November 2009 by successfully circulating two beams, each with an energy of 3.5 teraelectronvolts (TeV). The challenge for the engineers was then to try to line up the two beams so that they smashed into each other. This is like “firing two needles across the Atlantic and getting them to hit each other” according to Steve Myers, director for accelerators and technology.

    On 30 March 2010, the LHC successfully collided two proton beams with 3.5 TeV of energy per proton, resulting in a 7 TeV collision energy. However, this was just the start of what was needed for the expected discovery of the Higgs boson. When the 7 TeV experimental period ended, the LHC revved to 8 TeV (4 TeV per proton) starting March 2012, and soon began particle collisions at that energy. In July 2012, CERN scientists announced the discovery of a new sub-atomic particle that was later confirmed to be the Higgs boson.

    CERN CMS Higgs Event May 27, 2012.

    CERN ATLAS Higgs Event
    June 12, 2012.

    Peter Higgs

    In March 2013, CERN announced that the measurements performed on the newly found particle allowed it to conclude that this is a Higgs boson. In early 2013, the LHC was deactivated for a two-year maintenance period, to strengthen the electrical connections between magnets inside the accelerator and for other upgrades.

    On 5 April 2015, after two years of maintenance and consolidation, the LHC restarted for a second run. The first ramp to the record-breaking energy of 6.5 TeV was performed on 10 April 2015. In 2016, the design collision rate was exceeded for the first time. A second two-year period of shutdown begun at the end of 2018.

    Accelerators under construction

    As of October 2019, the construction is on-going to upgrade the LHC’s luminosity in a project called High Luminosity LHC (HL-LHC).

    This project should see the LHC accelerator upgraded by 2026 to an order of magnitude higher luminosity.

    As part of the HL-LHC upgrade project, also other CERN accelerators and their subsystems are receiving upgrades. Among other work, the LINAC 2 linear accelerator injector was decommissioned, to be replaced by a new injector accelerator, the LINAC4 in 2020.

    Possible future accelerators

    CERN, in collaboration with groups worldwide, is investigating two main concepts for future accelerators: A linear electron-positron collider with a new acceleration concept to increase the energy (CLIC) and a larger version of the LHC, a project currently named Future Circular Collider.

    CLIC collider

    CERN FCC Future Circular Collider details of proposed 100km-diameter successor to LHC.

    Not discussed or described, but worthy of consideration is the ILC, International Linear Collider in the planning stages for construction in Japan.

    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan.


    Since its foundation by 12 members in 1954, CERN regularly accepted new members. All new members have remained in the organization continuously since their accession, except Spain and Yugoslavia. Spain first joined CERN in 1961, withdrew in 1969, and rejoined in 1983. Yugoslavia was a founding member of CERN but quit in 1961. Of the 23 members, Israel joined CERN as a full member on 6 January 2014, becoming the first (and currently only) non-European full member.


    Associate Members, Candidates:

    Turkey signed an association agreement on 12 May 2014 and became an associate member on 6 May 2015.
    Pakistan signed an association agreement on 19 December 2014 and became an associate member on 31 July 2015.
    Cyprus signed an association agreement on 5 October 2012 and became an associate Member in the pre-stage to membership on 1 April 2016.
    Ukraine signed an association agreement on 3 October 2013. The agreement was ratified on 5 October 2016.
    India signed an association agreement on 21 November 2016. The agreement was ratified on 16 January 2017.
    Slovenia was approved for admission as an Associate Member state in the pre-stage to membership on 16 December 2016. The agreement was ratified on 4 July 2017.
    Lithuania was approved for admission as an Associate Member state on 16 June 2017. The association agreement was signed on 27 June 2017 and ratified on 8 January 2018.
    Croatia was approved for admission as an Associate Member state on 28 February 2019. The agreement was ratified on 10 October 2019.
    Estonia was approved for admission as an Associate Member in the pre-stage to membership state on 19 June 2020. The agreement was ratified on 1 February 2021.

  • richardmitnick 4:04 pm on January 27, 2021 Permalink | Reply
    Tags: "Size of helium nucleus measured more precisely than ever before", , Helium is the second most abundant element in the universe., Helium=two protons and two neutrons., , , , Paul Scherrer Institute [Paul Scherrer Institut](CH), , Proton radius mystery is fading away, QCD: Quantum Chromodynamics, , Resonance frequency, Rydberg constant, Slow muons; complicated laser system,   

    From Paul Scherrer Institute [Paul Scherrer Institut](CH): “Size of helium nucleus measured more precisely than ever before” 

    From Paul Scherrer Institute [Paul Scherrer Institut](CH)

    27 January 2021

    Text: Barbara Vonarburg

    Dr. Aldo Antognini
    Labor für Teilchenphysik
    Paul Scherrer Institute, Forschungsstrasse 111, 5232 Villigen PSI (CH)
    Institute for Particle Physics and Astrophysics
    ETH Zürich, Otto-Stern-Weg 5, 8093 Zürich (CH)
    +41 56 310 46 14

    Dr. Franz Kottmann
    Labor für Teilchenphysik
    Paul Scherrer Institute, Forschungsstrasse 111, 5232 Villigen PSI (CH)
    Institute for Particle Physics and Astrophysics
    ETH Zürich, Otto-Stern-Weg 5, 8093 Zürich (CH)
    +41 79273 16 39

    Dr. Julian J. Krauth
    LaserLaB, Faculty of Sciences
    Quantum Metrology and Laser Applications
    Vrije Universiteit Amsterdam
    De Boelelaan 1081, 1081HV Amsterdam (NL)
    +31 20 5987438

    Prof. Dr. Randolf Pohl
    Institut für Physik
    Johannes Gutenberg Universität, 55128 Mainz (DE)
    +49 171 41 70 752, e-mail:

    In experiments at the Paul Scherrer Institute PSI, an international research collaboration has measured the radius of the atomic nucleus of helium five times more precisely than ever before. With the aid of the new value, fundamental physical theories can be tested and natural constants can be determined even more precisely. For their measurements, the researchers needed muons – these particles are similar to electrons but are around 200 times heavier. PSI is the only research site in the world where enough so-called low-energy muons are produced for such experiments. The researchers are publishing their results today in the journal Nature.

    Both Franz Kottmann (left) and Karsten Schuhmann did essential preparatory work for the crucial experiment. Credit: Paul Scherrer Institute/Markus Fischer)

    After hydrogen, helium is the second most abundant element in the universe. Around one-fourth of the atomic nuclei that formed in the first few minutes after the Big Bang were helium nuclei. These consist of four building blocks: two protons and two neutrons. For fundamental physics, it is crucial to know the properties of the helium nucleus, among other things to understand the processes in other atomic nuclei that are heavier than helium. “The helium nucleus is a very fundamental nucleus, which could be described as magical,” says Aldo Antognini, a physicist at PSI and ETH Zürich (CH). His colleague and co-author Randolf Pohl from Johannes Gutenberg University Mainz (DE) adds: “Our previous knowledge about the helium nucleus comes from experiments with electrons. At PSI, however, we have for the first time developed a new type of measurement method that allows much better accuracy.”

    With this, the international research collaboration succeeded in determining the size of the helium nucleus around five times more precisely than was possible in previous measurements. The group is publishing its results today in the renowned scientific journal Nature [above]. According to their findings, the so-called mean charge radius of the helium nucleus is 1.67824 femtometers (there are 1 quadrillion femtometers in 1 meter).

    “The idea behind our experiments is simple,” explains Antognini. Normally two negatively charged electrons orbit the positively charged helium nucleus. “We don’t work with normal atoms, but with exotic atoms in which both electrons have been replaced by a single muon,” says the physicist. The muon is considered to be the electron’s heavier brother; it resembles it, but it’s around 200 times heavier. A muon is much more strongly bound to the atomic nucleus than an electron and encircles it in much narrower orbits. Compared to electrons, a muon is much more likely to stay in the nucleus itself. “So with muonic helium, we can draw conclusions about the structure of the atomic nucleus and measure its properties,” Antognini explains.

    Slow muons, complicated laser system

    The muons are produced at PSI using a particle accelerator. The specialty of the facility: generating muons with low energy. These particles are slow and can be stopped in the apparatus for experiments. This is the only way researchers can form the exotic atoms in which a muon throws an electron out of its orbit and replaces it. Fast muons, in contrast, would fly right through the apparatus. The PSI system delivers more low-energy muons than all other comparable systems worldwide. “That is why the experiment with muonic helium can only be carried out here,” says Franz Kottmann, who for 40 years has been pressing ahead with the necessary preliminary studies and technical developments for this experiment.

    The muons hit a small chamber filled with helium gas. If the conditions are right, muonic helium is created, where the muon is in an energy state in which it often stays in the atomic nucleus. “Now the second important component for the experiment comes into play: the laser system,” Pohl explains. The complicated system shoots a laser pulse at the helium gas. If the laser light has the right frequency, it excites the muon and advances it to a higher energy state, in which its path is practically always outside the nucleus. When it falls from this to the ground state, it emits X-rays. Detectors register these X-ray signals.

    In the experiment, the laser frequency is varied until a large number of X-ray signals arrive. Physicists then speak of the so-called resonance frequency. With its help, then, the difference between the two energetic states of the muon in the atom can be determined. According to theory, the measured energy difference depends on how large the atomic nucleus is. Hence, using the theoretical equation, the radius can be determined from the measured resonance. This data analysis was carried out in Randolf Pohl’s group in Mainz (DE).

    Proton radius mystery is fading away

    The researchers at PSI had already measured the radius of the proton in the same way in 2010. At that time, their value did not match that obtained by other measurement methods. There was talk of a proton radius puzzle, and some speculated that a new physics might lie behind it in the form of a previously unknown interaction between the muon and the proton. This time there is no contradiction between the new, more precise value and the measurements with other methods. “This makes the explanation of the results with physics beyond the standard model more improbable,” says Kottmann. In addition, in recent years the value of the proton radius determined by means of other methods has been approaching the precise number from PSI. “The proton radius puzzle still exists, but it is slowly fading away,” says Kottmann.

    “Our measurement can be used in different ways,” says Julian Krauth, first author of the study: “The radius of the helium nucleus is an important touchstone for nuclear physics.” Atomic nuclei are held together by the so-called strong interaction, one of the four fundamental forces in physics. With the theory of strong interaction, known as quantum chromodynamics, physicists would like to be able to predict the radius of the helium nucleus and other light atomic nuclei with a few protons and neutrons. The extremely precisely measured value for the radius of the helium nucleus puts these predictions to the test. This also makes it possible to test new theoretical models of the nuclear structure and to understand atomic nuclei even better.

    The measurements on muonic helium can also be compared with experiments using normal helium atoms and ions. In experiments on these, too, energy transitions can be triggered and measured with laser systems – here, though, with electrons instead of muons. Measurements on electronic helium are under way right now. By comparing the results of the two measurements, one can draw conclusions about fundamental natural constants such as the Rydberg constant, which plays an important role in quantum mechanics.

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Paul Scherrer Institute [Paul Scherrer Institut](CH) is the largest research institute for natural and engineering sciences within Switzerland. We perform world-class research in three main subject areas: Matter and Material; Energy and the Environment; and Human Health. By conducting fundamental and applied research, we work on long-term solutions for major challenges facing society, industry and science.

    The Paul Scherrer Institute (PSI) is a multi-disciplinary research institute for natural and engineering sciences in Switzerland. It is located in the Canton of Aargau in the municipalities Villigen and Würenlingen on either side of the River Aare, and covers an area over 35 hectares in size. Like ETH Zurich and EPFL, PSI belongs to the Swiss Federal Institutes of Technology Domain of the Swiss Confederation. The PSI employs around 2100 people. It conducts basic and applied research in the fields of matter and materials, human health, and energy and the environment. About 37% of PSI’s research activities focus on material sciences, 24% on life sciences, 19% on general energy, 11% on nuclear energy and safety, and 9% on particle physics.

    PSI develops, builds and operates large and complex research facilities and makes them available to the national and international scientific communities. In 2017, for example, more than 2500 researchers from 60 different countries came to PSI to take advantage of the concentration of large-scale research facilities in the same location, which is unique worldwide. About 1900 experiments are conducted each year at the approximately 40 measuring stations in these facilities.

    In recent years, the institute has been one of the largest recipients of money from the Swiss lottery fund.

  • richardmitnick 11:00 am on May 8, 2020 Permalink | Reply
    Tags: "What Goes On in a Proton? Quark Math Still Conflicts With Experiments", A million-dollar math prize awaits anyone who can solve the type of equation used in QCD to show how massive entities like protons form., “We know absolutely that quarks and gluons interact with each other but we can’t calculate” the result., , , QCD: Quantum Chromodynamics, , , The discovery of quarks in the 1960s broke everything., The holographic principle   

    From Quanta Magazine: “What Goes On in a Proton? Quark Math Still Conflicts With Experiments” 

    From Quanta Magazine

    May 6, 2020
    Charlie Wood

    The quark structure of the proton 16 March 2006 Arpad Horvath

    Objects are made of atoms, and atoms are likewise the sum of their parts — electrons, protons and neutrons. Dive into one of those protons or neutrons, however, and things get weird. Three particles called quarks ricochet back and forth at nearly the speed of light, snapped back by interconnected strings of particles called gluons. Bizarrely, the proton’s mass must somehow arise from the energy of the stretchy gluon strings, since quarks weigh very little and gluons nothing at all.

    Physicists uncovered this odd quark-gluon picture in the 1960s and matched it to an equation in the ’70s, creating the theory of quantum chromodynamics (QCD). The problem is that while the theory seems accurate, it is extraordinarily complicated mathematically. Faced with a task like calculating how three wispy quarks produce the hulking proton, QCD simply fails to produce a meaningful answer.

    “It’s tantalizing and frustrating,” said Mark Lancaster, a particle physicist based at the University of Manchester in the United Kingdom. “We know absolutely that quarks and gluons interact with each other, but we can’t calculate” the result.

    A million-dollar math prize awaits anyone who can solve the type of equation used in QCD to show how massive entities like protons form. Lacking such a solution, particle physicists have developed arduous workarounds that deliver approximate answers. Some infer quark activity experimentally at particle colliders, while others harness the world’s most powerful supercomputers. But these approximation techniques have recently come into conflict, leaving physicists unsure exactly what their theory predicts and thus less able to interpret signs of new, unpredicted particles or effects.

    To understand what makes quarks and gluons such mathematical scofflaws, consider how much mathematical machinery goes into describing even well-behaved particles.

    A humble electron, for instance, can briefly emit and then absorb a photon. During that photon’s short life, it can split into a pair of matter-antimatter particles, each of which can engage in further acrobatics, ad infinitum. As long as each individual event ends quickly, quantum mechanics allows the combined flurry of “virtual” activity to continue indefinitely.

    In the 1940s, after considerable struggle, physicists developed mathematical rules that could accommodate this bizarre feature of nature. Studying an electron involved breaking down its virtual entourage into a series of possible events, each corresponding to a squiggly drawing known as a Feynman diagram and a matching equation. A perfect analysis of the electron would require an infinite string of diagrams — and a calculation with infinitely many steps — but fortunately for the physicists, the more byzantine sketches of rarer events ended up being relatively inconsequential. Truncating the series gives good-enough answers.

    The discovery of quarks in the 1960s broke everything. By pelting protons with electrons, researchers uncovered the proton’s internal parts, bound by a novel force. Physicists raced to find a description that could handle these new building blocks, and they managed to wrap all the details of quarks and the “strong interaction” that binds them into a compact equation in 1973. But their theory of the strong interaction, quantum chromodynamics, didn’t behave in the usual way, and neither did the particles.

    Feynman diagrams treat particles as if they interact by approaching each other from a distance, like billiard balls. But quarks don’t act like this. The Feynman diagram representing three quarks coming together from a distance and binding to one another to form a proton is a mere “cartoon,” according to Flip Tanedo, a particle physicist at the University of California, Riverside, because quarks are bound so strongly that they have no separate existence. The strength of their connection also means that the infinite series of terms corresponding to the Feynman diagrams grows in an unruly fashion, rather than fading away quickly enough to permit an easy approximation. Feynman diagrams are simply the wrong tool.

    The strong interaction is weird for two main reasons. First, whereas the electromagnetic interaction involves just one variety of charge (electric charge), the strong interaction involves three: “color” charges nicknamed red, green and blue. Weirder still, the carrier of the strong interaction, dubbed the gluon, itself bears color charge. So while the (electrically neutral) photons that comprise electromagnetic fields don’t interact with each other, collections of colorful gluons draw together into strings. “That really drives the differences we see,” Lancaster said. The ability of gluons to trip over themselves, together with the three charges, makes the strong interaction strong — so strong that quarks can’t escape each other’s company.

    Evidence piled up over the decades that gluons exist and act as predicted in certain circumstances. But for most calculations, the QCD equation has proved intractable. Physicists need to know what QCD predicts, however — not just to understand quarks and gluons, but to pin down properties of other particles as well, since they’re all affected by the dance of quantum activity that includes virtual quarks.

    One approach has been to infer incalculable values by watching how quarks behave in experiments. “You take electrons and positrons and slam them together,” said Chris Polly, a particle physicist at the Fermi National Accelerator Laboratory, “and ask how often you make quark [products] in the final state.” From those measurements, he said, you can extrapolate how often quark bundles should pop up in the hubbub of virtual activity that surrounds all particles.

    Other researchers have continued to try to wring information from the canonical QCD equation by calculating approximate solutions using supercomputers. “You just keep throwing more computing cycles at it and your answer will keep getting better,” said Aaron Meyer, a particle physicist at Brookhaven National Laboratory.

    This computational approach, known as lattice QCD, turns computers into laboratories that model the behavior of digital quarks and gluons. The technique gets its name from the way it slices space-time into a grid of points. Quarks sit on the lattice points, and the QCD equation lets them interact. The denser the grid, the more accurate the simulation. The Fermilab physicist Andreas Kronfeld remembers how, three decades ago, these simulations had just a handful of lattice points on a side. But computing power has increased, and lattice QCD can now successfully predict the proton’s mass to within a few percent of the experimentally determined value.

    Kronfeld is a spokesperson for USQCD, a federation of lattice QCD groups in the United States that have banded together to negotiate for bulk supercomputer time. He serves as the principal investigator for the federation’s efforts on the Summit supercomputer, currently the world’s fastest, located at Oak Ridge National Laboratory. USQCD runs one of Summit’s largest programs, occupying nearly 4% of the machine’s annual computing capacity.

    ORNL IBM AC922 SUMMIT supercomputer, No.1 on the TOP500. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    Theorists thought these digital laboratories were still a year or two away from becoming competitive with the collider experiments in approximating the effects quarks have on other particles. But in February a European collaboration shocked the community with a preprint claiming to nail a magnetic property of a particle called the muon to within 1% of its true value, using novel noise reduction techniques. “You might think of it as throwing down the gauntlet,” said Aida El-Khadra, a high-energy theorist at the University of Illinois, Urbana-Champaign.

    The team’s prediction for virtual quark activity around the muon clashed with the inferences from electron-positron collisions, however. Meyer, who recently co-authored a survey of the conflicting results, says that many technical details in lattice QCD remain poorly understood, such as how to hop from the gritty lattice back to smooth space. Efforts to determine what QCD predicts for the muon, which many researchers consider a bellwether for undiscovered particles, are ongoing.

    Meanwhile, mathematically minded researchers haven’t entirely despaired of finding a pen-and-paper strategy for tackling the strong interaction — and reaping the million-dollar reward offered by the Clay Mathematics Institute for a rigorous prediction of the mass of the lightest possible collection of quarks or gluons.

    One such Hail Mary pass in the theoretical world is a tool called the holographic principle. The general strategy is to translate the problem into an abstract mathematical space where some hologram of quarks can be separated from each other, allowing an analysis in terms of Feynman diagrams.

    Simple attempts look promising, according to Tanedo, but none come close to the hard-won accuracy of lattice QCD. For now, theorists will continue to refine their imperfect tools and dream of new mathematical machinery capable of taming the fundamental but inseparable quarks.

    “That would be the holy grail,” Tanedo says. QCD is “just begging for us to figure out how that actually works.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

  • richardmitnick 8:54 am on May 5, 2020 Permalink | Reply
    Tags: "Three Birds with One Particle: The Possibilities of Axions", , , Matter-antimatter asymmetry, , , QCD: Quantum Chromodynamics   

    From particlebites: “Three Birds with One Particle: The Possibilities of Axions” 

    particlebites bloc

    From particlebites

    May 1, 2020
    Amara McCune

    Title: “Axiogenesis”

    Author: Raymond T. Co and Keisuke Harigaya

    Reference: https://arxiv.org/pdf/1910.02080.pdf

    On the laundry list of problems in particle physics, a rare three-for-one solution could come in the form of a theorized light scalar particle fittingly named after a detergent: the axion. Frank Wilczek coined this term in reference to its potential to “clean up” the Standard Model once he realized its applicability to multiple unsolved mysteries. Although Axion the dish soap has been somewhat phased out of our everyday consumer life (being now primarily sold in Latin America), axion particles remain as a key component of a physicist’s toolbox. While axions get a lot of hype as a promising Dark Matter candidate, and are now being considered as a solution to matter-antimatter asymmetry, they were originally proposed as a solution for a different Standard Model puzzle: the strong CP problem.

    The strong CP problem refers to a peculiarity of quantum chromodynamics (QCD), our theory of quarks, gluons, and the strong force that mediates them: while the theory permits charge-parity (CP) symmetry violation, the ardent experimental search for CP-violating processes in QCD has so far come up empty-handed. What does this mean from a physical standpoint? Consider the neutron electric dipole moment (eDM), which roughly describes the distribution of the three quarks comprising a neutron. Naively, we might expect this orientation to be a triangular one. However, measurements of the neutron eDM, carried out by tracking changes in neutron spin precession, return a value orders of magnitude smaller than classically expected. In fact, the incredibly small value of this parameter corresponds to a neutron where the three quarks are found nearly in a line.

    The classical picture of the neutron (left) looks markedly different from the picture necessitated by CP symmetry (right). The strong CP problem is essentially a question of why our mental image should look like the right picture instead of the left. Source: https://arxiv.org/pdf/1812.02669.pdf

    This would not initially appear to be a problem. In fact, in the context of CP, this makes sense: a simultaneous charge conjugation (exchanging positive charges for negative ones and vice versa) and parity inversion (flipping the sign of spatial directions) when the quark arrangement is linear results in a symmetry. Yet there are a few subtleties that point to the existence of further physics. First, this tiny value requires an adjustment of parameters within the mathematics of QCD, carefully fitting some coefficients to cancel out others in order to arrive at the desired conclusion. Second, we do observe violation of CP symmetry in particle physics processes mediated by the weak interaction, such as kaon decay, which also involves quarks.

    These arguments rest upon the idea of naturalness, a principle that has been invoked successfully several times throughout the development of particle theory as a hint toward the existence of a deeper, more underlying theory. Naturalness (in one of its forms) states that such minuscule values are only allowed if they increase the overall symmetry of the theory, something that cannot be true if weak processes exhibit CP-violation where strong processes do not. This puts the strong CP problem squarely within the realm of “fine-tuning” problems in physics; although there is no known reason for CP symmetry conservation to occur, the theory must be modified to fit this observation. We then seek one of two things: either an observation of CP-violation in QCD or a solution that sets the neutron eDM, and by extension any CP-violating phase within our theory, to zero.

    This term in the QCD Lagrangian allows for CP symmetry violation. Current measurements place the value of \theta at no greater than 10^{-10}. In Peccei-Quinn symmetry, Θ is promoted to a field.

    When such an expected symmetry violation is nowhere to be found, where is a theoretician to look for such a solution? The most straightforward answer is to turn to a new symmetry. This is exactly what Roberto Peccei and Helen Quinn did in 1977, birthing the Peccei-Quinn symmetry, an extension of QCD which incorporates a CP-violating phase known as the Θ term. The main idea behind this theory is to promote Θ to a dynamical field, rather than keeping it a constant. Since quantum fields have associated particles, this also yields the particle we dub the axion. Looking back briefly to the neutron eDM picture of the strong CP problem, this means that the angular separation should also be dynamical, and hence be relegated to the minimum energy configuration: the quarks again all in a straight line. In the language of symmetries, the U(1) Peccei-Quinn symmetry is approximately spontaneously broken, giving us a non-zero vacuum expectation value and a nearly-massless Goldstone boson: our axion.

    This is all great, but what does it have to do with dark matter? As it turns out, axions make for an especially intriguing dark matter candidate due to their low mass and potential to be produced in large quantities. For decades, this prowess was overshadowed by the leading WIMP candidate (weakly-interacting massive particles), whose parameter space has been slowly whittled down to the point where physicists are more seriously turning to alternatives. As there are several production-mechanisms in early universe cosmology for axions, and 100% of dark matter abundance could be explained through this generation, the axion is now stepping into the spotlight.

    This increased focus is causing some theorists to turn to further avenues of physics as possible applications for the axion. In a recent paper, Co and Harigaya examined the connection between this versatile particle and matter-antimatter asymmetry (also called baryon asymmetry). This latter term refers to the simple observation that there appears to be more matter than antimatter in our universe, since we are predominantly composed of matter, yet matter and antimatter also seem to be produced in colliders in equal proportions. In order to explain this asymmetry, without which matter and antimatter would have annihilated and we would not exist, physicists look for any mechanism to trigger an imbalance in these two quantities in the early universe. This theorized process is known as baryogenesis.

    Here’s where the axion might play a part. The \theta term, which settles to zero in its possible solution to the strong CP problem, could also have taken on any value from 0 to 360 degrees very early on in the universe. Analyzing the axion field through the conjectures of quantum gravity, if there are no global symmetries then the initial axion potential cannot be symmetric [4]. By falling from some initial value through an uneven potential, which the authors describe as a wine bottle potential with a wiggly top, \theta would cycle several times through the allowed values before settling at its minimum energy value of zero. This causes the axion field to rotate, an asymmetry which could generate a disproportionality between the amounts of produced matter and antimatter. If the field were to rotate in one direction, we would see more matter than antimatter, while a rotation in the opposite direction would result instead in excess antimatter.

    The team’s findings can be summarized in the plot above. Regions in purple, red, and above the orange lines (dependent upon a particular constant X which is proportional to weak scale quantities) signify excluded portions of the parameter space. The remaining white space shows values of the axion decay constant and mass where the currently measured amount of baryon asymmetry could be generated. Source: https://arxiv.org/pdf/1910.02080.pdf

    Introducing a third fundamental mystery into the realm of axions begets the question of whether all three problems (strong CP, dark matter, and matter-antimatter asymmetry) can be solved simultaneously with axions. And, of course, there are nuances that could make alternative solutions to the strong CP problem more favorable or other dark matter candidates more likely. Like most theorized particles, there are several formulations of axion in the works. It is then necessary to turn our attention to experiment to narrow down the possibilities for how axions could interact with other particles, determine what their mass could be, and answer the all-important question: if they exist at all. Consequently, there are a plethora of axion-focused experiments up and running, with more on the horizon, that use a variety of methods spanning several subfields of physics. While these results begin to roll in, we can continue to investigate just how many problems we might be able to solve with one adaptable, soapy particle.

    Learn More:

    A comprehensive introduction to the strong CP problem, the axion solution, and other potential solutions: https://arxiv.org/pdf/1812.02669.pdf
    Axions as a dark matter candidate: https://www.symmetrymagazine.org/article/the-other-dark-matter-candidate
    More information on matter-antimatter asymmetry and baryogenesis: https://www.quantumdiaries.org/2015/02/04/where-do-i-come-from/
    The quantum gravity conjectures that axiogenesis builds upon: https://arxiv.org/abs/1810.05338
    An overview of current axion-focused experiments: https://www.annualreviews.org/doi/full/10.1146/annurev-nucl-102014-022120

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    What is ParticleBites?
    ParticleBites is an online particle physics journal club written by graduate students and postdocs. Each post presents an interesting paper in a brief format that is accessible to undergraduate students in the physical sciences who are interested in active research.

    The papers are accessible on the arXiv preprint server. Most of our posts are based on papers from hep-ph (high energy phenomenology) and hep-ex (high energy experiment).

    Why read ParticleBites?

    Reading a technical paper from an unfamiliar subfield is intimidating. It may not be obvious how the techniques used by the researchers really work or what role the new research plays in answering the bigger questions motivating that field, not to mention the obscure jargon! For most people, it takes years for scientific papers to become meaningful.

    Our goal is to solve this problem, one paper at a time. With each brief ParticleBite, you should not only learn about one interesting piece of current work, but also get a peek at the broader picture of research in particle physics.

    Who writes ParticleBites?

    ParticleBites is written and edited by graduate students and postdocs working in high energy physics. Feel free to contact us if you’re interested in applying to write for ParticleBites.

    ParticleBites was founded in 2013 by Flip Tanedo following the Communicating Science (ComSciCon) 2013 workshop.

    Flip Tanedo UCI Chancellor’s ADVANCE postdoctoral scholar in theoretical physics. As of July 2016, I will be an assistant professor of physics at the University of California, Riverside

    It is now organized and directed by Flip and Julia Gonski, with ongoing guidance from Nathan Sanders.

  • richardmitnick 10:54 am on July 27, 2019 Permalink | Reply
    Tags: "Ask Ethan: Can We Really Get A Universe From Nothing?", , , , , Because dark energy is a property of space itself when the Universe expands the dark energy density must remain constant., , , , , Galaxies that are gravitationally bound will merge together into groups and clusters while the unbound groups and clusters will accelerate away from one another., , , Negative gravity?, QCD: Quantum Chromodynamics,   

    From Ethan Siegel: “Ask Ethan: Can We Really Get A Universe From Nothing?” 

    From Ethan Siegel
    July 27, 2019

    Our entire cosmic history is theoretically well-understood in terms of the frameworks and rules that govern it. It’s only by observationally confirming and revealing various stages in our Universe’s past that must have occurred, like when the first stars and galaxies formed, and how the Universe expanded over time, that we can truly come to understand what makes up our Universe and how it expands and gravitates in a quantitative fashion. The relic signatures imprinted on our Universe from an inflationary state before the hot Big Bang give us a unique way to test our cosmic history, subject to the same fundamental limitations that all frameworks possess. (NICOLE RAGER FULLER / NATIONAL SCIENCE FOUNDATION)

    And does it require the idea of ‘negative gravity’ in order to work?

    The biggest question that we’re even capable of asking, with our present knowledge and understanding of the Universe, is where did everything we can observe come from? If it came from some sort of pre-existing state, we’ll want to know exactly what that state was like and how our Universe came from it. If it emerged out of nothingness, we’d want to know how we went from nothing to the entire Universe, and what if anything caused it. At least, that’s what our Patreon supporter Charles Buchanan wants to know, asking:

    “One concept bothers me. Perhaps you can help. I see it in used many places, but never really explained. “A universe from Nothing” and the concept of negative gravity. As I learned my Newtonian physics, you could put the zero point of the gravitational potential anywhere, only differences mattered. However Newtonian physics never deals with situations where matter is created… Can you help solidify this for me, preferably on [a] conceptual level, maybe with a little calculation detail?”

    Gravitation might seem like a straightforward force, but an incredible number of aspects are anything but intuitive. Let’s take a deeper look.

    Countless scientific tests of Einstein’s general theory of relativity have been performed, subjecting the idea to some of the most stringent constraints ever obtained by humanity. Einstein’s first solution was for the weak-field limit around a single mass, like the Sun; he applied these results to our Solar System with dramatic success. We can view this orbit as Earth (or any planet) being in free-fall around the Sun, traveling in a straight-line path in its own frame of reference. All masses and all sources of energy contribute to the curvature of spacetime. (LIGO SCIENTIFIC COLLABORATION / T. PYLE / CALTECH / MIT)

    MIT /Caltech Advanced aLigo

    VIRGO Gravitational Wave interferometer, near Pisa, Italy

    Caltech/MIT Advanced aLigo Hanford, WA, USA installation

    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    LSC LIGO Scientific Collaboration

    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project

    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger

    Gravity is talking. Lisa will listen. Dialogos of Eide

    ESA/eLISA the future of gravitational wave research

    Localizations of gravitational-wave signals detected by LIGO in 2015 (GW150914, LVT151012, GW151226, GW170104), more recently, by the LIGO-Virgo network (GW170814, GW170817). After Virgo came online in August 2018

    Skymap showing how adding Virgo to LIGO helps in reducing the size of the source-likely region in the sky. (Credit: Giuseppe Greco (Virgo Urbino group)

    If you have two point masses located some distance apart in your Universe, they’ll experience an attractive force that compels them to gravitate towards one another. But this attractive force that you perceive, in the context of relativity, comes with two caveats.

    The first caveat is simple and straightforward: these two masses will experience an acceleration towards one another, but whether they wind up moving closer to one another or not is entirely dependent on how the space between them evolves. Unlike in Newtonian gravity, where space is a fixed quantity and only the masses within that space can evolve, everything is changeable in General Relativity. Not only does matter and energy move and accelerate due to gravitation, but the very fabric of space itself can expand, contract, or otherwise flow. All masses still move through space, but space itself is no longer stationary.

    The ‘raisin bread’ model of the expanding Universe, where relative distances increase as the space (dough) expands. The farther away any two raisin are from one another, the greater the observed redshift will be by time the light is received. The redshift-distance relation predicted by the expanding Universe is borne out in observations, and has been consistent with what’s been known going all the way back to the 1920s. (NASA / WMAP SCIENCE TEAM)

    NASA/WMAP 2001 to 2010

    The second caveat is that the two masses you’re considering, even if you’re extremely careful about accounting for what’s in your Universe, are most likely not the only forms of energy around. There are bound to be other masses in the form of normal matter, dark matter, and neutrinos. There’s the presence of radiation, from both electromagnetic and gravitational waves. There’s even dark energy: a type of energy inherent to the fabric of space itself.

    Now, here’s a scenario that might exemplify where your intuition leads you astray: what happens if these masses, for the volume they occupy, have less total energy than the average energy density of the surrounding space?

    The gravitational attraction (blue) of overdense regions and the relative repulsion (red) of the underdense regions, as they act on the Milky Way. Even though gravity is always attractive, there is an average amount of attraction throughout the Universe, and regions with lower energy densities than that will experience (and cause) an effective repulsion with respect to the average. (YEHUDA HOFFMAN, DANIEL POMARÈDE, R. BRENT TULLY, AND HÉLÈNE COURTOIS, NATURE ASTRONOMY 1, 0036 (2017))

    You can imagine three different scenarios:

    1.The first mass has a below-average energy density while the second has an above-average value.
    2.The first mass has an above-average energy density while the second has a below-average value.
    3.Both the first and second masses have a below-average energy density compared to the rest of space.

    In the first two scenarios, the above-average mass will begin growing as it pulls on the matter/energy all around it, while the below-average mass will start shrinking, as it’s less able to hold onto its own mass in the face of its surroundings. These two masses will effectively repel one another; even though gravitation is always attractive, the intervening matter is preferentially attracted to the heavier-than-average mass. This causes the lower-mass object to act like it’s both repelling and being repelled by the heavier-mass object, the same way a balloon held underwater will still be attracted to Earth’s center, but will be forced away from it owing to the (buoyant) effects of the water.

    The Earth’s crust is thinnest over the ocean and thickest over mountains and plateaus, as the principle of buoyancy dictates and as gravitational experiments confirm. Just as a balloon submerged in water will accelerate away from the center of the Earth, a region with below-average energy density will accelerate away from an overdense region, as average-density regions will be more preferentially attracted to the overdense region than the underdense region will. (USGS)

    So what’s going to happen if you have two regions of space with below-average densities, surrounded by regions of just average density? They’ll both shrink, giving up their remaining matter to the denser regions around them. But as far as motions go, they’ll accelerate towards one another, with exactly the same magnitude they’d accelerate at if they were both overdense regions that exceeded the average density by equivalent amounts.

    You might be wondering why it’s important to think about these concerns when talking about a Universe from nothing. After all, if your Universe is full of matter and energy, it’s pretty hard to understand how that’s relevant to making sense of the concept of something coming from nothing. But just as our intuition can lead us astray when thinking about matter and energy on the spacetime playing field of General Relativity, it’s a comparable situation when we think about nothingness.

    A representation of flat, empty space with no matter, energy or curvature of any type. With the exception of small quantum fluctuations, space in an inflationary Universe becomes incredibly flat like this, except in a 3D grid rather than a 2D sheet. Space is stretched flat, and particles are rapidly driven away. (AMBER STUVER / LIVING LIGO)

    You very likely think about nothingness as a philosopher would: the complete absence of everything. Zero matter, zero energy, an absolutely zero value for all the quantum fields in the Universe, etc. You think of space that’s completely flat, with nothing around to cause its curvature anywhere.

    If you think this way, you’re not alone: there are many different ways to conceive of “nothing.” You might even be tempted to take away space, time, and the laws of physics themselves, too. The problem, if you start doing that, is that you lose your ability to predict anything at all. The type of nothingness you’re thinking about, in this context, is what we call unphysical.

    If we want to think about nothing in a physical sense, you have to keep certain things. You need spacetime and the laws of physics, for example; you cannot have a Universe without them.

    A visualization of QCD illustrates how particle/antiparticle pairs pop out of the quantum vacuum for very small amounts of time as a consequence of Heisenberg uncertainty.

    The quantum vacuum is interesting because it demands that empty space itself isn’t so empty, but is filled with all the particles, antiparticles and fields in various states that are demanded by the quantum field theory that describes our Universe. Put this all together, and you find that empty space has a zero-point energy that’s actually greater than zero. (DEREK B. LEINWEBER)

    But here’s the kicker: if you have spacetime and the laws of physics, then by definition you have quantum fields permeating the Universe everywhere you go. You have a fundamental “jitter” to the energy inherent to space, due to the quantum nature of the Universe. (And the Heisenberg uncertainty principle, which is unavoidable.)

    Put these ingredients together — because you can’t have a physically sensible “nothing” without them — and you’ll find that space itself doesn’t have zero energy inherent to it, but energy with a finite, non-zero value. Just as there’s a finite zero-point energy (that’s greater than zero) for an electron bound to an atom, the same is true for space itself. Empty space, even with zero curvature, even devoid of particles and external fields, still has a finite energy density to it.

    The four possible fates of the Universe with only matter, radiation, curvature and a cosmological constant allowed. The top three possibilities are for a Universe whose fate is determined by the balance of matter/radiation with spatial curvature alone; the bottom one includes dark energy. Only the bottom “fate” aligns with the evidence. (E. SIEGEL / BEYOND THE GALAXY)

    From the perspective of quantum field theory, this is conceptualized as the zero-point energy of the quantum vacuum: the lowest-energy state of empty space. In the framework of General Relativity, however, it appears in a different sense: as the value of a cosmological constant, which itself is the energy of empty space, independent of curvature or any other form of energy density.

    Although we do not know how to calculate the value of this energy density from first principles, we can calculate the effects it has on the expanding Universe. As your Universe expands, every form of energy that exists within it contributes to not only how your Universe expands, but how that expansion rate changes over time. From multiple independent lines of evidence — including the Universe’s large-scale structure, the cosmic microwave background, and distant supernovae — we have been able to determine how much energy is inherent to space itself.

    Constraints on dark energy from three independent sources: supernovae, the CMB (cosmic microwave background) and BAO (which is a wiggly feature seen in the correlations of large-scale structure). Note that even without supernovae, we’d need dark energy for certain, and also that there are uncertainties and degeneracies between the amount of dark matter and dark energy that we’d need to accurately describe our Universe. (SUPERNOVA COSMOLOGY PROJECT, AMANULLAH, ET AL., AP.J. (2010))

    This form of energy is what we presently call dark energy, and it’s responsible for the observed accelerated expansion of the Universe. Although it’s been a part of our conceptions of reality for more than two decades now, we don’t fully understand its true nature. All we can say is that when we measure the expansion rate of the Universe, our observations are consistent with dark energy being a cosmological constant with a specific magnitude, and not with any of the alternatives that evolve significantly over cosmic time.

    Because dark energy causes distant galaxies to appear to recede from one another more and more quickly as time goes on — since the space between those galaxies is expanding — it’s often called negative gravity. This is not only highly informal, but incorrect. Gravity is only positive, never negative. But even positive gravity, as we saw earlier, can have effects that look very much like negative repulsion.

    Dark Energy Survey

    Dark Energy Camera [DECam], built at FNAL

    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Timeline of the Inflationary Universe WMAP

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.

    How energy density changes over time in a Universe dominated by matter (top), radiation (middle), and a cosmological constant (bottom). Note that dark energy doesn’t change in density as the Universe expands, which is why it comes to dominate the Universe at late times. (E. SIEGEL)

    If there were greater amounts of dark energy present within our spatially flat Universe, the expansion rate would be greater. But this is true for all forms of energy in a spatially flat Universe: dark energy is no exception. The only different between dark energy and the more commonly encountered forms of energy, like matter and radiation, is that as the Universe expands, the densities of matter and radiation decrease.

    But because dark energy is a property of space itself, when the Universe expands, the dark energy density must remain constant. As time goes on, galaxies that are gravitationally bound will merge together into groups and clusters, while the unbound groups and clusters will accelerate away from one another. That’s the ultimate fate of the Universe if dark energy is real.

    Laniakea supercluster. From Nature The Laniakea supercluster of galaxies R. Brent Tully, Hélène Courtois, Yehuda Hoffman & Daniel Pomarède at http://www.nature.com/nature/journal/v513/n7516/full/nature13674.html. Milky Way is the red dot.

    So why do we say we have a Universe that came from nothing? Because the value of dark energy may have been much higher in the distant past: before the hot Big Bang. A Universe with a very large amount of dark energy in it will behave identically to a Universe undergoing cosmic inflation. In order for inflation to end, that energy has to get converted into matter and radiation. The evidence strongly points to that happening some 13.8 billion years ago.

    When it did, though, a small amount of dark energy remained behind. Why? Because the zero-point energy of the quantum fields in our Universe isn’t zero, but a finite, greater-than-zero value. Our intuition may not be reliable when we consider the physical concepts of nothing and negative/positive gravity, but that’s why we have science. When we do it right, we wind up with physical theories that accurately describe the Universe we measure and observe.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

  • richardmitnick 4:47 pm on September 24, 2018 Permalink | Reply
    Tags: A New Single-Photon Sensor for Quantum Imaging, , Berkeley Quantum, Figuring out how to extend the search for dark matter particles, From Quantum Gravity to Quantum Technology, , , News Center A Quantum Leap Toward Expanding the Search for Dark Matter, , QCD: Quantum Chromodynamics, U.S. Department of Energy’s Office of High Energy Physics, University of Massachusetts Amherst,   

    From Lawrence Berkeley National Lab: “News Center A Quantum Leap Toward Expanding the Search for Dark Matter” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    September 24, 2018
    Glenn Roberts Jr.
    (510) 486-5582

    A visualization of a massive galaxy cluster that shows dark matter density (purple filaments) overlaid with the gas velocity field. (Credit: Illustris Collaboration)

    Figuring out how to extend the search for dark matter particles – dark matter describes the stuff that makes up an estimated 85 percent of the total mass of the universe yet so far has only been measured by its gravitational effects – is a bit like building a better mousetrap…that is, a mousetrap for a mouse you’ve never seen, will never see directly, may be joined by an odd assortment of other mice, or may not be a mouse after all.

    Now, through a new research program supported by the U.S. Department of Energy’s Office of High Energy Physics (HEP), a consortium of researchers from the DOE’s Lawrence Berkeley National Laboratory (Berkeley Lab), UC Berkeley, and the University of Massachusetts Amherst will develop sensors that enlist the seemingly weird properties of quantum physics to probe for dark matter particles in new ways, with increased sensitivity, and in uncharted regions. Maurice Garcia-Sciveres, a Berkeley Lab physicist, is leading this Quantum Sensors HEP-Quantum Information Science (QIS) Consortium.

    Quantum technologies are emerging as promising alternatives to the more conventional “mousetraps” that researchers have previously used to track down elusive particles. And the DOE, through the same HEP office, is also supporting a collection of other research efforts led by Berkeley Lab scientists that tap into quantum theory, properties, and technologies in the QIS field.

    These efforts include:

    Unraveling the Quantum Structure of Quantum Chromodynamics in Parton Shower Monte Carlo Generators – This effort will develop computer programs that test the interactions between fundamental particles in extreme detail. Current computer simulations are limited by classical algorithms, though quantum algorithms could more accurately model these interactions and could provide a better way to compare with and understand particle events measured at CERN’s Large Hadron Collider, the world’s most powerful particle collider. Berkeley Lab’s Christian Bauer, a senior research scientist, will lead this effort.
    Quantum Pattern Recognition (QPR) for High-Energy Physics –Increasingly powerful particle accelerators require vastly faster computer algorithms to monitor and sort through billions of particle events per second, and this effort will develop and study the potential of quantum-based algorithms for pattern recognition to reconstruct charged particles. Such algorithms have the potential for significant speed improvements and increased precision. Led by Berkeley Lab physicist and Divisional Fellow Heather Gray, this effort will involve high-energy physics and high-performance computing expertise in Berkeley Lab’s Physics Division and at the Lab’s National Energy Research Scientific Computing Center, a DOE Office of Science User Facility, and also at UC Berkeley.
    Skipper-CCD, a New Single-Photon Sensor for Quantum Imaging – For the past six years, Berkeley Lab and Fermi National Accelerator Laboratory (Fermilab) have been collaborating in the development of a detector for astrophysics experiments that can detect the smallest individual unit of light, known as a photon. This Skipper-CCD detector was successfully demonstrated in the summer of 2017 with an incredibly low noise that allowed the detection of even individual electrons. As a next step, this Fermilab-led effort will seek to image pairs of photons that exist in a state of quantum entanglement, meaning their properties are inherently related – even over long distances – such that the measurement of one of the particles necessarily defines the properties of the other. Steve Holland, a senior scientist and engineer at Berkeley Lab who is a pioneer in the development of high-performance silicon detectors for a range of uses, is leading Berkeley Lab’s participation in this project.
    Geometry and Flow of Quantum Information: From Quantum Gravity to Quantum Technology –This effort will develop quantum algorithms and simulations for properties, including error correction and information scrambling, that are relevant to black hole theories and to quantum computing involving highly connected arrays of superconducting qubits – the basic units of a quantum computer. Researchers will also compare these with more classical methods. UC Berkeley is heading up this research program, and Irfan Siddiqi, a scientist in Berkeley Lab’s Materials Sciences Division and founding director of the Center for Quantum Coherent Science at UC Berkeley, is leading Berkeley Lab’s involvement.
    Siddiqi is also leading a separate research program, Field Programmable Gate Array-based Quantum Control for High-Energy Physics Simulations with Qutrits, that will develop specialized tools and logic families for high-energy-physics-focused quantum computing. This effort involves Berkeley Lab’s Accelerator Technology and Applied Physics Division.

    These projects are also part of Berkeley Quantum, a partnership that harnesses the expertise and facilities of Berkeley Lab and UC Berkeley to advance U.S. quantum capabilities by conducting basic research, fabricating and testing quantum-based devices and technologies, and educating the next generation of researchers.

    Also, across several of its offices, the DOE has announced support for a wave of other R&D efforts (see a related news release) that will foster collaborative innovation in quantum information science at Berkeley Lab, at other national labs, and at partner institutions.

    At Berkeley Lab, the largest HEP-funded QIS-related undertaking will include a multidisciplinary team in the development and demonstration of quantum sensors to look for very-low-mass dark matter particles – so-called “light dark matter” – by instrumenting two different detectors.

    One of these detectors will use liquid helium at a very low temperature where otherwise familiar phenomena such as heat and thermal conductivity display quantum behavior. The other detector will use specially fabricated crystals of gallium arsenide (see a related article), also chilled to cryogenic temperatures. The ideas for how these experiments can search for very light dark matter sprang from theory work at Berkeley Lab.

    “There’s a lot of unexplored territory in low-mass dark matter,” said Natalie Roe, director of the Physics Division at Berkeley Lab and the principal investigator for the Lab’s HEP-related quantum efforts. “We have all the pieces to pull this together: in theory, experiments, and detectors.”

    This image of the Andromeda Galaxy, taken from a 1970 study by astronomers Vera Rubin and W. Kent Ford Jr., shows points (dots) that were tracked at different distances from the galaxy center. The selected points unexpectedly were found to rotate at a similar rate, which provides evidence for the existence of dark matter. (Credit: Vera Rubin, W. Kent Ford Jr.)

    Garcia-Sciveres, who is leading the effort in applying quantum sensors to the low-mass dark matter search, noted that other major efforts – such as the Berkeley Lab-led LUX-ZEPLIN (LZ) experiment that is taking shape in South Dakota – will help root out whether dark matter particles known as WIMPs (weakly interacting massive particles) exist with masses comparable to that of atoms. But LZ and similar experiments are not designed to detect dark matter particles of much lower masses.

    LBNL Lux Zeplin project at SURF

    “The traditional WIMP dark matter experiments haven’t found anything yet,” he said. “And there is a lot of theoretical work on models that favor particles of a lower mass than experiments like LZ can measure,” he added. “This has motivated people to really look hard at how you can detect very-low-mass particles. It’s not so easy. It’s a very small signal that has to be detected without any background noise.”

    Researchers hope to develop quantum sensors that are better at filtering out the noise of unwanted signals. While a traditional WIMP experiment is designed to sense the recoil of an entire atomic nucleus after it is “kicked” by a dark matter particle, very-low-mass dark matter particles will bounce right off nuclei without affecting them, like a flea bouncing off an elephant.

    The goal of the new effort is to sense the low-mass particles via their energy transfer in the form of very feeble quantum vibrations, which go by names like “phonons” or “rotons,” for example, Garcia-Sciveres said.

    “You would never be able to tell that an invisible flea hits an elephant by watching the elephant. But what if every time an invisible flea hits an elephant at one end of the herd, a visible flea is flung away from an elephant at the other end of the herd?” he said.

    “You could use these sensors to watch for such slight signals in a very cold crystal or superfluid helium, where an incoming dark matter particle is like the invisible flea, and the outgoing visible flea is a quantum vibration that must be detected.”

    The particle physics community has held some workshops to brainstorm the possibilities for low-mass dark matter detection. “This is a new regime. This is an area where there aren’t even any measurements yet. There is a promise that QIS techniques can help give us more sensitivity to the small signals we’re looking for,” Garcia-Sciveres added. “Let’s see if that’s true.”

    The demonstration detectors will each have about 1 cubic centimeter of detector material. Dan McKinsey, a Berkeley Lab faculty senior scientist and UC Berkeley physics professor who is responsible for the development of the liquid helium detector, said that the detectors will be constructed on the UC Berkeley campus. Both are designed to be sensitive to particles with a mass lighter than protons – the positively charged particles that reside in atomic nuclei.

    A schematic for low-mass dark matter particle detection in a planned superfluid helium (He) experiment. (Credit: Berkeley Lab)

    The superfluid helium detector will make use of a process called “quantum evaporation,” in which rotons and phonons cause individual helium atoms to be evaporated from the surface of superfluid helium.

    Kathryn Zurek, a Berkeley Lab physicist and pioneering theorist in the search for very-low-mass dark matter particles who is working on the quantum sensor project, said the technology to detect such “whispers” of dark matter didn’t exist just a decade ago but “has made major gains in the last few years.” She also noted, “There had been a fair amount of skepticism about how realistic it would be to look for this light-mass dark matter, but the community has moved more broadly in that direction.”

    There are many synergies in the expertise and capabilities that have developed both at Berkeley Lab and on the UC Berkeley campus that make it a good time – and the right place – to develop and apply quantum technologies to the hunt for dark matter, Zurek said.

    Theories developed at Berkeley Lab suggest that certain exotic materials exhibit quantum states or “modes” that low-mass dark matter particles can couple with, which would make the particles detectable – like the “visible flea” referenced above.

    “These ideas are the motivation for building these experiments to search for light dark matter,” Zurek said. “This is a broad and multipronged approached, and the idea is that it will be a stepping stone to a larger effort.”

    The new project will draw from a deep experience in building other types of particle detectors, and R&D in ultrasensitive sensors that operate at the threshold where an electrically conducting material becomes a superconductor – the “tipping point” that is sensitive to the slightest fluctuations. Versions of these sensors are already used to search for slight temperature variations in the relic microwave light that spans the universe.

    At the end of the three-year demonstration, researchers could perhaps turn their sights to more exotic types of detector materials in larger volumes.

    “I’m excited to see this program move forward, and I think it will become a significant research direction in the Physics Division at Berkeley Lab,” she said, adding that the program could also demonstrate ultrasensitive detectors that have applications in other fields of science.

    More info:

    Read a news release that summarizes all of the Berkeley Lab quantum information science awards announced Sept. 24
    Berkeley Lab to Build an Advanced Quantum Computing Testbed
    About Berkeley Quantum

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

  • richardmitnick 4:33 pm on August 20, 2018 Permalink | Reply
    Tags: , Anomalies, , Branes, , , , , , Parity violation, QCD: Quantum Chromodynamics, , , , , , The second superstring revolution, Theorist John Schwarz   

    From Caltech: “Long and Winding Road: A Conversation with String Theory Pioneer” John Schwarz 

    Caltech Logo

    From Caltech


    Whitney Clavin
    (626) 395-1856

    John Schwarz discusses the history and evolution of superstring theory.

    John Schwarz. Credit: Seth Hansen for Caltech

    The decades-long quest for a theory that would unify all the known forces—from the microscopic quantum realm to the macroscopic world where gravity dominates—has had many twists and turns. The current leading theory, known as superstring theory and more informally as string theory, grew out of an approach to theoretical particle physics, called S-matrix theory, which was popular in the 1960s. Caltech’s John H. Schwarz, the Harold Brown Professor of Theoretical Physics, Emeritus, began working on the problem in 1971, while a junior faculty member at Princeton University. He moved to Caltech in 1972, where he continued his research with various collaborators from other universities. Their studies in the 1970s and 1980s would dramatically shift the evolution of the theory and, in 1984, usher in what’s known as the first superstring revolution.

    Essentially, string theory postulates that our universe is made up, at its most fundamental level, of infinitesimal tiny vibrating strings and contains 10 dimensions—three for space, one for time, and six other spatial dimensions curled up in such a way that we don’t perceive them in everyday life or even with the most sensitive experimental searches to date. One of the many states of a string is thought to correspond to the particle that carries the gravitational force, the graviton, thereby linking the two pillars of fundamental physics—quantum mechanics and the general theory of relativity, which includes gravity.

    We sat down with Schwarz to discuss the history and evolution of string theory and how the theory itself might have moved past strings.

    What are the earliest origins of string theory?

    The first study often regarded as the beginning of string theory came from an Italian physicist named Gabriele Veneziano in 1968. He discovered a mathematical formula that had many of the properties that people were trying to incorporate in a fundamental theory of the strong nuclear force [a fundamental force that holds nuclei together]. This formula was kind of pulled out of the blue, and ultimately Veneziano and others realized, within a couple years, that it was actually describing a quantum theory of a string—a one-dimensional extended object.

    How did the field grow after this paper?

    In the early ’70s, there were several hundred people worldwide working on string theory. But then everything changed when quantum chromodynamics, or QCD—which was developed by Caltech’s Murray Gell-Mann [Nobel Laureate, 1969] and others—became the favored theory of the strong nuclear force. Almost everyone was convinced QCD was the right way to go and stopped working on string theory. The field shrank down to just a handful of people in the course of a year or two. I was one of the ones who remained.

    How did Gell-Mann become interested in your work?

    Gell-Mann is the one who brought me to Caltech and was very supportive of my work. He took an interest in studies I had done with a French physicist, André Neveu, when we were at Princeton. Neveu and I introduced a second string theory. The initial Veneziano version had many problems. There are two kinds of fundamental particles called bosons and fermions, and the Veneziano theory only described bosons. The one I developed with Neveu included fermions. And not only did it include fermions but it led to the discovery of a new kind of symmetry that relates bosons and fermions, which is called supersymmetry. Because of that discovery, this version of string theory is called superstring theory.

    When did the field take off again?

    A pivotal change happened after work I did with another French physicist, Joël Scherk, whom Gell-Mann and I had brought to Caltech as a visitor in 1974. During that period, we realized that many of the problems we were having with string theory could be turned into advantages if we changed the purpose. Instead of insisting on constructing a theory of the strong nuclear force, we took this beautiful theory and asked what it was good for. And it turned out it was good for gravity. Neither of us had worked on gravity. It wasn’t something we were especially interested in but we realized that this theory, which was having trouble describing the strong nuclear force, gives rise to gravity. Once we realized this, I knew what I would be doing for the rest of my career. And I believe Joël felt the same way. Unfortunately, he died six years later. He made several important discoveries during those six years, including a supergravity theory in 11 dimensions.

    Surprisingly, the community didn’t respond very much to our papers and lectures. We were generally respected and never had a problem getting our papers published, but there wasn’t much interest in the idea. We were proposing a quantum theory of gravity, but in that era physicists who worked on quantum theory weren’t interested in gravity, and physicists who worked on gravity weren’t interested in quantum theory.

    That changed after I met Michael Green [a theoretical physicist then at the University of London and now at the University of Cambridge], at the CERN cafeteria in Switzerland in the summer of 1979. Our collaboration was very successful, and Michael visited Caltech for several extended visits over the next few years. We published a number of papers during that period, which are much cited, but our most famous work was something we did in 1984, which had to do with a problem known as anomalies.

    What are anomalies in string theory?

    One of the facts of nature is that there is what’s called parity violation, which means that the fundamental laws are not invariant under mirror reflection. For example, a neutrino always spins clockwise and not counterclockwise, so it would look wrong viewed in a mirror. When you try to write down a fundamental theory with parity violation, mathematical inconsistencies often arise when you take account of quantum effects. This is referred to as the anomaly problem. It appeared that one couldn’t make a theory based on strings without encountering these anomalies, which, if that were the case, would mean strings couldn’t give a realistic theory. Green and I discovered that these anomalies cancel one another in very special situations.

    When we released our results in 1984, the field exploded. That’s when Edward Witten [a theoretical physicist at the Institute for Advanced Study in Princeton], probably the most influential theoretical physicist in the world, got interested. Witten and three collaborators wrote a paper early in 1985 making a particular proposal for what to do with the six extra dimensions, the ones other than the four for space and time. That proposal looked, at the time, as if it could give a theory that is quite realistic. These developments, together with the discovery of another version of superstring theory, constituted the first superstring revolution.

    Richard Feynman was here at Caltech during that time, before he passed away in 1988. What did he think about string theory?

    After the 1984 to 1985 breakthroughs in our understanding of superstring theory, the subject no longer could be ignored. At that time it acquired some prominent critics, including Richard Feynman and Stephen Hawking. Feynman’s skepticism of superstring theory was based mostly on the concern that it could not be tested experimentally. This was a valid concern, which my collaborators and I shared. However, Feynman did want to learn more, so I spent several hours explaining the essential ideas to him. Thirty years later, it is still true that there is no smoking-gun experimental confirmation of superstring theory, though it has proved its value in other ways. The most likely possibility for experimental support in the foreseeable future would be the discovery of supersymmetry particles. So far, they have not shown up.

    What was the second superstring revolution about?

    The second superstring revolution occurred 10 years later in the mid ’90s. What happened then is that string theorists discovered what happens when particle interactions become strong. Before, we had been studying weakly interacting systems. But as you crank up the strength of the interaction, a 10th dimension of space can emerge. New objects called branes also emerge. Strings are one dimensional; branes have all sorts of dimensions ranging from zero to nine. An important class of these branes, called D-branes, was discovered by the late Joseph Polchinski [BS ’75]. Strings do have a special role, but when the system is strongly interacting, then the strings become less fundamental. It’s possible that in the future the subject will get a new name but until we understand better what the theory is, which we’re still struggling with, it’s premature to invent a new name.

    What can we say now about the future of string theory?

    It’s now over 30 years since a large community of scientists began pooling their talents, and there’s been enormous progress in those 30 years. But the more big problems we solve, the more new questions arise. So, you don’t even know the right questions to ask until you solve the previous questions. Interestingly, some of the biggest spin-offs of our efforts to find the most fundamental theory of nature are in pure mathematics.

    Do you think string theory will ultimately unify the forces of nature?

    Yes, but I don’t think we’ll have a final answer in my lifetime. The journey has been worth it, even if it did take some unusual twists and turns. I’m convinced that, in other intelligent civilizations throughout the galaxy, similar discoveries will occur, or already have occurred, in a different sequence than ours. We’ll find the same result and reach the same conclusions as other civilizations, but we’ll get there by a very different route.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”

    Caltech campus

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: