Tagged: Standard Model of Particle Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:00 pm on July 16, 2021 Permalink | Reply
    Tags: "ATLAS Confirms Universality of Key Particle Interactions", Brookhaven Lab serves as the U.S. host laboratory for the ATLAS experiment., , , LHC’s predecessor—the Large Electron-Positron (LEP) collider, Standard Model of Particle Physics, Tension with the Standard Model resolved.   

    From DOE’s Brookhaven National Laboratory (US) and From CERN (CH) ATLAS : “ATLAS Confirms Universality of Key Particle Interactions” 

    From DOE’s Brookhaven National Laboratory (US)

    and

    From CERN (CH) ATLAS

    July 9, 2021
    Karen McNulty Walsh
    kmcnulty@bnl.gov

    Test demonstrates “lepton flavor universality” for interactions of muon and tau leptons with W bosons.

    A new paper by the ATLAS collaboration at the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN) provides evidence that two different types of leptons interact in a universal way with particles called W bosons. This result, just published in Nature Physics, supports “lepton flavor universality,” a key prediction of the Standard Model of particle physics.

    Standard Model of Particle Physics, Quantum Diaries[/caption]

    The Standard Model of particle physics is the reigning theory describing all known particles and their interactions. It includes three flavors of leptons: the familiar electron—which is central to our understanding of electricity—and two heavier cousins known as muons and tau particles. According to the Standard Model, each of these leptons should “couple,” or interact, with a W boson with equal strength, commonly referred to as lepton-flavor universality.

    Finding an experimental result in agreement with that longstanding prediction may not seem all that newsworthy. But decades ago, experiments at the LHC’s predecessor—the Large Electron-Positron (LEP) collider—had reported a hint of a discrepancy in the way muon and tau leptons behaved.

    That result, from the 1990s, generated tension with the Standard Model.

    2
    Srini Rajagopalan, Program Manager for U.S. ATLAS and a physicist at Brookhaven National Laboratory.

    “The new ATLAS measurement, which has significantly higher precision than the LEP experiments, resolves the decades-old tension,” said Srini Rajagopalan, Program Manager for U.S. ATLAS and a physicist at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory. “It is an important measurement to demonstrate that different types of leptons behave in the same way.”

    Brookhaven Lab serves as the U.S. host laboratory for the ATLAS experiment. Brookhaven scientists play multiple roles in this international collaboration, from construction and project management to data storage, distribution, and analysis.

    The ATLAS team’s motivation for using the powerful LHC to study leptons’ interactions with the W boson stems from the earlier discrepancy at LEP, which was also located at CERN.

    LEP collided electrons and their anti-particles (positrons). These collisions provided a very clean environment for precision measurements of particle interactions and properties. The experiments measured a discrepancy in the frequency with which W bosons decayed to muon and tau leptons. The discrepancy suggested there was a difference in the strength of the W boson interactions with these two different flavor leptons—a violation of lepton flavor universality. But LEP produced a relatively low number of W bosons, which limited the measurement’s statistical precision.

    The LHC, in contrast, collides high-energy protons. Compared with simple electrons and positrons, protons are more complex composite particles. Each proton is made of many quarks and gluons and each collision between two of these composite particles produces many different types of particles. But among the multitudes, more than 100 million of these collisions produce pairs of so-called top quarks, which readily decay into pairs of W bosons, and subsequently, in some cases, into leptons. Thus, the LHC provides a huge dataset for measuring W boson-to-lepton decays/interactions.

    But there’s an added challenge: Some muons come directly from the decay of W bosons; and some come from a tau lepton itself decaying into a muon plus two invisible particles called neutrinos. Fortunately, these two sources of muons have different lifetimes, which lead to different signatures in the detector.

    3
    ATLAS Physics Coordinator Stephane Willocq, a physicist at the University of Massachusetts at Amherst (US). Credit: UMass Amherst.

    ATLAS is sensitive enough to search for these unique signatures and cancel out additional uncertainties in the process—a key feature that enables the high precision of the measurement.

    “This is a beautiful result that demonstrates that we can perform precision tests at the LHC, thanks to the huge datasets collected and the well-understood detector performance,” said ATLAS Physics Coordinator Stephane Willocq, a physicist at the University of Massachusetts at Amherst.

    The new result gives the ratio of a W boson decaying to a tau or muon to be very close to 1. Such a measurement signifies that the decay to each lepton occurs with equal frequency implying that Ws couple with each lepton with equal strength—just as the Standard Model predicts. With an uncertainty half the size of the LEP measurement, this new high-precision ATLAS measurement suggests the earlier tension between experiment and theory may have been due to a fluctuation.

    Brookhaven Lab’s role in this research was funded by the DOE Office of Science.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN Courier (CH)

    Quantum Diaries
    QuantumDiaries

    One of ten national laboratories overseen and primarily funded by the DOE(US) Office of Science, DOE’s Brookhaven National Laboratory (US) conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University(US), the largest academic user of Laboratory facilities, and Battelle(US), a nonprofit, applied science and technology organization.

    Research at BNL specializes in nuclear and high energy physics, energy science and technology, environmental and bioscience, nanoscience and national security. The 5,300 acre campus contains several large research facilities, including the Relativistic Heavy Ion Collider [below] and National Synchrotron Light Source II [below]. Seven Nobel prizes have been awarded for work conducted at Brookhaven lab.

    BNL is staffed by approximately 2,750 scientists, engineers, technicians, and support personnel, and hosts 4,000 guest investigators every year. The laboratory has its own police station, fire department, and ZIP code (11973). In total, the lab spans a 5,265-acre (21 km^2) area that is mostly coterminous with the hamlet of Upton, New York. BNL is served by a rail spur operated as-needed by the New York and Atlantic Railway. Co-located with the laboratory is the Upton, New York, forecast office of the National Weather Service.

    Major programs

    Although originally conceived as a nuclear research facility, Brookhaven Lab’s mission has greatly expanded. Its foci are now:

    Nuclear and high-energy physics
    Physics and chemistry of materials
    Environmental and climate research
    Nanomaterials
    Energy research
    Nonproliferation
    Structural biology
    Accelerator physics

    Operation

    Brookhaven National Lab was originally owned by the Atomic Energy Commission(US) and is now owned by that agency’s successor, the United States Department of Energy (DOE). DOE subcontracts the research and operation to universities and research organizations. It is currently operated by Brookhaven Science Associates LLC, which is an equal partnership of Stony Brook University(US) and Battelle Memorial Institute(US). From 1947 to 1998, it was operated by Associated Universities, Inc. (AUI) (US), but AUI lost its contract in the wake of two incidents: a 1994 fire at the facility’s high-beam flux reactor that exposed several workers to radiation and reports in 1997 of a tritium leak into the groundwater of the Long Island Central Pine Barrens on which the facility sits.

    Foundations

    Following World War II, the US Atomic Energy Commission was created to support government-sponsored peacetime research on atomic energy. The effort to build a nuclear reactor in the American northeast was fostered largely by physicists Isidor Isaac Rabi and Norman Foster Ramsey Jr., who during the war witnessed many of their colleagues at Columbia University leave for new remote research sites following the departure of the Manhattan Project from its campus. Their effort to house this reactor near New York City was rivalled by a similar effort at the Massachusetts Institute of Technology (US) to have a facility near Boston, Massachusettes(US). Involvement was quickly solicited from representatives of northeastern universities to the south and west of New York City such that this city would be at their geographic center. In March 1946 a nonprofit corporation was established that consisted of representatives from nine major research universities — Columbia University(US), Cornell University(US), Harvard University(US), Johns Hopkins University(US), Massachusetts Institute of Technology(US), Princeton University(US), University of Pennsylvania(US), University of Rochester(US), and Yale University(US).

    Out of 17 considered sites in the Boston-Washington corridor, Camp Upton on Long Island was eventually chosen as the most suitable in consideration of space, transportation, and availability. The camp had been a training center from the US Army during both World War I and World War II. After the latter war, Camp Upton was deemed no longer necessary and became available for reuse. A plan was conceived to convert the military camp into a research facility.

    On March 21, 1947, the Camp Upton site was officially transferred from the U.S. War Department to the new U.S. Atomic Energy Commission (AEC), predecessor to the U.S. Department of Energy (DOE).

    Research and facilities

    Reactor history

    In 1947 construction began on the first nuclear reactor at Brookhaven, the Brookhaven Graphite Research Reactor. This reactor, which opened in 1950, was the first reactor to be constructed in the United States after World War II. The High Flux Beam Reactor operated from 1965 to 1999. In 1959 Brookhaven built the first US reactor specifically tailored to medical research, the Brookhaven Medical Research Reactor, which operated until 2000.

    Accelerator history

    In 1952 Brookhaven began using its first particle accelerator, the Cosmotron. At the time the Cosmotron was the world’s highest energy accelerator, being the first to impart more than 1 GeV of energy to a particle.


    The Cosmotron was retired in 1966, after it was superseded in 1960 by the new Alternating Gradient Synchrotron (AGS).

    The AGS was used in research that resulted in 3 Nobel prizes, including the discovery of the muon neutrino, the charm quark, and CP violation.

    In 1970 in BNL started the ISABELLE project to develop and build two proton intersecting storage rings.

    The groundbreaking for the project was in October 1978. In 1981, with the tunnel for the accelerator already excavated, problems with the superconducting magnets needed for the ISABELLE accelerator brought the project to a halt, and the project was eventually cancelled in 1983.

    The National Synchrotron Light Source (US) operated from 1982 to 2014 and was involved with two Nobel Prize-winning discoveries. It has since been replaced by the National Synchrotron Light Source II (US) [below].

    After ISABELLE’S cancellation, physicist at BNL proposed that the excavated tunnel and parts of the magnet assembly be used in another accelerator. In 1984 the first proposal for the accelerator now known as the Relativistic Heavy Ion Collider (RHIC)[below] was put forward. The construction got funded in 1991 and RHIC has been operational since 2000. One of the world’s only two operating heavy-ion colliders, RHIC is as of 2010 the second-highest-energy collider after the Large Hadron Collider(CH). RHIC is housed in a tunnel 2.4 miles (3.9 km) long and is visible from space.

    On January 9, 2020, It was announced by Paul Dabbar, undersecretary of the US Department of Energy Office of Science, that the BNL eRHIC design has been selected over the conceptual design put forward by DOE’s Thomas Jefferson National Accelerator Facility [Jlab] (US) as the future Electron–ion collider (EIC) in the United States.

    In addition to the site selection, it was announced that the BNL EIC had acquired CD-0 (mission need) from the Department of Energy. BNL’s eRHIC design proposes upgrading the existing Relativistic Heavy Ion Collider, which collides beams light to heavy ions including polarized protons, with a polarized electron facility, to be housed in the same tunnel.

    Other discoveries

    In 1958, Brookhaven scientists created one of the world’s first video games, Tennis for Two. In 1968 Brookhaven scientists patented Maglev, a transportation technology that utilizes magnetic levitation.

    Major facilities

    Relativistic Heavy Ion Collider (RHIC), which was designed to research quark–gluon plasma and the sources of proton spin. Until 2009 it was the world’s most powerful heavy ion collider. It is the only collider of spin-polarized protons.
    Center for Functional Nanomaterials (CFN), used for the study of nanoscale materials.
    BNL National Synchrotron Light Source II(US), Brookhaven’s newest user facility, opened in 2015 to replace the National Synchrotron Light Source (NSLS), which had operated for 30 years.[19] NSLS was involved in the work that won the 2003 and 2009 Nobel Prize in Chemistry.
    Alternating Gradient Synchrotron, a particle accelerator that was used in three of the lab’s Nobel prizes.
    Accelerator Test Facility, generates, accelerates and monitors particle beams.
    Tandem Van de Graaff, once the world’s largest electrostatic accelerator.
    Computational Science resources, including access to a massively parallel Blue Gene series supercomputer that is among the fastest in the world for scientific research, run jointly by Brookhaven National Laboratory and Stony Brook University.
    Interdisciplinary Science Building, with unique laboratories for studying high-temperature superconductors and other materials important for addressing energy challenges.
    NASA Space Radiation Laboratory, where scientists use beams of ions to simulate cosmic rays and assess the risks of space radiation to human space travelers and equipment.

    Off-site contributions

    It is a contributing partner to ATLAS experiment, one of the four detectors located at the Large Hadron Collider (LHC).


    It is currently operating at CERN near Geneva, Switzerland.

    Brookhaven was also responsible for the design of the SNS accumulator ring in partnership with Spallation Neutron Source at DOE’s Oak Ridge National Laboratory (US), Tennessee.

    Brookhaven plays a role in a range of neutrino research projects around the world, including the Daya Bay Neutrino Experiment (CN) nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China.


     
  • richardmitnick 7:16 pm on May 2, 2021 Permalink | Reply
    Tags: , , , , Large Hadron Collider’s LHCb detector, Many people would say supersymmetry is almost dead., , , , , , Some solutions nevertheless exist that could miraculously fit both. One is the leptoquark—a hypothetical particle that could have the ability to transform a quark into either a muon or an electron ., Standard Model of Particle Physics, , The data that the LHC has produced so far suggest that typical superpartners-if they exist-cannot weigh less than 1000 protons., The LHCb muon anomalies suffer from the same problem as the new muon-magnetism finding: various possible explanations exist but they are all “ad hoc”, There is one other major contender that might reconcile both the LHCb and Muon g – 2 discrepancies. It is a particle called the Z′ boson because of its similarity with the Z boson.   

    From Scientific American: “Muon Results Throw Physicists’ Best Theories into Confusion” 

    From Scientific American

    April 29, 2021
    Davide Castelvecchi

    The Large Hadron Collider’s LHCb detector reported anomalies in the behavior of muons, two weeks before the FNAL Muon g – 2 experiment announced a puzzling finding about muon magnetism.

    Physicists should be ecstatic right now. Taken at face value, the surprisingly strong magnetism of the elementary particles called muons, revealed by an experiment this month [Nature], suggests that the established theory of fundamental particles is incomplete. If the discrepancy pans out, it would be the first time that the theory has failed to account for observations since its inception five decades ago—and there is nothing physicists love more than proving a theory wrong.

    The Muon g − 2 collaboration at the Fermi National Accelerator Laboratory (Fermilab) outside Chicago, Illinois, reported the latest measurements in a webcast on 7 April, and published them in Physical Review Letters. The results are “extremely encouraging” for those hoping to discover other particles, says Susan Gardner, a physicist at the University of Kentucky (US) in Lexington.

    But rather than pointing to a new and revolutionary theory, the result—announced on 7 April by the FNAL Muon g – 2 experiment near Chicago, Illinois—poses a riddle. It seems maddeningly hard to explain it in a way that is compatible with everything else physicists know about elementary particles. And additional anomalies in the muon’s behaviour, reported in March by a collider experiment [LHCb above], only make that task harder. The result is that researchers have to perform the theoretical-physics equivalent of a triple somersault to make an explanation work.

    Zombie models

    Take supersymmetry, or SUSY, a theory that many physicists once thought was the most promising for extending the current paradigm, the standard model of particle physics.

    Supersymmetry comes in many variants, but in general, it posits that every particle in the standard model has a yet-to-be-discovered heavier counterpart, called a superpartner. Superpartners could be among the ‘virtual particles’ that constantly pop in and out of the empty space surrounding the muon, a quantum effect that would help to explain why this particle’s magnetic field is stronger than expected.

    If so, these particles could solve two mysteries at once: muon magnetism and dark matter, the unseen stuff that, through its gravitational pull, seems to keep galaxies from flying apart.

    Until ten years ago, various lines of evidence had suggested that a superpartner weighing as much as a few hundred protons could constitute dark matter. Many expected that the collisions at the Large Hadron Collider (LHC) outside Geneva, Switzerland, would produce a plethora of these new particles, but so far none has materialized.

    “Many people would say supersymmetry is almost dead,” says Dominik Stöckinger, a theoretical physicist at the Dresden University of Technology [Technische Universität Dresden] (DE), who is a member of the Muon g – 2 collaboration. But he still sees it as a plausible way to explain his experiment’s findings. “If you look at it in comparison to any other ideas, it’s not worse than the others,” he says.

    The data that the LHC has produced so far suggest that typical superpartners-if they exist-cannot weigh less than 1,000 protons (the bounds can be higher depending on the type of superparticle and the flavour of supersymmetry theory).

    There is one way in which Muon g – 2 could resurrect supersymmetry and also provide evidence for dark matter, Stöckinger says. There could be not one superpartner, but two appearing in LHC collisions, both of roughly similar masses—say, around 550 and 500 protons. Collisions would create the more massive one, which would then rapidly decay into two particles: the lighter superpartner plus a run-of-the-mill, standard-model particle carrying away the 50 protons’ worth of mass difference.

    The LHC detectors are well-equipped to reveal this kind of decay as long as the ordinary particle—the one that carries away the mass difference between the two superpartners—is large enough. But a very light particle could escape unobserved. “This is well-known to be a blind spot for LHC,” says Michael Peskin, a theoretician at the DOE’s SLAC National Accelerator Laboratory (US) in Menlo Park, California at Stanford University (US).

    The trouble is that models that include two superpartners with similar masses also tend to predict that the Universe should contain a much larger amount of dark matter than astronomers observe. So an additional mechanism would be needed—one that can reduce the amount of predicted dark matter, Peskin explains. This adds complexity to the theory. For it to fit the observations, all its parts would have to work “just so”.

    Meanwhile, physicists have uncovered more hints that muons behave oddly. An experiment at the LHC, called LHCb, has found tentative evidence that muons occur significantly less often than electrons as the breakdown products of certain heavier particles called B mesons. According to the standard model, muons are supposed to be identical to electrons in every way except for their mass, which is 207 times larger. As a consequence, B mesons should produce electrons and muons at rates that are nearly equal.

    The LHCb muon anomalies suffer from the same problem as the new muon-magnetism finding: various possible explanations exist but they are all “ad hoc”, says physicist Adam Falkowski, at the Paris-Saclay University [Université Paris-Saclay] (FR). “I’m quite appalled by this procession of zombie SUSY models dragged out of their graves,” says Falkowski.

    The task of explaining Muon g – 2’s results becomes even harder when researchers try concoct a theory that fits both those findings and the LHCb results, physicists say. “Extremely few models could explain both simultaneously,” says Stöckinger. In particular, the supersymmetry model that explains Muon g – 2 and dark matter would do nothing for LHCb.

    Some solutions nevertheless exist that could miraculously fit both. One is the leptoquark—a hypothetical particle that could have the ability to transform a quark into either a muon or an electron (which are both examples of a lepton). Leptoquarks could resurrect an attempt made by physicists in the 1970s to achieve a ‘grand unification’ of particle physics, showing that its three fundamental forces—strong, weak and electromagnetic—are all aspects of the same force.

    Most of the grand-unification schemes of that era failed experimental tests, and the surviving leptoquark models have become more complicated—but they still have their fans. “Leptoquarks could solve another big mystery: why different families of particles have such different masses,” says Gino Isidori, a theoretician at the University of Zürich [Universität Zürich ] (CH) in Switzerland. One family is made of the lighter quarks—the constituents of protons and neutrons—and the electron. Another has heavier quarks and the muon, and a third family has even heavier counterparts.

    Apart from the leptoquark, there is one other major contender that might reconcile both the LHCb and Muon g – 2 discrepancies. It is a particle called the Z′ boson because of its similarity with the Z boson, which carries the ‘weak force’ responsible for nuclear decay. It, too, could help to solve the mystery of the three families, says Ben Allanach, a theorist at the University of Cambridge (UK). “We’re building models where some features come out very naturally, you can understand these hierarchies,” he says. He adds that both leptoquarks and the Z′ boson have an advantage: they still have not been completely ruled out by the LHC, but the machine should ultimately see them if they exist.

    The LHC is currently undergoing an upgrade, and it will start to smash protons together again in April 2022. The coming deluge of data could strengthen the muon anomalies and perhaps provide hints of the long-sought new particles (although a proposed electron–positron collider, primarily designed to study the Higgs boson, might be needed to address some of the LHC’s blind spots, Peskin says). Meanwhile, beginning next year, Muon g – 2 will release further measurements. Once it’s known more precisely, the size of the discrepancy between muon magnetism and theory could itself rule out some explanations and point to others.

    Unless, that is, the discrepancies disappear and the standard model wins again. A new calculation, reported this month, of the standard model’s prediction for muon magnetism gave a value much closer to the experimental result. So far, those who have bet against the standard model have always lost, which makes physicists cautious. “We are—maybe—at the beginning of a new era,” Stöckinger says.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 4:11 pm on April 17, 2021 Permalink | Reply
    Tags: "German National Supercomputing Centre Provides Computational Muscle to Look for Cracks in the Standard Model of Physics", , , , Gauss Centre for Supercomputing [Gauß-Zentrum für Supercomputing] (DE), Jülich Supercomputing Centre [Forschungszentrum Jülich ] (DE), Leibniz Supercomputing Centre [Leibniz-Rechenzentrum] (DE), Magnetic moment of subatomic particles called muons, Muon g-2 collaboration, , , Standard Model of Particle Physics   

    From Gauss Centre for Supercomputing [Gauß-Zentrum für Supercomputing] (DE): “German National Supercomputing Centre Provides Computational Muscle to Look for Cracks in the Standard Model of Physics” 

    From Gauss Centre for Supercomputing [Gauß-Zentrum für Supercomputing] (DE)

    April 09, 2021
    Eric Gedenk

    Physicists have spent 20 years trying to more precisely measure the so-called “magnetic moment” of subatomic particles called muons. Findings published this week call into question long-standing assumptions of particle physics.

    1
    Does the magnetic moment of muons fit into our understanding of the laws governing the physical world around us? Credit: Uni Wuppertal / thavis gmbh.

    Since the 1970s, the Standard Model of Physics has served as the basis from which particle physics are investigated.

    Standard Model of Particle Physics, Quantum Diaries

    .

    Both experimentalists and theoretical physicists have tested the Standard Model of Particle Physics’s accuracy, and it has remained the law of the land when it comes to understanding how the subatomic world behaves.

    This week, cracks formed in that foundational set of assumptions. Researchers of the “Muon g-2” collaboration from the DOE’s Fermi National Accelerator Laboratory (US) published further experimental findings that show that muons—heavy subatomic relatives of electrons—may have a larger “magnetic moment” than earlier Standard Model estimates had predicted, indicating that an unknown particle or force might be influencing the muon. The work builds on anomalous results first uncovered 20 years ago at DOE’s Brookhaven National Laboratory, and calls into question whether the Standard Model needs to be rewritten.

    DOE’s Fermi National Accelerator Laboratory(US) Muon g-2 studio. As muons race around a ring at the Muon g-2 studio, their spin axes twirl, reflecting the influence of unseen particles.

    Meanwhile, researchers in Germany have used Europe’s most powerful high-performance computing (HPC) infrastructure to run new and more precise lattice quantum chromodynamics (lattice QCD) calculations of muons in a magnetic field. The team found a different value for the Standard Model prediction of muon behaviour than what was previously accepted. The new theoretical value is in agreement with the FNAL experiment, suggesting that a revision of the Standard Model is not needed. The results are now published in Nature.

    The team primarily used the supercomputer JUWELS at the Jülich Supercomputing Centre (JSC), with the computational time provided by the Gauss Centre for Supercomputing (GCS) as well at JSC’s JURECA system, along with extensive computations performed at the other two GCS sites—on Hawk at the High-Performance Computing Center Stuttgart (HLRS) and on SuperMUC-NG at the Leibniz Supercomputing Centre (LRZ).

    JURECA supercomputer at Jülich Supercomputing Centre [Forschungszentrum Jülich ] (DE)

    SuperMUC-NG, GCS@LRZ, Lenovo supercomputer at the Leibniz Supercomputing Centre [Leibniz-Rechenzentrum] (DE)

    SuperMUC-NG, GCS@LRZ, Lenovo supercomputer Germany at the Leibniz Supercomputing Centre [Leibniz-Rechenzentrum] (DE)

    Both the experimentalists and theoretical physicists agreed that further research must be done to verify the results published this week. One thing is clear, however: the HPC resources provided by GCS were essential for the scientists to achieve the precision necessary to get these groundbreaking results.

    “For the first time, lattice results have a precision comparable to these experiments. Interestingly our result is consistent with the new FNAL experiment, as opposed to previous theory results, that are in strong disagreement with it,” said Prof. Kalman Szabo, leader of the Helmholtz research group, “Relativistic Quantum Field Theory” at JSC and co-author of the Nature publication. “Before deciding the fate of the Standard Model, one has to understand the theoretical differences, and new lattice QCD computations are inevitable for that.”

    Minor discrepancies, major implications

    When DOE’s Brookhaven National Laboratory(US) researchers recorded unexplained muon behaviour in 2001, the finding left physicists at a loss—the muon, a subatomic particle 200 times heavier than an electron, showed stronger magnetic properties than predicted by the Standard Model of Physics. While the initial finding suggested that muons may be interacting with previously unknown subatomic particles, the results were still not accurate enough to definitely claim a new finding.

    Over the next 20 years, heavy investments in new, hyper-sensitive experiments done at particle accelerator facilities as well as increasingly sophisticated approaches based in theory have sought to confirm or refute the BNL group’s findings. During this time, a research group led by the University of Wuppertal [Universität Wuppertal] (DE)’s Prof. Zoltan Fodor, another co-author of the Nature paper, was progressing with big steps in lattice QCD simulations on the supercomputers provided by GCS. “Though our results on the muon g-2 are new, and have to be thoroughly scrutinized by other groups, we have a long record of computing various physical phenomena in quantum chromodynamics.” said Prof. Fodor. “Our previous major achievements were computing the mass of the proton, the proton-neutron mass difference, the phase diagram of the early universe and a possible solution for the dark matter problem. These paved the way to our most recent result.”

    Lattice QCD calculations allow researchers to accurately plot subatomic particle movements and interactions with extremely fine time resolution. However, they are only as precise as computational power allows—in order to perform these calculations in a timely manner, researchers have had to limit some combination of simulation size, resolution, or time. As computational resources have gotten more powerful, researchers have been able to do more precise simulations.

    “This foundational work shows that Germany’s world-class HPC infrastructure is essential for doing world-class science in Europe”, said Prof. Thomas Lippert, Director of the Jülich Supercomputing Centre, Professor for Quantum Computing and Modular Supercomputing at Goethe University [Goethe-Universität] Frankfurt(DE), current Chairman of the GCS Board of Directors, and also co-author of the Nature paper. “The computational resources of GCS not only play a central role in deepening the discourse on muon measurements, but they help European scientists and engineers become leaders in many scientific, industrial, and societal research areas.”

    While Fodor, Lippert, Szabo, and the team who published the Nature paper currently use their calculations to cool the claims of physics beyond the Standard Model, the researchers are also excited to continue working with international colleagues to definitively solve the mystery surrounding muon magnetism. The team anticipates that even more powerful HPC systems will be necessary to prove the existence of physics beyond the Standard Model. “The DOE’s Fermi National Accelerator Laboratory(US) experiment will increase the precision by a factor of four in two years. We theorists have to keep up with this pace if we want to fully exploit the new physics discovery potential of muons.” Szabo said.

    Further Information:
    Physical Review Letters

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Gauss Centre for Supercomputing (DE) combines the three national supercomputing centres HLRS (High Performance Computing Center Stuttgart [Hochleistungsrechnungszentrum Stuttgart] (DE), JSC (Jülich Supercomputing Centre [Forschungszentrum Jülich ] (DE)), and LRZ (Leibniz Supercomputing Centre [Leibniz-Rechenzentrum](DE) into Germany’s Tier-0 supercomputing institution. Each GCS member centre host supercomputers well beyond the 1 Petaflops performance mark. Concertedly, the three centres provide the largest and most powerful supercomputing infrastructure in all of Europe to serve a wide range of industrial and research activities in various disciplines. They also provide top-class training and education for the national as well as the European High Performance Computing (HPC) community.

    GCS is the German member of PRACE (Partnership for Advance Computing in Europe), an international non-profit association consisting of 25 member countries, whose representative organizations create a pan-European supercomputing infrastructure, providing access to computing and data management resources and services for large-scale scientific and engineering applications at the highest performance level.

    Gauss Centre for Supercomputing LRZ – Leibniz Supercomputing Centre Garching

    GCS is jointly funded by the German Federal Ministry of Education and Research and the federal states of Baden-Württemberg, Bavaria and North Rhine-Westphalia.

    GCS has its headquarters in Berlin, Germany.

     
  • richardmitnick 11:56 am on March 5, 2021 Permalink | Reply
    Tags: "Physicists Just Found 4 New Subatomic Particles That May Test The Laws of Nature", , , , , CERN(CH), , Hadrons, , Mesons, , , , Protons and neutrons, , Quarks and antiquarks, , Standard Model of Particle Physics, , Tetraquarks and pentaquarks, The four new particles we've discovered recently are all tetraquarks with a charm quark pair and two other quarks., The standard model is certainly not the last word in the understanding of particles., These models are crucial to achieve the ultimate goal of the LHC: find physics beyond the standard model.   

    From CERN(CH) via Science Alert(AU): “Physicists Just Found 4 New Subatomic Particles That May Test The Laws of Nature” 

    Cern New Bloc

    Cern New Particle Event


    From CERN(CH)

    via

    ScienceAlert

    Science Alert(AU)

    5 MARCH 2021
    PATRICK KOPPENBURG
    Research Fellow in Particle Physics
    Dutch National Institute for Subatomic Physics, Dutch Research Council (NWO – Nederlandse Organisatie voor Wetenschappelijk Onderzoek)(NL)

    Harry Cliff
    Particle physicist
    University of Cambridge(UK).

    1
    The Large Hadron Collider. Credit: CERN.

    This month is a time to celebrate. CERN has just announced the discovery of four brand new particles [3 March 2021: Observation of two ccus tetraquarks and two ccss tetraquarks.] at the Large Hadron Collider (LHC) in Geneva.

    This means that the LHC has now found a total of 59 new particles, in addition to the Nobel prize-winning Higgs boson, since it started colliding protons – particles that make up the atomic nucleus along with neutrons – in 2009.

    Excitingly, while some of these new particles were expected based on our established theories, some were altogether more surprising.

    The LHC’s goal is to explore the structure of matter at the shortest distances and highest energies ever probed in the lab – testing our current best theory of nature: the Standard Model of Particle Physics.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS).

    And the LHC has delivered the goods – it enabled scientists to discover the Higgs boson [below], the last missing piece of the model. That said, the theory is still far from being fully understood.

    One of its most troublesome features is its description of the strong interaction which holds the atomic nucleus together. The nucleus is made up of protons and neutrons, which are in turn each composed of three tiny particles called quarks (there are six different kinds of quarks: up, down, charm, strange, top and bottom).

    If we switched the strong force off for a second, all matter would immediately disintegrate into a soup of loose quarks – a state that existed for a fleeting instant at the beginning of the universe.

    Don’t get us wrong: the theory of the strong interaction, pretentiously called Quantum Chromodynamics, is on very solid footing. It describes how quarks interact through the strong interaction by exchanging particles called gluons. You can think of gluons as analogues of the more familiar photon, the particle of light and carrier of the electromagnetic interaction.

    However, the way gluons interact with quarks makes the strong interaction behave very differently from electromagnetism. While the electromagnetic interaction gets weaker as you pull two charged particles apart, the strong interaction actually gets stronger as you pull two quarks apart.

    As a result, quarks are forever locked up inside particles called hadrons – particles made of two or more quarks – which includes protons and neutrons. Unless, of course, you smash them open at incredible speeds, as we are doing at Cern.

    To complicate matters further, all the particles in the standard model have antiparticles which are nearly identical to themselves but with the opposite charge (or other quantum property). If you pull a quark out of a proton, the force will eventually be strong enough to create a quark-antiquark pair, with the newly created quark going into the proton.

    You end up with a proton and a brand new “meson”, a particle made of a quark and an antiquark. This may sound weird but according to quantum mechanics, which rules the universe on the smallest of scales, particles can pop out of empty space.

    This has been shown repeatedly by experiments – we have never seen a lone quark. An unpleasant feature of the theory of the strong interaction is that calculations of what would be a simple process in electromagnetism can end up being impossibly complicated. We therefore cannot (yet) prove theoretically that quarks can’t exist on their own.

    Worse still, we can’t even calculate which combinations of quarks would be viable in nature and which would not.

    2
    Illustration of a tetraquark. Credit: CERN.

    When quarks were first discovered, scientists realized that several combinations should be possible in theory. This included pairs of quarks and antiquarks (mesons); three quarks (baryons); three antiquarks (antibaryons); two quarks and two antiquarks (tetraquarks); and four quarks and one antiquark (pentaquarks) – as long as the number of quarks minus antiquarks in each combination was a multiple of three.

    For a long time, only baryons and mesons were seen in experiments. But in 2003, the Belle experiment in Japan discovered a particle that didn’t fit in anywhere.

    KEK Belle detector, at the High Energy Accelerator Research Organisation (KEK) in Tsukuba, Ibaraki Prefecture, Japan.

    Belle II KEK High Energy Accelerator Research Organization Tsukuba, Japan.

    It turned out to be the first of a long series of tetraquarks.

    In 2015, the LHCb experiment [below] at the LHC discovered two pentaquarks.

    3
    Is a pentaquark tightly (above) or weakly bound (see image below)? Credit: CERN.

    The four new particles we’ve discovered recently are all tetraquarks with a charm quark pair and two other quarks. All these objects are particles in the same way as the proton and the neutron are particles. But they are not fundamental particles: quarks and electrons are the true building blocks of matter.

    Charming new particles

    The LHC has now discovered 59 new hadrons. These include the tetraquarks most recently discovered, but also new mesons and baryons. All these new particles contain heavy quarks such as “charm” and “bottom”.

    These hadrons are interesting to study. They tell us what nature considers acceptable as a bound combination of quarks, even if only for very short times.

    They also tell us what nature does not like. For example, why do all tetra- and pentaquarks contain a charm-quark pair (with just one exception)? And why are there no corresponding particles with strange-quark pairs? There is currently no explanation.

    4
    Is a pentaquark a molecule? A meson (left) interacting with a proton (right). Credit: CERN.

    Another mystery is how these particles are bound together by the strong interaction. One school of theorists considers them to be compact objects, like the proton or the neutron.

    Others claim they are akin to “molecules” formed by two loosely bound hadrons. Each newly found hadron allows experiments to measure its mass and other properties, which tell us something about how the strong interaction behaves. This helps bridge the gap between experiment and theory. The more hadrons we can find, the better we can tune the models to the experimental facts.

    These models are crucial to achieve the ultimate goal of the LHC: find physics beyond the standard model. Despite its successes, the standard model is certainly not the last word in the understanding of particles. It is for instance inconsistent with cosmological models describing the formation of the universe.

    The LHC is searching for new fundamental particles that could explain these discrepancies. These particles could be visible at the LHC, but hidden in the background of particle interactions. Or they could show up as small quantum mechanical effects in known processes.

    In either case, a better understanding of the strong interaction is needed to find them. With each new hadron, we improve our knowledge of nature’s laws, leading us to a better description of the most fundamental properties of matter.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN(CH) in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier(CH)

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Maximilien Brice and Julien Marius Ordan.


    SixTRack CERN LHC particles

    The European Organization for Nuclear Research (Organisation européenne pour la recherche nucléaire)(EU), known as CERN, is a European research organization that operates the largest particle physics laboratory in the world. Established in 1954, the organization is based in a northwest suburb of Geneva on the Franco–Swiss border and has 23 member states. Israel is the only non-European country granted full membership. CERN is an official United Nations Observer.

    The acronym CERN is also used to refer to the laboratory, which in 2019 had 2,660 scientific, technical, and administrative staff members, and hosted about 12,400 users from institutions in more than 70 countries. In 2016 CERN generated 49 petabytes of data.

    CERN’s main function is to provide the particle accelerators and other infrastructure needed for high-energy physics research – as a result, numerous experiments have been constructed at CERN through international collaborations. The main site at Meyrin hosts a large computing facility, which is primarily used to store and analyse data from experiments, as well as simulate events. Researchers need remote access to these facilities, so the lab has historically been a major wide area network hub. CERN is also the birthplace of the World Wide Web.

    The convention establishing CERN was ratified on 29 September 1954 by 12 countries in Western Europe. The acronym CERN originally represented the French words for Conseil Européen pour la Recherche Nucléaire (European Council for Nuclear Research), which was a provisional council for building the laboratory, established by 12 European governments in 1952. The acronym was retained for the new laboratory after the provisional council was dissolved, even though the name changed to the current Organisation Européenne pour la Recherche Nucléaire (European Organization for Nuclear Research)(EU) in 1954. According to Lew Kowarski, a former director of CERN, when the name was changed, the abbreviation could have become the awkward OERN, and Werner Heisenberg said that this could “still be CERN even if the name is [not]”.

    CERN’s first president was Sir Benjamin Lockspeiser. Edoardo Amaldi was the general secretary of CERN at its early stages when operations were still provisional, while the first Director-General (1954) was Felix Bloch.

    The laboratory was originally devoted to the study of atomic nuclei, but was soon applied to higher-energy physics, concerned mainly with the study of interactions between subatomic particles. Therefore, the laboratory operated by CERN is commonly referred to as the European laboratory for particle physics (Laboratoire européen pour la physique des particules), which better describes the research being performed there.

    Founding members

    At the sixth session of the CERN Council, which took place in Paris from 29 June – 1 July 1953, the convention establishing the organization was signed, subject to ratification, by 12 states. The convention was gradually ratified by the 12 founding Member States: Belgium, Denmark, France, the Federal Republic of Germany, Greece, Italy, the Netherlands, Norway, Sweden, Switzerland, the United Kingdom, and “Yugoslavia”.

    Scientific achievements

    Several important achievements in particle physics have been made through experiments at CERN. They include:

    1973: The discovery of neutral currents in the Gargamelle bubble chamber.
    1983: The discovery of W and Z bosons in the UA1 and UA2 experiments.
    1989: The determination of the number of light neutrino families at the Large Electron–Positron Collider (LEP) operating on the Z boson peak.
    1995: The first creation of antihydrogen atoms in the PS210 experiment.
    1999: The discovery of direct CP violation in the NA48 experiment.
    2010: The isolation of 38 atoms of antihydrogen.
    2011: Maintaining antihydrogen for over 15 minutes.
    2012: A boson with mass around 125 GeV/c2 consistent with the long-sought Higgs boson.

    In September 2011, CERN attracted media attention when the OPERA Collaboration reported the detection of possibly faster-than-light neutrinos. Further tests showed that the results were flawed due to an incorrectly connected GPS synchronization cable.

    The 1984 Nobel Prize for Physics was awarded to Carlo Rubbia and Simon van der Meer for the developments that resulted in the discoveries of the W and Z bosons. The 1992 Nobel Prize for Physics was awarded to CERN staff researcher Georges Charpak “for his invention and development of particle detectors, in particular the multiwire proportional chamber”. The 2013 Nobel Prize for Physics was awarded to François Englert and Peter Higgs for the theoretical description of the Higgs mechanism in the year after the Higgs boson was found by CERN experiments.

    Computer science

    The World Wide Web began as a CERN project named ENQUIRE, initiated by Tim Berners-Lee in 1989 and Robert Cailliau in 1990. Berners-Lee and Cailliau were jointly honoured by the Association for Computing Machinery in 1995 for their contributions to the development of the World Wide Web.

    Current complex

    CERN operates a network of six accelerators and a decelerator. Each machine in the chain increases the energy of particle beams before delivering them to experiments or to the next more powerful accelerator. Currently (as of 2019) active machines are:

    The LINAC 3 linear accelerator generating low energy particles. It provides heavy ions at 4.2 MeV/u for injection into the Low Energy Ion Ring (LEIR).
    The Proton Synchrotron Booster increases the energy of particles generated by the proton linear accelerator before they are transferred to the other accelerators.
    The Low Energy Ion Ring (LEIR) accelerates the ions from the ion linear accelerator LINAC 3, before transferring them to the Proton Synchrotron (PS). This accelerator was commissioned in 2005, after having been reconfigured from the previous Low Energy Antiproton Ring (LEAR).
    The 28 GeV Proton Synchrotron (PS), built during 1954—1959 and still operating as a feeder to the more powerful SPS.
    The Super Proton Synchrotron (SPS), a circular accelerator with a diameter of 2 kilometres built in a tunnel, which started operation in 1976. It was designed to deliver an energy of 300 GeV and was gradually upgraded to 450 GeV. As well as having its own beamlines for fixed-target experiments (currently COMPASS and NA62), it has been operated as a proton–antiproton collider (the SppS collider), and for accelerating high energy electrons and positrons which were injected into the Large Electron–Positron Collider (LEP). Since 2008, it has been used to inject protons and heavy ions into the Large Hadron Collider (LHC).
    The On-Line Isotope Mass Separator (ISOLDE), which is used to study unstable nuclei. The radioactive ions are produced by the impact of protons at an energy of 1.0–1.4 GeV from the Proton Synchrotron Booster. It was first commissioned in 1967 and was rebuilt with major upgrades in 1974 and 1992.
    The Antiproton Decelerator (AD), which reduces the velocity of antiprotons to about 10% of the speed of light for research of antimatter.[50] The AD machine was reconfigured from the previous Antiproton Collector (AC) machine.
    The AWAKE experiment, which is a proof-of-principle plasma wakefield accelerator.
    The CERN Linear Electron Accelerator for Research (CLEAR) accelerator research and development facility.

    Large Hadron Collider

    Many activities at CERN currently involve operating the Large Hadron Collider (LHC) and the experiments for it. The LHC represents a large-scale, worldwide scientific cooperation project.

    The LHC tunnel is located 100 metres underground, in the region between the Geneva International Airport and the nearby Jura mountains. The majority of its length is on the French side of the border. It uses the 27 km circumference circular tunnel previously occupied by the Large Electron–Positron Collider (LEP), which was shut down in November 2000. CERN’s existing PS/SPS accelerator complexes are used to pre-accelerate protons and lead ions which are then injected into the LHC.

    Eight experiments (CMS, ATLAS, LHCb, MoEDAL, TOTEM, LHCf, FASER and ALICE) are located along the collider; each of them studies particle collisions from a different aspect, and with different technologies. Construction for these experiments required an extraordinary engineering effort. For example, a special crane was rented from Belgium to lower pieces of the CMS detector into its cavern, since each piece weighed nearly 2,000 tons. The first of the approximately 5,000 magnets necessary for construction was lowered down a special shaft at 13:00 GMT on 7 March 2005.

    The LHC has begun to generate vast quantities of data, which CERN streams to laboratories around the world for distributed processing (making use of a specialized grid infrastructure, the LHC Computing Grid). During April 2005, a trial successfully streamed 600 MB/s to seven different sites across the world.

    The initial particle beams were injected into the LHC August 2008. The first beam was circulated through the entire LHC on 10 September 2008, but the system failed 10 days later because of a faulty magnet connection, and it was stopped for repairs on 19 September 2008.

    The LHC resumed operation on 20 November 2009 by successfully circulating two beams, each with an energy of 3.5 teraelectronvolts (TeV). The challenge for the engineers was then to try to line up the two beams so that they smashed into each other. This is like “firing two needles across the Atlantic and getting them to hit each other” according to Steve Myers, director for accelerators and technology.

    On 30 March 2010, the LHC successfully collided two proton beams with 3.5 TeV of energy per proton, resulting in a 7 TeV collision energy. However, this was just the start of what was needed for the expected discovery of the Higgs boson. When the 7 TeV experimental period ended, the LHC revved to 8 TeV (4 TeV per proton) starting March 2012, and soon began particle collisions at that energy. In July 2012, CERN scientists announced the discovery of a new sub-atomic particle that was later confirmed to be the Higgs boson.

    CERN CMS Higgs Event May 27, 2012.


    CERN ATLAS Higgs Event
    June 12, 2012.


    Peter Higgs

    In March 2013, CERN announced that the measurements performed on the newly found particle allowed it to conclude that this is a Higgs boson. In early 2013, the LHC was deactivated for a two-year maintenance period, to strengthen the electrical connections between magnets inside the accelerator and for other upgrades.

    On 5 April 2015, after two years of maintenance and consolidation, the LHC restarted for a second run. The first ramp to the record-breaking energy of 6.5 TeV was performed on 10 April 2015. In 2016, the design collision rate was exceeded for the first time. A second two-year period of shutdown begun at the end of 2018.

    Accelerators under construction

    As of October 2019, the construction is on-going to upgrade the LHC’s luminosity in a project called High Luminosity LHC (HL-LHC).

    This project should see the LHC accelerator upgraded by 2026 to an order of magnitude higher luminosity.

    As part of the HL-LHC upgrade project, also other CERN accelerators and their subsystems are receiving upgrades. Among other work, the LINAC 2 linear accelerator injector was decommissioned, to be replaced by a new injector accelerator, the LINAC4 in 2020.

    Possible future accelerators

    CERN, in collaboration with groups worldwide, is investigating two main concepts for future accelerators: A linear electron-positron collider with a new acceleration concept to increase the energy (CLIC) and a larger version of the LHC, a project currently named Future Circular Collider.

    CLIC collider

    CERN FCC Future Circular Collider details of proposed 100km-diameter successor to LHC.

    Not discussed or described, but worthy of consideration is the ILC, International Linear Collider in the planning stages for construction in Japan.

    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan.

    Participation

    Since its foundation by 12 members in 1954, CERN regularly accepted new members. All new members have remained in the organization continuously since their accession, except Spain and Yugoslavia. Spain first joined CERN in 1961, withdrew in 1969, and rejoined in 1983. Yugoslavia was a founding member of CERN but quit in 1961. Of the 23 members, Israel joined CERN as a full member on 6 January 2014, becoming the first (and currently only) non-European full member.

    Enlargement

    Associate Members, Candidates:

    Turkey signed an association agreement on 12 May 2014 and became an associate member on 6 May 2015.
    Pakistan signed an association agreement on 19 December 2014 and became an associate member on 31 July 2015.
    Cyprus signed an association agreement on 5 October 2012 and became an associate Member in the pre-stage to membership on 1 April 2016.
    Ukraine signed an association agreement on 3 October 2013. The agreement was ratified on 5 October 2016.
    India signed an association agreement on 21 November 2016. The agreement was ratified on 16 January 2017.
    Slovenia was approved for admission as an Associate Member state in the pre-stage to membership on 16 December 2016. The agreement was ratified on 4 July 2017.
    Lithuania was approved for admission as an Associate Member state on 16 June 2017. The association agreement was signed on 27 June 2017 and ratified on 8 January 2018.
    Croatia was approved for admission as an Associate Member state on 28 February 2019. The agreement was ratified on 10 October 2019.
    Estonia was approved for admission as an Associate Member in the pre-stage to membership state on 19 June 2020. The agreement was ratified on 1 February 2021.

     
  • richardmitnick 3:14 pm on March 10, 2020 Permalink | Reply
    Tags: "Accounting for the Higgs", , , , , , , , Standard Model of Particle Physics,   

    From Symmetry: “Accounting for the Higgs” 

    Symmetry Mag
    From Symmetry<

    03/10/20
    Sarah Charley

    Only a fraction of collision events that look like they produce a Higgs boson actually produce a Higgs boson. Luckily, it doesn’t matter.

    CERN CMS Higgs Event May 27, 2012

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    I’ll let you in on a little secret: Even though physicists have produced millions of Higgs bosons at the Large Hadron Collider, they’ve never actually seen one. Higgs bosons are fragile things that dissolve immediately after they’re born. But as they die, they produce other particles, which, if they’re created at the LHC, can travel through a particle detector and leave recognizable signatures.

    Here’s another secret: Higgs signatures are identical to the signatures of numerous other processes. In fact, every time the Higgs signs its name in a detector, there are many more background processes leaving the exact same marks.

    For instance, one of the Higgs boson’s cleanest signatures is two photons with a combined mass of around 125 billion electronvolts. But for every 10 diphotons that look like a Higgs signature, only about one event actually belongs to a Higgs.

    So how can scientists study something that they cannot see and cannot isolate? They employ the same technique FBI agents use to uncover illegal money laundering schemes: accounting.

    In money laundering, “dirty” money (from illegal activities) is mixed with “clean” money from a legitimate business like a car wash. It all looks the same, so determining which Benjamins came from drugs versus which came from detailing is impossible. But agents don’t need to look at the individual dollars; they just need to look for suspiciously large spikes in profit that cannot be explained by regular business activities.

    In physics, the accounting comes from a much-loved set of equations called the Standard Model.

    Standard Model of Particle Physics, Quantum Diaries

    Physicists have spent decades building and perfecting the Standard Model, which tells them what percentage of the time different subatomic processes should happen. Scientists know which signatures are associated with which processes, so if they see a signature more often than expected, it means there is something happening outside the purview of the Standard Model: a new process.

    Clever accounting is how scientists originally discovered the Higgs boson in 2012. Theorists predicted what the Higgs signatures should look like, and when physicists went searching, they consistently saw some of these signatures more frequently than they could explain without the Higgs boson. When scientists added the mathematics for the Higgs boson into the equations, the predictions matched the data.

    Today, physicists use this accounting method to search for new particles. Many of these new particles are predicted to be rarer than Higgs bosons (for reference, Higgs bosons are produced in about one in a billion collisions). Many processes are also less clear-cut, and just the act of standardizing the accounting is a challenge. (To return to the money laundering analogy, it would be like FBI agents investigating an upscale bar, where a sudden excess could be explained by a generous tip.)

    To find these complex and subtle signs of mischief, scientists need a huge amount of data and a finely tuned model. Future runs of the LHC will be dedicated to building up these enormous datasets so that scientists can dig through the books for numbers that the Standard Model cannot explain.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 2:11 pm on June 11, 2019 Permalink | Reply
    Tags: , , , , Future Circular Collider (FCC), If we don’t push the frontiers of physics we’ll never learn what lies beyond our current understanding., , Lepton collider, New accelerators ecplored, , , , Proton collider, Standard Model of Particle Physics   

    From Ethan Siegel: “Does Particle Physics Have A Future On Earth?” 

    From Ethan Siegel
    Jun 11. 2019

    1
    The inside of the LHC, where protons pass each other at 299,792,455 m/s, just 3 m/s shy of the speed of light. As powerful as the LHC is, the cancelled SSC could have been three times as powerful, and may have revealed secrets of nature that are inaccessible at the LHC. (CERN)

    If we don’t push the frontiers of physics, we’ll never learn what lies beyond our current understanding.

    At a fundamental level, what is our Universe made of? This question has driven physics forward for centuries. Even with all the advances we’ve made, we still don’t know it all. While the Large Hadron Collider discovered the Higgs boson and completed the Standard Model earlier this decade, the full suite of the particles we know of only make up 5% of the total energy in the Universe.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    Standard Model of Particle Physics

    We don’t know what dark matter is, but the indirect evidence for it is overwhelming.

    Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com

    Coma cluster via NASA/ESA Hubble

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL)


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu

    Same deal with dark energy.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Timeline of the Inflationary Universe WMAP

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.

    Or questions like why the fundamental particles have the masses they do, or why neutrinos aren’t massless, or why our Universe is made of matter and not antimatter. Our current tools and searches have not answered these great existential puzzles of modern physics. Particle physics now faces an incredible dilemma: try harder, or give up.

    2
    The Standard Model of particle physics accounts for three of the four forces (excepting gravity), the full suite of discovered particles, and all of their interactions. Whether there are additional particles and/or interactions that are discoverable with colliders we can build on Earth is a debatable subject, but one we’ll only know the answer to if we explore past the known energy frontier. (CONTEMPORARY PHYSICS EDUCATION PROJECT / DOE / NSF / LBNL)

    The particles and interactions that we know of are all governed by the Standard Model of particle physics, plus gravity, dark matter, and dark energy. In particle physics experiments, however, it’s the Standard Model alone that matters. The six quarks, charged leptons and neutrinos, gluons, photon, gauge bosons and Higgs boson are all that it predicts, and each particle has been not only discovered, but their properties have been measured.

    As a result, the Standard Model is perhaps a victim of its own success. The masses, spins, lifetimes, interaction strengths, and decay ratios of every particle and antiparticle have all been measured, and they agree with the Standard Model’s predictions at every turn. There are enormous puzzles about our Universe, and particle physics has given us no experimental indications of where or how they might be solved.

    3
    The particles and antiparticles of the Standard Model have now all been directly detected, with the last holdout, the Higgs Boson, falling at the LHC earlier this decade. All of these particles can be created at LHC energies, and the masses of the particles lead to fundamental constants that are absolutely necessary to describe them fully. These particles can be well-described by the physics of the quantum field theories underlying the Standard Model, but they do not describe everything, like dark matter. (E. SIEGEL / BEYOND THE GALAXY)

    It might be tempting, therefore, to presume that building a superior particle collider would be a fruitless endeavor. Indeed, this could be the case. The Standard Model of particle physics has explicit predictions for the couplings that occur between particles. While there are a number of parameters that remain poorly determined at present, it’s conceivable that there are no new particles that a next-generation collider could reveal.

    The heaviest Standard Model particle is the top quark, which takes roughly ~180 GeV of energy to create. While the Large Hadron Collider can reach energies of 14 TeV (about 80 times the energy needed to create a top quark), there might not be any new particles present to find unless we reach energies in excess of 1,000,000 times as great. This is the great fear of many: the possible existence of a so-called “energy desert” extending for many orders of magnitude.

    4
    There is certainly new physics beyond the Standard Model, but it might not show up until energies far, far greater than what a terrestrial collider could ever reach. Still, whether this scenario is true or not, the only way we’ll know is to look. In the meantime, properties of the known particles can be better explored with a future collider than any other tool. The LHC has failed to reveal, thus far, anything beyond the known particles of the Standard Model. (UNIVERSE-REVIEW.CA)

    But it’s also possible that there is new physics present at a modest scale beyond where we’ve presently probed. There are many theoretical extensions to the Standard Model that are quite generic, where deviations from the Standard Model’s predictions can be detected by a next-generation collider.

    If we want to know what the truth about our Universe is, we have to look, and that means pushing the present frontiers of particle physics into uncharted territory. Right now, the community is debating between multiple approaches, with each one having its pros and cons. The nightmare scenario, however, isn’t that we’ll look and won’t find anything. It’s that infighting and a lack of unity will doom experimental physics forever, and that we won’t get a next-generation collider at all.

    5
    A hypothetical new accelerator, either a long linear one or one inhabiting a large tunnel beneath the Earth, could dwarf the sensitivity to new particles that prior and current colliders can achieve. Even at that, there’s no guarantee we’ll find anything new, but we’re certain to find nothing new if we fail to try. (ILC COLLABORATION)

    When it comes to deciding what collider to build next, there are two generic approaches: a lepton collider (where electrons and positrons are accelerated and collided), and a proton collider (where protons are accelerated and collided). The lepton colliders have the advantages of:

    the fact that leptons are point particles, rather than composite particles,
    100% of the energy from electrons colliding with positrons can be converted into energy for new particles,
    the signal is clean and much easier to extracts,
    and the energy is controllable, meaning we can choose to tune the energy to a specific value and maximize the chance of creating a specific particle.

    Lepton colliders, in general, are great for precision studies, and we haven’t had a cutting-edge one since LEP was operational nearly 20 years ago.

    CERN LEP Collider

    5
    At various center-of-mass energies in electron/positron (lepton) colliders, various Higgs production mechanisms can be reached at explicit energies. While a circular collider can achieve much greater collision rates and production rates of W, Z, H, and t particles, a long-enough linear collider can conceivably reach higher energies, enabling us to probe Higgs production mechanisms that a circular collider cannot reach. This is the main advantage that linear lepton colliders possess; if they are low-energy only (like the proposed ILC), there is no reason not to go circular. (H. ABRAMOWICZ ET AL., EUR. PHYS. J. C 77, 475 (2017))

    It’s very unlikely, unless nature is extremely kind, that a lepton collider will directly discover a new particle, but it may be the best bet for indirectly discovering evidence of particles beyond the Standard Model. We’ve already discovered particles like the W and Z bosons, the Higgs boson, and the top quark, but a lepton collider could both produce them in great abundances and through a variety of channels.

    The more events of interest we create, the more deeply we can probe the Standard Model. The Large Hadron Collider, for example, will be able to tell whether the Higgs behaves consistently with the Standard Model down to about the 1% level. In a wide series of extensions to the Standard Model, ~0.1% deviations are expected, and the right future lepton collider will get you the best physics constraints possible.

    6
    The observed Higgs decay channels vs. the Standard Model agreement, with the latest data from ATLAS and CMS included. The agreement is astounding, and yet frustrating at the same time. By the 2030s, the LHC will have approximately 50 times as much data, but the precisions on many decay channels will still only be known to a few percent. A future collider could increase that precision by multiple orders of magnitude, revealing the existence of potential new particles.(ANDRÉ DAVID, VIA TWITTER)

    These precision studies could be incredibly sensitive to the presence of particles or interactions we haven’t yet discovered. When we create a particle, it has a certain set of branching ratios, or probabilities that it will decay in a variety of ways. The Standard Model makes explicit predictions for those ratios, so if we create a million, or a billion, or a trillion such particles, we can probe those branching ratios to unprecedented precisions.

    If you want better physics constraints, you need more data and better data. It isn’t just the technical considerations that should determine which collider comes next, but also where and how you can get the best personnel, the best infrastructure and support, and where you can build a (or take advantage of an already-existing) strong experimental and theoretical physics community.

    7
    The idea of a linear lepton collider has been bandied about in the particle physics community as the ideal machine to explore post-LHC physics for many decades, but that was under the assumption that the LHC would find a new particle other than the Higgs. If we want to do precision testing of Standard Model particles to indirectly search for new physics, a linear collider may be an inferior option to a circular lepton collider. (REY HORI/KEK)

    There are two general classes proposals for a lepton collider: a circular collider and a linear collider. Linear colliders are simple: accelerate your particles in a straight line and collide them together in the center. With ideal accelerator technology, a linear collider 11 km long could reach energies of 380 GeV: enough to produce the W, Z, Higgs, or top in great abundance. With a 29 km linear collider, you could reach energies of 1.5 TeV, and with a 50 km collider, 3 TeV, although costs rise tremendously to accompany longer lengths.

    Linear colliders are slightly less expensive than circular colliders for the same energy, because you can dig a smaller tunnel to reach the same energies, and they don’t suffer energy losses due to synchrotron radiation, enabling them to reach potentially higher energies. However, the circular colliders offer an enormous advantage: they can produce much greater numbers of particles and collisions.

    Future Circular Collider (FCC)Larger LHC


    The Future Circular Collider is a proposal to build, for the 2030s, a successor to the LHC with a circumference of up to 100 km: nearly four times the size of the present underground tunnels. This will enable, with current magnet technology, the creation of a lepton collider that can produce ~1⁰⁴ times the number of W, Z, H, and t particles that have been produced by prior and current colliders. (CERN / FCC STUDY)

    While a linear collider might be able to produce 10 to 100 times as many collisions as a prior-generation lepton collider like LEP (dependent on energies), a circular version can surpass that easily: producing 10,000 times as many collisions at the energies required to create the Z boson.

    Although circular colliders have substantially higher event rates than linear colliders at the relevant energies that produce Higgs particles as well, they begin to lose their advantage at energies required to produce top quarks, and cannot reach beyond that at all, where linear colliders become dominant.

    Because all of the decay and production processes that occur in these heavy particles scales as either the number of collisions or the square root of the number of collisions, a circular collider has the potential to probe physics with many times the sensitivity of a linear collider.

    7
    A number of the various lepton colliders, with their luminosity (a measure of the collision rate and the number of detections one can make) as a function of center-of-mass collision energy. Note that the red line, which is a circular collider option, offers many more collisions than the linear version, but gets less superior as energy increases. Beyond about 380 GeV, circular colliders cannot reach, and a linear collider like CLIC is the far superior option. (GRANADA STRATEGY MEETING SUMMARY SLIDES / LUCIE LINSSEN (PRIVATE COMMUNICATION))

    The proposed FCC-ee, or the lepton stage of the Future Circular Collider, would realistically discover indirect evidence for any new particles that coupled to the W, Z, Higgs, or top quark with masses up to 70 TeV: five times the maximum energy of the Large Hadron Collider.

    The flipside to a lepton collider is a proton collider, which — at these high energies — is essentially a gluon-gluon collider. This cannot be linear; it must be circular.

    8
    The scale of the proposed Future Circular Collider (FCC), compared with the LHC presently at CERN and the Tevatron, formerly operational at Fermilab. The Future Circular Collider is perhaps the most ambitious proposal for a next-generation collider to date, including both lepton and proton options as various phases of its proposed scientific programme. (PCHARITO / WIKIMEDIA COMMONS)

    There is really only one suitable site for this: CERN, since it not only needs a new, enormous tunnel, but all the infrastructure of the prior stages, which only exist at CERN. (They could be built elsewhere, but the cost would be more expensive than a site where the infrastructure like the LHC and earlier colliders like SPS already exist.)

    The Super Proton Synchrotron (SPS), CERN’s second-largest accelerator.

    Just as the LHC is presently occupying the tunnel previously occupied by LEP, a circular lepton collider could be superseded by a next-generation circular proton collider, such as the proposed FCC-pp. However, you cannot run both an exploratory proton collider and a precision lepton collider simultaneously; you must decommission one to finish the other.

    9
    The CMS detector at CERN, one of the two most powerful particle detectors ever assembled. Every 25 nanoseconds, on average, a new particle bunch collides at the center-point of this detector. A next-generation detector, whether for a lepton or proton collider, may be able to record even more data, faster, and with higher-precision than the CMS or ATLAS detectors can at present. (CERN)

    It’s very important to make the right decision, as we do not know what secrets nature holds beyond the already-explored frontiers. Going to higher energies unlocks the potential for new direct discoveries, while going to higher precisions and greater statistics could provide even stronger indirect evidence for the existence of new physics.

    The first-stage linear colliders are going to cost between 5 and 7 billion dollars, including the tunnel, while a proton collider of four times the LHC’s radius, with magnets twice as strong, 10 times the collision rate and next-generation computing and cryogenics might cost a total of up to $22 billion, offering as big a leap over the LHC as the LHC was over the Tevatron. Some money could be saved if we build the circular lepton and proton colliders one after the other in the same tunnel, which would essentially provide a future for experimental particle physics after the LHC is done running at the end of the 2030s.

    10
    The Standard Model particles and their supersymmetric counterparts. Slightly under 50% of these particles have been discovered, and just over 50% have never showed a trace that they exist. Supersymmetry is an idea that hopes to improve on the Standard Model, but it has yet to make successful predictions about the Universe in attempting to supplant the prevailing theory. However, new colliders are not being proposed to find supersymmetry or dark matter, but to perform generic searches. Regardless of what they’ll find, we’ll learn something new about the Universe itself. (CLAIRE DAVID / CERN)

    The most important thing to remember in all of this is that we aren’t simply continuing to look for supersymmetry, dark matter, or any particular extension of the Standard Model. We have a slew of problems and puzzles that indicate that there must be new physics beyond what we currently understand, and our scientific curiosity compels us to look. In choosing what machine to build, it’s vital to choose the most performant machine: the ones with the highest numbers of collisions at the energies we’re interested in probing.

    Regardless of which specific projects the community chooses, there will be trade-offs. A linear lepton collider can always reach higher energies than a circular one, while a circular one can always create more collisions and go to higher precisions. It can gather just as much data in a tenth the time, and probe for more subtle effects, at the cost of a lower energy reach.

    Will it be successful? Regardless of what we find, that answer is unequivocally yes. In experimental physics, success does not equate to finding something, as some might erroneously believe. Instead, success means knowing something, post-experiment, that you did not know before you did the experiment. To push beyond the presently known frontiers, we’d ideally want both a lepton and a proton collider, at the highest energies and collision rates we can achieve.

    There is no doubt that new technologies and spinoffs will come from whichever collider or colliders come next, but that’s not why we do it. We are after the deepest secrets of nature, the ones that will remain elusive even after the Large Hadron Collider finishes. We have the technical capabilities, the personnel, and the expertise to build it right at our fingertips. All we need is the political and financial will, as a civilization, to seek the ultimate truths about nature.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 12:30 pm on April 13, 2019 Permalink | Reply
    Tags: "Ask Ethan: What Is An Electron?", , Electrons are leptons and thus fermions, Electrons were the first fundamental particles discovered, , , Sometimes the simplest questions of all are the most difficult to meaningfully answer., Standard Model of Particle Physics   

    From Ethan Siegel: “Ask Ethan: What Is An Electron?” 

    From Ethan Siegel
    Apr 13, 2019

    1
    This artist’s illustration shows an electron orbiting an atomic nucleus, where the electron is a fundamental particle but the nucleus can be broken up into still smaller, more fundamental constituents. (NICOLLE RAGER FULLER, NSF)

    Sometimes, the simplest questions of all are the most difficult to meaningfully answer.

    If you were to take any tiny piece of matter in our known Universe and break it up into smaller and smaller constituents, you’d eventually reach a stage where what you were left with was indivisible. Everything on Earth is composed of atoms, which can further be divided into protons, neutrons, and electrons. While protons and neutrons can still be divided farther, electrons cannot. They were the first fundamental particles discovered, and over 100 years later, we still know of no way to split electrons apart. But what, exactly, are they? That’s what Patreon supporter John Duffield wants to know, asking:

    “Please will you describe the electron… explaining what it is, and why it moves the way it does when it interacts with a positron. If you’d also like to explain why it moves the way that it does in an electric field, a magnetic field, and a gravitational field, that would be nice. An explanation of charge would be nice too, and an explanation of why the electron has mass.”

    Here’s what we know, at the deepest level, about one of the most common fundamental particles around.

    2
    The hydrogen atom, one of the most important building blocks of matter, exists in an excited quantum state with a particular magnetic quantum number. Even though its properties are well-defined, certain questions, like ‘where is the electron in this atom,’ only have probabilistically-determined answers. (WIKIMEDIA COMMONS USER BERNDTHALLER)

    In order to understand the electron, you have to first understand what it means to be a particle. In the quantum Universe, everything is both a particle and a wave simultaneously, where many of its exact properties cannot be perfectly known. The more you try and pin down a particle’s position, you destroy information about its momentum, and vice versa. If the particle is unstable, the duration of its lifetime will affect how well you’re able to know its mass or intrinsic energy. And if the particle has an intrinsic spin to it, measuring its spin in one direction destroys all the information you could know about how it’s spinning in the other directions.

    3
    Electrons, like all spin-1/2 fermions, have two possible spin orientations when placed in a magnetic field. Performing an experiment like this determines their spin orientation in one dimension, but destroys any information about their spin orientation in the other two dimensions as a result. This is a frustrating property inherent to quantum mechanics.(CK-12 FOUNDATION / WIKIMEDIA COMMONS)

    If you measure it at one particular moment in time, information about its future properties cannot be known to arbitrary accuracy, even if the laws governing it are completely understood. In the quantum Universe, many physical properties have a fundamental, inherent uncertainty to them.

    But that’s not true of everything. The quantum rules that govern the Universe are more complex than just the counterintuitive parts, like Heisenberg uncertainty.

    4
    An illustration between the inherent uncertainty between position and momentum at the quantum level. There is a limit to how well you can measure these two quantities simultaneously, and uncertainty shows up in places where people often least expect it. (E. SIEGEL / WIKIMEDIA COMMONS USER MASCHEN)

    The Universe is made up of quanta, which are those components of reality that cannot be further divided into smaller components. The most successful model of those smallest, fundamental components that compose our reality come to us in the form of the creatively-named Standard Model.

    In the Standard Model, there are two separate classes of quanta:

    the particles that make up the matter and antimatter in our material Universe, and
    the particles responsible for the forces that govern their interactions.

    The former class of particles are known as fermions, while the latter class are known as bosons.

    5
    The particles of the standard model, with masses (in MeV) in the upper right. The fermions make up the three leftmost columns and possess half-integer spins; the bosons populate the two columns on the right and have integer spins. While all particles have a corresponding antiparticle, only the fermions can be matter or antimatter. (WIKIMEDIA COMMONS USER MISSMJ, PBS NOVA, FERMILAB, OFFICE OF SCIENCE, UNITED STATES DEPARTMENT OF ENERGY, PARTICLE DATA GROUP)

    Even though, in the quantum Universe, many properties have an intrinsic uncertainty to them, there are some properties that we can know exactly. We call these quantum numbers, which are conserved quantities in not only individual particles, but in the Universe as a whole. In particular, these include properties like:

    electric charge,
    color charge,
    magnetic charge,
    angular momentum,
    baryon number,
    lepton number,
    and lepton family number.

    These are properties that are always conserved, as far as we can tell.

    6
    The quarks, antiquarks, and gluons of the standard model have a color charge, in addition to all the other properties like mass and electric charge that other particles and antiparticles possess. All of these particles, to the best we can tell, are truly point-like, and come in three generations. At higher energies, it is possible that still additional types of particles will exist, but they would go beyond the Standard Model’s description. (E. SIEGEL / BEYOND THE GALAXY)

    In addition, there are a few other properties that are conserved in the strong and electromagnetic interactions, but whose conservation can be violated by the weak interactions. These include

    weak hypercharge,
    weak isospin,
    and quark flavor numbers (like strangeness, charm, bottomness, or topness).

    Every quantum particle that exists has specific values for these quantum numbers that are allowed. Some of them, like electric charge, never change, as an electron will always have an electric charge of -1 and an up quark will always have an electric charge of +⅔. But others, like angular momentum, can take on various values, which can be either +½ or -½ for an electron, or -1, 0, or +1 for a W-boson.

    7
    The pattern of weak isospin, T3, and weak hypercharge, Y_W, and color charge of all known elementary particles, rotated by the weak mixing angle to show electric charge, Q, roughly along the vertical. The neutral Higgs field (gray square) breaks the electroweak symmetry and interacts with other particles to give them mass. (CJEAN42 OF WIKIMEDIA COMMONS)

    The particles that make up matter, known as the fermions, all have antimatter counterparts: the anti-fermions. The bosons, which are responsible for the forces and interactions between the particles, are neither matter nor antimatter, but can interact with either one, as well as themselves.

    The way we view these interactions is by exchanges of bosons between fermions and/or anti-fermions. You can have a fermion interact with a boson and give rise to another fermion; you can have a fermion and an anti-fermion interact and give rise to a boson; you can have an anti-fermion interact with a boson and give rise to another anti-fermion. As long as you conserve all the total quantum numbers you are required to conserve and obey the rules set forth by the Standard Model’s particles and interactions, anything that is not forbidden will inevitably occur with some finite probability.

    8
    The characteristic signals of positron/electron annihilation at low energies, a 511 keV photon line, has been thoroughly measured by the ESA’s INTEGRAL satellite. (J. KNÖDLSEDER (CESR) AND SPI TEAM; THE ESA’S INTEGRAL OBSERVATORY)

    ESA/Integral

    It’s important, before we enumerate what all the properties of the electron are, to note that this is merely the best understanding we have today of what the Universe is made of at a fundamental level. We do not know if there is a more fundamental description; we do not know if the Standard Model will someday be superseded by a more complete theory; we do not know if there are additional quantum numbers and when they might be (or might not be) conserved; we do not know how to incorporate gravity into the Standard Model.

    Although it should always go without saying, it warrants being stated explicitly here: these properties provide the best description of the electron as we know it today. In the future, they may turn out to be an incomplete description, or only an approximate description of what an electron (or a more fundamental entity that makes up our reality) truly is.

    9
    This diagram displays the structure of the standard model (in a way that displays the key relationships and patterns more completely, and less misleadingly, than in the more familiar image based on a 4×4 square of particles). In particular, this diagram depicts all of the particles in the Standard Model (including their letter names, masses, spins, handedness, charges, and interactions with the gauge bosons: i.e., with the strong and electroweak forces). (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    With that said, an electron is:

    a fermion (and not an antifermion),
    with an electric charge of -1 (in units of fundamental electric charge),
    with zero magnetic charge
    and zero color charge,
    with a fundamental intrinsic angular momentum (or spin) of ½, meaning it can take on values of +½ or -½,
    with a baryon number of 0,
    with a lepton number of +1,
    with a lepton family number of +1 in the electron family, 0 in the muon family and 0 in the tau family,
    with a weak isospin of -½,
    and with a weak hypercharge of -1.

    Those are the quantum numbers of the electron. It does couple to the weak interaction (and hence, the W and Z bosons) and the electromagnetic interaction (and hence, the photon), and also the Higgs boson (and hence, it has a non-zero rest mass). It does not couple to the strong force, and therefore cannot interact with the gluons.

    10
    The Positronium Beam experiment at University College London, shown here, combines electrons and positrons to create the quasi-atom known as positronium, which decays with a mean lifetime of approximately 1 microsecond. The decay products are well-predicted by the Standard Model, and usually proceed into 2 or 3 photons, depending on the relative spins of the electron and positron composing positronium. (UCL)

    If an electron and a positron (which has some of the same quantum numbers and some quantum numbers which are opposites) interact, there are finite probabilities that they will interact through either the electromagnetic or the weak force.

    Most interactions will be dominated by the possibility that electrons and positrons will attract one another, owing to their opposite electric charges. They can form an unstable atom-like entity known as positronium, where they become bound together similar to how protons and electrons bind together, except the electron and positron are of equal mass.

    However, because the electron is matter and the positron is antimatter, they can also annihilate. Depending on a number of factors, such as their relative spins, there are finite probabilities for how they will decay: into 2, 3, 4, 5, or greater numbers of photons. (But 2 or 3 are most common.)

    11
    The rest masses of the fundamental particles in the Universe determine when and under what conditions they can be created, and also describe how they will curve spacetime in General Relativity. The properties of particles, fields, and spacetime are all required to describe the Universe we inhabit. (FIG. 15–04A FROM UNIVERSE-REVIEW.CA)

    When you subject an electron to an electric or magnetic field, photons interact with it to change its momentum; in simple terms, that means they cause an acceleration. Because an electron also has a rest mass associated with it, courtesy of its interactions with the Higgs boson, it also accelerates in a gravitational field. However, the Standard Model cannot account for this, nor can any quantum theory we know of.

    Until we have a quantum theory of gravity, we have to take the mass and energy of an electron and put it into General Relativity: our non-quantum theory of gravitation. This is sufficient to give us the correct answer for every experiment we’ve been able to design, but it’s going to break down at some fundamental level. For example, if you ask what happens to the gravitational field of a single electron as it passes through a double slit, General Relativity has no answer.

    12
    The wave pattern for electrons passing through a double slit, one-at-a-time. If you measure “which slit” the electron goes through, you destroy the quantum interference pattern shown here. The rules of the Standard Model and of General Relativity do not tell us what happens to the gravitational field of an electron as it passes through a double slit; this would require something that goes beyond our current understanding, like quantum gravity. (DR. TONOMURA AND BELSAZAR OF WIKIMEDIA COMMONS)

    Electrons are incredibly important components of our Universe, as there are approximately 1080 of them contained within our observable Universe. They are required for the assembly of atoms, which form molecules, humans, planets and more, and are used in our world for everything from magnets to computers to the macroscopic sensation of touch.

    But the reason they have the properties they do is because of the fundamental quantum rules that govern the Universe. The Standard Model is the best description we have of those rules today, and it also provides the best description of the ways that electrons can and do interact, as well as describing which interactions they cannot undergo.

    Why electrons have these particular properties is beyond the scope of the Standard Model, though. For all that we know, we can only describe how the Universe works. Why it works the way it does is still an open question that we have no satisfactory answer for. All we can do is continue to investigate, and work towards a more fundamental answer.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 5:34 pm on August 30, 2018 Permalink | Reply
    Tags: , Borexino observatory, , , , , DarkSide experiment, Davide D’Angelo-physical scientist, , , , , , , Pobbile dark matter candidates-axions gravitinos Massive Astrophysical Compact Halo Objects (MACHOs) and Weakly Interacting Massive Particles (WMIPs.)), SABRE-Sodium Iodide with Active Background Rejection Experiment, , Solar neutrinos-recently caught at U Wisconsin IceCube at the South Pole, Standard Model of Particle Physics, , , , , WIMPs that go by names like the gravitino sneutrino and neutralino   

    From Gran Sasso via Motherboard: “The New Hunt for Dark Matter Is Taking Place Under a Mountain” 

    From Gran Sasso

    via

    Motherboard

    1

    Aug 30 2018
    Daniel Oberhaus

    Davide D’Angelo wasn’t always interested in dark matter, but now he’s at the forefront of the hunt to find the most elusive particle in the universe.

    About an hour outside of Rome there’s a dense cluster of mountains known as the Gran Sasso d’Italia. Renowned for their natural beauty, the Gran Sasso are a popular tourist destination year round, offering world-class skiing in the winter and plenty of hiking and swimming opportunities in the summer. For the 43-year old Italian physicist Davide D’Angelo, these mountains are like a second home. Unlike most people who visit Gran Sasso, however, D’Angelo spends more time under the mountains than on top of them.

    It’s here, in a cavernous hall thousands of feet beneath the earth, that D’Angleo works on a new generation of experiments dedicated to the hunt for dark matter particles, an exotic form of matter whose existence has been hypothesized for decades but never proven experimentally.

    Dark matter is thought to make up about 27 percent of the universe and characterizing this elusive substance is one of the most profound problems in contemporary physics. Although D’Angelo is optimistic that a breakthrough will occur in his lifetime, so was the last generation of physicists. In fact, there’s a decent chance that the particles D’Angelo is looking for don’t exist at all. Yet for physicists probing the fundamental nature of the universe, the possibility that they might spend their entire career “hunting ghosts,” as D’Angelo put it, is the price of advancing science.

    WHAT’S UNDER THE ‘GREAT STONE’?

    In 1989, Italy’s National Institute for Nuclear Physics opened the Gran Sasso National Laboratory, the world’s largest underground laboratory dedicated to astrophysics. Gran Sasso’s three cavernous halls were purposely built for physics, which is something of a luxury as far as research centers go. Most other underground astrophysics laboratories like SNOLAB are ad hoc facilities that repurpose old or active mine shafts, which limits the amount of time that can be spent in the lab and the types of equipment that can be used.


    SNOLAB, Sudbury, Ontario, Canada.

    Buried nearly a mile underground to protect it from the noisy cosmic rays that bathe the Earth, Gran Sasso is home to a number of particle physics experiments that are probing the foundations of the universe. For the last few years, D’Angelo has divided his time between the Borexino observatory and the Sodium Iodide with Active Background Rejection Experiment (SABRE), which are investigating solar neutrinos and dark matter, respectively.

    Borexino Solar Neutrino detector

    SABRE experiment at INFN Gran Sasso

    2
    Davide D’Angelo with the SABRE proof of concept. Image: Xavier Aaronson/Motherboard

    Over the last 100 years, characterizing solar neutrinos and dark matter was considered to be one of the most important tasks of particle physics. Today, the mystery of solar neutrinos is resolved, but the particles are still of great interest to physicists for the insight they provide into the fusion process occurring in our Sun and other stars. The composition of dark matter, however, is still considered to be one of the biggest questions in particle physics. Despite the radically different nature of the particles, they are united insofar as they both can only be discovered in environments where the background radiation is at a minimum: Thousands of feet beneath the Earth’s surface.

    “The mountain acts as a shield so if you go below it, you have so-called ‘cosmic silence,’” D’Angelo said. “That’s the part of my research I like most: Going into the cave, putting my hands on the detector and trying to understand the signals I’m seeing.”

    After finishing grad school, D’Angelo got a job with Italy’s National Institute for Nuclear Physics where his research focused on solar neutrinos, a subatomic particle with no charge that is produced by fusion in the Sun. For the better part of four decades, solar neutrinos [recently caught at U Wisconsin IceCube at the South Pole] were at the heart of one of the largest mysteries in astrophysics.

    IceCube neutrino detector interior


    U Wisconsin ICECUBE neutrino detector at the South Pole

    The problem was that instruments measuring the energy from solar neutrinos returned results much lower than predicted by the Standard Model, the most accurate theory of fundamental particles in physics.

    Given how accurate the Standard Model had proven to be for other aspects of cosmology, physicists were reluctant to make alterations to it to account for the discrepancy. One possible explanation was that physicists had faulty models of the Sun and better measurements of its core pressure and temperature were needed. Yet after a string of observations in the 60s and 70s demonstrated that the models of the sun were essentially correct, physicists sought alternative explanations by turning to the neutrino.

    A TALE OF THREE NEUTRINOS

    Ever since they were first proposed by the Austrian physicist Wolfgang Pauli in 1930, neutrinos have been called upon to patch holes in theories. In Pauli’s case, he first posited the existence of an extremely light, chargeless particle as a “desperate remedy” to explain why the law of the conservation of energy appeared to be violated during radioactive decay. Three years later, the Italian physicist Enrico Fermi gave these hypothetical particles a name. He called them “neutrinos,” Italian for “little neutrons.”

    A quarter of a century after Pauli posited their existence, two American physicists reported the first evidence of neutrinos produced in a fission reactor. The following year, in 1957, Bruno Pontecorvo, an Italian physicist working in the Soviet Union, developed a theory of neutrino oscillations. At the time, little was known about the properties of neutrinos and Pontecorvo suggested that there might be more than one type of neutrino. If this were the case, Pontecorvo theorized that it could be possible for the neutrinos to switch between types.

    By 1975, part of Pontecorvo’s theory had been proven correct. Three different types, or “flavors,” of neutrino had been discovered: electron neutrinos, muon neutrinos, and tau neutrinos. Importantly, observations from an experiment in a South Dakota mineshaft had confirmed that the Sun produced electron neutrinos. The only issue was that the experiment detected far fewer neutrinos than the Standard Model predicted.

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA


    FNAL DUNE Argon tank at SURF


    Surf-Dune/LBNF Caverns at Sanford



    SURF building in Lead SD USA

    Prior to the late 90s, there was scant indirect evidence that neutrinos could change from one flavor to another. In 1998, a group of researchers working in Japan’s Super-Kamiokande Observatory observed oscillations in atmospheric neutrinos, which are mostly produced by the interactions between photons and the Earth’s atmosphere.

    Super-Kamiokande experiment. located under Mount Ikeno near the city of Hida, Gifu Prefecture, Japan

    Three years later, Canada’s Sudbury Neutrino Observatory (SNO) provided the first direct evidence of oscillations from solar neutrinos.

    Sudbury Neutrino Observatory, no longer operating

    This was, to put it lightly, a big deal in cosmological physics. It effectively resolved the mystery of the missing solar neutrinos, or why experiments only observed about a third as many neutrinos radiating from the Sun compared to predictions made by the Standard Model. If neutrinos could oscillate between flavors, this means a neutrino that is emitted in the Sun’s core could be a different type of neutrino by the time it reaches Earth. Prior to the mid-80s, most experiments on Earth were only looking for electron neutrinos, which meant they were missing the other two flavors of neutrinos that were created en route from the Sun to the Earth.

    When SNO was dreamt up in the 80s, it was designed so that it would be capable of detecting all three types of neutrinos, instead of just electron neutrinos. This decision paid off. In 2015, the directors of the experiments at Super-Kamiokande and SNO shared the Nobel Prize in physics for resolving the mystery of the missing solar neutrinos.

    Although the mystery of solar neutrinos has been solved, there’s still plenty of science to be done to better understand them. Since 2007, Gran Sasso’s Borexino observatory has been refining the measurements of solar neutrino flux, which has given physicists unprecedented insight into the fusion process powering the Sun. From the outside, the Borexino observatory looks like a large metal sphere, but on the inside it looks like a technology transplanted from an alien world.

    Borexino detector. Image INFN

    In the center of the sphere is basically a large, transparent nylon sack that is almost 30 feet in diameter and only half a millimeter thick. This sack contains a liquid scintillator, a chemical mixture that releases energy when a neutrino passes through it. This nylon sphere is suspended in 1,000 metric tons of a purified buffer liquid and surrounded by 2,200 sensors to detect energy released by electrons that are freed when neutrinos interact with the liquid scintillator. Finally, an outer buffer of nearly 3,000 tons of ultrapure water helps provide additional shielding for the detector. Taken together, the Borexino observatory has the most protection from outside radiation interference of any liquid scintillator in the world.

    For the last decade, physicists at Borexino—including D’Angelo, who joined the project in 2011—have been using this one-of-a-kind device to observe low energy solar neutrinos produced by proton collisions during the fusion process in the Sun’s core. Given how difficult it is to detect these chargless, ultralight particles that hardly ever interact with matter, detecting the low energy solar neutrinos would be virtually impossible without such a sensitive machine. When SNO directly detected the first solar neutrino oscillations, for instance, it could only observe the highest energy solar neutrinos due to interference from background radiation. This amounted to only about 0.01 percent of all the neutrinos emitted by the Sun. Borexino’s sensitivity allows it to observe solar neutrinos whose energy is a full order of magnitude lower than those detected by SNO, opening the door for an incredibly refined model of solar processes as well as more exotic events like supernovae.

    “It took physicists 40 years to understand solar neutrinos and it’s been one of the most interesting puzzles in particle physics,” D’Angelo told me. “It’s kind of like how dark matter is now.”

    SHINING A LIGHT ON DARK MATTER

    If neutrinos were the mystery particle of the twentieth century, then dark matter is the particle conundrum for the new millenium. Just like Pauli proposed neutrinos as a “desperate remedy” to explain why experiments seemed to be violating one of the most fundamental laws of nature, the existence of dark matter particles is inferred because cosmological observations just don’t add up.

    In the early 1930s, the American astronomer Fritz Zwicky was studying the movement of a handful of galaxies in the Coma cluster, a collection of over 1,000 galaxies approximately 320 million light years from Earth.

    Fritz Zwicky, the Father of Dark Matter research.No image credit after long search

    Vera Rubin did much of the work on proving the existence of Dark Matter. She and Fritz were both overlooked for the Nobel prize.

    Vera Rubin measuring spectra (Emilio Segre Visual Archives AIP SPL)


    Astronomer Vera Rubin at the Lowell Observatory in 1965. (The Carnegie Institution for Science)

    Using data published by Edwin Hubble, Zwicky calculated the mass of the entire Coma galaxy cluster.

    Coma cluster via NASA/ESA Hubble

    When he did, Zwicky noticed something odd about the velocity dispersion—the statistical distribribution of the speeds of a group of objects—of the galaxies: The velocity distribution was about 12 times higher than it should be based on the amount of matter in the galaxies.

    Inside Gran Sasso- Image- Xavier Aaronson-Motherboard

    This was a surprising calculation and its significance wasn’t lost on Zwicky. “If this would be confirmed,” he wrote, “we would get the surprising result that dark matter is present in much greater amount than luminous matter.”

    The idea that the universe was made up mostly of invisible matter was a radical idea in Zwicky’s time and still is today. The main difference, however, is that astronomers now have much stronger empirical evidence pointing to its existence. This is mostly due to the American astronomer Vera Rubin, whose measurement of galactic rotations in the 1960s and 70s put the existence of dark matter beyond a doubt. In fact, based on Rubin’s measurements and subsequent observations, physicists now think dark matter makes up about 27 percent of the “stuff” in the universe, about seven times more than the regular, baryonic matter we’re all familiar with. The burning question, then, is what is it made of?

    Since Rubin’s pioneering observations, a number of dark matter candidate particles have been proposed, but so far all of them have eluded detection by some of the world’s most sensitive instruments. Part of the reason for this is that physicists aren’t exactly sure what they’re looking for. In fact, a small minority of physicists think dark matter might not be a particle at all and is just an exotic gravitational effect. This makes designing dark matter experiments kind of like finding a car key in a stadium parking lot and trying to track down the vehicle it pairs with. There’s a pretty good chance the car is somewhere in the parking lot, but you’re going to have to try a lot of doors before you find your ride—if it even exists.

    Among the candidates for dark matter are subatomic particles with goofy names like axions, gravitinos, Massive Astrophysical Compact Halo Objects (MACHOs), and Weakly Interacting Massive Particles (WMIPs.) D’Angelo and his colleagues at Gran Sasso have placed their bets on WIMPs, which until recently were considered to be the leading particle candidate for dark matter.

    Over the last few years, however, physicists have started to look at other possibilities after some critical tests failed to confirm the existence of WIMPs. WIMPs are a class of hypothetical elementary particles that hardly ever interact with regular baryonic matter and don’t emit light, which makes them exceedingly hard to detect. This problem is compounded by the fact that no one is really sure how to characterize a WIMP. Needless to say, it’s hard to find something if you’re not even really sure what you’re looking for.

    So why would physicists think WIMPs exist at all? In the 1970s, physicists conceptualized the Standard Model of particle physics, which posited that everything in the universe was made out of a handful of fundamental particles.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    The Standard Model works great at explaining almost everything the universe throws at it, but it’s still incomplete since it doesn’t incorporate gravity into the model.

    Gravity measured with two slightly different torsion pendulum set ups and slightly different results

    In the 1980s, an extension of the Standard Model called Supersymmetry emerged, which hypothesizes that each fundamental particle in the Standard Model has a partner.

    Standard model of Supersymmetry DESY

    These particle pairs are known as supersymmetric particles and are used as the theoretical explanation for a number of mysteries in Standard Model physics, such as the mass of the Higgs boson and the existence of dark matter. Some of the most complex and expensive experiments in the world like the Large Hadron Collider particle accelerator were created in an effort to discover these supersymmetric particles, but so far there’s been no experimental evidence that these particles actually exist.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    Many of the lightest particles theorized in the supersymmetric model are WIMPs and go by names like the gravitino, sneutrino and neutralino. The latter is still considered to be the leading candidate for dark matter by many physicists and is thought to have formed in abundance in the early universe. Detecting evidence of this ancient theoretical particle is the goal of many dark matter experiments, including the one D’Angelo works on at Gran Sasso.

    D’Angelo told me he became interested in dark matter a few years after joining the Gran Sasso laboratory and began contributing to the laboratory’s DarkSide experiment, which seemed like a natural extension of his work on solar neutrinos. DarkSide is essentially a large tank filled with liquid argon and equipped with incredibly sensitive sensors. If WIMPs exist, physicists expect to detect them from the ionization produced through their collision with the argon nuclei.

    Dark Side-50 Dark Matter Experiment at Gran Sasso

    The set up of the SABRE experiment is deliberately similar to another experiment that has been running at Gran Sasso since 1995 called DAMA. In 2003, the DAMA experiment began looking for seasonal fluctuations in dark matter particles that was predicted in the 1980s as a consequence of the relative motion of the sun and Earth to the rest of the galaxy. The theory posited that the relative speed of any dark matter particles detected on Earth should peak in June and bottom out in December.

    The DarkSide experiment has been running at Gran Sasso since 2013 and D’Angelo said it is expected to continue for several more years. These days, however, he’s found himself involved with a different dark matter experiment at Gran Sasso called SABRE [above], which will also look for direct evidence of dark matter particles based on the light produced when energy is released through their collision with Sodium-Iodide crystals.

    Over the course of nearly 15 years, DAMA did in fact register seasonal fluctuations in its detectors that were in accordance with this theory and the expected signature of a dark matter particle. In short, it seemed as if DAMA was the first experiment in the world to detect a dark matter particle. The problem, however, was that DAMA couldn’t completely rule out the possibility that the signature it had detected was in fact due to some other seasonal variation on Earth, rather than the ebb and flow of dark matter as the Earth revolved around the Sun.

    SABRE aims to remove the ambiguities in DAMA’s data. After all the kinks are worked out in the testing equipment, the Gran Sasso experiment will become one half of SABRE. The other half will be located in Australia in a converted gold mine. By having a laboratory in the northern hemisphere and another in the southern hemisphere, this should help eliminate any false positives that result from normal seasonal fluctuations. At the moment, the SABRE detector is still in a proof of principle phase and is expected to begin observations in both hemispheres within the next few years.

    When it comes to SABRE, it’s possible that the experiment may disprove the best evidence physicists have found so far for a dark matter particle. But as D’Angelo pointed out, this type of disappointment is a fundamental part of science.

    “Of course I am afraid that there might not be any dark matter there and we are hunting ghosts, but science is like this,” D’Angelo said. “Sometimes you spend several years looking for something and in the end it’s not there so you have to change the way you were thinking about things.”

    For D’Angelo, probing the subatomic world with neutrino and dark matter research from a cave in Italy is his way of connecting to the universe writ large.

    “The tiniest elements of nature are bonded to the most macroscopic phenomena, like the expansion of the universe,” D’Angelo said. “The infinitely small touches the infinitely big in this sense, and I find that fascinating. The physics I do, it’s goal is to push over the boundary of human knowledge.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    INFN Gran Sasso National Laboratory (LNGS) is the largest underground laboratory in the world devoted to neutrino and astroparticle physics, a worldwide research facility for scientists working in this field of research, where particle physics, cosmology and astrophysics meet. It is unequalled anywhere else, as it offers the most advanced underground infrastructures in terms of dimensions, complexity and completeness.

    LNGS is funded by the National Institute for Nuclear Physics (INFN), the Italian Institution in charge to coordinate and support research in elementary particles physics, nuclear and sub nuclear physics

    Located between L’Aquila and Teramo, at about 120 kilometres from Rome, the underground structures are on one side of the 10-kilometre long highway tunnel which crosses the Gran Sasso massif (towards Rome); the underground complex consists of three huge experimental halls (each 100-metre long, 20-metre large and 18-metre high) and bypass tunnels, for a total volume of about 180.000 m3.

    Access to experimental halls is horizontal and it is made easier by the highway tunnel. Halls are equipped with all technical and safety equipment and plants necessary for the experimental activities and to ensure proper working conditions for people involved.

    The 1400 metre-rock thickness above the Laboratory represents a natural coverage that provides a cosmic ray flux reduction by one million times; moreover, the flux of neutrons in the underground halls is about thousand times less than on the surface due to the very small amount of uranium and thorium of the Dolomite calcareous rock of the mountain.

    The permeability of cosmic radiation provided by the rock coverage together with the huge dimensions and the impressive basic infrastructure, make the Laboratory unmatched in the detection of weak or rare signals, which are relevant for astroparticle, sub nuclear and nuclear physics.

    Outside, immersed in a National Park of exceptional environmental and naturalistic interest on the slopes of the Gran Sasso mountain chain, an area of more than 23 acres hosts laboratories and workshops, the Computing Centre, the Directorate and several other Offices.

    Currently 1100 scientists from 29 different Countries are taking part in the experimental activities of LNGS.
    LNGS research activities range from neutrino physics to dark matter search, to nuclear astrophysics, and also to earth physics, biology and fundamental physics.

     
    • Marco Pereira 2:43 pm on September 1, 2018 Permalink | Reply

      I created a theory called the Hypergeometrical Universe Theory (HU). This theory uses three hypotheses:
      a) The Universe is a lightspeed expanding hyperspherical hypersurface. This was later proven correct by observations by the Sloan Digital Sky Survey
      https://hypergeometricaluniverse.quora.com/Proof-of-an-Extra-Spatial-Dimension
      b) Matter is made directly and simply from coherences between stationary states of deformation of the local metric called Fundamental Dilator or FD.
      https://hypergeometricaluniverse.quora.com/The-Fundamental-Dilator
      c) FDs obey the Quantum Lagrangian Principle (QLP). Yves Couder had a physical implementation (approximation) of the Fundamental Dilator and was perplexed that it would behave Quantum Mechanically. FDs and the QLP are the reason for Quantum Mechanics. QLP replaces Newtonian Dynamics and allows for the derivation of Quantum Gravity or Gravity as applied to Black Holes.

      HU derives a new law of Gravitation that is epoch-dependent. That makes Type 1a Supernovae to be epoch-dependent (within the context of the theory). HU then derives the Absolute Luminosity of SN1a as a function of G and showed that Absolute Luminosity scales with G^{-3}.
      Once corrected the Photometrically Determined SN1a distances, HU CORRECTLY PREDICTS all SN1a distances given their redshifts z.

      The extra dimension refutes all 4D spacetime theories, including General Relativity and L-CDM. HU also falsifies all Dark Matter evidence:
      https://www.quora.com/Are-dark-matter-and-dark-energy-falsifiable/answer/Marco-Pereira-1
      including the Spiral Galaxy Conundrum and the Coma Cluster Conundrum.

      Somehow, my theory is still been censored by the community as a whole (either directly or by omission).

      I hope this posting will help correct this situation.

      Like

  • richardmitnick 3:54 pm on December 12, 2017 Permalink | Reply
    Tags: How Neutrinos Could Solve The Three Greatest Open Questions In Physics, , Standard Model of cosmology, Standard Model of Cosmology Timeline, Standard Model of Particle Physics,   

    From Ethan Siegel: “How Neutrinos Could Solve The Three Greatest Open Questions In Physics” 

    Ethan Siegel
    Dec 12, 2017

    Dark matter, dark energy, and why there’s more matter than antimatter? There’s an experiment to explore if neutrinos could solve all three.

    1
    A detailed look at the Universe reveals that it’s made of matter and not antimatter, that dark matter and dark energy are required, and that we don’t know the origin of any of these mysteries. Image credit: Chris Blake and Sam Moorfield.

    When you take a look at the Universe in great detail, a few facts jump out at you that might be surprising. All the stars, galaxies, gas, and plasma out there are made of matter and not antimatter, even though the laws of nature appear symmetric between the two. In order to form the structures we see on the largest scales, we require a huge amount of dark matter: about five times as much as all the normal matter we possess. And to explain how the expansion rate has changed over time, we need a mysterious form of energy inherent to space itself that’s twice as important (as far as energy is concerned) as all the other forms combined: dark energy. These three puzzles may be the greatest cosmological problems for the 21st century, and yet the one particle that goes beyond the standard model — the neutrino — just might explain them all.

    2
    The particles and antiparticles of the Standard Model of particle physics are exactly in line with what experiments require, with only massive neutrinos providing a difficulty. Image credit: E. Siegel / Beyond the Galaxy.

    Standard Model of Particle Physics from Symmetry Magazine

    Here in the physical Universe, we have two types of Standard Model:

    The Standard Model of particle physics (above), with six flavors of quarks and leptons, their antiparticles, the gauge bosons, and the Higgs.
    The Standard Model of cosmology (below), with the inflationary Big Bang, matter and not antimatter, and a history of structure formation that leads to stars, galaxies, clusters, filaments, and the present-day Universe.

    4
    The matter and energy content in the Universe at the present time (left) and at earlier times (right). Note the presence of dark energy, dark matter, and the prevalence of normal matter over antimatter, which is so minute it does not contribute at any of the times shown. Image credit: NASA, modified by Wikimedia Commons user 老陳, modified further by E. Siegel.

    Both Standard Models are perfect in the sense that they explain everything we can observe, but both contain mysteries we cannot explain. From the particle physics side, there’s the mystery of why the particle masses have the values that they do, while on the cosmology side, there are the mysteries of what dark matter and dark energy are, and why (and how) they came to dominate the Universe.

    The Universe according to the Standard Model

    Standard Model of Cosmology Timeline

    The big problem in all of this is that the Standard Model of particle physics explains everything we’ve ever observed — every particle, interaction, decay, etc. — perfectly. We’ve never observed a single interaction in a collider, a cosmic ray, or any other experiment that runs counter to the Standard Model’s predictions. The only experimental hint we have that the Standard Model doesn’t give us everything we observe is the fact of neutrino oscillations: where one type of neutrino transforms into another as it passes through space, and through matter in particular. This can only happen if neutrinos have a small, tiny, non-zero mass, as opposed to the massless properties predicted by the Standard Model.

    5
    If you begin with an electron neutrino (black) and allow it to travel through either empty space or matter, it will have a certain probability of oscillating into one of the other two types, something that can only happen if neutrinos have very small but non-zero masses. Image credit: Wikimedia Commons user Strait.

    So, then, why and how do neutrinos get their masses, and why are those masses so tiny compared to everything else?

    6
    The mass difference between an electron, the lightest normal Standard Model particle, and the heaviest possible neutrino is more than a factor of 4,000,000, a gap even larger than the difference between the electron and the top quark. Image credit: Hitoshi Murayama.

    There’s even more bizarreness afoot when you take a closer look at these particles. You see, every neutrino we’ve ever observed is left-handed, meaning if you point your left-hand’s thumb in a certain direction, your fingers curl in the direction of the neutrino’s spin. Every anti-neutrino, on the other hand (literally), is right-handed: your right thumb points in its direction of motion and your fingers curl in the direction of the anti-neutrino’s spin. Every other fermion that exists has a symmetry between particles and antiparticles, including equal numbers of left-and-right-handed types. This bizarre property suggests that neutrinos are Majorana (rather than the normal Dirac) fermions, where they behave as their own antiparticles.

    Why could this be? The simplest answer is through an idea known as the see-saw mechanism.

    If you had “normal” neutrinos with typical masses — comparable to the other Standard Model particles (or the electroweak scale) — that would be expected. Left-handed neutrino and right-handed neutrinos would be balanced, and would have a mass of around 100 GeV. But if there were very heavy particles, like the yellow one (above) that existed at some ultra-high scale (around 10¹⁵ GeV, typical for the grand unification scale), they could land on one side of the see-saw. This mass would get mixed together with the “normal” neutrinos, and you’d get two types of particles out:

    . a stable, neutral, weakly interacting ultra-heavy right-handed neutrino (around 10¹⁵ GeV), made heavy by the heavy mass that landed on one side of the see-saw, and
    . a light, neutral, weakly interacting left-handed neutrino of the “normal” mass squared over the heavy mass: about (100 GeV)²/(10¹⁵ GeV), or around 0.01 eV.

    That first type of particle could easily be the mass of the dark matter particle we need: a member of a class of cold dark matter candidates known as WIMPzillas. This could successfully reproduce the large-scale structure and gravitational effects we need to recover the observed Universe. Meanwhile, the second number lines up extremely well with the actual, allowable mass ranges of the neutrinos we have in our Universe today. Given the uncertainties of one or two orders of magnitude, this could describe exactly how neutrinos work. It gives a dark matter candidate, an explanation for why neutrinos would be so light, and three other interesting things.

    7
    The expected fates of the Universe (top three illustrations) all correspond to a Universe where the matter and energy fights against the initial expansion rate. In our observed Universe, a cosmic acceleration is caused by some type of dark energy, which is hitherto unexplained. Image credit: E. Siegel / Beyond the Galaxy.

    Dark energy. If you try and calculate what the zero-point energy, or vacuum energy, of the Universe is, you get a ridiculous number: somewhere around Λ ~ (10¹⁹ GeV)⁴. If you’ve ever heard of people saying that the prediction for dark energy is too large by about 120 orders of magnitude, this is where they get that number from. But if you replace that number of 10¹⁹ GeV with the mass of the neutrino, at 0.01 eV, you get a number that’s right around Λ ~ (0.01 eV)⁴, which comes out to match the value we measure almost exactly. This isn’t a proof of anything, but it’s extremely suggestive.

    8
    When the electroweak symmetry breaks, the combination of CP-violation and baryon number violation can create a matter/antimatter asymmetry where there was none before, owing to the effect of sphaleron interactions working on a neutrino excess. Image credit: University of Heidelberg.

    A baryon asymmetry. We need a way to generate more matter than antimatter in the early Universe, and if we have this see-saw scenario, it gives us a viable way to do it. These mixed-state neutrinos can create more leptons than anti-leptons through the neutrino sector, giving rise to a Universe-wide asymmetry. When the electroweak symmetry breaks, a series of interactions known as sphaleron interactions can then give rise to a Universe with more baryons than leptons, since baryon number (B) and lepton number (L) aren’t individually conserved: just the combination B — L. Whatever lepton asymmetry you start with, they’ll get converted into equal parts baryon and lepton asymmetry. For example, if you start with a lepton asymmetry of X, these sphalerons will naturally give you a Universe with an “extra” amount of protons and neutrons that equals X/2, while giving you that same X/2 amount of electrons and neutrinos combined.

    9
    When a nucleus experiences a double neutron decay, two electrons and two neutrinos get emitted conventionally. If neutrinos obey this see-saw mechanism and are Majorana particles, neutrinoless double beta decay should be possible. Experiments are actively looking for this. Image credit: Ludwig Niedermeier, Universitat Tubingen / GERDA.

    U Washington Majorana Demonstrator Experiment at SURF

    U Washington Majorana Demonstrator Experiment at SURF


    SURF building in Lead SD USA

    A new type of decay: neutrinoless double beta decay. The theoretical idea of a source for dark matter, dark energy, and the baryon asymmetry is fascinating, but you need an experiment to detect it. Until we can directly measure neutrinos (and anti-neutrinos) left over from the Big Bang, a feat that’s practically impossible due to the low cross-section of these low-energy neutrinos, we won’t know how to test whether neutrinos have these properties (Majorana) or not (Dirac). But if a double beta decay that emits no neutrinos occurs, we’ll know that neutrinos do have these (Majorana) properties after all, and all of this suddenly could be real.

    Perhaps ironically, the greatest advance in particle physics — a great leap forward beyond the Standard Model — might not come from our greatest experiments and detectors at high-energies, but from a humble, patient look for an ultra-rare decay. We’ve constrained neutrinoless double beta decay to have a lifetime of more than 2 × 10²⁵ years, but the next decade or two of experiments should measure this decay if it exists. So far, neutrinos are the only hint of particle physics beyond the Standard Model. If neutrinoless double beta decay turns out to be real, it might be the future of fundamental physics. It could solve the biggest cosmic questions plaguing humanity today. Our only choice is to look. If nature is kind to us, the future won’t be supersymmetry, extra dimensions, or string theory. We just might have a neutrino revolution on our hands.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 1:47 pm on July 19, 2017 Permalink | Reply
    Tags: , , , , , , Scientists Are Using the Universe as a "Cosmological Collider", Standard Model of Particle Physics   

    From CfA: “Scientists Are Using the Universe as a “Cosmological Collider” 

    Harvard Smithsonian Center for Astrophysics


    Center For Astrophysics

    July 19, 2017
    Megan Watzke
    Harvard-Smithsonian Center for Astrophysics
    +1 617-496-7998
    mwatzke@cfa.harvard.edu

    Peter Edmonds
    Harvard-Smithsonian Center for Astrophysics
    +1 617-571-7279
    pedmonds@cfa.harvard.edu

    1

    Physicists are capitalizing on a direct connection between the largest cosmic structures and the smallest known objects to use the universe as a “cosmological collider” and investigate new physics.

    The three-dimensional map of galaxies throughout the cosmos and the leftover radiation from the Big Bang – called the cosmic microwave background (CMB) – are the largest structures in the universe that astrophysicists observe using telescopes.

    CMB per ESA/Planck

    ESA/Planck

    Subatomic elementary particles, on the other hand, are the smallest known objects in the universe that particle physicists study using particle colliders.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    A team including Xingang Chen of the Harvard-Smithsonian Center for Astrophysics (CfA), Yi Wang from the Hong Kong University of Science and Technology (HKUST) and Zhong-Zhi Xianyu from the Center for Mathematical Sciences and Applications at Harvard University has used these extremes of size to probe fundamental physics in an innovative way. They have shown how the properties of the elementary particles in the Standard Model of particle physics may be inferred by studying the largest cosmic structures. This connection is made through a process called cosmic inflation.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Inflationary Universe. NASA/WMAP

    Cosmic inflation is the most widely accepted theoretical scenario to explain what preceded the Big Bang. This theory predicts that the size of the universe expanded at an extraordinary and accelerating rate in the first fleeting fraction of a second after the universe was created.

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    It was a highly energetic event, during which all particles in the universe were created and interacted with each other. This is similar to the environment physicists try to create in ground-based colliders, with the exception that its energy can be 10 billion times larger than any colliders that humans can build.

    Inflation was followed by the Big Bang, where the cosmos continued to expand for more than 13 billion years, but the expansion rate slowed down with time. Microscopic structures created in these energetic events got stretched across the universe, resulting in regions that were slightly denser or less dense than surrounding areas in the otherwise very homogeneous early universe. As the universe evolved, the denser regions attracted more and more matter due to gravity. Eventually, the initial microscopic structures seeded the large-scale structure of our universe, and determined the locations of galaxies throughout the cosmos.

    In ground-based colliders, physicists and engineers build instruments to read the results of the colliding events. The question is then how we should read the results of the cosmological collider.

    “Several years ago, Yi Wang and I, Nima Arkani-Hamed and Juan Maldacena from the Institute of Advanced Study, and several other groups, discovered that the results of this cosmological collider are encoded in the statistics of the initial microscopic structures. As time passes, they become imprinted in the statistics of the spatial distribution of the universe’s contents, such as galaxies and the cosmic microwave background, that we observe today,” said Xingang Chen. “By studying the properties of these statistics we can learn more about the properties of elementary particles.”

    As in ground-based colliders, before scientists explore new physics, it is crucial to understand the behavior of known fundamental particles in this cosmological collider, as described by the Standard Model of particle physics.

    “The relative number of fundamental particles that have different masses – what we call the mass spectrum – in the Standard Model has a special pattern, which can be viewed as the fingerprint of the Standard Model,” explained Zhong-Zhi Xiangyu. “However, this fingerprint changes as the environment changes, and would have looked very different at the time of inflation from how it looks now.”

    The team showed what the mass spectrum of the Standard Model would look like for different inflation models. They also showed how this mass spectrum is imprinted in the appearance of the large-scale structure of our universe. This study paves the way for the future discovery of new physics.

    “The ongoing observations of the CMB and large-scale structure have achieved impressive precision from which valuable information about the initial microscopic structures can be extracted,” said Yi Wang. “In this cosmological collider, any observational signal that deviates from that expected for particles in the Standard Model would then be a sign of new physics.”

    The current research is only a small step towards an exciting era when precision cosmology will show its full power.

    “If we are lucky enough to observe these imprints, we would not only be able to study particle physics and fundamental principles in the early universe, but also better understand cosmic inflation itself. In this regard, there are still a whole universe of mysteries to be explored,” said Xianyu.

    This research is detailed in a paper published in the journal Physical Review Letters on June 29, 2017, and the preprint is available online.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Center for Astrophysics combines the resources and research facilities of the Harvard College Observatory and the Smithsonian Astrophysical Observatory under a single director to pursue studies of those basic physical processes that determine the nature and evolution of the universe. The Smithsonian Astrophysical Observatory (SAO) is a bureau of the Smithsonian Institution, founded in 1890. The Harvard College Observatory (HCO), founded in 1839, is a research institution of the Faculty of Arts and Sciences, Harvard University, and provides facilities and substantial other support for teaching activities of the Department of Astronomy.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: