Tagged: Standard Model Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:25 pm on April 14, 2017 Permalink | Reply
    Tags: , , , , , Standard Model   

    From Ethan Siegel: “Can muons — which live for microseconds — save experimental particle physics?” 

    Ethan Siegel

    Apr 14, 2017

    You lose whether you use protons or electrons in your collider, for different reasons. Could the unstable muon solve both problems?

    1
    A four-muon candidate event in the ATLAS detector at the Large Hadron Collider. The muon/anti-muon tracks are highlighted in red, as the long-lived muons travel farther than any other unstable particle. Image credit: ATLAS Collaboration / CERN.

    “It does not matter how slowly you go as long as you do not stop.” -Confucius

    High-energy physics is facing its greatest crisis ever. The Standard Model is complete, as all the particles our most successful physics theories have predicted have been discovered.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    The Large Hadron Collider at CERN, the most energetic particle collider ever developed (with more than six times the energies of any prior collider), discovered the long-sought-after Higgs boson, but nothing else.

    CERN/LHC Map

    CERN LHC Tube


    LHC at CERN

    Traditionally, the way to discover new particles has been to go to higher energies with one of two strategies:

    Collide electrons and positrons, getting a “clean” signal where 100% of the collider energy goes into producing new particles.
    Collide protons and either anti-protons or other protons, getting a messy signal but reaching higher energies due to the heavier mass of the proton.

    Both methods have their limitations, but one unstable particle might give us a third option to make the elusive breakthrough we desperately need: the muon.

    2
    The known particles in the Standard Model. These are all the fundamental particles that have been directly discovered. Image credit: E. Siegel.

    The Standard Model is made up of all the fundamental particles and antiparticles we’ve ever discovered. They include six quarks and antiquarks, each in three colors, three charged leptons and three types of neutrino, along with their antiparticle counterparts, and the bosons: the photon, the weak bosons (W+, W-, Z0), the eight gluons (with color/anticolor combinations attached), and the Higgs boson. While countless different combinations of these particles exist in nature, only a precious few are stable. The electron, photon, proton (made of two up and one down quark), and, if they’re bound together in nuclei, the neutron (with two down and one up quark) are stable, along with their antimatter counterparts. That’s why all the normal matter we see in the Universe is made up of protons, neutrons, and electrons; nothing else with any significant interactions is stable.

    3
    While many unstable particles, both fundamental and composite, can be produced in particle physics, only protons, neutrons (bound in nuclei) and the electron are stable, along with their antimatter counterparts and the photon. Everything else is short-lived. Image credit: Contemporary Physics Education Project (CPEP), U.S. Department of Energy / NSF / LBNL.

    The way you create these unstable particles is by colliding the stable ones together at high enough energies. Because of a fundamental principle of nature — mass/energy equivalence, given by Einstein’s E = mc2 — you can turn pure energy into mass if you have enough of it. (So long as you obey all the other conservation laws.) This is exactly the way we’ve created almost all the other particles of the Standard Model: by colliding particles into one another at enough energy that the energy you get out (E) is high enough to create the new particles (of mass m) you’re attempting to discover.

    4
    The particle tracks emanating from a high energy collision at the LHC in 2014 show the creation of many new particles. It’s only because of the high-energy nature of this collision that new masses can be created.

    We know there are almost certainly more particles beyond the ones we’ve discovered; we expect there to be particle explanations for mysteries like the baryon asymmetry (why there’s more matter than antimatter), the missing mass problem in the Universe (what we suspect will be solved by dark matter), the neutrino mass problem (why they’re so incredibly light), the quantum nature of gravity (i.e., there should be a force-carrying particle for the gravitational interaction, like the graviton), and the strong-CP problem (why certain decays don’t happen), among others. But our colliders haven’t reached the energies necessary to uncover those new particles, if they even exist. What’s even worse: both of the current methods have severe drawbacks that may prohibit us from building colliders that go to significantly higher energies.

    The Large Hadron Collider is the current record-holder, accelerating protons up to energies of 6.5 TeV apiece before smashing them together. The energy you can reach is directly proportional to two things only: the radius of your accelerator (R) and the strength of the magnetic field used to bend the protons into a circle (B). Collide those two protons together, and they hit with an energy of 13 TeV. But you’ll never make a 13 TeV particle colliding two protons at the LHC; only a fraction of that energy is available to create new particles via E = mc². The reason? A proton is made of multiple, composite particles — quarks, gluons, and even quark/antiquark pairs inside — meaning that only a tiny fraction of that energy goes into making new, massive particles.

    5
    A candidate Higgs event in the ATLAS detector. Note how even with the clear signatures and transverse tracks, there is a shower of other particles; this is due to the fact that protons are composite particles. Image credit: The ATLAS collaboration / CERN.

    CERN ATLAS Higgs Event

    CERN/ATLAS detector

    You might think to use fundamental particles instead, then, like electrons and positrons. If you were to put them in the same ring (with the same R) and subject them to the same magnetic field (the same B), you might think you could reach the same energies, only this time, 100% of the energy could make new particles. And that would be true, if it weren’t for one factor: synchrotron radiation. You see, when you accelerate a charged particle in a magnetic field, it gives off radiation. Because a proton is so massive compared to its electric charge, that radiation is negligible, and you can take protons up to the highest energies we’ve ever reached without worrying about it. But electrons and positrons are only 1/1836th of a proton’s mass, and synchrotron radiation would limit them to only about 0.114 TeV of energy under the same conditions.

    6
    Relativistic electrons and positrons can be accelerated to very high speeds, but will emit synchrotron radiation (blue) at high enough energies, preventing them from moving faster. Image credit: Chung-Li Dong, Jinghua Guo, Yang-Yuan Chen, and Chang Ching-Lin, ‘Soft-x-ray spectroscopy probes nanomaterial-based devices’.

    But there’s a third option that’s never been put into practice: use muons and anti-muons. A muon is just like an electron in the sense that it’s a fundamental particle, it’s charged, it’s a lepton, but it’s 206 times heavier than the electron. This is massive enough that synchrotron radiation doesn’t matter for muons or anti-muons, which is great! The only downside? The muon is unstable, with a mean lifetime of only 2.2 microseconds before decaying away.

    5
    The prototype MICE 201-megahertz RF module, with the copper cavity mounted, is shown during assembly at Fermilab. This apparatus could focus and collimate a muon beam, enabling the muons to be accelerated and survive for much longer than 2.2 microseconds. Image credit: Y. Torun / IIT / Fermilab Today.

    That might be okay, though, because special relativity can rescue us! When you bring an unstable particle close to the speed of light, the amount of time that it lives increases dramatically, thanks to the relativistic phenomenon of time dilation. If you brought a muon all the way up to 6.5 TeV of energy, it would live for 135,000 microseconds: enough time to circle the Large Hadron Collider 1,500 times before decaying away. And this time, your hopes would be absolutely true: 100% of that energy, 6.5 TeV + 6.5 TeV = 13 TeV, would be available for particle creation.

    6
    A design plan for a full-scale muon-antimuon collider at Fermilab, the source of the world’s second-most powerful particle accelerator. Image credit: Fermilab.

    We can always build a bigger ring or invent stronger magnets, and we may well do exactly that. But there’s no cure for synchrotron radiation except to use heavier particles, and there’s no cure for energy spreading out among the components of composite particles other than not to use them at all. Muons are unstable and difficult to keep alive for a long time, but as we get to higher and higher energies, that task gets progressively easier. Muon colliders have long been touted as a mere pipe dream, but recent progress by the MICE collaboration — for Muon Ionization Cooling Experiment — has demonstrated that this may be possible after all. A circular muon/anti-muon collider may be the particle accelerator that takes us beyond the LHC’s reach, and, if we’re lucky, into the realm of the new physics we’re so desperately seeking.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 12:52 pm on March 16, 2017 Permalink | Reply
    Tags: , , Nauilus, , , Standard Model, Supersymetry   

    From Nautilus: “A Brief History of the Grand Unified Theory of Physics” 

    Nautilus

    Nautilus

    March 16, 2017
    Lawrence M. Krauss
    Paintings by Jonathan Feldschuh

    Particle physicists had two nightmares before the Higgs particle was discovered in 2012. The first was that the Large Hadron Collider (LHC) particle accelerator would see precisely nothing.


    CERN ATLAS Higgs Event

    CERN ATLAS detector


    CERN CMS Higgs Event


    CERN CMS detector




    LHC at CERN

    For if it did, it would likely be the last large accelerator ever built to probe the fundamental makeup of the cosmos. The second was that the LHC would discover the Higgs particle predicted by theoretical physicist Peter Higgs in 1964 … and nothing else.

    Each time we peel back one layer of reality, other layers beckon. So each important new development in science generally leaves us with more questions than answers. But it also usually leaves us with at least the outline of a road map to help us begin to seek answers to those questions. The successful discovery of the Higgs particle, and with it the validation of the existence of an invisible background Higgs field throughout space (in the quantum world, every particle like the Higgs is associated with a field), was a profound validation of the bold scientific developments of the 20th century.

    2
    Particles #22

    However, the words of Sheldon Glashow continue to ring true: The Higgs is like a toilet. It hides all the messy details we would rather not speak of. The Higgs field interacts with most elementary particles as they travel through space, producing a resistive force that slows their motion and makes them appear massive. Thus, the masses of elementary particles that we measure, and that make the world of our experience possible is something of an illusion—an accident of our particular experience.

    As elegant as this idea might be, it is essentially an ad hoc addition to the Standard Model of physics—which explains three of the four known forces of nature, and how these forces interact with matter.


    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    It is added to the theory to do what is required to accurately model the world of our experience. But it is not required by the theory. The universe could have happily existed with massless particles and a long-range weak force (which, along with the strong force, gravity, and electromagnetism, make up the four known forces). We would just not be here to ask about them. Moreover, the detailed physics of the Higgs is undetermined within the Standard Model alone. The Higgs could have been 20 times heavier, or 100 times lighter.

    Why, then, does the Higgs exist at all? And why does it have the mass it does? (Recognizing that whenever scientists ask “Why?” we really mean “How?”) If the Higgs did not exist, the world we see would not exist, but surely that is not an explanation. Or is it? Ultimately to understand the underlying physics behind the Higgs is to understand how we came to exist. When we ask, “Why are we here?,” at a fundamental level we may as well be asking, “Why is the Higgs here?” And the Standard Model gives no answer to this question.

    Some hints do exist, however, coming from a combination of theory and experiment. Shortly after the fundamental structure of the Standard Model became firmly established, in 1974, and well before the details were experimentally verified over the next decade, two different groups of physicists at Harvard, where both Sheldown Glashow and Steven Weinberg were working, noticed something interesting. Glashow, along with Howard Georgi, did what Glashow did best: They looked for patterns among the existing particles and forces and sought out new possibilities using the mathematics of group theory.

    In the Standard Model the weak and electromagnetic forces of nature are unified at a high-energy scale, into a single force that physicists call the “electroweak force.” This means that the mathematics governing the weak and electromagnetic forces are the same, both constrained by the same mathematical symmetry, and the two forces are different reflections of a single underlying theory. But the symmetry is “spontaneously broken” by the Higgs field, which interacts with the particles that convey the weak force, but not the particles that convey the electromagnetic force. This accident of nature causes these two forces to appear as two separate and distinct forces at scales we can measure—with the weak force being short-range and electromagnetism remaining long-range.

    Georgi and Glashow tried to extend this idea to include the strong force, and discovered that all of the known particles and the three non-gravitational forces could naturally fit within a single fundamental symmetry structure. They then speculated that this symmetry could spontaneously break at some ultrahigh energy scale (and short distance scale) far beyond the range of current experiments, leaving two separate and distinct unbroken symmetries left over—resulting in separate strong and electroweak forces. Subsequently, at a lower energy and larger distance scale, the electroweak symmetry would break, separating the electroweak force into the short-range weak and the long-range electromagnetic force.

    They called such a theory, modestly, a Grand Unified Theory (GUT).

    At around the same time, Weinberg and Georgi along with Helen Quinn noticed something interesting—following the work of Frank Wilczek, David Gross, and David Politzer. While the strong interaction got weaker at smaller distance scales, the electromagnetic and weak interactions got stronger.

    It didn’t take a rocket scientist to wonder whether the strength of the three different interactions might become identical at some small-distance scale. When they did the calculations, they found (with the accuracy with which the interactions were then measured) that such a unification looked possible, but only if the scale of unification was about 15 orders of magnitude in scale smaller than the size of the proton.

    This was good news if the unified theory was the one proposed by Howard Georgi and Glashow—because if all the particles we observe in nature got unified this way, then new particles (called gauge bosons) would exist that produce transitions between quarks (which make up protons and neutrons), and electrons and neutrinos. That would mean protons could decay into other lighter particles, which we could potentially observe. As Glashow put it, “Diamonds aren’t forever.”

    Even then it was known that protons must have an incredibly long lifetime. Not just because we still exist almost 14 billion years after the big bang, but because we all don’t die of cancer as children. If protons decayed with an average lifetime smaller than about a billion billion years, then enough protons would decay in our bodies during our childhood to produce enough radiation to kill us. Remember that in quantum mechanics, processes are probabilistic. If an average proton lives a billion billion years, and if one has a billion billion protons, then on average one will decay each year. There are a lot more than a billion billion protons in our bodies.

    However, with the incredibly small proposed distance scale and therefore the incredibly large mass scale associated with spontaneous symmetry breaking in Grand Unification, the new gauge bosons would get large masses. That would make the interactions they mediate be so short-range that they would be unbelievably weak on the scale of protons and neutrons today. As a result, while protons could decay, they might live, in this scenario, perhaps a million billion billion billion years before decaying. Still time to hold onto your growth stocks.

    With the results of Glashow and Georgi, and Georgi, Quinn, and Weinberg, the smell of grand synthesis was in the air. After the success of the electroweak theory, particle physicists were feeling ambitious and ready for further unification.

    How would one know if these ideas were correct, however? There was no way to build an accelerator to probe an energy scale a million billion times greater than the rest mass energy of protons. Such a machine would have to have a circumference of the moon’s orbit. Even if it was possible, considering the earlier debacle over the Superconducting Super Collider, no government would ever foot the bill.


    Superconducting Super Collider map, in the vicinity of Waxahachie, Texas.

    Happily, there was another way, using the kind of probability arguments I just presented that give limits to the proton lifetime. If the new Grand Unified Theory predicted a proton lifetime of, say, a thousand billion billion billion years, then if one could put a thousand billion billion billion protons in a single detector, on average one of them would decay each year.

    Where could one find so many protons? Simple: in about 3,000 tons of water.

    So all that was required was to get a tank of water, put it in the dark, make sure there were no radioactivity backgrounds, surround it with sensitive phototubes that can detect flashes of light in the detector, and then wait for a year to see a burst of light when a proton decayed. As daunting as this may seem, at least two large experiments were commissioned and built to do just this, one deep underground next to Lake Erie in a salt mine, and one in a mine near Kamioka, Japan. The mines were necessary to screen out incoming cosmic rays that would otherwise produce a background that would swamp any proton decay signal.

    Both experiments began taking data around 1982–83. Grand Unification seemed so compelling that the physics community was confident a signal would soon appear and Grand Unification would mean the culmination of a decade of amazing change and discovery in particle physics—not to mention another Nobel Prize for Glashow and maybe some others.

    Unfortunately, nature was not so kind in this instance. No signals were seen in the first year, the second, or the third. The simplest elegant model proposed by Glashow and Georgi was soon ruled out. But once the Grand Unification bug had caught on, it was not easy to let it go. Other proposals were made for unified theories that might cause proton decay to be suppressed beyond the limits of the ongoing experiments.

    On Feb. 23, 1987, however, another event occurred that demonstrates a maxim I have found is almost universal: Every time we open a new window on the universe, we are surprised. On that day a group of astronomers observed, in photographic plates obtained during the night, the closest exploding star (a supernova) seen in almost 400 years.

    3
    NASA is celebrating the 30th anniversary of SN 1987A by releasing new data.

    The star, about 160,000 light-years away, was in the Large Magellanic Cloud—a small satellite galaxy of the Milky Way observable in the southern hemisphere.


    Large Magellanic Cloud. Adrian Pingstone December 2003

    If our ideas about exploding stars are correct, most of the energy released should be in the form of neutrinos, despite that the visible light released is so great that supernovas are the brightest cosmic fireworks in the sky when they explode (at a rate of about one explosion per 100 years per galaxy). Rough estimates then suggested that the huge IMB (Irvine- Michigan-Brookhaven) and Kamiokande water detectors should see about 20 neutrino events.

    5
    Irvine- Michigan-Brookhaven detector


    Super Kamiokande detector

    When the IMB and Kamiokande experimentalists went back and reviewed their data for that day, lo and behold IMB displayed eight candidate events in a 10-second interval, and Kamiokande displayed 11 such events. In the world of neutrino physics, this was a flood of data. The field of neutrino astrophysics had suddenly reached maturity. These 19 events produced perhaps 1,900 papers by physicists, such as me, who realized that they provided an unprecedented window into the core of an exploding star, and a laboratory not just for astrophysics but also for the physics of neutrinos themselves.

    Spurred on by the realization that large proton-decay detectors might serve a dual purpose as new astrophysical neutrino detectors, several groups began to build a new generation of such dual-purpose detectors. The largest one in the world was again built in the Kamioka mine and was called Super-Kamiokande, and with good reason. This mammoth 50,000-ton tank of water, surrounded by 11,800 phototubes, was operated in a working mine, yet the experiment was maintained with the purity of a laboratory clean room. This was absolutely necessary because in a detector of this size one had to worry not only about external cosmic rays, but also about internal radioactive contaminants in the water that could swamp any signals being searched for.

    Meanwhile, interest in a related astrophysical neutrino signature also reached a new high during this period. The sun produces neutrinos due to the nuclear reactions in its core that power it, and over 20 years, using a huge underground detector, physicist Ray Davis had detected solar neutrinos, but had consistently found an event rate about a factor of three below what was predicted using the best models of the sun. A new type of solar neutrino detector was built inside a deep mine in Sudbury, Canada, which became known as the Sudbury Neutrino Observatory (SNO).


    SNOLAB, Sudbury, Ontario, Canada.

    Super-Kamiokande has now been operating almost continuously, through various upgrades, for more than 20 years. No proton-decay signals have been seen, and no new supernovas observed. However, the precision observations of neutrinos at this huge detector, combined with complementary observations at SNO, definitely established that the solar neutrino deficit observed by Ray Davis is real, and moreover that it is not due to astrophysical effects in the sun but rather due to the properties of neutrinos. The implication was that at least one of the three known types of neutrinos is not massless. Since the Standard Model does not accommodate neutrinos’ masses, this was the first definitive observation that some new physics, beyond the Standard Model and beyond the Higgs, must be operating in nature.

    Soon after this, observations of higher-energy neutrinos that regularly bombard Earth as high-energy cosmic-ray protons hit the atmosphere and produce a downward shower of particles, including neutrinos, demonstrated that yet a second neutrino has mass. This mass is somewhat larger, but still far smaller than the mass of the electron. For these results team leaders at SNO and Kamiokande were awarded the 2015 Nobel Prize in Physics—a week before I wrote the first draft of these words. To date these tantalizing hints of new physics are not explained by current theories.

    The absence of proton decay, while disappointing, turned out to be not totally unexpected. Since Grand Unification was first proposed, the physics landscape had shifted slightly. More precise measurements of the actual strengths of the three non-gravitational interactions—combined with more sophisticated calculations of the change in the strength of these interactions with distance—demonstrated that if the particles of the Standard Model are the only ones existing in nature, the strength of the three forces will not unify at a single scale. In order for Grand Unification to take place, some new physics at energy scales beyond those that have been observed thus far must exist. The presence of new particles would not only change the energy scale at which the three known interactions might unify, it would also tend to drive up the Grand Unification scale and thus suppress the rate of proton decay—leading to predicted lifetimes in excess of a million billion billion billion years.

    As these developments were taking place, theorists were driven by new mathematical tools to explore a possible new type of symmetry in nature, which became known as supersymmetry.


    Standard model of Supersymmetry DESY

    This fundamental symmetry is different from any previous known symmetry, in that it connects the two different types of particles in nature, fermions (particles with half-integer spins) and bosons (particles with integer spins). The upshot of this is that if this symmetry exists in nature, then for every known particle in the Standard Model at least one corresponding new elementary particle must exist. For every known boson there must exist a new fermion. For every known fermion there must exist a new boson.

    Since we haven’t seen these particles, this symmetry cannot be manifest in the world at the level we experience it, and it must be broken, meaning the new particles will all get masses that could be heavy enough so that they haven’t been seen in any accelerator constructed thus far.

    What could be so attractive about a symmetry that suddenly doubles all the particles in nature without any evidence of any of the new particles? In large part the seduction lay in the very fact of Grand Unification. Because if a Grand Unified theory exists at a mass scale of 15 to 16 orders of magnitude higher energy than the rest mass of the proton, this is also about 13 orders of magnitude higher than the scale of electroweak symmetry breaking. The big question is why and how such a huge difference in scales can exist for the fundamental laws of nature. In particular, if the Standard Model Higgs is the true last remnant of the Standard Model, then the question arises, Why is the energy scale of Higgs symmetry breaking 13 orders of magnitude smaller than the scale of symmetry breaking associated with whatever new field must be introduced to break the GUT symmetry into its separate component forces?

    ____________________________________________________________________________
    Following three years of LHC runs, there are no signs of supersymmetry whatsoever.
    ____________________________________________________________________________

    The problem is a little more severe than it appears. When one considers the effects of virtual particles (which appear and disappear on timescales so short that their existence can only be probed indirectly), including particles of arbitrarily large mass, such as the gauge particles of a presumed Grand Unified Theory, these tend to drive up the mass and symmetry-breaking scale of the Higgs so that it essentially becomes close to, or identical to, the heavy GUT scale. This generates a problem that has become known as the naturalness problem. It is technically unnatural to have a huge hierarchy between the scale at which the electroweak symmetry is broken by the Higgs particle and the scale at which the GUT symmetry is broken by whatever new heavy field scalar breaks that symmetry.

    The mathematical physicist Edward Witten argued in an influential paper in 1981 that supersymmetry had a special property. It could tame the effect that virtual particles of arbitrarily high mass and energy have on the properties of the world at the scales we can currently probe. Because virtual fermions and virtual bosons of the same mass produce quantum corrections that are identical except for a sign, if every boson is accompanied by a fermion of equal mass, then the quantum effects of the virtual particles will cancel out. This means that the effects of virtual particles of arbitrarily high mass and energy on the physical properties of the universe on scales we can measure would now be completely removed.

    If, however, supersymmetry is itself broken (as it must be or all the supersymmetric partners of ordinary matter would have the same mass as the observed particles and we would have observed them), then the quantum corrections will not quite cancel out. Instead they would yield contributions to masses that are the same order as the supersymmetry-breaking scale. If it was comparable to the scale of the electroweak symmetry breaking, then it would explain why the Higgs mass scale is what it is.

    And it also means we should expect to begin to observe a lot of new particles—the supersymmetric partners of ordinary matter—at the scale currently being probed at the LHC.

    This would solve the naturalness problem because it would protect the Higgs boson masses from possible quantum corrections that could drive them up to be as large as the energy scale associated with Grand Unification. Supersymmetry could allow a “natural” large hierarchy in energy (and mass) separating the electroweak scale from the Grand Unified scale.

    That supersymmetry could in principle solve the hierarchy problem, as it has become known, greatly increased its stock with physicists. It caused theorists to begin to explore realistic models that incorporated supersymmetry breaking and to explore the other physical consequences of this idea. When they did so, the stock price of supersymmetry went through the roof. For if one included the possibility of spontaneously broken supersymmetry into calculations of how the three non-gravitational forces change with distance, then suddenly the strength of the three forces would naturally converge at a single, very small-distance scale. Grand Unification became viable again!

    Models in which supersymmetry is broken have another attractive feature. It was pointed out, well before the top quark was discovered, that if the top quark was heavy, then through its interactions with other supersymmetric partners, it could produce quantum corrections to the Higgs particle properties that would cause the Higgs field to form a coherent background field throughout space at its currently measured energy scale if Grand Unification occurred at a much higher, superheavy scale. In short, the energy scale of electroweak symmetry breaking could be generated naturally within a theory in which Grand Unification occurs at a much higher energy scale. When the top quark was discovered and indeed was heavy, this added to the attractiveness of the possibility that supersymmetry breaking might be responsible for the observed energy scale of the weak interaction.

    _____________________________________________________________________
    In order for Grand Unification to take place, some new physics at energy scales beyond those that have been observed thus far must exist.
    _____________________________________________________________________

    All of this comes at a cost, however. For the theory to work, there must be two Higgs bosons, not just one. Moreover, one would expect to begin to see the new supersymmetric particles if one built an accelerator such as the LHC, which could probe for new physics near the electroweak scale. Finally, in what looked for a while like a rather damning constraint, the lightest Higgs in the theory could not be too heavy or the mechanism wouldn’t work.

    As searches for the Higgs continued without yielding any results, accelerators began to push closer and closer to the theoretical upper limit on the mass of the lightest Higgs boson in supersymmetric theories. The value was something like 135 times the mass of the proton, with details to some extent depending on the model. If the Higgs could have been ruled out up to that scale, it would have suggested all the hype about supersymmetry was just that.

    Well, things turned out differently. The Higgs that was observed at the LHC has a mass about 125 times the mass of the proton. Perhaps a grand synthesis was within reach.

    The answer at present is … not so clear. The signatures of new super- symmetric partners of ordinary particles should be so striking at the LHC, if they exist, that many of us thought that the LHC had a much greater chance of discovering supersymmetry than it did of discovering the Higgs. It didn’t turn out that way. Following three years of LHC runs, there are no signs of supersymmetry whatsoever. The situation is already beginning to look uncomfortable. The lower limits that can now be placed on the masses of supersymmetric partners of ordinary matter are getting higher. If they get too high, then the supersymmetry-breaking scale would no longer be close to the electroweak scale, and many of the attractive features of supersymmetry breaking for resolving the hierarchy problem would go away.

    But the situation is not yet hopeless, and the LHC has been turned on again, this time at higher energy. It could be that supersymmetric particles will soon be discovered.

    If they are, this will have another important consequence. One of the bigger mysteries in cosmology is the nature of the dark matter that appears to dominate the mass of all galaxies we can see.


    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    There is so much of it that it cannot be made of the same particles as normal matter. If it were, for example, the predictions of the abundance of light elements such as helium produced in the big bang would no longer agree with observation. Thus physicists are reasonably certain that the dark matter is made of a new type of elementary particle. But what type?

    Well, the lightest supersymmetric partner of ordinary matter is, in most models, absolutely stable and has many of the properties of neutrinos. It would be weakly interacting and electrically neutral, so that it wouldn’t absorb or emit light. Moreover, calculations that I and others performed more than 30 years ago showed that the remnant abundance today of the lightest supersymmetric particle left over after the big bang would naturally be in the range so that it could be the dark matter dominating the mass of galaxies.

    In that case our galaxy would have a halo of dark matter particles whizzing throughout it, including through the room in which you are reading this. As a number of us also realized some time ago, this means that if one designs sensitive detectors and puts them underground, not unlike, at least in spirit, the neutrino detectors that already exist underground, one might directly detect these dark matter particles. Around the world a half dozen beautiful experiments are now going on to do just that. So far nothing has been seen, however.

    So, we are in potentially the best of times or the worst of times. A race is going on between the detectors at the LHC and the underground direct dark matter detectors to see who might discover the nature of dark matter first. If either group reports a detection, it will herald the opening up of a whole new world of discovery, leading potentially to an understanding of Grand Unification itself. And if no discovery is made in the coming years, we might rule out the notion of a simple supersymmetric origin of dark matter—and in turn rule out the whole notion of supersymmetry as a solution of the hierarchy problem. In that case we would have to go back to the drawing board, except if we don’t see any new signals at the LHC, we will have little guidance about which direction to head in order to derive a model of nature that might actually be correct.

    Things got more interesting when the LHC reported a tantalizing possible signal due to a new particle about six times heavier than the Higgs particle. This particle did not have the characteristics one would expect for any supersymmetric partner of ordinary matter. In general the most exciting spurious hints of signals go away when more data are amassed, and about six months after this signal first appeared, after more data were amassed, it disappeared. If it had not, it could have changed everything about the way we think about Grand Unified Theories and electroweak symmetry, suggesting instead a new fundamental force and a new set of particles that feel this force. But while it generated many hopeful theoretical papers, nature seems to have chosen otherwise.

    The absence of clear experimental direction or confirmation of super- symmetry has thus far not bothered one group of theoretical physicists. The beautiful mathematical aspects of supersymmetry encouraged, in 1984, the resurrection of an idea that had been dormant since the 1960s when Yoichiro Nambu and others tried to understand the strong force as if it were a theory of quarks connected by string-like excitations. When supersymmetry was incorporated in a quantum theory of strings, to create what became known as superstring theory, some amazingly beautiful mathematical results began to emerge, including the possibility of unifying not just the three non-gravitational forces, but all four known forces in nature into a single consistent quantum field theory.

    However, the theory requires a host of new spacetime dimensions to exist, none of which has been, as yet, observed. Also, the theory makes no other predictions that are yet testable with currently conceived experiments. And the theory has recently gotten a lot more complicated so that it now seems that strings themselves are probably not even the central dynamical variables in the theory.

    None of this dampened the enthusiasm of a hard core of dedicated and highly talented physicists who have continued to work on superstring theory, now called M-theory, over the 30 years since its heyday in the mid-1980s. Great successes are periodically claimed, but so far M-theory lacks the key element that makes the Standard Model such a triumph of the scientific enterprise: the ability to make contact with the world we can measure, resolve otherwise inexplicable puzzles, and provide fundamental explanations of how our world has arisen as it has. This doesn’t mean M-theory isn’t right, but at this point it is mostly speculation, although well-meaning and well-motivated speculation.

    It is worth remembering that if the lessons of history are any guide, most forefront physical ideas are wrong. If they weren’t, anyone could do theoretical physics. It took several centuries or, if one counts back to the science of the Greeks, several millennia of hits and misses to come up with the Standard Model.

    So this is where we are. Are great new experimental insights just around the corner that may validate, or invalidate, some of the grander speculations of theoretical physicists? Or are we on the verge of a desert where nature will give us no hint of what direction to search in to probe deeper into the underlying nature of the cosmos? We’ll find out, and we will have to live with the new reality either way.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 3:32 pm on February 17, 2017 Permalink | Reply
    Tags: A minimal extension to the standard model of particle physics involves six new particles, Astrophysical observations suggest that the mysterious dark matter is more than five times as common, , , Model Tries to Solve Five Physics Problems at Once, Physical Review Letters, , Standard Model, The particles are three heavy right-handed neutrinos and a color triplet fermion and a particle called rho that both gives mass to the right-handed neutrinos and drives cosmic inflation together with   

    From DESY: “Solving five big questions in particle physics in a SMASH” 

    DESY
    DESY

    2017/02/16
    No writer credit found

    Extension of the standard model provides complete and consistent description of the history of the universe.

    The extremely successful standard model of particle physics has an unfortunate limitation: the current version is only able to explain about 15 percent of the matter found in the universe.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Although it describes and categorises all the known fundamental particles and interactions, it does so only for the type of matter we are familiar with. However, astrophysical observations suggest that the mysterious dark matter is more than five times as common. An international team of theoretical physicists has now come up with an extension to the standard model which could not only explain dark matter but at the same time solve five major problems faced by particle physics at one stroke. Guillermo Ballesteros, from the University of Paris-Saclay, and his colleagues are presenting their SMASH model (“Standard Model Axion Seesaw Higgs portal inflation” model) in the journal Physical Review Letters.

    1
    The history of the universe according to SMASH, denoting the different phases and the dominant energies of the epochs since the Big Bang. Credit: DESY

    3

    Model Tries to Solve Five Physics Problems at Once

    A minimal extension to the standard model of particle physics involves six new particles. http://physics.aps.org/synopsis-for/10.1103/PhysRevLett.118.071802

    The standard model has enjoyed a happy life. Ever since it was proposed four decades ago, it has passed all particle physics tests with flying colors. But it has several sticky problems. For instance, it doesn’t explain why there’s more matter than antimatter in the cosmos. A quartet of theorists from Europe has now taken a stab at solving five of these problems in one go. The solution is a model dubbed SMASH, which extends the standard model in a minimal fashion.

    SMASH adds six new particles to the seventeen fundamental particles of the standard model. The particles are three heavy right-handed neutrinos, a color triplet fermion, a particle called rho that both gives mass to the right-handed neutrinos and drives cosmic inflation together with the Higgs boson, and an axion, which is a promising dark matter candidate. With these six particles, SMASH does five things: produces the matter–antimatter imbalance in the Universe; creates the mysterious tiny masses of the known left-handed neutrinos; explains an unusual symmetry of the strong interaction that binds quarks in nuclei; accounts for the origin of dark matter; and explains inflation.

    The jury is out on whether the model will fly. For one thing, it doesn’t tackle the so-called hierarchy problem and the cosmological constant problem. On the plus side, it makes clear predictions, which the authors say can be tested with future data from observations of the cosmic microwave background and from experiments searching for axions. One prediction is that axions should have a mass between 50 and 200 μeV. Over to the experimentalists, then.

    This research is published in Physical Review Letters.

    “SMASH was actually developed from the bottom up,” explains DESY’s Andreas Ringwald, who co-authored the study. “We started off with the standard model and only added as few new concepts as were necessary in order to answer the unresolved issues.” To do this, the scientists combined various different existing theoretical approaches and came up with a simple, uniform model. SMASH adds a total of six new particles to the standard model: three heavy, right-handed neutrinos and an additional quark, as well as a so-called axion and the heavy rho (ρ) particle. The latter two form a new field which extends throughout the entire universe.

    Using these extensions, the scientists were able to solve five problems: the axion is a candidate for dark matter, which astrophysical observations suggest is five times more ubiquitous than the matter we are familiar with, which is described by the standard model. The heavy neutrinos explain the mass of the already known, very light neutrinos; and the rho interacts with the Higgs boson to produce so-called cosmic inflation, a period during which the entire young universe suddenly expanded by a factor of at least one hundred septillion for hitherto unknown reasons. In addition, SMASH provides explanations as to why our universe contains so much more matter than antimatter, even though equal amounts must have been created during the big bang, and it reveals why no violation of so-called CP symmetry is observed in the strong force, one of the fundamental interactions.

    3
    The particles of the standard model (SM, left) and of the extension SMASH (right). Credit: Carlos Tamarit, University of Durham

    “Overall, the resulting description of the history of the universe is complete and consistent, from the period of inflation to the present day. And unlike many older models, the individual important values can be calculated to a high level of precision, for example the time at which the universe starts heating up again after inflation,” emphasises Ringwald.

    Being able to calculate these values with such precision means that SMASH could potentially be tested experimentally within the next ten years. “The good thing about SMASH is that the theory is falsifiable. For example, it contains very precise predictions of certain features of the so-called cosmic microwave background. Future experiments that measure this radiation with even greater precision could therefore soon rule out SMASH – or else confirm its predictions,” explains Ringwald. A further test of the model is the search for axions. Here too, the model is able to make accurate predictions, and if axions do indeed account for the bulk of dark matter in the universe, then SMASH requires them to have a mass of 50 to 200 micro-electronvolts, in the units conventionally used in particle physics. Experiments that examine dark matter more precisely could soon test this prediction too.

    Javier Redondo from the University of Saragossa in Spain and Carlos Tamarit from the University of Durham in England were also involved in the study.

    Read the APS synopsis: http://physics.aps.org/synopsis-for/10.1103/PhysRevLett.118.071802

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    desi

    DESY is one of the world’s leading accelerator centres. Researchers use the large-scale facilities at DESY to explore the microcosm in all its variety – from the interactions of tiny elementary particles and the behaviour of new types of nanomaterials to biomolecular processes that are essential to life. The accelerators and detectors that DESY develops and builds are unique research tools. The facilities generate the world’s most intense X-ray light, accelerate particles to record energies and open completely new windows onto the universe. 
That makes DESY not only a magnet for more than 3000 guest researchers from over 40 countries every year, but also a coveted partner for national and international cooperations. Committed young researchers find an exciting interdisciplinary setting at DESY. The research centre offers specialized training for a large number of professions. DESY cooperates with industry and business to promote new technologies that will benefit society and encourage innovations. This also benefits the metropolitan regions of the two DESY locations, Hamburg and Zeuthen near Berlin.

     
  • richardmitnick 2:09 pm on February 14, 2017 Permalink | Reply
    Tags: , An exceptional result on a very rare decay of a particle called Bs0, , , Standard Model   

    From CERN: “The Standard Model stands its ground” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    14 Feb 2017
    Stefania Pandolfi

    1
    Event display of a typical Bs0 decay into two muons. The two muon tracks from the Bs0 decay are seen as a pair of green tracks traversing the whole detector. (Image: LHCb collaboration)

    Today, in a seminar at CERN, the LHCb collaboration has presented an exceptional result on a very rare decay of a particle called Bs0. This observation marks yet another victory for the Standard Model (SM) of particle physics – the model that explains, to the best of our knowledge, the behaviour of all fundamental particles in the universe – over all its principal theoretical alternatives.

    CERN/LHCb
    CERN/LHCb

    The LHCb collaboration has reported the observation of the decay of the Bs0 meson – a heavy particle made of a bottom anti-quark and a strange quark – into a pair of muons. This decay is extremely rare, the rarest ever seen: according to the theoretical predictions, it should occur about 3 times in every billion total decays of that particle.

    3
    Event display from the LHCb experiment shows examples of collisions that produced candidates for the rare decay of the Bs0 meson. Image credit: LHCb Collaboration.

    The decay of the Bs0 meson has been long regarded as a very promising place to look for cracks in the armour of the Standard Model, which, despite being our best available description of the subatomic world, leaves some questions unanswered. Therefore, over time, physicists came up with many alternatives or complementary theories. A large class of theories that extend the Standard Model into new physics, such as Supersymmetry, predicts significantly higher values for the Bs0 decay probability. Therefore, an observation of any significant deviation from the SM predicted value would suggest the presence of new, yet unknown, physics.

    The experimental value found by the LHCb collaboration for this probability is in an excellent agreement with the one predicted by the theory, and the result is confirmed to a very high level of reliability, at the level of 7.8 standard deviations: that is, the scientists are extremely sure that it hasn’t occurred just by chance. The LHCb collaboration obtained the first evidence of this phenomenon in November 2012, with a significance of 3.5 standard deviations. Three years later, together with the CMS collaboration, LHCb obtained the first confirmed observation in May 2015, with a significance of 6.2 standard deviations (for more information read the CERN Press release and the paper published on Nature ).

    This new finding limits the room for action of other models of physics beyond the SM: all candidate models will have to demonstrate their compatibility with this important result.

    Further reading on the LHCb website.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    CernCourier
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 8:28 am on September 24, 2016 Permalink | Reply
    Tags: , , , Leptons, Standard Model   

    From FNAL: “Lepton flavor violation: the search for mismatched Higgs boson decays at CMS” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    September 23, 2016
    Bo Jayatilaka

    1
    The Standard Model allows for the Higgs boson to decay to identically flavored pairs of , such as electrons and muons, but not to mixed pairings of lepton flavors. Evidence of the latter would be a sign of new physics.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    For a moment, forget everything you know about twins and imagine you were told “the only way two siblings could have been born on the same day is if they were identical twins.” You would go about life assuming that the only twins in the world were siblings who were the same age and looked exactly alike. Of course, in reality, there are fraternal twins, and the first time you encountered a pair of nonidentical siblings born on the same day, you’d have to assume that your initial information was at least incomplete. Physicists are trying to test a principle of the Standard Model by looking for a particle version of fraternal twins, or lepton flavor violation.

    The fundamental particles known as fermions that make up ordinary matter all seem to come in multiple flavors or generations. For example, the electron has a heavier cousin called the muon. Apart from its mass, a muon behaves much the same way as an electron in terms of having similar properties and interacting with the same forces. One key exception is the flavor itself, a quantity unique to a given flavor of particle. For example, an electron has an “electron number” of +1 while its antiparticle, the positron, has a corresponding number of -1. Muons, on the other hand, have an electron number of 0 but have corresponding “muon numbers.” The Standard Model requires that certain types of interactions, say the decay of a Higgs boson, always conserve lepton flavor. This means that a Higgs boson can decay into an electron and a positron (which would sum to an electron flavor of zero) or a muon and an antimuon (again, a muon flavor sum of zero), but not to an electron and an antimuon, the latter being an example of lepton flavor violation. In short, the Standard Model requires that identical twins of particles emerge from Higgs boson decays and expressly forbids fraternal twins.

    Thus, observing decays of Higgs boson into fraternal twins of lepton pairs, say an electron and a muon, would be a strong sign of physics beyond the Standard Model. CMS physicists searched for evidence of such decays, specifically for Higgs boson decays to electron-muon and electron-tau lepton pairs. The search, performed in the dataset accumulated by CMS in 2012 and reported in a paper submitted to Physics Letters B, yielded no evidence of either type of decay. The results did place the tightest bounds yet on the possible rates of such decays and allowed physicists to place constraints on some models of physics beyond the Standard Model.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    FNAL Icon
    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 11:36 am on August 7, 2016 Permalink | Reply
    Tags: , , , , , Standard Model   

    From physicsworld.com: “And so to bed for the 750 GeV bump” 

    physicsworld
    physicsworld.com

    Aug 5, 2016
    Tushna Commissariat

    1
    No bumps: ATLAS diphoton data – the solid black line shows the 2015 and 2016 data combined. (Courtesy: ATLAS Experiment/CERN)

    2
    Smooth dips: CMS diphoton data – blue lines show 2015 data, red are 2016 data and black are the combined result. (Courtesy: CMS collaboration/CERN)

    After months of rumours, speculation and some 500 papers posted to the arXiv in an attempt to explain it, the ATLAS and CMS collaborations have confirmed that the small excess of diphoton events, or “bump”, at 750 GeV detected in their preliminary data is a mere statistical fluctuation that has disappeared in the light of more data. Most folks in the particle-physics community will have been unsurprised if a bit disappointed by today’s announcement at the International Conference on High Energy Physics (ICHEP) 2016, currently taking place in Chicago.

    The story began around this time last year, soon after the LHC was rebooted and began its impressive 13 TeV run, when the ATLAS collaboration saw more events than expected around the 750 GeV mass window. This bump immediately caught the interest of physicists world over, simply because there was a sniff of “new physics” around it, meaning that the Standard Model of particle physics did not predict the existence of a particle at that energy. But also, it was the first interesting data to emerge from the LHC after its momentous discovery of the Higgs boson in 2012 and if it had held, would have been one of the most exciting discoveries in modern particle physics.

    According to ATLAS, “Last year’s result triggered lively discussions in the scientific communities about possible explanations in terms of new physics and the possible production of a new, beyond-Standard-Model particle decaying to two photons. However, with the modest statistical significance from 2015, only more data could give a conclusive answer.”

    And that is precisely what both ATLAS and CMS did, by analysing the 2016 dataset that is nearly four times larger than that of last year. Sadly, both years’ data taken together reveal that the excess is not large enough to be an actual particle. “The compatibility of the 2015 and 2016 datasets, assuming a signal with mass and width given by the largest 2015 excess, is on the level of 2.7 sigma. This suggests that the observation in the 2015 data was an upward statistical fluctuation.” The CMS statement is succinctly similar: “No significant excess is observed over the Standard Model predictions.”

    Tommaso Dorigo, blogger and CMS collaboration member, tells me that it is wisest to “never completely believe in a new physics signal until the data are confirmed over a long time” – preferably by multiple experiments. More interestingly, he tells me that the 750 Gev bump data seemed to be a “similar signal” to the early Higgs-to-gamma-gamma data the LHC physicists saw in 2011, when they were still chasing the particle. In much the same way, more data were obtained and the Higgs “bump” went on to be an official discovery. With the 750 GeV bump, the opposite is true. “Any new physics requires really really strong evidence to be believed because your belief in the Standard Model is so high and you have seen so many fluctuations go away,” says Dorigo.

    And this is precisely what Colombia University’s Peter Woit – who blogs at Not Even Wrong – told me in March this year when I asked him how he thought the bump would play out. Woit pointed out that particle physics has a long history of “bumps” that may look intriguing at first glance, but will most likely be nothing. “If I had to guess, this will disappear,” he said, adding that the real surprise for him was that “there aren’t more bumps” considering how good the LHC team is at analysing its data and teasing out any possibilities.

    It may be fair to wonder just why so many theorists decided to work with the unconfirmed data from last year and look for a possible explanation of what kind of particle it may have been and indeed, Dorigo says that “theorists should have known better”. But on the flip-side, the Standard Model predicted many a particle long before it was eventually discovered and so it is easy to see why many were keen to come up with the perfect new model.

    Despite the hype and the eventual letdown, Dorigo is glad that this bump has got folks talking about high-energy physics. “It doesn’t matter even if it fizzles out; it’s important to keep asking ourselves these questions,” he says. The main reason for this, Dorigo explains, is that “we are at a very special junction in particle physics as we decide what new machine to build” and some input from current colliders is necessary.”Right now there is no clear direction,” he says. In light of the fact that there has been no new physics (or any hint of supersymmetry) from the LHC to date, the most likely future devices would be an electron–positron collider or, in the long term, a muon collider. But a much clearer indication is necessary before these choices are made and for now, much more data are needed.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

     
  • richardmitnick 10:32 am on July 25, 2016 Permalink | Reply
    Tags: , , , , Possible fifth force?, Standard Model   

    From Don Lincoln of FNAL on livescience: “A Fifth Force: Fact or Fiction” 

    Livescience

    FNAL Icon
    FNAL

    FNAL Don Lincoln
    Don lincoln

    July 5, 2016

    1
    Has a Hungarian lab really found evidence of a fifth force of nature? Credit: Jurik Peter / Shutterstock.com

    Science and the internet have an uneasy relationship: Science tends to move forward through a careful and tedious evaluation of data and theory, and the process can take years to complete. In contrast, the internet community generally has the attention span of Dory, the absent-minded fish of Finding Nemo(and now Finding Dory) — a meme here, a celebrity picture there — oh, look … a funny cat video.

    Thus people who are interested in serious science should be extremely cautious when they read an online story that purports to be a paradigm-shifting scientific discovery. A recent example is one suggesting that a new force of nature might have been discovered. If true, that would mean that we have to rewrite the textbooks.

    A fifth force

    So what has been claimed?

    In an article submitted on April 7, 2015, to the arXiv repository of physics papers, a group of Hungarian researchers reported on a study in which they focused an intense beam of protons (particles found in the center of atoms) on thin lithium targets. The collisions created excited nuclei of beryllium-8, which decayed into ordinary beryllium-8 and pairs of electron-positron particles. (The positron is the antimatter equivalent of the electron.)

    3
    The Standard Model is the collection of theories that describe the smallest experimentally observed particles of matter and the interactions between energy and matter. Credit: Karl Tate, LiveScience Infographic Artist

    They claimed that their data could not be explained by known physical phenomena in the Standard Model, the reigning model governing particle physics. But, they purported, they could explain the data if a new particle existed with a mass of approximately 17 million electron volts, which is 32.7 times heavier than an electron and just shy of 2 percent the mass of a proton. The particles that emerge at this energy range, which is relatively low by modern standards, have been well studied. And so it would be very surprising if a new particle were discovered in this energy regime.

    However, the measurement survived peer review and was published on Jan. 26, 2016, in the journal Physical Review Letters, which is one of the most prestigious physics journals in the world. In this publication, the researchers, and this research, cleared an impressive hurdle.

    Their measurement received little attention until a group of theoretical physicists from the University of California, Irvine (UCI), turned their attention to it. As theorists commonly do with a controversial physics measurement, the team compared it with the body of work that has been assembled over the last century or so, to see if the new data are consistent or inconsistent with the existing body of knowledge. In this case, they looked at about a dozen published studies.

    What they found is that though the measurement didn’t conflict with any past studies, it seemed to be something never before observed — and something that couldn’t be explained by the Standard Model.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth

    New theoretical framework

    To make sense of the Hungarian measurement, then, this group of UCI theorists invented a new theory.

    The theory invented by the Irvine group is really quite exotic. They start with the very reasonable premise that the possible new particle is something that is not described by existing theory. This makes sense because the possible new particle is very low mass and would have been discovered before if it were governed by known physics. If this were a new particle governed by new physics, perhaps a new force is involved. Since traditionally physicists speak of four known fundamental forces (gravity, electromagnetism and the strong and weak nuclear forces), this hypothetical new force has been dubbed “the fifth force.”

    Theories and discoveries of a fifth force have a checkered history, going back decades, with measurements and ideas arising and disappearing with new data. On the other hand, there are mysteries not explained by ordinary physics like, for example, dark matter. While dark matter has historically been modeled as a single form of a stable and massive particle that experiences gravity and none of the other known forces, there is no reason that dark matter couldn’t experience forces that ordinary matter doesn’t experience. After all, ordinary matter experiences forces that dark matter doesn’t, so the hypothesis isn’t so silly.

    6
    There is no reason dark matter couldn’t experience forces that ordinary matter doesn’t experience. Here, in the galaxy cluster Abell 3827, dark matter was observed interacting with itself during a galaxy collision. Credit: ESO

    There are many ideas about forces that affect only dark matter and the term for this basic idea is called “complex dark matter.” One common idea is that there is a dark photon that interacts with a dark charge carried only by dark matter. This particle is a dark matter analog of the photon of ordinary matter that interacts with familiar electrical charge, with one exception: Some theories of complex dark matter imbue dark photons with mass, in stark contrast with ordinary photons.

    If dark photons exist, they can couple with ordinary matter (and ordinary photons) and decay into electron-positron pairs, which is what the Hungarian research group was investigating. Because dark photons don’t interact with ordinary electric charge, this coupling can only occur because of the vagaries of quantum mechanics. But if scientists started seeing an increase in electron-positron pairs, that might mean they were observing a dark photon.

    The Irvine group found a model that included a “protophobic” particle that was not ruled out by earlier measurements and would explain the Hungarian result. Particles that are “protophobic,” which literally means “fear of protons,” rarely or never interact with protons but can interact with neutrons (neutrophilic).

    The particle proposed by the Irvine group experiences a fifth and unknown force, which is in the range of 12 femtometers, or about 12 times bigger than a proton. The particle is protophobic and neutrophilic. The proposed particle has a mass of 17 million electron volts and can decay into electron-positron pairs. In addition to explaining the Hungarian measurement, such a particle would help explain some discrepancies seen by other experiments. This last consequence adds some weight to the idea.

    Paradigm-shifting force?

    So this is the status.

    What is likely to be true? Obviously, data is king. Other experiments will need to confirm or refute the measurement. Nothing else really matters. But that will take a year or so and having some idea before then might be nice. The best way to estimate the likelihood the finding is real is to look at the reputations of the various researchers involved. This is clearly a shoddy way to do science, but it will help shade your expectations.

    So let’s start with the Irvine group. Many of them (the senior ones, typically) are well- regarded and established members of the field, with substantive and solid papers in their past. The group includes a spectrum of ages, with both senior and junior members. In the interest of full disclosure, I know some of them personally and, indeed, two of them have read the theoretical portions of chapters of books I have written for the public to ensure that I didn’t say anything stupid. (By the way, they didn’t find any gaffes, but they certainly helped clarify certain points.) That certainly demonstrates my high regard for members of the Irvine group, but possibly taints my opinion. In my judgment, they almost certainly did a thorough and professional job of comparing their new model to existing data. They have found a small and unexplored region of possible theories that could exist.

    On the other hand, the theory is pretty speculative and highly improbable. This isn’t an indictment … all proposed theories could be labeled in this way. After all, the Standard Model, which governs particle physics, is nearly a half century old and has been thoroughly explored. In addition, ALL new theoretical ideas are speculative and improbable and almost all of them are wrong. This also isn’t an indictment. There are many ways to add possible modifications to existing theories to account for new phenomena. They can’t all be right. Sometimes none of the proposed ideas are right.

    However, we can conclude from the reputation of the group’s members that they have generated a new idea and have compared it to all relevant existing data. The fact that they released their model means that it survived their tests and thus it remains a credible, if improbable, possibility.

    What about the Hungarian group? I know none of them personally, but the article was published in Physical Review Letters — a chalk mark in the win column. However, the group has also published two previous papers in which comparable anomalies were observed, including a possible particle with a mass of 12 million electron volts and a second publication claiming the discovery of a particle with a mass of about 14 million electron volts. Both of these claims were subsequently falsified by other experiments.

    Further, the Hungarian group has never satisfactorily disclosed what error was made that resulted in these erroneous claims. Another possible red flag is that the group rarely publishes data that doesn’t claim anomalies. That is improbable. In my own research career, most publications were confirmation of existing theories. Anomalies that persist are very, very, rare.

    So what’s the bottom line? Should you be excited about this new possible discovery? Well…sure…possible discoveries are always exciting. The Standard Model has stood the test of time for half a century, but there are unexplained mysteries and the scientific community is always looking for the discovery that points us in the direction of a new and improved theory. But what are the odds that this measurement and theory will lead to the scientific world accepting a new force with a range of 12 fm and with a particle that shuns protons? My sense is that this a long shot. I am not so sanguine as to the chances of this outcome.

    Of course, this opinion is only that…an opinion, albeit an informed one. Other experiments will also be looking for dark photons because, even if the Hungarian measurement doesn’t stand up to scrutiny, there is still a real problem with dark matter. Many experiments looking for dark photons will explore the same parameter space (e.g. energy, mass and decay modes) in which the Hungarian researchers claim to have found an anomaly. We will soon (within a year) know if this anomaly is a discovery or just another bump in the data that temporarily excited the community, only to be discarded as better data is recorded. And, no matter the outcome, good and better science will be the eventual result.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 5:11 pm on June 17, 2016 Permalink | Reply
    Tags: , , , , , Standard Model   

    From Don Lincoln at FNAL: “The triumphant Standard Model” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    June 17, 2016

    FNAL Don Lincoln
    Don Lincoln

    In high-end research, there are a couple of deeply compelling types of data analyses that scientists do. There are those that break the existing scientific understanding and rewrite the textbooks. Those are exciting. But there are also those in which a highly successful theory is tested in a regime never before explored. There can also be two types of outcome. If the theory fails to explain the data, we have a discovery of the type I mentioned first. But it is also possible that the theory explains the data perfectly well. If so, that means that you’ve proven that the existing theory is even more successful than was originally known. That’s a different kind of success. It means that predictions made in one realm taught scientists enough to understand far more.

    In the LHC, pairs of protons are collided together with the unprecedented energy of 13 trillion electronvolts of energy.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Before 2015, when the data in this analysis was recorded, the highest energy ever studied by humanity was only 8 trillion electronvolts. So, already we know that the new data is 63 percent higher in terms of energy reach as compared to the old data. To get a visceral sense of what that means, imagine that your bank told you that they made a mistake and that for every dollar you thought you had in your account, you actually had $1.63. I’m guessing you’d start planning for an awesome vacation or perhaps an earlier retirement.

    When the protons collide, most commonly, a quark or gluon from each proton hits a quark or gluon from the other proton and knocks them out of the collision area into the detector. As the quarks and gluons leave the collision area, they convert into sprays of particles that travel in roughly the same direction. These are called jets. Physicists study the location and energy of the jets in the detector and compare them to the predicted distribution.

    CMS scientists studied the production patterns of jets at a collision energy of 13 trillion electronvolts and found that they agreed with the predictions of the Standard Model with the same level of precision seen at lower energy measurements.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth

    This result comes with a small sadness because this means that new physics hasn’t been discovered. On the other hand, it is a resounding endorsement of the theory of quantum chromodynamics, or QCD, which is the portion of the Standard Model that deals explicitly with quark and gluon scattering. QCD, first worked out nearly half a century ago, continues its decades-long track record of success.

    2
    Scientists are constantly exploring the universe, seeing what happens when existing theories are tested in new realms. In today’s analysis, scientists put the leading theory of quark scattering to the test, studying what happens when it is compared to data taken at energies over 60 percent higher than ever before achieved.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 4:47 pm on May 9, 2016 Permalink | Reply
    Tags: , , , Standard Model   

    From COSMOS: “Particle physics: a primer to the theory of (almost) everything” 

    Cosmos Magazine bloc

    COSMOS

    9 May 2016
    Cathal O’Connell

    Are you a boson bozo? Do quarks leave you quizzical? Do gluons get you unstuck? Cathal O’Connell has a guide to the zoo of particles, known as the Standard Model of particle physics.

    1
    Graphic of a transverse section through a detector showing one of the numerous particle collision events recorded during the search for the Higgs boson.Credit: ATLAS COLLABORATION/CERN

    CERN ATLAS Higgs Event
    CERN/ATLAS
    ATLAS

    Around the turn of the 4th century BC, the Greek philosopher Democritus caught the smell of baking and thought that little bits of bread must be floating through the air and into his nose. He called the little bits “atoms” (meaning “uncuttable”) and imagined them as tiny spherical balls.

    But atoms are not little solid spheres. They are made of even smaller bits, called particles.

    Scientists’ best description of those particles and the forces that govern their behaviour is called the Standard Model of particle physics, or just “The Standard Model”.

    The Standard Model categorises all of the particles of nature, in the same way that the periodic table categorises the elements. The theory is called the Standard Model because it is so successful it has become “standard”.

    And no, there is no Economy Model, nor a Deluxe one.

    There are, however, still a few kinks to be ironed out (as well as a couple of whopping omissions). That’s why it is sometimes called the “Theory of Almost Everything”.

    How did it all kick off?

    Back in the early 20th century, scientists thought there were only three fundamental particles in nature: protons and neutrons, which make up the nucleus of an atom, and electrons that whizz round it.

    But in the 1950s and 1960s physicists started smashing these particles together and some of them broke. It turned out the protons and neutrons had even smaller particles inside them.

    Many dozens of new particles were discovered – and for a while, nobody could explain them. Physicists called it the “particle zoo”.

    In the 1970s, physicists such as Murray Gell-Mann found an order amongst the chaos. The step they took was similar to the one Russian chemist Dmitri Mendeleev took to find an order to the chemical elements in his periodic table.

    The new ordering of the particles explained many of the properties of the newly discovered particles, as well as correctly predicting some new ones.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Meet the family

    The particles of the Standard Model make up one big family. Your first introduction can be daunting, a bit like attending a gathering with a lot of distant cousins you’ve never heard of. No matter how strange these cousins are, it is important to remember that they are all related.
    The basics

    Gell-Mann and others placed the particles in two main categories: fermions and bosons.

    Fermions, such as the electron, make up the stuff we call matter. Bosons, such as the photon, transmit forces.

    Fermions are subdivided again into two kinds of particles, depending on the forces they feel. These are the quarks and the leptons (see below).
    Forces of nature

    Particles communicate with one another via four forces: electromagnetism, the strong force, the weak force and gravity.

    The Standard Model describes the first three (gravity does not feature in the Standard Model, as explained below).

    Different particles communicate through different forces, similar to the way people can communicate in different languages. For example, only the quarks speak “gluon”. While electrons can speak “photon” as well as “W boson” and “Z boson”.

    Electromagnetism is the force that holds electrons in an atom. It is communicated by photons.

    The strong force keeps the nuclei of atoms together. Without it, every atom in the universe would spontaneously explode. It is communicated by gluons.

    The weak force causes radioactive decay. It’s transmitted by W and Z bosons.

    The fundamental particles

    All matter is made of two types of particles known as quarks and leptons.

    Quarks: (the purple particles in the figure) come in six “flavours”, all with weird names. It’s useful to see them as coming in pairs to make three generations. These are “up” and “down” (first generation), “charmed” and “strange” (second generation) and “top” and “bottom” (third generation).

    Only the up and down quarks are important in day-to-day life because they make protons and neutrons.

    The others make only “exotic” matter, which is too unstable to form atoms. Physicists can create exotic matter in particle accelerators, but it usually only lasts a fraction of a second before decaying.

    Leptons: there are six leptons, the best known of which is the electron, a tiny fundamental particle with a negative charge.

    The muon (second generation) and tau (third generation) particles are like fatter versions of the electron. They also have negative electric charge, but they are too unstable to feature in ordinary matter.

    And each of these particles has a corresponding neutrino, with no charge.

    Neutrinos deserve a special mention because they are perhaps the least understood of all the particles in the Standard Model.

    They are fast but interact only through the weak force, which means they can easily zip straight through a planet. They are created in nuclear reactions, such as those powering the Sun’s core.

    Hadrons: the composite particles

    Now that we know the fundamental particles of nature, we can begin to stack them together in different ways to make bigger particles.

    The most important composite particles are the baryons, made of three quarks. Protons and neutrons are both kinds of baryon.

    The European Organisation for Nuclear Research’s (CERN) biggest particle collider smashes protons together. Because protons are a kind of hadron, it’s called the Large Hadron Collider, or LHC.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Antimatter: double or nothing?

    As far as we know, all quarks and leptons have twin particles of antimatter. Antimatter is like matter except it has the opposite charge. For example, the electron has a counterpart that’s exactly the same mass, except with positive charge instead of negative. When a particle of matter meets its antimatter twin, they both annihilate in a burst of pure energy.

    Antimatter is incredibly rare in the Universe, although it does have some important roles in technology. Positron emission tomography (PET) scanners, for instance, use the annihilation of positrons to see inside the body.

    One of the great mysteries of physics is why the Universe is made almost entirely of matter. Many particle physicists are striving to answer it.

    Atoms: composites of composites

    The bread that Democritus sniffed is made of only the first generation of fundamental particles.

    Up and down quarks bind together through the strong force to make protons and neutrons, and the strong force also sticks them together to form the nucleus of an atom.

    Electrons orbit the nucleus in arrangements determined by quantum mechanics (see our primer Quantum physics for the terminally confused).
    The Higgs: the god particle

    You probably noticed the loner off to the right side of particle table – the Higgs boson. The Higgs is a special kind of particle that gives the other fundamental particles their mass.

    The idea is that there is a field existing everywhere in space. And when particles move through space, they tend to bump into this field, and this interaction slows them down (similar to how it’s more difficult to move through water than air). This interaction is what gives fundamental particles their mass.

    Some particles such as photons and gluons don’t interact with the Higgs field, so are massless.

    Just as photons communicate the electromagnetic force, the Higgs Boson communicates the Higgs Field.

    The Higgs Boson was a theoretical particle until 2013 when CERN announced it had been discovered at last, although scientists are still uncovering its properties.
    What’s missing?

    Gravity

    The biggest hole in the Standard Model is the lack of gravity. The fourth force of nature just does not fit into the current picture.

    Gravity is also incredibly weak compared to the other forces (the strong force is 100,000,000,000,000,000,000,000,000,000,000,000,000 times stronger than gravity, for example).

    Some physicists think gravity is also transmitted by a kind of particle, called a graviton, but so far there is no evidence that this particle exists.

    Neutrino mass

    The neutrino is so tiny compared to all the other particles that it really begs an explanation. It’s possible that the neutrino doesn’t get its mass from the Higgs in the same way other particles do.

    Dark matter: For observing the Universe, it looks like a huge portion of it is made of Dark Matter – a new kind of stuff that doesn’t interact with regular matter and so is probably missing from the Standard Model entirely.

    Supersymmetry

    Some physicists are looking for extensions to the Standard Model to explain these mysteries. Supersymmetry is one extension where every particle has another twin with higher mass.

    Some of these particles would interact very weakly with ordinary stuff and so could be good candidates for Dark Matter.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 10:57 am on August 5, 2015 Permalink | Reply
    Tags: , , Standard Model,   

    From Symmetry: “The mystery of particle generations” 

    Symmetry

    August 05, 2015
    Matthew R. Francis

    Why are there three almost identical copies of each particle of matter?

    1
    Artwork by Sandbox Studio, Chicago

    The Standard Model of particles and interactions is remarkably successful for a theory everyone knows is missing big pieces.

    1
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    It accounts for the everyday stuff we know like protons, neutrons, electrons and photons, and even exotic stuff like Higgs bosons and top quarks. But it isn’t complete; it doesn’t explain phenomena such as dark matter and dark energy.

    The Standard Model is successful because it is a useful guide to the particles of matter we see. One convenient pattern that has proven valuable is generations. Each particle of matter seems to come in three different versions, differentiated only by mass.

    Scientists wonder whether that pattern has a deeper explanation or if it’s just convenient for now, to be superseded by a deeper truth.
    The next generations

    The Standard Model is a menu listing all of the known fundamental particles: particles that cannot be broken down into constituent parts. It distinguishes between the fermions, which are particles of matter, and the bosons, which carry forces.

    The matter particles include six quarks and six leptons. The six quarks are called the up, down, charm, strange, top and bottom quark. Quarks typically don’t exist as single particles but lump together to form heavier particles such as protons and neutrons. Leptons include electrons and their cousins the muons and tau particles, along with the three types of neutrinos.

    All of these matter particles fall into three “generations.”

    “The three generations are literally copy-paste of the first generation,” says Carleton University physicist Heather Logan. The up, charm and top quarks have the same electric charge, along with the same weak and strong interactions—they primarily differ in the mass, which comes from the Higgs field. The same thing holds for the down, strange and bottom quarks, along with the electron, muon and tau leptons.

    “The fact that the three generations couple differently to the Higgs sector is maybe telling us something, but we don’t really know what yet,” Logan says. Most of the generations differ in mass by a lot. For example, the tau lepton is roughly 3600 times more massive than the electron, and the top quark is nearly 100,000 times heavier than the up quark. That difference manifests itself in stability: The heavier generations decay into the lighter generations, until they reach the lightest, which are (as far as we can tell) stable forever.

    The generations play a big role in experiments. The Higgs boson, for instance, is an unstable particle that decays into a variety of other particles, including tau leptons. “Since the tau is the heaviest, the Higgs [boson] prefers to change into taus more than electrons or muons,” says Clara Nellist, an experimental particle physicist at the Laboratoire de l’Accélérateur Linéaire (LAL) in Orsay, France. “The best way to study how the Higgs interacts with leptons is by looking at a Higgs changing into two taus.”

    That sort of observation is the heart of Standard Model physics: Crash two or more particles together, watch what new particles are born, look for patterns in the detritus, and—if we’re really lucky—see what doesn’t fit into the map we have.
    Roads outward

    While some stuff like dark matter obviously lies outside the charts, the Standard Model itself has a few problems. For example, neutrinos should be massless according to the Standard Model, but real-world experiments show they have very tiny masses. And unlike quarks and electrically charged leptons, the mass differences between the generations of neutrinos are very small, which is why we see them oscillating from one type to another.

    Without mass, neutrinos are exactly identical; with the mass, they’re different. And that generational difference is puzzling to theorist Richard Ruiz of the University of Pittsburgh. “There is a pattern here staring at us but we cannot quite figure out how to make sense of it.”

    Even if there is only the one Standard Model Higgs, we can learn a lot by how it interacts and decays. For instance, Nellist says, “by studying how often the Higgs boson changes into taus compared to other particles, we can test the validity of the Standard Model and see if there are hints of other generations.”

    It’s unlikely, since any fourth generation quark would need to be far more massive even than the top quark. But any anomaly in Higgs decay could tell us a lot.

    “Nobody knows why there are three generations,” Logan says. However, the structure of the Standard Model is a clue to what might be beyond, including the theory known as Supersymmetry: “If there are supersymmetric partners of the fermions, they should also fall into the three generations. How their masses are set might give us clues to understanding how the masses of the Standard Model fermions are set and why we have those patterns.”

    No matter how many there are, nobody knows why there are generations to begin with. “‘Generations’ is just a conventional organization of the Standard Model’s matter content,” Ruiz says. That organization might survive in a deeper theory (for instance, theories in which quarks are made up of smaller particles called “preons”, which are unlikely based on present data), but new ideas would have to explain why the quarks and leptons seem to fall into the patterns they do.

    Ultimately, even though the Standard Model is not the final description of the cosmos, it’s been a good guide so far. As we look for the edges of the map it provides, we get closer to a true and accurate chart of all the particles and their interactions.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: