Tagged: Standard Model Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:32 pm on April 18, 2019 Permalink | Reply
    Tags: "When Beauty Gets in the Way of Science", , , , , , , , , , Standard Model,   

    From Nautilus: “When Beauty Gets in the Way of Science” 

    Nautilus

    From Nautilus

    April 18, 2019
    Sabine Hossenfelder

    Insisting that new ideas must be beautiful blocks progress in particle physics.

    When Beauty Gets in the Way of Science. Nautilus

    The biggest news in particle physics is no news. In March, one of the most important conferences in the field, Rencontres de Moriond, took place. It is an annual meeting at which experimental collaborations present preliminary results. But the recent data from the Large Hadron Collider (LHC), currently the world’s largest particle collider, has not revealed anything new.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    Forty years ago, particle physicists thought themselves close to a final theory for the structure of matter. At that time, they formulated the Standard Model of particle physics to describe the elementary constituents of matter and their interactions.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    After that, they searched for the predicted, but still missing, particles of the Standard Model. In 2012, they confirmed the last missing particle, the Higgs boson.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    The Higgs boson is necessary to make sense of the rest of the Standard Model. Without it, the other particles would not have masses, and probabilities would not properly add up to one. Now, with the Higgs in the bag, the Standard Model is complete; all Pokémon caught.

    1
    HIGGS HANGOVER: After the Large Hadron Collider (above) confirmed the Higgs boson, which validated the Standard Model, it’s produced nothing newsworthy, and is unlikely to, says physicist Sabine Hossenfelder.Shutterstock

    The Standard Model may be physicists’ best shot at the structure of fundamental matter, but it leaves them wanting. Many particle physicists think it is simply too ugly to be nature’s last word. The 25 particles of the Standard Model can be classified by three types of symmetries that correspond to three fundamental forces: The electromagnetic force, and the strong and weak nuclear forces. Physicists, however, would rather there was only one unified force. They would also like to see an entirely new type of symmetry, the so-called “supersymmetry,” because that would be more appealing.

    2
    Supersymmetry builds on the Standard Model, with many new supersymmetric particles, represented here with a tilde (~) on them. ( From the movie “Particle fever” reproduced by Mark Levinson)

    Oh, and additional dimensions of space would be pretty. And maybe also parallel universes. Their wish list is long.

    It has become common practice among particle physicists to use arguments from beauty to select the theories they deem worthy of further study. These criteria of beauty are subjective and not evidence-based, but they are widely believed to be good guides to theory development. The most often used criteria of beauty in the foundations of physics are presently simplicity and naturalness.

    By “simplicity,” I don’t mean relative simplicity, the idea that the simplest theory is the best (a.k.a. “Occam’s razor”). Relying on relative simplicity is good scientific practice. The desire that a theory be simple in absolute terms, in contrast, is a criterion from beauty: There is no deep reason that the laws of nature should be simple. In the foundations of physics, this desire for absolute simplicity presently shows in physicists’ hope for unification or, if you push it one level further, in the quest for a “Theory of Everything” that would merge the three forces of the Standard Model with gravity.

    The other criterion of beauty, naturalness, requires that pure numbers that appear in a theory (i.e., those without units) should neither be very large nor very small; instead, these numbers should be close to one. Exactly how close these numbers should be to one is debatable, which is already an indicator of the non-scientific nature of this argument. Indeed, the inability of particle physicists to quantify just when a lack of naturalness becomes problematic highlights that the fact that an unnatural theory is utterly unproblematic. It is just not beautiful.

    Anyone who has a look at the literature of the foundations of physics will see that relying on such arguments from beauty has been a major current in the field for decades. It has been propagated by big players in the field, including Steven Weinberg, Frank Wilczek, Edward Witten, Murray Gell-Mann, and Sheldon Glashow. Countless books popularized the idea that the laws of nature should be beautiful, written, among others, by Brian Greene, Dan Hooper, Gordon Kane, and Anthony Zee. Indeed, this talk about beauty has been going on for so long that at this point it seems likely most people presently in the field were attracted by it in the first place. Little surprise, then, they can’t seem to let go of it.

    Trouble is, relying on beauty as a guide to new laws of nature is not working.

    Since the 1980s, dozens of experiments looked for evidence of unified forces and supersymmetric particles, and other particles invented to beautify the Standard Model. Physicists have conjectured hundreds of hypothetical particles, from “gluinos” and “wimps” to “branons” and “cuscutons,” each of which they invented to remedy a perceived lack of beauty in the existing theories. These particles are supposed to aid beauty, for example, by increasing the amount of symmetries, by unifying forces, or by explaining why certain numbers are small. Unfortunately, not a single one of those particles has ever been seen. Measurements have merely confirmed the Standard Model over and over again. And a theory of everything, if it exists, is as elusive today as it was in the 1970s. The Large Hadron Collider is only the most recent in a long series of searches that failed to confirm those beauty-based predictions.

    These decades of failure show that postulating new laws of nature just because they are beautiful according to human standards is not a good way to put forward scientific hypotheses. It’s not the first time this has happened. Historical precedents are not difficult to find. Relying on beauty did not work for Kepler’s Platonic solids, it did not work for Einstein’s idea of an eternally unchanging universe, and it did not work for the oh-so-pretty idea, popular at the end of the 19th century, that atoms are knots in an invisible ether. All of these theories were once considered beautiful, but are today known to be wrong. Physicists have repeatedly told me about beautiful ideas that didn’t turn out to be beautiful at all. Such hindsight is not evidence that arguments from beauty work, but rather that our perception of beauty changes over time.

    That beauty is subjective is hardly a breakthrough insight, but physicists are slow to learn the lesson—and that has consequences. Experiments that test ill-motivated hypotheses are at high risk to only find null results; i.e., to confirm the existing theories and not see evidence of new effects. This is what has happened in the foundations of physics for 40 years now. And with the new LHC results, it happened once again.

    The data analyzed so far shows no evidence for supersymmetric particles, extra dimensions, or any other physics that would not be compatible with the Standard Model. In the past two years, particle physicists were excited about an anomaly in the interaction rates of different leptons. The Standard Model predicts these rates should be identical, but the data demonstrates a slight difference. This “lepton anomaly” has persisted in the new data, but—against particle physicists’ hopes—it did not increase in significance, is hence not a sign for new particles. The LHC collaborations succeeded in measuring the violation of symmetry in the decay of composite particles called “D-mesons,” but the measured effect is, once again, consistent with the Standard Model. The data stubbornly repeat: Nothing new to see here.

    Of course it’s possible there is something to find in the data yet to be analyzed. But at this point we already know that all previously made predictions for new physics were wrong, meaning that there is now no reason to expect anything new to appear.

    Yes, null results—like the recent LHC measurements—are also results. They rule out some hypotheses. But null results are not very useful results if you want to develop a new theory. A null-result says: “Let’s not go this way.” A result says: “Let’s go that way.” If there are many ways to go, discarding some of them does not help much.

    To find the way forward in the foundations of physics, we need results, not null-results. When testing new hypotheses takes decades of construction time and billions of dollars, we have to be careful what to invest in. Experiments have become too costly to rely on serendipitous discoveries. Beauty-based methods have historically not worked. They still don’t work. It’s time that physicists take note.

    And it’s not like the lack of beauty is the only problem with the current theories in the foundations of physics. There are good reasons to think physics is not done. The Standard Model cannot be the last word, notably because it does not contain gravity and fails to account for the masses of neutrinos. It also describes neither dark matter nor dark energy, which are necessary to explain galactic structures.

    So, clearly, the foundations of physics have problems that require answers. Physicists should focus on those. And we currently have no reason to think that colliding particles at the next higher energies will help solve any of the existing problems. New effects may not appear until energies are a billion times higher than what even the next larger collider could probe. To make progress, then, physicists must, first and foremost, learn from their failed predictions.

    So far, they have not. In 2016, the particle physicists Howard Baer, Vernon Barger, and Jenny List wrote an essay for Scientific American arguing that we need a larger particle collider to “save physics.” The reason? A theory the authors had proposed themselves, that is natural (beautiful!) in a specific way, predicts such a larger collider should see new particles. This March, Kane, a particle physicist, used similar beauty-based arguments in an essay for Physics Today. And a recent comment in Nature Reviews Physics about a big, new particle collider planned in Japan once again drew on the same motivations from naturalness that have already not worked for the LHC. Even the particle physicists who have admitted their predictions failed do not want to give up beauty-based hypotheses. Instead, they have argued we need more experiments to test just how wrong they are.

    Will this latest round of null-results finally convince particle physicists that they need new methods of theory-development? I certainly hope so.

    As an ex-particle physicist myself, I understand very well the desire to have an all-encompassing theory for the structure of matter. I can also relate to the appeal of theories such a supersymmetry or string theory. And, yes, I quite like the idea that we live in one of infinitely many universes that together make up the “multiverse.” But, as the latest LHC results drive home once again, the laws of nature care heartily little about what humans find beautiful.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 10:44 am on October 18, 2018 Permalink | Reply
    Tags: , , , Standard Model   

    From Northwestern University: “Unprecedented look at electron brings us closer to understanding the universe” 

    Northwestern U bloc
    From Northwestern University

    October 17, 2018
    Amanda Morris

    Study supports Standard Model of particle physics, excludes alternative models.

    1
    An artist’s representation of an electron orbiting an atom’s nucleus, spinning about its axis as a cloud of other subatomic particles pop in and out of existence.
    No image caption or credit.

    The scientific community can relax. The electron is still round.

    At least for now.

    In a new study, researchers at Northwestern, Harvard and Yale universities examined the shape of an electron’s charge with unprecedented precision to confirm that it is perfectly spherical. A slightly squashed charge could have indicated unknown, hard-to-detect heavy particles in the electron’s presence, a discovery that could have upended the global physics community.

    “If we had discovered that the shape wasn’t round, that would be the biggest headline in physics for the past several decades,” said Gerald Gabrielse, who led the research at Northwestern. “But our finding is still just as scientifically significant because it strengthens the Standard Model of particle physics and excludes alternative models.”

    The study will be published Oct. 18 in the journal Nature. In addition to Gabrielse, the research was led by John Doyle, the Henry B. Silsbee Professor of Physics at Harvard, and David DeMille, professor of physics at Yale. The trio leads the National Science Foundation (NSF)-funded Advanced Cold Molecule Electron (ACME) Electric Dipole Moment Search.

    The sub-standard Standard Model

    A longstanding theory, the Standard Model of particle physics describes most of the fundamental forces and particles in the universe.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    The model is a mathematical picture of reality, and no laboratory experiments yet performed have contradicted it.

    This lack of contradiction has been puzzling physicists for decades.

    “The Standard Model as it stands cannot possibly be right because it cannot predict why the universe exists,” said Gabrielse, the Board of Trustees Professor of Physics in Northwestern’s Weinberg College of Arts and Sciences. “That’s a pretty big loophole.”

    Gabrielse and his ACME colleagues have spent their careers trying to close this loophole by examining the Standard Model’s predictions and then trying to confirm them through table-top experiments in the lab.

    Attempting to “fix” the Standard Model, many alternative models predict that an electron’s seemingly uniform sphere is actually asymmetrically squished. One such model, called the Supersymmetric Model, posits that unknown, heavy subatomic particles influence the electron to alter its perfectly spherical shape — an unproven phenomenon called the “electric dipole moment.”

    Standard model of Supersymmetry DESY

    These undiscovered, heavier particles could be responsible for some of the universe’s most glaring mysteries and could possibly explain why the universe is made from matter instead of antimatter.

    “Almost all of the alternative models say the electron charge may well be squished, but we just haven’t looked sensitively enough,” said Gabrielse, the founding director of Northwestern’s new Center for Fundamental Physics. “That’s why we decided to look there with a higher precision than ever realized before.”

    Squashing the alternative theories

    The ACME team probed this question by firing a beam of cold thorium-oxide molecules into a chamber the size of a large desk. Researchers then studied the light emitted from the molecules. Twisting light would indicate an electric dipole moment. When the light did not twist, the research team concluded that the electron’s shape was, in fact, round, confirming the Standard Model’s prediction. No evidence of an electric dipole moment means no evidence of those hypothetical heavier particles. If these particles do exist at all, their properties differ from those predicted by theorists.

    “Our result tells the scientific community that we need to seriously rethink some of the alternative theories,” DeMille said.

    In 2014, the ACME team performed the same measurement with a simpler apparatus. By using improved laser methods and different laser frequencies, the current experiment was an order of magnitude more sensitive than its predecessor.

    “If an electron were the size of Earth, we could detect if the Earth’s center was off by a distance a million times smaller than a human hair,” Gabrielse explained. “That’s how sensitive our apparatus is.”

    Gabrielse, DeMille, Doyle and their teams plan to keep tuning their instrument to make more and more precise measurements. Until researchers find evidence to the contrary, the electron’s round shape — and the universe’s mysteries — will remain.

    “We know the Standard Model is wrong, but we can’t seem to find where it’s wrong. It’s like a huge mystery novel,” Gabrielse said. “We should be very careful about making assumptions that we’re getting closer to solving the mystery, but I do have considerable hope that we’re getting closer at this level of precision.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Northwestern South Campus
    South Campus

    On May 31, 1850, nine men gathered to begin planning a university that would serve the Northwest Territory.

    Given that they had little money, no land and limited higher education experience, their vision was ambitious. But through a combination of creative financing, shrewd politicking, religious inspiration and an abundance of hard work, the founders of Northwestern University were able to make that dream a reality.

    In 1853, the founders purchased a 379-acre tract of land on the shore of Lake Michigan 12 miles north of Chicago. They established a campus and developed the land near it, naming the surrounding town Evanston in honor of one of the University’s founders, John Evans. After completing its first building in 1855, Northwestern began classes that fall with two faculty members and 10 students.
    Twenty-one presidents have presided over Northwestern in the years since. The University has grown to include 12 schools and colleges, with additional campuses in Chicago and Doha, Qatar.

    Northwestern is recognized nationally and internationally for its educational programs.

     
  • richardmitnick 1:04 pm on August 14, 2018 Permalink | Reply
    Tags: , , Brute-force approach to particle hunt, , , , , , Standard Model,   

    From Nature: “LHC physicists embrace brute-force approach to particle hunt” 

    Nature Mag
    From Nature

    14 August 2018
    Davide Castelvecchi

    The world’s most powerful particle collider has yet to turn up new physics [since Higgs] — now some physicists are turning to a different strategy.

    1
    The ATLAS detector at the Large Hadron Collider near Geneva, Switzerland.Credit: Stefano Dal Pozzolo/Contrasto /eyevine

    A once-controversial approach to particle physics has entered the mainstream at the Large Hadron Collider (LHC).

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    The LHC’s major ATLAS experiment has officially thrown its weight behind the method — an alternative way to hunt through the reams of data created by the machine — as the collider’s best hope for detecting behaviour that goes beyond the standard model of particle physics. Conventional techniques have so far come up empty-handed.

    So far, almost all studies at the LHC — at CERN, Europe’s particle-physics laboratory near Geneva, Switzerland — have involved ‘targeted searches’ for signatures of favoured theories. The ATLAS collaboration now describes its first all-out ‘general’ search of the detector’s data, in a preprint posted on the arXiv server last month and submitted to European Physics Journal C. Another major LHC experiment, CMS, is working on a similar project.

    “My goal is to try to come up with a really new way to look for new physics” — one driven by the data rather than by theory, says Sascha Caron of Radboud University Nijmegen in the Netherlands, who has led the push for the approach at ATLAS. General searches are to the targeted ones what spell checking an entire text is to searching that text for a particular word. These broad searches could realize their full potential in the near future, when combined with increasingly sophisticated artificial-intelligence (AI) methods.

    LHC researchers hope that the methods will lead them to their next big discovery — something that hasn’t happened since the detection of the Higgs boson in 2012, which put in place the final piece of the standard model. Developed in the 1960s and 1970s, the model describes all known subatomic particles, but physicists suspect that there is more to the story — the theory doesn’t account for dark matter, for instance. But big experiments such as the LHC have yet to find evidence for such behaviour. That means it’s important to try new things, including general searches, says Gian Giudice, who heads CERN’s theory department and is not involved in any of the experiments. “This is the right approach, at this point.”

    Collision course

    The LHC smashes together millions of protons per second at colossal energies to produce a profusion of decay particles, which are recorded by detectors such as ATLAS and CMS. Many different types of particle interaction can produce the same debris. For example, the decay of a Higgs might produce a pair of photons, but so do other, more common, processes. So, to search for the Higgs, physicists first ran simulations to predict how many of those ‘impostor’ pairs to expect. They then counted all photon pairs recorded in the detector and compared them to their simulations. The difference — a slight excess of photon pairs within a narrow range of energies — was evidence that the Higgs existed.

    ATLAS and CMS have run hundreds more of these targeted searches to look for particles that do not appear in the standard model.

    CERN/ATLAS detector


    CERN/CMS Detector

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    Many searches have looked for various flavours of supersymmetry, a theorized extension of the model that includes hypothesized particles such as the neutralino, a candidate for dark matter. But these searches have come up empty so far.

    Standard model of Supersymmetry DESY

    This leaves open the possibility that there are exotic particles that produce signatures no one has thought of — something that general searches have a better chance of finding. Physicists have yet to look, for example, events that produced three photons instead of two, Caron says. “We have hundreds of people looking at Higgs decay and supersymmetry, but maybe we are missing something nobody thought of,” says Arnd Meyer, a CMS member at Aachen University in Germany.

    Whereas targeted searches typically look at only a handful of the many types of decay product, the latest study looked at more than 700 types at once. The study analysed data collected in 2015, the first year after an LHC upgrade raised the energy of proton collisions in the collider from 8 teraelectronvolts (TeV) to 13 TeV. At CMS, Meyer and a few collaborators have conducted a proof-of-principle study, which hasn’t been published, on a smaller set of data from the 8 TeV run.

    Neither experiment has found significant deviations so far. This was not surprising, the teams say, because the data sets were relatively small. Both ATLAS and CMS are now searching the data collected in 2016 and 2017, a trove tens of times larger.

    Statistical cons

    The approach “has clear advantages, but also clear shortcomings”, says Markus Klute, a physicist at the Massachusetts Institute of Technology in Cambridge. Klute is part of CMS and has worked on general searches in at previous experiments, but he was not directly involved in the more recent studies. One limitation is statistical power. If a targeted search finds a positive result, there are standard procedures for calculating its significance; when casting a wide net, however, some false positives are bound to arise. That was one reason that general searches had not been favoured in the past: many physicists feared that they could lead down too many blind alleys. But the teams say they have put a lot of work into making their methods more solid. “I am excited this came forward,” says Klute.

    Most of the people power and resources at the LHC experiments still go into targeted searches, and that might not change anytime soon. “Some people doubt the usefulness of such general searches, given that we have so many searches that exhaustively cover much of the parameter space,” says Tulika Bose of Boston University in Massachusetts, who helps to coordinate the research programme at CMS.

    Many researchers who work on general searches say that they eventually want to use AI to do away with standard-model simulations altogether. Proponents of this approach hope to use machine learning to find patterns in the data without any theoretical bias. “We want to reverse the strategy — let the data tell us where to look next,” Caron says. Computer scientists are also pushing towards this type of ‘unsupervised’ machine learning — compared with the supervised type, in which the machine ‘learns’ from going through data that have been tagged previously by humans.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

     
  • richardmitnick 12:49 pm on August 8, 2018 Permalink | Reply
    Tags: , , , , Standard Model,   

    From Ethan Siegel: “What Was It Like When The Higgs Gave Mass To The Universe?” 

    From Ethan Siegel
    Aug 8, 2018

    1
    A candidate Higgs event in the ATLAS detector. Note how even with the clear signatures and transverse tracks, there is a shower of other particles; this is due to the fact that protons are composite particles. This is only the case because the Higgs gives mass to the fundamental constituents that compose these particles. (THE ATLAS COLLABORATION / CERN)

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    One moment, every particle in the Universe was massless. Then, they weren’t anymore. Here’s how it happened.

    In the earliest stages of the hot Big Bang, the Universe was filled with all the particles, antiparticles, and quanta of radiation it had the energy to create. As the Universe expanded, it cooled: the stretching fabric of space also stretched the wavelengths of all the radiation within it to longer wavelengths, which equates to lower energies.

    If there are any particles (and antiparticles) that exist at higher energies that are yet to be discovered, they were likely created in the hot Big Bang, so long as there was enough energy (E) available to create a massive (m) particle via Einstein’s E = mc². It’s possible that a slew of puzzles about our Universe, including the origin of the matter-antimatter asymmetry and the creation of dark matter, are solved by new physics at these early times. But the massive particles we know today are foreign to us. At these early stages, they have no mass.

    The particles and antiparticles of the Standard Model are easy to create, even as the Universe cools and the fractions-of-a-second ticked by. The Universe might start of at energies as large as 10¹⁵ or 10¹⁶ GeV; even by time it’s dropped to 1000 (10³) GeV, no Standard Model particle is threatened. At the energies achievable by the LHC, we can create the full suite of particle-antiparticle pairs that are known to physics.

    But at this point, unlike today, they’re all massless. If they have no rest mass, they have no choice but to move at the speed of light. The reason particles are in this strange, bizarre state that’s so different from how they exist today? It’s because the fundamental symmetry that gives rise to the Higgs boson — the electroweak symmetry — has not yet broken in the Universe.

    2
    The particles and antiparticles of the Standard Model have now all been directly detected, with the last holdout, the Higgs Boson, falling at the LHC earlier this decade. Today, only the gluons and photons are massless; everything else has a non-zero rest mass. (E. SIEGEL / BEYOND THE GALAXY)

    When we look at the Standard Model today, it’s arranged as follows:

    six quarks, each of which come in three colors, and their antiquark counterparts,
    three charged leptons (e, μ, τ) and three neutral ones (ν_e, ν_μ, ν_τ), and their antimatter counterparts,
    the eight massless gluons that mediate the strong force between the quarks,
    the three heavy, weak bosons (W+, W-, and Z_0) that mediate the weak nuclear force,
    and the photon (γ), the massless mediator of the electromagnetic force.

    But there’s a symmetry that’s broken at today’s low-energy scale: the electroweak symmetry. This symmetry was restored in the early days of the Universe. And when it’s restored versus when it’s broken, it fundamentally changes the Standard Model picture.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    Instead of the weak and electromagnetic bosons (W+, W-, Z_0, γ), where the first three are very massive and the last is massless, we have four new bosons for the electroweak force (W_1, W_2, W_3, B), and all of them have no mass at all. The other particles are all the same, except for the fact that they, too, have no mass yet. This is what’s floating around in the early Universe, colliding, annihilating, and spontaneously being created, all in motion at the speed of light.

    As the Universe expands and cools, all of this continues. So long as the energy of your Universe is above a certain value, you can think about the Higgs field as floating atop the liquid in a soda (or wine) bottle. As the level of the liquid drops, the Higgs field remains atop the liquid, and everything stays massless. This is what we call a restored-symmetry state.

    3
    When a wine bottle is either completely or partially filled, a drop of oil or a ping pong ball will float on the wine’s surface inside the bottle. At any location, the wine-level, and hence what’s floating atop it, will remain at the same level. This corresponds to a restored-symmetry state. (EVAN SWIGART FROM CHICAGO, USA)

    But below a certain liquid level, the bottom of the container starts to show itself. And the field can no longer remain in the center; more generally, it can’t take on simply any old value. It has to go to where the liquid level is, and that means down into the divot(s) at the bottom of the bottle. This is what we call a broken-symmetry state.

    When this symmetry breaks, the Higgs field settles into the bottom, lowest-energy, equilibrium state. But that energy state isn’t quite zero: it has a finite, non-zero value known as its vacuum expectation value. Whereas the restored-symmetry state yielded only massless particles, the broken symmetry state changes everything.

    4
    When a wine bottle is completely empty, any ball or drop of oil inside will slide all the way down to the lowest-level ‘ring’ at the bottom. This corresponds to a broken symmetry state, since all values (i.e., locations) are no longer equivalent. (PATRICK HEUSSER, X8ING.COM)

    Once the symmetry breaks, the Higgs field has four mass-containing consequences: two are charged (one positive and one negative) and two are neutral. Then, the following things all happen at once:

    The W_1 and W_2 particles “eat” the charged, broken-symmetry consequences of the Higgs, becoming the W+ and W- particles.
    The W_3 and B particles mix together, with one combination eating the uncharged broken-symmetry consequence of the Higgs, becoming the Z_0, and with the other combination eating nothing, to remain the massless photon (γ).
    The last neutral broken-symmetry consequence of the Higgs gains mass, and becomes the Higgs boson.
    At last, the Higgs boson couples to all the other particles of the Standard Model, giving mass to the Universe.

    This is the origin of mass in the Universe.

    4
    When the electroweak symmetry is broken, the W+ gets its mass by eating the positively charged Higgs, the W- by eating the negatively charged Higgs, and the Z_0 by eating the neutral Higgs. The other neutral Higgs becomes the Higgs boson, detected and discovered earlier this decade at the LHC. The photon, the other combination of the W3 and the B boson, remains massless. (FLIP TANEDO / QUANTUM DIARIES)

    This whole process is called spontaneous symmetry breaking. And for the quarks and leptons in the standard model, when this Higgs symmetry is broken, every particle gets a mass due to two things:

    The expectation value of the Higgs field, and
    A coupling constant.

    And this is kind of the problem. The expectation value of the Higgs field is the same for all of these particles, and not too difficult to determine. But that coupling constant? Not only is it different for every particle, but — in the standard model — it’s arbitrary.

    5
    The Higgs boson, now with mass, couples to the quarks, leptons, and W-and-Z bosons of the Standard Model, which gives them mass. That it doesn’t couple to the photon and gluons means those particles remain massless. (TRITERTBUTOXY AT ENGLISH WIKIPEDIA)

    We know that the particles have mass; we know how they get mass; we’ve discovered the particles responsible for mass. But we still have no idea why the particles have the values of the masses they do. We have no idea why the coupling constants have the couplings that they do. The Higgs boson is real; the gauge bosons are real; the quarks and leptons are real. We can create, detect, and measure their properties exquisitely. Yet, when it comes to understanding why they have the values that they do, that’s a puzzle we cannot yet solve. We do not have the answer.

    7
    The masses of the fundamental particles in the Universe, once the electroweak symmetry is broken, spans many orders of magnitude, withe the neutrinos being the lightest massive particles and the top quark being the heaviest. We do not understand why the coupling constants have the values they do, and hence, why the particles have the masses they do. (FIG. 15–04A FROM UNIVERSE-REVIEW.CA)

    Before the breaking of the electroweak symmetry, everything that is known to exist in the Universe today is massless, and moves at the speed of light. Once the Higgs symmetry breaks, it gives mass to the quarks and leptons of the Universe, the W and Z bosons, and the Higgs boson itself. Suddenly, with huge mass differences between light particles and heavy ones, the heavy ones spontaneously decay into the lighter ones on very short timescales, especially when the energy (E) of the Universe drops below the mass equivalent (m) needed to create these unstable particles via E = mc².

    8
    A visual history of the expanding Universe includes the hot, dense state known as the Big Bang and the growth and formation of structure subsequently. Without the Higgs giving mass to the particles in the Universe at a very early, hot stage, none of this would have been possible. (NASA / CXC / M. WEISS)

    Without this critical gauge symmetry associated with electroweak symmetry breaking, existence wouldn’t be possible, as we do not have stable, bound states made purely of massless particles. But with fundamental masses to the quarks and charged leptons, the Universe can now do something it’s never done before. It can cool and create bound states like protons and neutrons. It can cool further and create atomic nuclei and, eventually, neutral atoms. And when enough time goes by, it can give rise to stars, galaxies, planets, and human beings. Without the Higgs to give mass to the Universe, none of this would be possible. The Higgs, despite the fact that it took 50 years to discover, has been making the Universe possible for 13.8 billion years.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    See the full article here .

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 12:00 pm on August 8, 2018 Permalink | Reply
    Tags: , , , Could a new type of quark fix the “unnaturalness” of the Standard Model?, , , , , , Standard Model   

    From CERN ATLAS: “Could a new type of quark fix the “unnaturalness” of the Standard Model?” 

    CERN ATLAS Higgs Event

    CERN/ATLAS
    From CERN ATLAS

    8th August 2018
    ATLAS Collaboration

    1
    Figure 1: One of the Feynman diagrams for T pair production at the LHC. (Image: ATLAS Collaboration © CERN 2018)

    While the discovery of the Higgs boson at the Large Hadron Collider (LHC) in 2012 confirmed many Standard Model predictions, it has raised as many questions as it has answered. For example, interactions at the quantum level between the Higgs boson and the top quark ought to lead to a huge Higgs boson mass, possibly as large as the Planck mass (>1018 GeV). So why is it only 125 GeV? Is there a mechanism at play to cancel these large quantum corrections caused by the top quark (t)? Finding a way to explain the lightness of the Higgs boson is one of the top (no pun intended) questions in particle physics.

    A wide range of solutions have been proposed and a common feature in many of them is the existence of vector-like quarks – in particular, a vector-like top quark (T). Like other quarks, vector-like quarks would be spin-½ particles that interact via the strong force. While all spin-½ particles have left- and right-handed components, the weak force only interacts with the left-handed components of Standard Model particles. However, vector-like quarks would have “ambidextrous” interactions with the weak force, giving them a bit more leeway in how they decay. While the Standard Model top quark always decays to a bottom quark (b) by emitting a W boson (t→Wb), a vector-like top can decay three different ways: T→Wb, T→Zt or T→Ht (Figure 1).

    2
    Figure 2: Lower limit (scale on right axis) on the mass of a vector-like top as a function of the branching ratio to Wb and Ht (bottom and left axes). (Image: ATLAS Collaboration © CERN 2018)

    The ATLAS collaboration uses a custom-built programme to search for vector-like top pairs in LHC data. It utilizes data from several dedicated analyses, each of them sensitive to various experimental signatures (involving leptons, boosted objects and/or large missing transverse momentum). This allows ATLAS to look for all of possible decays, increasing the chance of discovery.

    ATLAS has now gone one step further by performing a combination of all of the individual searches. While individual analyses are designed to study a particular sets of decays, combined results provide sensitivity to all possible sets of decays. These have allowed ATLAS to search for vector-like tops with masses over 1200 GeV. It appears, however, that vector-like tops are so far nowhere to be found. On the bright side, the combination allows ATLAS to set the most stringent lower limits on the mass of a vector-like top for arbitrary sets of branching ratios to the three decay modes (Figure 2).

    Between these limits on vector-like top quarks and those on other theories that could offer a solution (like supersymmetry), the case for a naturally light Higgs boson is not looking good… but Nature probably still has a few tricks up its sleeve for us to uncover.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN map


    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN


    CERN Courier

    QuantumDiaries


    Quantum Diaries

     
  • richardmitnick 1:32 pm on July 25, 2017 Permalink | Reply
    Tags: , , , Hidden-sector particles, , , , Standard Model   

    From FNAL: “The MiniBooNE search for dark matter” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    July 18, 2017
    Ranjan Dharmapalan
    Tyler Thornton

    FNAL/MiniBooNE

    1
    This schematic shows the experimental setup for the dark matter search. Protons (blue arrow on the left) generated by the Fermilab accelerator chain strike a thick steel block. This interaction produces secondary particles, some of which are absorbed by the block. Others, including photons and perhaps dark-sector photons, symbolized by V, are unaffected. These dark photons decay into dark matter, shown as χ, and travel to the MiniBooNE detector, depicted as the sphere on the right.

    Particle physicists are in a quandary. On one hand, the Standard Model accurately describes most of the known particles and forces of interaction between them.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    On the other, we know that the Standard Model accounts for less than 5 percent of the universe. About 26 percent of the universe is composed of mysterious dark matter, and the remaining 68 percent of even more mysterious dark energy.

    Some theorists speculate that dark matter particles could belong to a “hidden sector” and that there may be portals to this hidden sector from the Standard Model. The portals allow hidden-sector particles to trickle into Standard Model interactions. A large sensitive particle detector, placed in an intense particle beam and equipped with a mechanism to suppress the Standard Model interactions, could unveil these new particles.

    Fermilab is home to a number of proton beams and large, extremely sensitive detectors, initially built to detect neutrinos. These devices, such as the MiniBooNE detector, are ideal places to search for hidden-sector particles.

    In 2012, the MiniBooNE-DM collaboration teamed up with theorists who proposed new ways to search for dark matter particles. One of these proposals [FNAL PAC Oct 15 2012] involved the reconfiguration of the existing neutrino experiment. This was a pioneering effort that involved close coordination between the experimentalists, accelerator scientists, beam alignment experts and numerous technicians.

    2
    Results of this MiniBooNE-DM search for dark matter scattering off of nucleons. The plot shows the confidence limits and sensitivities with 1, 2σ errors resulting from this analysis compared to other experimental results, as a function of Y (a parameter describing the dark photon mass, dark matter mass and the couplings to the Standard Model) and Mχ (the dark matter mass). For details see the Physical Review Letters paper.

    For the neutrino experiment, the 8-GeV proton beam from the Fermilab Booster hit a beryllium target to produce a secondary beam of charged particles that decayed further downstream, in a decay pipe, into neutrinos. MiniBooNE ran in this mode for about a decade to measure neutrino oscillations and interactions.

    In the dark matter search mode, however, the proton beam was steered past the beryllium target. The beam instead struck a thick steel block at the end of the decay pipe. The resulting charged secondary particles (mostly particles called pions) are absorbed in the steel block, reducing the number of subsequent neutrinos, while the neutral secondary particles remained unaffected. The photons resulting from the decay of neutral pions may have transformed into hidden-sector photons that in turn might have decayed into dark matter, which would travel to the MiniBooNE detector 450 meters away. The experiment ran in this mode for nine months for a dedicated dark matter search.

    Using the previous 10 years’ worth of data as a baseline, MiniBooNE-DM looked for scattered protons and neutrons in the detector. If they found more scattered protons or neutrons than predicted, the excess could indicate a new particle, maybe dark matter, being produced in the steel block. Scientists analyzed multiple types of neutrino interactions at the same time, reducing the error on the signal data set by more than half.

    Analysts concluded that the data was consistent with the Standard Model prediction, enabling the experimenters to set a limit on a specific model of dark matter, called vector portal dark matter. To set the limit, scientists developed a detailed simulation that estimated the predicted proton or neutron response in the detector from scattered dark matter particles. The new limit extends from the low-mass edge of direct-detection experiments down to masses about 1,000 times smaller. Additionally, the result rules out this particular model as a description of the anomalous behavior of the muon seen in the Muon g-2 experiment at Brookhaven, which was one of the goals of the MiniBooNE-DM proposal. Incidentally, researchers at Fermilab will make a more precise measurement of the muon — and verify the Brookhaven result — in an experiment that started up this year.

    This result from MiniBooNE, a dedicated proton beam dump search for dark matter, was published in Physical Review Letters and was highlighted as an “Editor’s suggestion.”

    What’s next? The experiment will continue to analyze the collected data set. It is possible that the dark matter or hidden-sector particles may prefer to scatter off of the lepton family of particles, which includes electrons, rather than off of quarks, which are the constituent of protons and neutrons. Different interaction channels probe different possibilities.

    If the portals to the hidden sector are narrow — that is, if they are weakly coupled — researchers will need to collect more data or implement new ideas to suppress the Standard Model interactions.

    The first results from MiniBooNE-DM show that Fermilab could be at the forefront of searching for hidden-sector particles. Upcoming experiments in Fermilab’s Short-Baseline Neutrino program will use higher-resolution detectors — specifically, liquid-argon time projection chamber technology — expanding the search regions and possibly leading to discovery.

    Ranjan Dharmapalan is a postdoc at Argonne National Laboratory. Tyler Thornton is a graduate student at Indiana University Bloomington.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    FNAL Icon
    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 2:35 pm on May 9, 2017 Permalink | Reply
    Tags: , , , , , Standard Model   

    From ATLAS: “New insight into the Standard Model” 

    CERN ATLAS Higgs Event

    CERN/ATLAS
    ATLAS

    9th May 2017
    ATLAS Collaboration

    ATLAS releases the first study of a pair of neutral bosons produced in association with a high-mass dijet system.

    1
    Figure 1: Distribution of (a) the centrality of the Z boson-photon (Zγ) system and (b) the transverse energy of the photon. These studies show data collected by ATLAS in 2012 (black points) compared to Standard Model predictions (coloured histograms). The signal that is looked for is displayed as the dark red histogram and the main background is shown as the light blue one. The bottom panels show the ratio of the data to the sum of all the predictions. The error band (blue) shows the total uncertainty on these predictions. A sign of new physics could appear as an enhancement at large momentum, as shown by the dotted blue line in (b). (Image: ATLAS Collaboration/CERN)

    2
    Figure 2: Feynman diagram of the signal process, the Electroweak production of a Z boson, photon (γ) and two high-energy jets. (Image: ATLAS Collaboration/CERN)

    Ever since the LHC collided its first protons in 2009, the ATLAS Collaboration has been persistently studying their interactions with increasing precision. To this day, it has always observed them to be as expected by the Standard Model.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Though it remains unrefuted, physicists are convinced that a better theory must exist to explain certain fundamental questions: What is the nature of the dark matter? Why is the gravitational force so weak compared to the other forces?

    Answers may be found by looking at a very rare process that had previously never been studied by ATLAS: the interaction of four bosons, whose signature is the presence of a Z boson, a photon and two high-energy jets. This is an excellent probe of the electroweak sector of the Standard Model and is very sensitive to new physics models. However, this process is very difficult to detect, given its rarity and the large number of different processes that can mimic its signature (known as “background”). The main background comes from the production of a Z boson and a photon accompanied by two jets, which, unlike the electroweak process we are interested in, is produced via strong interactions.

    This leads to differences in the kinematics of the observed jets, which are described in a recently-submitted paper to the Journal of High Energy Physics [no link found], where ATLAS presents a search for such events using 8 TeV data. Utilizing the knowledge that the recoiling quarks (see Figure 2) will produce jets that have a very large invariant mass and are widely separated in the detector, ATLAS has been able to reduce the background and mitigate the large experimental uncertainties in order to extract the signal.

    The background is suppressed by selecting events where the two jets have an invariant mass larger than 500 GeV. The signal and main background are further separated by quantifying the centrality of the Z-photon system with respect to the two jets. Events with low centrality are more likely to be produced via the electroweak signal process while those with high centrality are more likely to come from strong interactions. This is illustrated in Figure 1(a), where a small excess of events above the predicted background is observed, with a statistical significance of 2σ.

    The centrality is used to measure the event rate (cross section) of the signal alone, and of the sum of the signal and the major background. Both were found to be in agreement with Standard Model predictions within the large statistical uncertainty. Anomalies on the coupling of four bosons have also been searched for, by looking at the tails of the photon transverse energy spectrum that may be enhanced by new physics contributions (blue dotted line in Figure 1(b)). No deviation from the Standard Model has been seen and stringent limits are set on the presence of new physics in this region.

    The Standard Model will continue to keep its secrets… until the next set of results!

    Links:

    Studies of Zγ electroweak production in association with a high-mass dijet system in pp collisions at 8 TeV with the ATLAS detector(arXiv: 1705.01966, see figures)
    See also the full lists of ATLAS Conference Notes and ATLAS Physics Papers.

    See the full article here .

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 1:25 pm on April 14, 2017 Permalink | Reply
    Tags: , , , , , Standard Model   

    From Ethan Siegel: “Can muons — which live for microseconds — save experimental particle physics?” 

    Ethan Siegel

    Apr 14, 2017

    You lose whether you use protons or electrons in your collider, for different reasons. Could the unstable muon solve both problems?

    1
    A four-muon candidate event in the ATLAS detector at the Large Hadron Collider. The muon/anti-muon tracks are highlighted in red, as the long-lived muons travel farther than any other unstable particle. Image credit: ATLAS Collaboration / CERN.

    “It does not matter how slowly you go as long as you do not stop.” -Confucius

    High-energy physics is facing its greatest crisis ever. The Standard Model is complete, as all the particles our most successful physics theories have predicted have been discovered.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    The Large Hadron Collider at CERN, the most energetic particle collider ever developed (with more than six times the energies of any prior collider), discovered the long-sought-after Higgs boson, but nothing else.

    CERN/LHC Map

    CERN LHC Tube


    LHC at CERN

    Traditionally, the way to discover new particles has been to go to higher energies with one of two strategies:

    Collide electrons and positrons, getting a “clean” signal where 100% of the collider energy goes into producing new particles.
    Collide protons and either anti-protons or other protons, getting a messy signal but reaching higher energies due to the heavier mass of the proton.

    Both methods have their limitations, but one unstable particle might give us a third option to make the elusive breakthrough we desperately need: the muon.

    2
    The known particles in the Standard Model. These are all the fundamental particles that have been directly discovered. Image credit: E. Siegel.

    The Standard Model is made up of all the fundamental particles and antiparticles we’ve ever discovered. They include six quarks and antiquarks, each in three colors, three charged leptons and three types of neutrino, along with their antiparticle counterparts, and the bosons: the photon, the weak bosons (W+, W-, Z0), the eight gluons (with color/anticolor combinations attached), and the Higgs boson. While countless different combinations of these particles exist in nature, only a precious few are stable. The electron, photon, proton (made of two up and one down quark), and, if they’re bound together in nuclei, the neutron (with two down and one up quark) are stable, along with their antimatter counterparts. That’s why all the normal matter we see in the Universe is made up of protons, neutrons, and electrons; nothing else with any significant interactions is stable.

    3
    While many unstable particles, both fundamental and composite, can be produced in particle physics, only protons, neutrons (bound in nuclei) and the electron are stable, along with their antimatter counterparts and the photon. Everything else is short-lived. Image credit: Contemporary Physics Education Project (CPEP), U.S. Department of Energy / NSF / LBNL.

    The way you create these unstable particles is by colliding the stable ones together at high enough energies. Because of a fundamental principle of nature — mass/energy equivalence, given by Einstein’s E = mc2 — you can turn pure energy into mass if you have enough of it. (So long as you obey all the other conservation laws.) This is exactly the way we’ve created almost all the other particles of the Standard Model: by colliding particles into one another at enough energy that the energy you get out (E) is high enough to create the new particles (of mass m) you’re attempting to discover.

    4
    The particle tracks emanating from a high energy collision at the LHC in 2014 show the creation of many new particles. It’s only because of the high-energy nature of this collision that new masses can be created.

    We know there are almost certainly more particles beyond the ones we’ve discovered; we expect there to be particle explanations for mysteries like the baryon asymmetry (why there’s more matter than antimatter), the missing mass problem in the Universe (what we suspect will be solved by dark matter), the neutrino mass problem (why they’re so incredibly light), the quantum nature of gravity (i.e., there should be a force-carrying particle for the gravitational interaction, like the graviton), and the strong-CP problem (why certain decays don’t happen), among others. But our colliders haven’t reached the energies necessary to uncover those new particles, if they even exist. What’s even worse: both of the current methods have severe drawbacks that may prohibit us from building colliders that go to significantly higher energies.

    The Large Hadron Collider is the current record-holder, accelerating protons up to energies of 6.5 TeV apiece before smashing them together. The energy you can reach is directly proportional to two things only: the radius of your accelerator (R) and the strength of the magnetic field used to bend the protons into a circle (B). Collide those two protons together, and they hit with an energy of 13 TeV. But you’ll never make a 13 TeV particle colliding two protons at the LHC; only a fraction of that energy is available to create new particles via E = mc². The reason? A proton is made of multiple, composite particles — quarks, gluons, and even quark/antiquark pairs inside — meaning that only a tiny fraction of that energy goes into making new, massive particles.

    5
    A candidate Higgs event in the ATLAS detector. Note how even with the clear signatures and transverse tracks, there is a shower of other particles; this is due to the fact that protons are composite particles. Image credit: The ATLAS collaboration / CERN.

    CERN ATLAS Higgs Event

    CERN/ATLAS detector

    You might think to use fundamental particles instead, then, like electrons and positrons. If you were to put them in the same ring (with the same R) and subject them to the same magnetic field (the same B), you might think you could reach the same energies, only this time, 100% of the energy could make new particles. And that would be true, if it weren’t for one factor: synchrotron radiation. You see, when you accelerate a charged particle in a magnetic field, it gives off radiation. Because a proton is so massive compared to its electric charge, that radiation is negligible, and you can take protons up to the highest energies we’ve ever reached without worrying about it. But electrons and positrons are only 1/1836th of a proton’s mass, and synchrotron radiation would limit them to only about 0.114 TeV of energy under the same conditions.

    6
    Relativistic electrons and positrons can be accelerated to very high speeds, but will emit synchrotron radiation (blue) at high enough energies, preventing them from moving faster. Image credit: Chung-Li Dong, Jinghua Guo, Yang-Yuan Chen, and Chang Ching-Lin, ‘Soft-x-ray spectroscopy probes nanomaterial-based devices’.

    But there’s a third option that’s never been put into practice: use muons and anti-muons. A muon is just like an electron in the sense that it’s a fundamental particle, it’s charged, it’s a lepton, but it’s 206 times heavier than the electron. This is massive enough that synchrotron radiation doesn’t matter for muons or anti-muons, which is great! The only downside? The muon is unstable, with a mean lifetime of only 2.2 microseconds before decaying away.

    5
    The prototype MICE 201-megahertz RF module, with the copper cavity mounted, is shown during assembly at Fermilab. This apparatus could focus and collimate a muon beam, enabling the muons to be accelerated and survive for much longer than 2.2 microseconds. Image credit: Y. Torun / IIT / Fermilab Today.

    That might be okay, though, because special relativity can rescue us! When you bring an unstable particle close to the speed of light, the amount of time that it lives increases dramatically, thanks to the relativistic phenomenon of time dilation. If you brought a muon all the way up to 6.5 TeV of energy, it would live for 135,000 microseconds: enough time to circle the Large Hadron Collider 1,500 times before decaying away. And this time, your hopes would be absolutely true: 100% of that energy, 6.5 TeV + 6.5 TeV = 13 TeV, would be available for particle creation.

    6
    A design plan for a full-scale muon-antimuon collider at Fermilab, the source of the world’s second-most powerful particle accelerator. Image credit: Fermilab.

    We can always build a bigger ring or invent stronger magnets, and we may well do exactly that. But there’s no cure for synchrotron radiation except to use heavier particles, and there’s no cure for energy spreading out among the components of composite particles other than not to use them at all. Muons are unstable and difficult to keep alive for a long time, but as we get to higher and higher energies, that task gets progressively easier. Muon colliders have long been touted as a mere pipe dream, but recent progress by the MICE collaboration — for Muon Ionization Cooling Experiment — has demonstrated that this may be possible after all. A circular muon/anti-muon collider may be the particle accelerator that takes us beyond the LHC’s reach, and, if we’re lucky, into the realm of the new physics we’re so desperately seeking.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 12:52 pm on March 16, 2017 Permalink | Reply
    Tags: , , Nauilus, , , Standard Model, Supersymetry   

    From Nautilus: “A Brief History of the Grand Unified Theory of Physics” 

    Nautilus

    Nautilus

    March 16, 2017
    Lawrence M. Krauss
    Paintings by Jonathan Feldschuh

    Particle physicists had two nightmares before the Higgs particle was discovered in 2012. The first was that the Large Hadron Collider (LHC) particle accelerator would see precisely nothing.


    CERN ATLAS Higgs Event

    CERN ATLAS detector


    CERN CMS Higgs Event


    CERN CMS detector




    LHC at CERN

    For if it did, it would likely be the last large accelerator ever built to probe the fundamental makeup of the cosmos. The second was that the LHC would discover the Higgs particle predicted by theoretical physicist Peter Higgs in 1964 … and nothing else.

    Each time we peel back one layer of reality, other layers beckon. So each important new development in science generally leaves us with more questions than answers. But it also usually leaves us with at least the outline of a road map to help us begin to seek answers to those questions. The successful discovery of the Higgs particle, and with it the validation of the existence of an invisible background Higgs field throughout space (in the quantum world, every particle like the Higgs is associated with a field), was a profound validation of the bold scientific developments of the 20th century.

    2
    Particles #22

    However, the words of Sheldon Glashow continue to ring true: The Higgs is like a toilet. It hides all the messy details we would rather not speak of. The Higgs field interacts with most elementary particles as they travel through space, producing a resistive force that slows their motion and makes them appear massive. Thus, the masses of elementary particles that we measure, and that make the world of our experience possible is something of an illusion—an accident of our particular experience.

    As elegant as this idea might be, it is essentially an ad hoc addition to the Standard Model of physics—which explains three of the four known forces of nature, and how these forces interact with matter.


    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    It is added to the theory to do what is required to accurately model the world of our experience. But it is not required by the theory. The universe could have happily existed with massless particles and a long-range weak force (which, along with the strong force, gravity, and electromagnetism, make up the four known forces). We would just not be here to ask about them. Moreover, the detailed physics of the Higgs is undetermined within the Standard Model alone. The Higgs could have been 20 times heavier, or 100 times lighter.

    Why, then, does the Higgs exist at all? And why does it have the mass it does? (Recognizing that whenever scientists ask “Why?” we really mean “How?”) If the Higgs did not exist, the world we see would not exist, but surely that is not an explanation. Or is it? Ultimately to understand the underlying physics behind the Higgs is to understand how we came to exist. When we ask, “Why are we here?,” at a fundamental level we may as well be asking, “Why is the Higgs here?” And the Standard Model gives no answer to this question.

    Some hints do exist, however, coming from a combination of theory and experiment. Shortly after the fundamental structure of the Standard Model became firmly established, in 1974, and well before the details were experimentally verified over the next decade, two different groups of physicists at Harvard, where both Sheldown Glashow and Steven Weinberg were working, noticed something interesting. Glashow, along with Howard Georgi, did what Glashow did best: They looked for patterns among the existing particles and forces and sought out new possibilities using the mathematics of group theory.

    In the Standard Model the weak and electromagnetic forces of nature are unified at a high-energy scale, into a single force that physicists call the “electroweak force.” This means that the mathematics governing the weak and electromagnetic forces are the same, both constrained by the same mathematical symmetry, and the two forces are different reflections of a single underlying theory. But the symmetry is “spontaneously broken” by the Higgs field, which interacts with the particles that convey the weak force, but not the particles that convey the electromagnetic force. This accident of nature causes these two forces to appear as two separate and distinct forces at scales we can measure—with the weak force being short-range and electromagnetism remaining long-range.

    Georgi and Glashow tried to extend this idea to include the strong force, and discovered that all of the known particles and the three non-gravitational forces could naturally fit within a single fundamental symmetry structure. They then speculated that this symmetry could spontaneously break at some ultrahigh energy scale (and short distance scale) far beyond the range of current experiments, leaving two separate and distinct unbroken symmetries left over—resulting in separate strong and electroweak forces. Subsequently, at a lower energy and larger distance scale, the electroweak symmetry would break, separating the electroweak force into the short-range weak and the long-range electromagnetic force.

    They called such a theory, modestly, a Grand Unified Theory (GUT).

    At around the same time, Weinberg and Georgi along with Helen Quinn noticed something interesting—following the work of Frank Wilczek, David Gross, and David Politzer. While the strong interaction got weaker at smaller distance scales, the electromagnetic and weak interactions got stronger.

    It didn’t take a rocket scientist to wonder whether the strength of the three different interactions might become identical at some small-distance scale. When they did the calculations, they found (with the accuracy with which the interactions were then measured) that such a unification looked possible, but only if the scale of unification was about 15 orders of magnitude in scale smaller than the size of the proton.

    This was good news if the unified theory was the one proposed by Howard Georgi and Glashow—because if all the particles we observe in nature got unified this way, then new particles (called gauge bosons) would exist that produce transitions between quarks (which make up protons and neutrons), and electrons and neutrinos. That would mean protons could decay into other lighter particles, which we could potentially observe. As Glashow put it, “Diamonds aren’t forever.”

    Even then it was known that protons must have an incredibly long lifetime. Not just because we still exist almost 14 billion years after the big bang, but because we all don’t die of cancer as children. If protons decayed with an average lifetime smaller than about a billion billion years, then enough protons would decay in our bodies during our childhood to produce enough radiation to kill us. Remember that in quantum mechanics, processes are probabilistic. If an average proton lives a billion billion years, and if one has a billion billion protons, then on average one will decay each year. There are a lot more than a billion billion protons in our bodies.

    However, with the incredibly small proposed distance scale and therefore the incredibly large mass scale associated with spontaneous symmetry breaking in Grand Unification, the new gauge bosons would get large masses. That would make the interactions they mediate be so short-range that they would be unbelievably weak on the scale of protons and neutrons today. As a result, while protons could decay, they might live, in this scenario, perhaps a million billion billion billion years before decaying. Still time to hold onto your growth stocks.

    With the results of Glashow and Georgi, and Georgi, Quinn, and Weinberg, the smell of grand synthesis was in the air. After the success of the electroweak theory, particle physicists were feeling ambitious and ready for further unification.

    How would one know if these ideas were correct, however? There was no way to build an accelerator to probe an energy scale a million billion times greater than the rest mass energy of protons. Such a machine would have to have a circumference of the moon’s orbit. Even if it was possible, considering the earlier debacle over the Superconducting Super Collider, no government would ever foot the bill.


    Superconducting Super Collider map, in the vicinity of Waxahachie, Texas.

    Happily, there was another way, using the kind of probability arguments I just presented that give limits to the proton lifetime. If the new Grand Unified Theory predicted a proton lifetime of, say, a thousand billion billion billion years, then if one could put a thousand billion billion billion protons in a single detector, on average one of them would decay each year.

    Where could one find so many protons? Simple: in about 3,000 tons of water.

    So all that was required was to get a tank of water, put it in the dark, make sure there were no radioactivity backgrounds, surround it with sensitive phototubes that can detect flashes of light in the detector, and then wait for a year to see a burst of light when a proton decayed. As daunting as this may seem, at least two large experiments were commissioned and built to do just this, one deep underground next to Lake Erie in a salt mine, and one in a mine near Kamioka, Japan. The mines were necessary to screen out incoming cosmic rays that would otherwise produce a background that would swamp any proton decay signal.

    Both experiments began taking data around 1982–83. Grand Unification seemed so compelling that the physics community was confident a signal would soon appear and Grand Unification would mean the culmination of a decade of amazing change and discovery in particle physics—not to mention another Nobel Prize for Glashow and maybe some others.

    Unfortunately, nature was not so kind in this instance. No signals were seen in the first year, the second, or the third. The simplest elegant model proposed by Glashow and Georgi was soon ruled out. But once the Grand Unification bug had caught on, it was not easy to let it go. Other proposals were made for unified theories that might cause proton decay to be suppressed beyond the limits of the ongoing experiments.

    On Feb. 23, 1987, however, another event occurred that demonstrates a maxim I have found is almost universal: Every time we open a new window on the universe, we are surprised. On that day a group of astronomers observed, in photographic plates obtained during the night, the closest exploding star (a supernova) seen in almost 400 years.

    3
    NASA is celebrating the 30th anniversary of SN 1987A by releasing new data.

    The star, about 160,000 light-years away, was in the Large Magellanic Cloud—a small satellite galaxy of the Milky Way observable in the southern hemisphere.


    Large Magellanic Cloud. Adrian Pingstone December 2003

    If our ideas about exploding stars are correct, most of the energy released should be in the form of neutrinos, despite that the visible light released is so great that supernovas are the brightest cosmic fireworks in the sky when they explode (at a rate of about one explosion per 100 years per galaxy). Rough estimates then suggested that the huge IMB (Irvine- Michigan-Brookhaven) and Kamiokande water detectors should see about 20 neutrino events.

    5
    Irvine- Michigan-Brookhaven detector


    Super Kamiokande detector

    When the IMB and Kamiokande experimentalists went back and reviewed their data for that day, lo and behold IMB displayed eight candidate events in a 10-second interval, and Kamiokande displayed 11 such events. In the world of neutrino physics, this was a flood of data. The field of neutrino astrophysics had suddenly reached maturity. These 19 events produced perhaps 1,900 papers by physicists, such as me, who realized that they provided an unprecedented window into the core of an exploding star, and a laboratory not just for astrophysics but also for the physics of neutrinos themselves.

    Spurred on by the realization that large proton-decay detectors might serve a dual purpose as new astrophysical neutrino detectors, several groups began to build a new generation of such dual-purpose detectors. The largest one in the world was again built in the Kamioka mine and was called Super-Kamiokande, and with good reason. This mammoth 50,000-ton tank of water, surrounded by 11,800 phototubes, was operated in a working mine, yet the experiment was maintained with the purity of a laboratory clean room. This was absolutely necessary because in a detector of this size one had to worry not only about external cosmic rays, but also about internal radioactive contaminants in the water that could swamp any signals being searched for.

    Meanwhile, interest in a related astrophysical neutrino signature also reached a new high during this period. The sun produces neutrinos due to the nuclear reactions in its core that power it, and over 20 years, using a huge underground detector, physicist Ray Davis had detected solar neutrinos, but had consistently found an event rate about a factor of three below what was predicted using the best models of the sun. A new type of solar neutrino detector was built inside a deep mine in Sudbury, Canada, which became known as the Sudbury Neutrino Observatory (SNO).


    SNOLAB, Sudbury, Ontario, Canada.

    Super-Kamiokande has now been operating almost continuously, through various upgrades, for more than 20 years. No proton-decay signals have been seen, and no new supernovas observed. However, the precision observations of neutrinos at this huge detector, combined with complementary observations at SNO, definitely established that the solar neutrino deficit observed by Ray Davis is real, and moreover that it is not due to astrophysical effects in the sun but rather due to the properties of neutrinos. The implication was that at least one of the three known types of neutrinos is not massless. Since the Standard Model does not accommodate neutrinos’ masses, this was the first definitive observation that some new physics, beyond the Standard Model and beyond the Higgs, must be operating in nature.

    Soon after this, observations of higher-energy neutrinos that regularly bombard Earth as high-energy cosmic-ray protons hit the atmosphere and produce a downward shower of particles, including neutrinos, demonstrated that yet a second neutrino has mass. This mass is somewhat larger, but still far smaller than the mass of the electron. For these results team leaders at SNO and Kamiokande were awarded the 2015 Nobel Prize in Physics—a week before I wrote the first draft of these words. To date these tantalizing hints of new physics are not explained by current theories.

    The absence of proton decay, while disappointing, turned out to be not totally unexpected. Since Grand Unification was first proposed, the physics landscape had shifted slightly. More precise measurements of the actual strengths of the three non-gravitational interactions—combined with more sophisticated calculations of the change in the strength of these interactions with distance—demonstrated that if the particles of the Standard Model are the only ones existing in nature, the strength of the three forces will not unify at a single scale. In order for Grand Unification to take place, some new physics at energy scales beyond those that have been observed thus far must exist. The presence of new particles would not only change the energy scale at which the three known interactions might unify, it would also tend to drive up the Grand Unification scale and thus suppress the rate of proton decay—leading to predicted lifetimes in excess of a million billion billion billion years.

    As these developments were taking place, theorists were driven by new mathematical tools to explore a possible new type of symmetry in nature, which became known as supersymmetry.


    Standard model of Supersymmetry DESY

    This fundamental symmetry is different from any previous known symmetry, in that it connects the two different types of particles in nature, fermions (particles with half-integer spins) and bosons (particles with integer spins). The upshot of this is that if this symmetry exists in nature, then for every known particle in the Standard Model at least one corresponding new elementary particle must exist. For every known boson there must exist a new fermion. For every known fermion there must exist a new boson.

    Since we haven’t seen these particles, this symmetry cannot be manifest in the world at the level we experience it, and it must be broken, meaning the new particles will all get masses that could be heavy enough so that they haven’t been seen in any accelerator constructed thus far.

    What could be so attractive about a symmetry that suddenly doubles all the particles in nature without any evidence of any of the new particles? In large part the seduction lay in the very fact of Grand Unification. Because if a Grand Unified theory exists at a mass scale of 15 to 16 orders of magnitude higher energy than the rest mass of the proton, this is also about 13 orders of magnitude higher than the scale of electroweak symmetry breaking. The big question is why and how such a huge difference in scales can exist for the fundamental laws of nature. In particular, if the Standard Model Higgs is the true last remnant of the Standard Model, then the question arises, Why is the energy scale of Higgs symmetry breaking 13 orders of magnitude smaller than the scale of symmetry breaking associated with whatever new field must be introduced to break the GUT symmetry into its separate component forces?

    ____________________________________________________________________________
    Following three years of LHC runs, there are no signs of supersymmetry whatsoever.
    ____________________________________________________________________________

    The problem is a little more severe than it appears. When one considers the effects of virtual particles (which appear and disappear on timescales so short that their existence can only be probed indirectly), including particles of arbitrarily large mass, such as the gauge particles of a presumed Grand Unified Theory, these tend to drive up the mass and symmetry-breaking scale of the Higgs so that it essentially becomes close to, or identical to, the heavy GUT scale. This generates a problem that has become known as the naturalness problem. It is technically unnatural to have a huge hierarchy between the scale at which the electroweak symmetry is broken by the Higgs particle and the scale at which the GUT symmetry is broken by whatever new heavy field scalar breaks that symmetry.

    The mathematical physicist Edward Witten argued in an influential paper in 1981 that supersymmetry had a special property. It could tame the effect that virtual particles of arbitrarily high mass and energy have on the properties of the world at the scales we can currently probe. Because virtual fermions and virtual bosons of the same mass produce quantum corrections that are identical except for a sign, if every boson is accompanied by a fermion of equal mass, then the quantum effects of the virtual particles will cancel out. This means that the effects of virtual particles of arbitrarily high mass and energy on the physical properties of the universe on scales we can measure would now be completely removed.

    If, however, supersymmetry is itself broken (as it must be or all the supersymmetric partners of ordinary matter would have the same mass as the observed particles and we would have observed them), then the quantum corrections will not quite cancel out. Instead they would yield contributions to masses that are the same order as the supersymmetry-breaking scale. If it was comparable to the scale of the electroweak symmetry breaking, then it would explain why the Higgs mass scale is what it is.

    And it also means we should expect to begin to observe a lot of new particles—the supersymmetric partners of ordinary matter—at the scale currently being probed at the LHC.

    This would solve the naturalness problem because it would protect the Higgs boson masses from possible quantum corrections that could drive them up to be as large as the energy scale associated with Grand Unification. Supersymmetry could allow a “natural” large hierarchy in energy (and mass) separating the electroweak scale from the Grand Unified scale.

    That supersymmetry could in principle solve the hierarchy problem, as it has become known, greatly increased its stock with physicists. It caused theorists to begin to explore realistic models that incorporated supersymmetry breaking and to explore the other physical consequences of this idea. When they did so, the stock price of supersymmetry went through the roof. For if one included the possibility of spontaneously broken supersymmetry into calculations of how the three non-gravitational forces change with distance, then suddenly the strength of the three forces would naturally converge at a single, very small-distance scale. Grand Unification became viable again!

    Models in which supersymmetry is broken have another attractive feature. It was pointed out, well before the top quark was discovered, that if the top quark was heavy, then through its interactions with other supersymmetric partners, it could produce quantum corrections to the Higgs particle properties that would cause the Higgs field to form a coherent background field throughout space at its currently measured energy scale if Grand Unification occurred at a much higher, superheavy scale. In short, the energy scale of electroweak symmetry breaking could be generated naturally within a theory in which Grand Unification occurs at a much higher energy scale. When the top quark was discovered and indeed was heavy, this added to the attractiveness of the possibility that supersymmetry breaking might be responsible for the observed energy scale of the weak interaction.

    _____________________________________________________________________
    In order for Grand Unification to take place, some new physics at energy scales beyond those that have been observed thus far must exist.
    _____________________________________________________________________

    All of this comes at a cost, however. For the theory to work, there must be two Higgs bosons, not just one. Moreover, one would expect to begin to see the new supersymmetric particles if one built an accelerator such as the LHC, which could probe for new physics near the electroweak scale. Finally, in what looked for a while like a rather damning constraint, the lightest Higgs in the theory could not be too heavy or the mechanism wouldn’t work.

    As searches for the Higgs continued without yielding any results, accelerators began to push closer and closer to the theoretical upper limit on the mass of the lightest Higgs boson in supersymmetric theories. The value was something like 135 times the mass of the proton, with details to some extent depending on the model. If the Higgs could have been ruled out up to that scale, it would have suggested all the hype about supersymmetry was just that.

    Well, things turned out differently. The Higgs that was observed at the LHC has a mass about 125 times the mass of the proton. Perhaps a grand synthesis was within reach.

    The answer at present is … not so clear. The signatures of new super- symmetric partners of ordinary particles should be so striking at the LHC, if they exist, that many of us thought that the LHC had a much greater chance of discovering supersymmetry than it did of discovering the Higgs. It didn’t turn out that way. Following three years of LHC runs, there are no signs of supersymmetry whatsoever. The situation is already beginning to look uncomfortable. The lower limits that can now be placed on the masses of supersymmetric partners of ordinary matter are getting higher. If they get too high, then the supersymmetry-breaking scale would no longer be close to the electroweak scale, and many of the attractive features of supersymmetry breaking for resolving the hierarchy problem would go away.

    But the situation is not yet hopeless, and the LHC has been turned on again, this time at higher energy. It could be that supersymmetric particles will soon be discovered.

    If they are, this will have another important consequence. One of the bigger mysteries in cosmology is the nature of the dark matter that appears to dominate the mass of all galaxies we can see.


    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    There is so much of it that it cannot be made of the same particles as normal matter. If it were, for example, the predictions of the abundance of light elements such as helium produced in the big bang would no longer agree with observation. Thus physicists are reasonably certain that the dark matter is made of a new type of elementary particle. But what type?

    Well, the lightest supersymmetric partner of ordinary matter is, in most models, absolutely stable and has many of the properties of neutrinos. It would be weakly interacting and electrically neutral, so that it wouldn’t absorb or emit light. Moreover, calculations that I and others performed more than 30 years ago showed that the remnant abundance today of the lightest supersymmetric particle left over after the big bang would naturally be in the range so that it could be the dark matter dominating the mass of galaxies.

    In that case our galaxy would have a halo of dark matter particles whizzing throughout it, including through the room in which you are reading this. As a number of us also realized some time ago, this means that if one designs sensitive detectors and puts them underground, not unlike, at least in spirit, the neutrino detectors that already exist underground, one might directly detect these dark matter particles. Around the world a half dozen beautiful experiments are now going on to do just that. So far nothing has been seen, however.

    So, we are in potentially the best of times or the worst of times. A race is going on between the detectors at the LHC and the underground direct dark matter detectors to see who might discover the nature of dark matter first. If either group reports a detection, it will herald the opening up of a whole new world of discovery, leading potentially to an understanding of Grand Unification itself. And if no discovery is made in the coming years, we might rule out the notion of a simple supersymmetric origin of dark matter—and in turn rule out the whole notion of supersymmetry as a solution of the hierarchy problem. In that case we would have to go back to the drawing board, except if we don’t see any new signals at the LHC, we will have little guidance about which direction to head in order to derive a model of nature that might actually be correct.

    Things got more interesting when the LHC reported a tantalizing possible signal due to a new particle about six times heavier than the Higgs particle. This particle did not have the characteristics one would expect for any supersymmetric partner of ordinary matter. In general the most exciting spurious hints of signals go away when more data are amassed, and about six months after this signal first appeared, after more data were amassed, it disappeared. If it had not, it could have changed everything about the way we think about Grand Unified Theories and electroweak symmetry, suggesting instead a new fundamental force and a new set of particles that feel this force. But while it generated many hopeful theoretical papers, nature seems to have chosen otherwise.

    The absence of clear experimental direction or confirmation of super- symmetry has thus far not bothered one group of theoretical physicists. The beautiful mathematical aspects of supersymmetry encouraged, in 1984, the resurrection of an idea that had been dormant since the 1960s when Yoichiro Nambu and others tried to understand the strong force as if it were a theory of quarks connected by string-like excitations. When supersymmetry was incorporated in a quantum theory of strings, to create what became known as superstring theory, some amazingly beautiful mathematical results began to emerge, including the possibility of unifying not just the three non-gravitational forces, but all four known forces in nature into a single consistent quantum field theory.

    However, the theory requires a host of new spacetime dimensions to exist, none of which has been, as yet, observed. Also, the theory makes no other predictions that are yet testable with currently conceived experiments. And the theory has recently gotten a lot more complicated so that it now seems that strings themselves are probably not even the central dynamical variables in the theory.

    None of this dampened the enthusiasm of a hard core of dedicated and highly talented physicists who have continued to work on superstring theory, now called M-theory, over the 30 years since its heyday in the mid-1980s. Great successes are periodically claimed, but so far M-theory lacks the key element that makes the Standard Model such a triumph of the scientific enterprise: the ability to make contact with the world we can measure, resolve otherwise inexplicable puzzles, and provide fundamental explanations of how our world has arisen as it has. This doesn’t mean M-theory isn’t right, but at this point it is mostly speculation, although well-meaning and well-motivated speculation.

    It is worth remembering that if the lessons of history are any guide, most forefront physical ideas are wrong. If they weren’t, anyone could do theoretical physics. It took several centuries or, if one counts back to the science of the Greeks, several millennia of hits and misses to come up with the Standard Model.

    So this is where we are. Are great new experimental insights just around the corner that may validate, or invalidate, some of the grander speculations of theoretical physicists? Or are we on the verge of a desert where nature will give us no hint of what direction to search in to probe deeper into the underlying nature of the cosmos? We’ll find out, and we will have to live with the new reality either way.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 3:32 pm on February 17, 2017 Permalink | Reply
    Tags: A minimal extension to the standard model of particle physics involves six new particles, Astrophysical observations suggest that the mysterious dark matter is more than five times as common, , , Model Tries to Solve Five Physics Problems at Once, Physical Review Letters, , Standard Model, The particles are three heavy right-handed neutrinos and a color triplet fermion and a particle called rho that both gives mass to the right-handed neutrinos and drives cosmic inflation together with   

    From DESY: “Solving five big questions in particle physics in a SMASH” 

    DESY
    DESY

    2017/02/16
    No writer credit found

    Extension of the standard model provides complete and consistent description of the history of the universe.

    The extremely successful standard model of particle physics has an unfortunate limitation: the current version is only able to explain about 15 percent of the matter found in the universe.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Although it describes and categorises all the known fundamental particles and interactions, it does so only for the type of matter we are familiar with. However, astrophysical observations suggest that the mysterious dark matter is more than five times as common. An international team of theoretical physicists has now come up with an extension to the standard model which could not only explain dark matter but at the same time solve five major problems faced by particle physics at one stroke. Guillermo Ballesteros, from the University of Paris-Saclay, and his colleagues are presenting their SMASH model (“Standard Model Axion Seesaw Higgs portal inflation” model) in the journal Physical Review Letters.

    1
    The history of the universe according to SMASH, denoting the different phases and the dominant energies of the epochs since the Big Bang. Credit: DESY

    3

    Model Tries to Solve Five Physics Problems at Once

    A minimal extension to the standard model of particle physics involves six new particles. http://physics.aps.org/synopsis-for/10.1103/PhysRevLett.118.071802

    The standard model has enjoyed a happy life. Ever since it was proposed four decades ago, it has passed all particle physics tests with flying colors. But it has several sticky problems. For instance, it doesn’t explain why there’s more matter than antimatter in the cosmos. A quartet of theorists from Europe has now taken a stab at solving five of these problems in one go. The solution is a model dubbed SMASH, which extends the standard model in a minimal fashion.

    SMASH adds six new particles to the seventeen fundamental particles of the standard model. The particles are three heavy right-handed neutrinos, a color triplet fermion, a particle called rho that both gives mass to the right-handed neutrinos and drives cosmic inflation together with the Higgs boson, and an axion, which is a promising dark matter candidate. With these six particles, SMASH does five things: produces the matter–antimatter imbalance in the Universe; creates the mysterious tiny masses of the known left-handed neutrinos; explains an unusual symmetry of the strong interaction that binds quarks in nuclei; accounts for the origin of dark matter; and explains inflation.

    The jury is out on whether the model will fly. For one thing, it doesn’t tackle the so-called hierarchy problem and the cosmological constant problem. On the plus side, it makes clear predictions, which the authors say can be tested with future data from observations of the cosmic microwave background and from experiments searching for axions. One prediction is that axions should have a mass between 50 and 200 μeV. Over to the experimentalists, then.

    This research is published in Physical Review Letters.

    “SMASH was actually developed from the bottom up,” explains DESY’s Andreas Ringwald, who co-authored the study. “We started off with the standard model and only added as few new concepts as were necessary in order to answer the unresolved issues.” To do this, the scientists combined various different existing theoretical approaches and came up with a simple, uniform model. SMASH adds a total of six new particles to the standard model: three heavy, right-handed neutrinos and an additional quark, as well as a so-called axion and the heavy rho (ρ) particle. The latter two form a new field which extends throughout the entire universe.

    Using these extensions, the scientists were able to solve five problems: the axion is a candidate for dark matter, which astrophysical observations suggest is five times more ubiquitous than the matter we are familiar with, which is described by the standard model. The heavy neutrinos explain the mass of the already known, very light neutrinos; and the rho interacts with the Higgs boson to produce so-called cosmic inflation, a period during which the entire young universe suddenly expanded by a factor of at least one hundred septillion for hitherto unknown reasons. In addition, SMASH provides explanations as to why our universe contains so much more matter than antimatter, even though equal amounts must have been created during the big bang, and it reveals why no violation of so-called CP symmetry is observed in the strong force, one of the fundamental interactions.

    3
    The particles of the standard model (SM, left) and of the extension SMASH (right). Credit: Carlos Tamarit, University of Durham

    “Overall, the resulting description of the history of the universe is complete and consistent, from the period of inflation to the present day. And unlike many older models, the individual important values can be calculated to a high level of precision, for example the time at which the universe starts heating up again after inflation,” emphasises Ringwald.

    Being able to calculate these values with such precision means that SMASH could potentially be tested experimentally within the next ten years. “The good thing about SMASH is that the theory is falsifiable. For example, it contains very precise predictions of certain features of the so-called cosmic microwave background. Future experiments that measure this radiation with even greater precision could therefore soon rule out SMASH – or else confirm its predictions,” explains Ringwald. A further test of the model is the search for axions. Here too, the model is able to make accurate predictions, and if axions do indeed account for the bulk of dark matter in the universe, then SMASH requires them to have a mass of 50 to 200 micro-electronvolts, in the units conventionally used in particle physics. Experiments that examine dark matter more precisely could soon test this prediction too.

    Javier Redondo from the University of Saragossa in Spain and Carlos Tamarit from the University of Durham in England were also involved in the study.

    Read the APS synopsis: http://physics.aps.org/synopsis-for/10.1103/PhysRevLett.118.071802

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    desi

    DESY is one of the world’s leading accelerator centres. Researchers use the large-scale facilities at DESY to explore the microcosm in all its variety – from the interactions of tiny elementary particles and the behaviour of new types of nanomaterials to biomolecular processes that are essential to life. The accelerators and detectors that DESY develops and builds are unique research tools. The facilities generate the world’s most intense X-ray light, accelerate particles to record energies and open completely new windows onto the universe. 
That makes DESY not only a magnet for more than 3000 guest researchers from over 40 countries every year, but also a coveted partner for national and international cooperations. Committed young researchers find an exciting interdisciplinary setting at DESY. The research centre offers specialized training for a large number of professions. DESY cooperates with industry and business to promote new technologies that will benefit society and encourage innovations. This also benefits the metropolitan regions of the two DESY locations, Hamburg and Zeuthen near Berlin.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: