Tagged: Supersymmetry (SUSY) Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:10 pm on April 23, 2019 Permalink | Reply
    Tags: "Falsifiability and physics", , , , , , , , Karl Popper (1902-1994) "The Logic of Scientific Discovery", , Supersymmetry (SUSY),   

    From Symmetry: “Falsifiability and physics” 

    Symmetry Mag
    From Symmetry

    04/23/19
    Matthew R. Francis

    1
    Illustration by Sandbox Studio, Chicago with Corinne Mucha

    Can a theory that isn’t completely testable still be useful to physics?

    What determines if an idea is legitimately scientific or not? This question has been debated by philosophers and historians of science, working scientists, and lawyers in courts of law. That’s because it’s not merely an abstract notion: What makes something scientific or not determines if it should be taught in classrooms or supported by government grant money.

    The answer is relatively straightforward in many cases: Despite conspiracy theories to the contrary, the Earth is not flat. Literally all evidence is in favor of a round and rotating Earth, so statements based on a flat-Earth hypothesis are not scientific.

    In other cases, though, people actively debate where and how the demarcation line should be drawn. One such criterion was proposed by philosopher of science Karl Popper (1902-1994), who argued that scientific ideas must be subject to “falsification.”

    Popper wrote in his classic book The Logic of Scientific Discovery that a theory that cannot be proven false—that is, a theory flexible enough to encompass every possible experimental outcome—is scientifically useless. He wrote that a scientific idea must contain the key to its own downfall: It must make predictions that can be tested and, if those predictions are proven false, the theory must be jettisoned.

    When writing this, Popper was less concerned with physics than he was with theories like Freudian psychology and Stalinist history. These, he argued, were not falsifiable because they were vague or flexible enough to incorporate all the available evidence and therefore immune to testing.

    But where does this falsifiability requirement leave certain areas of theoretical physics? String theory, for example, involves physics on extremely small length scales unreachable by any foreseeable experiment.

    String Theory depiction. Cross section of the quintic Calabi–Yau manifold Calabi yau.jpg. Jbourjai (using Mathematica output)

    Cosmic inflation, a theory that explains much about the properties of the observable universe, may itself be untestable through direct observations.

    Some critics believe these theories are unfalsifiable and, for that reason, are of dubious scientific value.

    At the same time, many physicists align with philosophers of science who identified flaws in Popper’s model, saying falsification is most useful in identifying blatant pseudoscience (the flat-Earth hypothesis, again) but relatively unimportant for judging theories growing out of established paradigms in science.

    “I think we should be worried about being arrogant,” says Chanda Prescod-Weinstein of the University of New Hampshire. “Falsifiability is important, but so is remembering that nature does what it wants.”

    Prescod-Weinstein is both a particle cosmologist and researcher in science, technology, and society studies, interested in analyzing the priorities scientists have as a group. “Any particular generation deciding that they’ve worked out all that can be worked out seems like the height of arrogance to me,” she says.

    Tracy Slatyer of MIT agrees, and argues that stringently worrying about falsification can prevent new ideas from germinating, stifling creativity. “In theoretical physics, the vast majority of all the ideas you ever work on are going to be wrong,” she says. “They may be interesting ideas, they may be beautiful ideas, they may be gorgeous structures that are simply not realized in our universe.”

    Particles and practical philosophy

    Take, for example, supersymmetry. SUSY is an extension of the Standard Model in which each known particle is paired with a supersymmetric partner.

    Standard Model of Supersymmetry via DESY

    The theory is a natural outgrowth of a mathematical symmetry of spacetime, in ways similar to the Standard Model itself. It’s well established within particle physics, even though supersymmetric particles, if they exist, may be out of scientists’ experimental reach.

    SUSY could potentially resolve some major mysteries in modern physics. For one, all of those supersymmetric particles could be the reason the mass of the Higgs boson is smaller than quantum mechanics says it should be.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    “Quantum mechanics says that [the Higgs boson] mass should blow up to the largest mass scale possible,” says Howard Baer of the University of Oklahoma. That’s because masses in quantum theory are the result of contributions from many different particles involved in interactions—and the Higgs field, which gives other particles mass, racks up a lot of these interactions. But the Higgs mass isn’t huge, which requires an explanation.

    “Something else would have to be tuned to a huge negative [value] in order to cancel [the huge positive value of those interactions] and give you the observed value,” Baer says. That level of coincidence, known as a “fine-tuning problem,” makes physicists itchy. “It’s like trying to play the lottery. It’s possible you might win, but really you’re almost certain to lose.”

    If SUSY particles turn up in a certain mass range, their contributions to the Higgs mass “naturally” solve this problem, which has been an argument in favor of the theory of supersymmetry. So far, the Large Hadron Collider has not turned up any SUSY particles in the range of “naturalness.”

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    However, the broad framework of supersymmetry can accommodate even more massive SUSY particles, which may or may not be detectable using the LHC. In fact, if naturalness is abandoned, SUSY doesn’t provide an obvious mass scale at all, meaning SUSY particles might be out of range for discovery with any earthly particle collider. That point has made some critics queasy: If there’s no obvious mass scale at which colliders can hunt for SUSY, is the theory falsifiable?

    A related problem confronts dark matter researchers: Despite strong indirect evidence for a large amount of mass invisible to all forms of light, particle experiments have yet to find any dark matter particles. It could be that dark matter particles are just impossible to directly detect. A small but vocal group of researchers has argued that we need to consider alternative theories of gravity instead.

    Fritz Zwicky, the Father of Dark Matter research.No image credit after long search

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)

    U Washington ADMX Axion Dark Matter Experiment

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB deep in Sudbury’s Creighton Mine

    Dark Side-50 Dark Matter Experiment at Gran Sasso

    Slatyer, whose research involves looking for dark matter, considers the criticism partly as a problem of language. “When you say ‘dark matter,’ [you need] to distinguish dark matter from specific scenarios for what dark matter could be,” she says. “The community has not always done that well.”

    In other words, specific models for dark matter can stand or fall, but the dark matter paradigm as a whole has withstood all tests so far. But as Slatyer points out, no alternative theory of gravity can explain all the phenomena that a simple dark matter model can, from the behavior of galaxies to the structure of the cosmic microwave background.

    Prescod-Weinstein argues that we’re a long way from ruling out all dark matter possibilities. “How will we prove that the dark matter, if it exists, definitively doesn’t interact with the Standard Model?” she says. “Astrophysics is always a bit of a detective game. Without laboratory [detection of] dark matter, it’s hard to make definitive statements about its properties. But we can construct likely narratives based on what we know about its behavior.”

    Similarly, Baer thinks that we haven’t exhausted all the SUSY possibilities yet. “People say, ‘you’ve been promising supersymmetry for 20 or 30 years,’ but it was based on overly optimistic naturalness calculations,” he says. “I think if one evaluates the naturalness properly, then you find that supersymmetry is still even now very natural. But you’re going to need either an energy upgrade of LHC or an ILC [International Linear Collider] in order to discover it.”

    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan

    Beyond falsifiability of dark matter or SUSY, physicists are motivated by more mundane concerns. “Even if these individual scenarios are in principle falsifiable, how much money would [it] take and how much time would it take?” Slatyer says. In other words, rather than try to demonstrate or rule out SUSY as a whole, physicists focus on particle experiments that can be performed within a certain number of budgetary cycles. It’s not romantic, but it’s true nevertheless.

    2
    Illustration by Sandbox Studio, Chicago with Corinne Mucha

    Is it science? Who decides?

    Historically, sometimes theories that seem untestable turn out to just need more time. For example, 19th century physicist Ludwig Boltzmann and colleagues showed they could explain many results in thermal physics and chemistry if everything were made up of “atoms”—what we call particles, atoms, and molecules today—governed by Newtonian physics.

    Since atoms were out of reach of experiments of the day, prominent philosophers of science argued that the atomic hypothesis was untestable in principle, and therefore unscientific.

    However, the atomists eventually won the day: J. J. Thompson demonstrated the existence of electrons, while Albert Einstein showed that water molecules could make grains of pollen dance on a pond’s surface.

    Atoms provide a case study for how falsifiability proved to be the wrong criterion. Many other cases are trickier.

    For instance, Einstein’s theory of general relativity is one of the best-tested theories in all of science. At the same time, it allows for physically unrealistic “universes,” such as a “rotating” cosmos where movement back and forth in time is possible, which are contradicted by all observations of the reality we inhabit.

    General relativity also makes predictions about things that are untestable by definition, like how particles move inside the event horizon of a black hole: No information about these trajectories can be determined by experiment.

    The first image of a black hole, Messier 87 Credit Event Horizon Telescope Collaboration, via NSF 4.10.19

    Yet no knowledgeable physicist or philosopher of science would argue that general relativity is unscientific. The success of the theory is due to enough of its predictions being testable.

    Eddington/Einstein exibition of gravitational lensing solar eclipse of 29 May 1919

    Another type of theory may be mostly untestable, but have important consequences. One such theory is cosmic inflation, which (among other things) explains why we don’t see isolated magnetic monopoles and why the universe is a nearly uniform temperature everywhere we look.

    The key property of inflation—the extremely rapid expansion of spacetime during a tiny split second after the Big Bang—cannot be tested directly. Cosmologists look for indirect evidence for inflation, but in the end it may be difficult or impossible to distinguish between different inflationary models, simply because scientists can’t get the data. Does that mean it isn’t scientific?

    Inflation

    4
    Alan Guth, from Highland Park High School and M.I.T., who first proposed cosmic inflation

    HPHS Owls

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    Alan Guth’s notes:
    5

    “A lot of people have personal feelings about inflation and the aesthetics of physical theories,” Prescod-Weinstein says. She’s willing to entertain alternative ideas which have testable consequences, but inflation works well enough for now to keep it around. “It’s also the case that the majority of the cosmology community continues to take inflation seriously as a model, so I have to shrug a little when someone says it’s not science.”

    On that note, Caltech cosmologist Sean M. Carroll argues that many very useful theories have both falsifiable and unfalsifiable predictions. Some aspects may be testable in principle, but not by any experiment or observation we can perform with existing technology. Many particle physics models fall into that category, but that doesn’t stop physicists from finding them useful. SUSY as a concept may not be falsifiable, but many specific models within the broad framework certainly are. All the evidence we have for the existence of dark matter is indirect, which won’t go away even if laboratory experiments never find dark matter particles. Physicists accept the concept of dark matter because it works.

    Slatyer is a practical dark matter hunter. “The questions I’m most interested asking are not even just questions that are in principle falsifiable, but questions that in principle can be tested by data on the timescale of less than my lifetime,” she says. “But it’s not only problems that can be tested by data on a timescale of ‘less than Tracy’s lifetime’ are good scientific questions!”

    Prescod-Weinstein agrees, and argues for keeping an open mind. “There’s a lot we don’t know about the universe, including what’s knowable about it. We are a curious species, and I think we should remain curious.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 12:32 pm on April 18, 2019 Permalink | Reply
    Tags: "When Beauty Gets in the Way of Science", , , , , , , , , , , Supersymmetry (SUSY)   

    From Nautilus: “When Beauty Gets in the Way of Science” 

    Nautilus

    From Nautilus

    April 18, 2019
    Sabine Hossenfelder

    Insisting that new ideas must be beautiful blocks progress in particle physics.

    When Beauty Gets in the Way of Science. Nautilus

    The biggest news in particle physics is no news. In March, one of the most important conferences in the field, Rencontres de Moriond, took place. It is an annual meeting at which experimental collaborations present preliminary results. But the recent data from the Large Hadron Collider (LHC), currently the world’s largest particle collider, has not revealed anything new.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    Forty years ago, particle physicists thought themselves close to a final theory for the structure of matter. At that time, they formulated the Standard Model of particle physics to describe the elementary constituents of matter and their interactions.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    After that, they searched for the predicted, but still missing, particles of the Standard Model. In 2012, they confirmed the last missing particle, the Higgs boson.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    The Higgs boson is necessary to make sense of the rest of the Standard Model. Without it, the other particles would not have masses, and probabilities would not properly add up to one. Now, with the Higgs in the bag, the Standard Model is complete; all Pokémon caught.

    1
    HIGGS HANGOVER: After the Large Hadron Collider (above) confirmed the Higgs boson, which validated the Standard Model, it’s produced nothing newsworthy, and is unlikely to, says physicist Sabine Hossenfelder.Shutterstock

    The Standard Model may be physicists’ best shot at the structure of fundamental matter, but it leaves them wanting. Many particle physicists think it is simply too ugly to be nature’s last word. The 25 particles of the Standard Model can be classified by three types of symmetries that correspond to three fundamental forces: The electromagnetic force, and the strong and weak nuclear forces. Physicists, however, would rather there was only one unified force. They would also like to see an entirely new type of symmetry, the so-called “supersymmetry,” because that would be more appealing.

    2
    Supersymmetry builds on the Standard Model, with many new supersymmetric particles, represented here with a tilde (~) on them. ( From the movie “Particle fever” reproduced by Mark Levinson)

    Oh, and additional dimensions of space would be pretty. And maybe also parallel universes. Their wish list is long.

    It has become common practice among particle physicists to use arguments from beauty to select the theories they deem worthy of further study. These criteria of beauty are subjective and not evidence-based, but they are widely believed to be good guides to theory development. The most often used criteria of beauty in the foundations of physics are presently simplicity and naturalness.

    By “simplicity,” I don’t mean relative simplicity, the idea that the simplest theory is the best (a.k.a. “Occam’s razor”). Relying on relative simplicity is good scientific practice. The desire that a theory be simple in absolute terms, in contrast, is a criterion from beauty: There is no deep reason that the laws of nature should be simple. In the foundations of physics, this desire for absolute simplicity presently shows in physicists’ hope for unification or, if you push it one level further, in the quest for a “Theory of Everything” that would merge the three forces of the Standard Model with gravity.

    The other criterion of beauty, naturalness, requires that pure numbers that appear in a theory (i.e., those without units) should neither be very large nor very small; instead, these numbers should be close to one. Exactly how close these numbers should be to one is debatable, which is already an indicator of the non-scientific nature of this argument. Indeed, the inability of particle physicists to quantify just when a lack of naturalness becomes problematic highlights that the fact that an unnatural theory is utterly unproblematic. It is just not beautiful.

    Anyone who has a look at the literature of the foundations of physics will see that relying on such arguments from beauty has been a major current in the field for decades. It has been propagated by big players in the field, including Steven Weinberg, Frank Wilczek, Edward Witten, Murray Gell-Mann, and Sheldon Glashow. Countless books popularized the idea that the laws of nature should be beautiful, written, among others, by Brian Greene, Dan Hooper, Gordon Kane, and Anthony Zee. Indeed, this talk about beauty has been going on for so long that at this point it seems likely most people presently in the field were attracted by it in the first place. Little surprise, then, they can’t seem to let go of it.

    Trouble is, relying on beauty as a guide to new laws of nature is not working.

    Since the 1980s, dozens of experiments looked for evidence of unified forces and supersymmetric particles, and other particles invented to beautify the Standard Model. Physicists have conjectured hundreds of hypothetical particles, from “gluinos” and “wimps” to “branons” and “cuscutons,” each of which they invented to remedy a perceived lack of beauty in the existing theories. These particles are supposed to aid beauty, for example, by increasing the amount of symmetries, by unifying forces, or by explaining why certain numbers are small. Unfortunately, not a single one of those particles has ever been seen. Measurements have merely confirmed the Standard Model over and over again. And a theory of everything, if it exists, is as elusive today as it was in the 1970s. The Large Hadron Collider is only the most recent in a long series of searches that failed to confirm those beauty-based predictions.

    These decades of failure show that postulating new laws of nature just because they are beautiful according to human standards is not a good way to put forward scientific hypotheses. It’s not the first time this has happened. Historical precedents are not difficult to find. Relying on beauty did not work for Kepler’s Platonic solids, it did not work for Einstein’s idea of an eternally unchanging universe, and it did not work for the oh-so-pretty idea, popular at the end of the 19th century, that atoms are knots in an invisible ether. All of these theories were once considered beautiful, but are today known to be wrong. Physicists have repeatedly told me about beautiful ideas that didn’t turn out to be beautiful at all. Such hindsight is not evidence that arguments from beauty work, but rather that our perception of beauty changes over time.

    That beauty is subjective is hardly a breakthrough insight, but physicists are slow to learn the lesson—and that has consequences. Experiments that test ill-motivated hypotheses are at high risk to only find null results; i.e., to confirm the existing theories and not see evidence of new effects. This is what has happened in the foundations of physics for 40 years now. And with the new LHC results, it happened once again.

    The data analyzed so far shows no evidence for supersymmetric particles, extra dimensions, or any other physics that would not be compatible with the Standard Model. In the past two years, particle physicists were excited about an anomaly in the interaction rates of different leptons. The Standard Model predicts these rates should be identical, but the data demonstrates a slight difference. This “lepton anomaly” has persisted in the new data, but—against particle physicists’ hopes—it did not increase in significance, is hence not a sign for new particles. The LHC collaborations succeeded in measuring the violation of symmetry in the decay of composite particles called “D-mesons,” but the measured effect is, once again, consistent with the Standard Model. The data stubbornly repeat: Nothing new to see here.

    Of course it’s possible there is something to find in the data yet to be analyzed. But at this point we already know that all previously made predictions for new physics were wrong, meaning that there is now no reason to expect anything new to appear.

    Yes, null results—like the recent LHC measurements—are also results. They rule out some hypotheses. But null results are not very useful results if you want to develop a new theory. A null-result says: “Let’s not go this way.” A result says: “Let’s go that way.” If there are many ways to go, discarding some of them does not help much.

    To find the way forward in the foundations of physics, we need results, not null-results. When testing new hypotheses takes decades of construction time and billions of dollars, we have to be careful what to invest in. Experiments have become too costly to rely on serendipitous discoveries. Beauty-based methods have historically not worked. They still don’t work. It’s time that physicists take note.

    And it’s not like the lack of beauty is the only problem with the current theories in the foundations of physics. There are good reasons to think physics is not done. The Standard Model cannot be the last word, notably because it does not contain gravity and fails to account for the masses of neutrinos. It also describes neither dark matter nor dark energy, which are necessary to explain galactic structures.

    So, clearly, the foundations of physics have problems that require answers. Physicists should focus on those. And we currently have no reason to think that colliding particles at the next higher energies will help solve any of the existing problems. New effects may not appear until energies are a billion times higher than what even the next larger collider could probe. To make progress, then, physicists must, first and foremost, learn from their failed predictions.

    So far, they have not. In 2016, the particle physicists Howard Baer, Vernon Barger, and Jenny List wrote an essay for Scientific American arguing that we need a larger particle collider to “save physics.” The reason? A theory the authors had proposed themselves, that is natural (beautiful!) in a specific way, predicts such a larger collider should see new particles. This March, Kane, a particle physicist, used similar beauty-based arguments in an essay for Physics Today. And a recent comment in Nature Reviews Physics about a big, new particle collider planned in Japan once again drew on the same motivations from naturalness that have already not worked for the LHC. Even the particle physicists who have admitted their predictions failed do not want to give up beauty-based hypotheses. Instead, they have argued we need more experiments to test just how wrong they are.

    Will this latest round of null-results finally convince particle physicists that they need new methods of theory-development? I certainly hope so.

    As an ex-particle physicist myself, I understand very well the desire to have an all-encompassing theory for the structure of matter. I can also relate to the appeal of theories such a supersymmetry or string theory. And, yes, I quite like the idea that we live in one of infinitely many universes that together make up the “multiverse.” But, as the latest LHC results drive home once again, the laws of nature care heartily little about what humans find beautiful.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 8:17 am on April 5, 2019 Permalink | Reply
    Tags: , In string theory a “solution” implies a vacuum of spacetime that is governed by Einstein’s theory of gravity coupled to a quantum field theory., In the past two decades a new branch of string theory called F-theory has allowed physicists to work with strongly interacting or strongly coupled strings, , , String theorists can use algebraic geometry to analyze the various ways of compactifying extra dimensions in F-theory and to find solutions., , Supersymmetry (SUSY)   

    From Scientific American: “Found: A Quadrillion Ways for String Theory to Make Our Universe” 

    Scientific American

    From Scientific American

    Mar 29, 2019
    Anil Ananthaswamy

    Stemming from the “F-theory” branch of string theory, each solution replicates key features of the standard model of particle physics.

    1
    Photo: dianaarturovna/Getty Images

    Physicists who have been roaming the “landscape” of string theory — the space of zillions and zillions of mathematical solutions of the theory, where each solution provides the kinds of equations physicists need to describe reality — have stumbled upon a subset of such equations that have the same set of matter particles as exists in our universe.

    String Theory depiction. Cross section of the quintic Calabi–Yau manifold Calabi yau.jpg. Jbourjai (using Mathematica output)

    Standard Model of Supersymmetry via DESY

    But this is no small subset: there are at least a quadrillion such solutions, making it the largest such set ever found in string theory.

    According to string theory, all particles and fundamental forces arise from the vibrational states of tiny strings. For mathematical consistency, these strings vibrate in 10-dimensional spacetime. And for consistency with our familiar everyday experience of the universe, with three spatial dimensions and the dimension of time, the additional six dimensions are “compactified” so as to be undetectable.

    Different compactifications lead to different solutions. In string theory, a “solution” implies a vacuum of spacetime that is governed by Einstein’s theory of gravity coupled to a quantum field theory. Each solution describes a unique universe, with its own set of particles, fundamental forces and other such defining properties.

    Some string theorists have focused their efforts on trying to find ways to connect string theory to properties of our known, observable universe — particularly the standard model of particle physics, which describes all known particles and all their mutual forces except gravity.

    Much of this effort has involved a version of string theory in which the strings interact weakly. However, in the past two decades, a new branch of string theory called F-theory has allowed physicists to work with strongly interacting, or strongly coupled, strings.

    ____________________________________________________
    F-theory is a branch of string theory developed by Cumrun Vafa. The new vacua described by F-theory were discovered by Vafa and allowed string theorists to construct new realistic vacua — in the form of F-theory compactified on elliptically fibered Calabi–Yau four-folds. The letter “F” supposedly stands for “Father”.

    F-theory is formally a 12-dimensional theory, but the only way to obtain an acceptable background is to compactify this theory on a two-torus. By doing so, one obtains type IIB superstring theory in 10 dimensions. The SL(2,Z) S-duality symmetry of the resulting type IIB string theory is manifest because it arises as the group of large diffeomorphisms of the two-dimensional torus.

    More generally, one can compactify F-theory on an elliptically fibered manifold (elliptic fibration), i.e. a fiber bundle whose fiber is a two-dimensional torus (also called an elliptic curve). For example, a subclass of the K3 manifolds is elliptically fibered, and F-theory on a K3 manifold is dual to heterotic string theory on a two-torus. Also, the moduli spaces of those theories should be isomorphic.

    The large number of semirealistic solutions to string theory referred to as the string theory landscape, with 10 272 , 000 {\displaystyle 10^{272,000}} {\displaystyle 10^{272,000}} elements or so, is dominated by F-theory compactifications on Calabi–Yau four-folds.[3] There are about 10 15 {\displaystyle 10^{15}} 10^{15} of those solutions consistent with the Standard Model of particle physics.

    -Wikipedia

    ____________________________________________________

    “An intriguing, surprising result is that when the coupling is large, we can start describing the theory very geometrically,” says Mirjam Cvetic of the University of Pennsylvania in Philadelphia.

    This means that string theorists can use algebraic geometry — which uses algebraic techniques to tackle geometric problems — to analyze the various ways of compactifying extra dimensions in F-theory and to find solutions. Mathematicians have been independently studying some of the geometric forms that appear in F-theory. “They provide us physicists a vast toolkit”, says Ling Lin, also of the University of Pennsylvania. “The geometry is really the key… it is the ‘language’ that makes F-theory such a powerful framework.”

    Now, Cvetic, Lin, James Halverson of Northeastern University in Boston, and their colleagues have used such techniques to identify a class of solutions with string vibrational modes that lead to a similar spectrum of fermions (or, particles of matter) as is described by the standard model — including the property that all fermions come in three generations (for example, the electron, muon and tau are the three generations of one type of fermion).

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    The F-theory solutions found by Cvetic and colleagues have particles that also exhibit the handedness, or chirality, of the standard model particles. In particle physics lingo, the solutions reproduce the exact “chiral spectrum” of standard model particles. For example, the quarks and leptons in these solutions come in left and right-handed versions, as they do in our universe.

    The new work shows that there are at least a quadrillion solutions in which particles have the same chiral spectrum as the standard model, which is 10 orders of magnitude more solutions than had been found within string theory until now. “This is by far the largest domain of standard model solutions,” Cvetic says. “It’s somehow surprising and actually also rewarding that it turns out to be in the strongly coupled string theory regime, where geometry helped us.”

    A quadrillion — while it’s much, much smaller than the size of the landscape of solutions in F-theory (which at last count was shown to be of the order of 10272,000) — is a tremendously large number. “And because it’s a tremendously large number, and it gets something nontrivial in real world particle physics correct, we should take it seriously and study it further,” Halverson says.

    Further study would involve uncovering stronger connections with the particle physics of the real world. The researchers still have to work out the couplings or interactions between particles in the F-theory solutions — which again depend on the geometric details of the compactifications of the extra dimensions.

    It could be that within the space of the quadrillion solutions, there are some with couplings that could cause the proton to decay within observable timescales. This would clearly be at odds with the real world, as experiments have yet to see any sign of protons decaying. Alternatively, physicists could search for solutions that realize the spectrum of standard model particles that preserve a mathematical symmetry called R-parity. “This symmetry forbids certain proton decay processes and would be very attractive from a particle physics point of view, but is missing in our current models,” Lin says.

    Also, the work assumes supersymmetry, which means that all the standard model particles have partner particles. String theory needs this symmetry in order to ensure the mathematical consistency of solutions.

    But in order for any supersymmetric theory to tally with the observable universe, the symmetry has to be broken (much like how a diner’s selection of cutlery and drinking glass on her left or right side will “break” the symmetry of the table setting at a round dinner table). Else, the partner particles would have the same mass as standard model particles — and that is clearly not the case, since we don’t observe any such partner particles in our experiments.

    Crucially, experiments at the Large Hadron Collider (LHC) have also shown that supersymmetry — if it is the correct description of nature — is not broken even at the energy scales probed by the LHC, given that the LHC has yet to find any supersymmetric particles.

    String theorists think that supersymmetry might be broken only at extremely high energies that are not within experimental reach anytime soon. “The expectation in string theory is that high-scale [supersymmetry] breaking, which is fully consistent with LHC data, is completely possible,” Halverson says. “It requires further analysis to determine whether or not it happens in our case.”

    Despite these caveats, other string theorists are approving of the new work. “This is definitely a step forward in demonstrating that string theory gives rise to many solutions with features of the standard model,” says string theorist Washington Taylor of MIT.

    “It’s very nice work,” says Cumrun Vafa, one of the developers of F-theory, at Harvard University. “The fact you can arrange the geometry and topology to fit with not only Einstein’s equations, but also with the [particle] spectrum that we want, is not trivial. It works out nicely here.”

    But Vafa and Taylor both caution that these solutions are far from matching perfectly with the standard model. Getting solutions to match exactly with the particle physics of our world is one of the ultimate goals of string theory. Vafa is among those who think that, despite the immensity of the landscape of solutions, there exists a unique solution that matches our universe. “I bet there is exactly one,” he says. But, “to pinpoint this is not going to be easy.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 5:34 pm on August 30, 2018 Permalink | Reply
    Tags: , Borexino observatory, , , , , DarkSide experiment, Davide D’Angelo-physical scientist, , , , , , , Pobbile dark matter candidates-axions gravitinos Massive Astrophysical Compact Halo Objects (MACHOs) and Weakly Interacting Massive Particles (WMIPs.)), SABRE-Sodium Iodide with Active Background Rejection Experiment, , Solar neutrinos-recently caught at U Wisconsin IceCube at the South Pole, , , Supersymmetry (SUSY), , , WIMPs that go by names like the gravitino sneutrino and neutralino   

    From Gran Sasso via Motherboard: “The New Hunt for Dark Matter Is Taking Place Under a Mountain” 

    From Gran Sasso

    via

    Motherboard

    1

    Aug 30 2018
    Daniel Oberhaus

    Davide D’Angelo wasn’t always interested in dark matter, but now he’s at the forefront of the hunt to find the most elusive particle in the universe.

    About an hour outside of Rome there’s a dense cluster of mountains known as the Gran Sasso d’Italia. Renowned for their natural beauty, the Gran Sasso are a popular tourist destination year round, offering world-class skiing in the winter and plenty of hiking and swimming opportunities in the summer. For the 43-year old Italian physicist Davide D’Angelo, these mountains are like a second home. Unlike most people who visit Gran Sasso, however, D’Angelo spends more time under the mountains than on top of them.

    It’s here, in a cavernous hall thousands of feet beneath the earth, that D’Angleo works on a new generation of experiments dedicated to the hunt for dark matter particles, an exotic form of matter whose existence has been hypothesized for decades but never proven experimentally.

    Dark matter is thought to make up about 27 percent of the universe and characterizing this elusive substance is one of the most profound problems in contemporary physics. Although D’Angelo is optimistic that a breakthrough will occur in his lifetime, so was the last generation of physicists. In fact, there’s a decent chance that the particles D’Angelo is looking for don’t exist at all. Yet for physicists probing the fundamental nature of the universe, the possibility that they might spend their entire career “hunting ghosts,” as D’Angelo put it, is the price of advancing science.

    WHAT’S UNDER THE ‘GREAT STONE’?

    In 1989, Italy’s National Institute for Nuclear Physics opened the Gran Sasso National Laboratory, the world’s largest underground laboratory dedicated to astrophysics. Gran Sasso’s three cavernous halls were purposely built for physics, which is something of a luxury as far as research centers go. Most other underground astrophysics laboratories like SNOLAB are ad hoc facilities that repurpose old or active mine shafts, which limits the amount of time that can be spent in the lab and the types of equipment that can be used.


    SNOLAB, Sudbury, Ontario, Canada.

    Buried nearly a mile underground to protect it from the noisy cosmic rays that bathe the Earth, Gran Sasso is home to a number of particle physics experiments that are probing the foundations of the universe. For the last few years, D’Angelo has divided his time between the Borexino observatory and the Sodium Iodide with Active Background Rejection Experiment (SABRE), which are investigating solar neutrinos and dark matter, respectively.

    Borexino Solar Neutrino detector

    SABRE experiment at INFN Gran Sasso

    2
    Davide D’Angelo with the SABRE proof of concept. Image: Xavier Aaronson/Motherboard

    Over the last 100 years, characterizing solar neutrinos and dark matter was considered to be one of the most important tasks of particle physics. Today, the mystery of solar neutrinos is resolved, but the particles are still of great interest to physicists for the insight they provide into the fusion process occurring in our Sun and other stars. The composition of dark matter, however, is still considered to be one of the biggest questions in particle physics. Despite the radically different nature of the particles, they are united insofar as they both can only be discovered in environments where the background radiation is at a minimum: Thousands of feet beneath the Earth’s surface.

    “The mountain acts as a shield so if you go below it, you have so-called ‘cosmic silence,’” D’Angelo said. “That’s the part of my research I like most: Going into the cave, putting my hands on the detector and trying to understand the signals I’m seeing.”

    After finishing grad school, D’Angelo got a job with Italy’s National Institute for Nuclear Physics where his research focused on solar neutrinos, a subatomic particle with no charge that is produced by fusion in the Sun. For the better part of four decades, solar neutrinos [recently caught at U Wisconsin IceCube at the South Pole] were at the heart of one of the largest mysteries in astrophysics.

    IceCube neutrino detector interior


    U Wisconsin ICECUBE neutrino detector at the South Pole

    The problem was that instruments measuring the energy from solar neutrinos returned results much lower than predicted by the Standard Model, the most accurate theory of fundamental particles in physics.

    Given how accurate the Standard Model had proven to be for other aspects of cosmology, physicists were reluctant to make alterations to it to account for the discrepancy. One possible explanation was that physicists had faulty models of the Sun and better measurements of its core pressure and temperature were needed. Yet after a string of observations in the 60s and 70s demonstrated that the models of the sun were essentially correct, physicists sought alternative explanations by turning to the neutrino.

    A TALE OF THREE NEUTRINOS

    Ever since they were first proposed by the Austrian physicist Wolfgang Pauli in 1930, neutrinos have been called upon to patch holes in theories. In Pauli’s case, he first posited the existence of an extremely light, chargeless particle as a “desperate remedy” to explain why the law of the conservation of energy appeared to be violated during radioactive decay. Three years later, the Italian physicist Enrico Fermi gave these hypothetical particles a name. He called them “neutrinos,” Italian for “little neutrons.”

    A quarter of a century after Pauli posited their existence, two American physicists reported the first evidence of neutrinos produced in a fission reactor. The following year, in 1957, Bruno Pontecorvo, an Italian physicist working in the Soviet Union, developed a theory of neutrino oscillations. At the time, little was known about the properties of neutrinos and Pontecorvo suggested that there might be more than one type of neutrino. If this were the case, Pontecorvo theorized that it could be possible for the neutrinos to switch between types.

    By 1975, part of Pontecorvo’s theory had been proven correct. Three different types, or “flavors,” of neutrino had been discovered: electron neutrinos, muon neutrinos, and tau neutrinos. Importantly, observations from an experiment in a South Dakota mineshaft had confirmed that the Sun produced electron neutrinos. The only issue was that the experiment detected far fewer neutrinos than the Standard Model predicted.

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA


    FNAL DUNE Argon tank at SURF


    Surf-Dune/LBNF Caverns at Sanford



    SURF building in Lead SD USA

    Prior to the late 90s, there was scant indirect evidence that neutrinos could change from one flavor to another. In 1998, a group of researchers working in Japan’s Super-Kamiokande Observatory observed oscillations in atmospheric neutrinos, which are mostly produced by the interactions between photons and the Earth’s atmosphere.

    Super-Kamiokande experiment. located under Mount Ikeno near the city of Hida, Gifu Prefecture, Japan

    Three years later, Canada’s Sudbury Neutrino Observatory (SNO) provided the first direct evidence of oscillations from solar neutrinos.

    Sudbury Neutrino Observatory, no longer operating

    This was, to put it lightly, a big deal in cosmological physics. It effectively resolved the mystery of the missing solar neutrinos, or why experiments only observed about a third as many neutrinos radiating from the Sun compared to predictions made by the Standard Model. If neutrinos could oscillate between flavors, this means a neutrino that is emitted in the Sun’s core could be a different type of neutrino by the time it reaches Earth. Prior to the mid-80s, most experiments on Earth were only looking for electron neutrinos, which meant they were missing the other two flavors of neutrinos that were created en route from the Sun to the Earth.

    When SNO was dreamt up in the 80s, it was designed so that it would be capable of detecting all three types of neutrinos, instead of just electron neutrinos. This decision paid off. In 2015, the directors of the experiments at Super-Kamiokande and SNO shared the Nobel Prize in physics for resolving the mystery of the missing solar neutrinos.

    Although the mystery of solar neutrinos has been solved, there’s still plenty of science to be done to better understand them. Since 2007, Gran Sasso’s Borexino observatory has been refining the measurements of solar neutrino flux, which has given physicists unprecedented insight into the fusion process powering the Sun. From the outside, the Borexino observatory looks like a large metal sphere, but on the inside it looks like a technology transplanted from an alien world.

    Borexino detector. Image INFN

    In the center of the sphere is basically a large, transparent nylon sack that is almost 30 feet in diameter and only half a millimeter thick. This sack contains a liquid scintillator, a chemical mixture that releases energy when a neutrino passes through it. This nylon sphere is suspended in 1,000 metric tons of a purified buffer liquid and surrounded by 2,200 sensors to detect energy released by electrons that are freed when neutrinos interact with the liquid scintillator. Finally, an outer buffer of nearly 3,000 tons of ultrapure water helps provide additional shielding for the detector. Taken together, the Borexino observatory has the most protection from outside radiation interference of any liquid scintillator in the world.

    For the last decade, physicists at Borexino—including D’Angelo, who joined the project in 2011—have been using this one-of-a-kind device to observe low energy solar neutrinos produced by proton collisions during the fusion process in the Sun’s core. Given how difficult it is to detect these chargless, ultralight particles that hardly ever interact with matter, detecting the low energy solar neutrinos would be virtually impossible without such a sensitive machine. When SNO directly detected the first solar neutrino oscillations, for instance, it could only observe the highest energy solar neutrinos due to interference from background radiation. This amounted to only about 0.01 percent of all the neutrinos emitted by the Sun. Borexino’s sensitivity allows it to observe solar neutrinos whose energy is a full order of magnitude lower than those detected by SNO, opening the door for an incredibly refined model of solar processes as well as more exotic events like supernovae.

    “It took physicists 40 years to understand solar neutrinos and it’s been one of the most interesting puzzles in particle physics,” D’Angelo told me. “It’s kind of like how dark matter is now.”

    SHINING A LIGHT ON DARK MATTER

    If neutrinos were the mystery particle of the twentieth century, then dark matter is the particle conundrum for the new millenium. Just like Pauli proposed neutrinos as a “desperate remedy” to explain why experiments seemed to be violating one of the most fundamental laws of nature, the existence of dark matter particles is inferred because cosmological observations just don’t add up.

    In the early 1930s, the American astronomer Fritz Zwicky was studying the movement of a handful of galaxies in the Coma cluster, a collection of over 1,000 galaxies approximately 320 million light years from Earth.

    Fritz Zwicky, the Father of Dark Matter research.No image credit after long search

    Vera Rubin did much of the work on proving the existence of Dark Matter. She and Fritz were both overlooked for the Nobel prize.

    Vera Rubin measuring spectra (Emilio Segre Visual Archives AIP SPL)


    Astronomer Vera Rubin at the Lowell Observatory in 1965. (The Carnegie Institution for Science)

    Using data published by Edwin Hubble, Zwicky calculated the mass of the entire Coma galaxy cluster.

    Coma cluster via NASA/ESA Hubble

    When he did, Zwicky noticed something odd about the velocity dispersion—the statistical distribribution of the speeds of a group of objects—of the galaxies: The velocity distribution was about 12 times higher than it should be based on the amount of matter in the galaxies.

    Inside Gran Sasso- Image- Xavier Aaronson-Motherboard

    This was a surprising calculation and its significance wasn’t lost on Zwicky. “If this would be confirmed,” he wrote, “we would get the surprising result that dark matter is present in much greater amount than luminous matter.”

    The idea that the universe was made up mostly of invisible matter was a radical idea in Zwicky’s time and still is today. The main difference, however, is that astronomers now have much stronger empirical evidence pointing to its existence. This is mostly due to the American astronomer Vera Rubin, whose measurement of galactic rotations in the 1960s and 70s put the existence of dark matter beyond a doubt. In fact, based on Rubin’s measurements and subsequent observations, physicists now think dark matter makes up about 27 percent of the “stuff” in the universe, about seven times more than the regular, baryonic matter we’re all familiar with. The burning question, then, is what is it made of?

    Since Rubin’s pioneering observations, a number of dark matter candidate particles have been proposed, but so far all of them have eluded detection by some of the world’s most sensitive instruments. Part of the reason for this is that physicists aren’t exactly sure what they’re looking for. In fact, a small minority of physicists think dark matter might not be a particle at all and is just an exotic gravitational effect. This makes designing dark matter experiments kind of like finding a car key in a stadium parking lot and trying to track down the vehicle it pairs with. There’s a pretty good chance the car is somewhere in the parking lot, but you’re going to have to try a lot of doors before you find your ride—if it even exists.

    Among the candidates for dark matter are subatomic particles with goofy names like axions, gravitinos, Massive Astrophysical Compact Halo Objects (MACHOs), and Weakly Interacting Massive Particles (WMIPs.) D’Angelo and his colleagues at Gran Sasso have placed their bets on WIMPs, which until recently were considered to be the leading particle candidate for dark matter.

    Over the last few years, however, physicists have started to look at other possibilities after some critical tests failed to confirm the existence of WIMPs. WIMPs are a class of hypothetical elementary particles that hardly ever interact with regular baryonic matter and don’t emit light, which makes them exceedingly hard to detect. This problem is compounded by the fact that no one is really sure how to characterize a WIMP. Needless to say, it’s hard to find something if you’re not even really sure what you’re looking for.

    So why would physicists think WIMPs exist at all? In the 1970s, physicists conceptualized the Standard Model of particle physics, which posited that everything in the universe was made out of a handful of fundamental particles.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    The Standard Model works great at explaining almost everything the universe throws at it, but it’s still incomplete since it doesn’t incorporate gravity into the model.

    Gravity measured with two slightly different torsion pendulum set ups and slightly different results

    In the 1980s, an extension of the Standard Model called Supersymmetry emerged, which hypothesizes that each fundamental particle in the Standard Model has a partner.

    Standard model of Supersymmetry DESY

    These particle pairs are known as supersymmetric particles and are used as the theoretical explanation for a number of mysteries in Standard Model physics, such as the mass of the Higgs boson and the existence of dark matter. Some of the most complex and expensive experiments in the world like the Large Hadron Collider particle accelerator were created in an effort to discover these supersymmetric particles, but so far there’s been no experimental evidence that these particles actually exist.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    Many of the lightest particles theorized in the supersymmetric model are WIMPs and go by names like the gravitino, sneutrino and neutralino. The latter is still considered to be the leading candidate for dark matter by many physicists and is thought to have formed in abundance in the early universe. Detecting evidence of this ancient theoretical particle is the goal of many dark matter experiments, including the one D’Angelo works on at Gran Sasso.

    D’Angelo told me he became interested in dark matter a few years after joining the Gran Sasso laboratory and began contributing to the laboratory’s DarkSide experiment, which seemed like a natural extension of his work on solar neutrinos. DarkSide is essentially a large tank filled with liquid argon and equipped with incredibly sensitive sensors. If WIMPs exist, physicists expect to detect them from the ionization produced through their collision with the argon nuclei.

    Dark Side-50 Dark Matter Experiment at Gran Sasso

    The set up of the SABRE experiment is deliberately similar to another experiment that has been running at Gran Sasso since 1995 called DAMA. In 2003, the DAMA experiment began looking for seasonal fluctuations in dark matter particles that was predicted in the 1980s as a consequence of the relative motion of the sun and Earth to the rest of the galaxy. The theory posited that the relative speed of any dark matter particles detected on Earth should peak in June and bottom out in December.

    The DarkSide experiment has been running at Gran Sasso since 2013 and D’Angelo said it is expected to continue for several more years. These days, however, he’s found himself involved with a different dark matter experiment at Gran Sasso called SABRE [above], which will also look for direct evidence of dark matter particles based on the light produced when energy is released through their collision with Sodium-Iodide crystals.

    Over the course of nearly 15 years, DAMA did in fact register seasonal fluctuations in its detectors that were in accordance with this theory and the expected signature of a dark matter particle. In short, it seemed as if DAMA was the first experiment in the world to detect a dark matter particle. The problem, however, was that DAMA couldn’t completely rule out the possibility that the signature it had detected was in fact due to some other seasonal variation on Earth, rather than the ebb and flow of dark matter as the Earth revolved around the Sun.

    SABRE aims to remove the ambiguities in DAMA’s data. After all the kinks are worked out in the testing equipment, the Gran Sasso experiment will become one half of SABRE. The other half will be located in Australia in a converted gold mine. By having a laboratory in the northern hemisphere and another in the southern hemisphere, this should help eliminate any false positives that result from normal seasonal fluctuations. At the moment, the SABRE detector is still in a proof of principle phase and is expected to begin observations in both hemispheres within the next few years.

    When it comes to SABRE, it’s possible that the experiment may disprove the best evidence physicists have found so far for a dark matter particle. But as D’Angelo pointed out, this type of disappointment is a fundamental part of science.

    “Of course I am afraid that there might not be any dark matter there and we are hunting ghosts, but science is like this,” D’Angelo said. “Sometimes you spend several years looking for something and in the end it’s not there so you have to change the way you were thinking about things.”

    For D’Angelo, probing the subatomic world with neutrino and dark matter research from a cave in Italy is his way of connecting to the universe writ large.

    “The tiniest elements of nature are bonded to the most macroscopic phenomena, like the expansion of the universe,” D’Angelo said. “The infinitely small touches the infinitely big in this sense, and I find that fascinating. The physics I do, it’s goal is to push over the boundary of human knowledge.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    INFN Gran Sasso National Laboratory (LNGS) is the largest underground laboratory in the world devoted to neutrino and astroparticle physics, a worldwide research facility for scientists working in this field of research, where particle physics, cosmology and astrophysics meet. It is unequalled anywhere else, as it offers the most advanced underground infrastructures in terms of dimensions, complexity and completeness.

    LNGS is funded by the National Institute for Nuclear Physics (INFN), the Italian Institution in charge to coordinate and support research in elementary particles physics, nuclear and sub nuclear physics

    Located between L’Aquila and Teramo, at about 120 kilometres from Rome, the underground structures are on one side of the 10-kilometre long highway tunnel which crosses the Gran Sasso massif (towards Rome); the underground complex consists of three huge experimental halls (each 100-metre long, 20-metre large and 18-metre high) and bypass tunnels, for a total volume of about 180.000 m3.

    Access to experimental halls is horizontal and it is made easier by the highway tunnel. Halls are equipped with all technical and safety equipment and plants necessary for the experimental activities and to ensure proper working conditions for people involved.

    The 1400 metre-rock thickness above the Laboratory represents a natural coverage that provides a cosmic ray flux reduction by one million times; moreover, the flux of neutrons in the underground halls is about thousand times less than on the surface due to the very small amount of uranium and thorium of the Dolomite calcareous rock of the mountain.

    The permeability of cosmic radiation provided by the rock coverage together with the huge dimensions and the impressive basic infrastructure, make the Laboratory unmatched in the detection of weak or rare signals, which are relevant for astroparticle, sub nuclear and nuclear physics.

    Outside, immersed in a National Park of exceptional environmental and naturalistic interest on the slopes of the Gran Sasso mountain chain, an area of more than 23 acres hosts laboratories and workshops, the Computing Centre, the Directorate and several other Offices.

    Currently 1100 scientists from 29 different Countries are taking part in the experimental activities of LNGS.
    LNGS research activities range from neutrino physics to dark matter search, to nuclear astrophysics, and also to earth physics, biology and fundamental physics.

     
    • Marco Pereira 2:43 pm on September 1, 2018 Permalink | Reply

      I created a theory called the Hypergeometrical Universe Theory (HU). This theory uses three hypotheses:
      a) The Universe is a lightspeed expanding hyperspherical hypersurface. This was later proven correct by observations by the Sloan Digital Sky Survey
      https://hypergeometricaluniverse.quora.com/Proof-of-an-Extra-Spatial-Dimension
      b) Matter is made directly and simply from coherences between stationary states of deformation of the local metric called Fundamental Dilator or FD.
      https://hypergeometricaluniverse.quora.com/The-Fundamental-Dilator
      c) FDs obey the Quantum Lagrangian Principle (QLP). Yves Couder had a physical implementation (approximation) of the Fundamental Dilator and was perplexed that it would behave Quantum Mechanically. FDs and the QLP are the reason for Quantum Mechanics. QLP replaces Newtonian Dynamics and allows for the derivation of Quantum Gravity or Gravity as applied to Black Holes.

      HU derives a new law of Gravitation that is epoch-dependent. That makes Type 1a Supernovae to be epoch-dependent (within the context of the theory). HU then derives the Absolute Luminosity of SN1a as a function of G and showed that Absolute Luminosity scales with G^{-3}.
      Once corrected the Photometrically Determined SN1a distances, HU CORRECTLY PREDICTS all SN1a distances given their redshifts z.

      The extra dimension refutes all 4D spacetime theories, including General Relativity and L-CDM. HU also falsifies all Dark Matter evidence:
      https://www.quora.com/Are-dark-matter-and-dark-energy-falsifiable/answer/Marco-Pereira-1
      including the Spiral Galaxy Conundrum and the Coma Cluster Conundrum.

      Somehow, my theory is still been censored by the community as a whole (either directly or by omission).

      I hope this posting will help correct this situation.

      Like

  • richardmitnick 4:33 pm on August 20, 2018 Permalink | Reply
    Tags: , Anomalies, Bosons and fermions, Branes, , , , , , Parity violation, , , , , , Supersymmetry (SUSY), The second superstring revolution, Theorist John Schwarz   

    From Caltech: “Long and Winding Road: A Conversation with String Theory Pioneer” John Schwarz 

    Caltech Logo

    From Caltech

    08/20/2018

    Whitney Clavin
    (626) 395-1856
    wclavin@caltech.edu

    John Schwarz discusses the history and evolution of superstring theory.

    1
    John Schwarz. Credit: Seth Hansen for Caltech

    The decades-long quest for a theory that would unify all the known forces—from the microscopic quantum realm to the macroscopic world where gravity dominates—has had many twists and turns. The current leading theory, known as superstring theory and more informally as string theory, grew out of an approach to theoretical particle physics, called S-matrix theory, which was popular in the 1960s. Caltech’s John H. Schwarz, the Harold Brown Professor of Theoretical Physics, Emeritus, began working on the problem in 1971, while a junior faculty member at Princeton University. He moved to Caltech in 1972, where he continued his research with various collaborators from other universities. Their studies in the 1970s and 1980s would dramatically shift the evolution of the theory and, in 1984, usher in what’s known as the first superstring revolution.

    Essentially, string theory postulates that our universe is made up, at its most fundamental level, of infinitesimal tiny vibrating strings and contains 10 dimensions—three for space, one for time, and six other spatial dimensions curled up in such a way that we don’t perceive them in everyday life or even with the most sensitive experimental searches to date. One of the many states of a string is thought to correspond to the particle that carries the gravitational force, the graviton, thereby linking the two pillars of fundamental physics—quantum mechanics and the general theory of relativity, which includes gravity.

    We sat down with Schwarz to discuss the history and evolution of string theory and how the theory itself might have moved past strings.

    What are the earliest origins of string theory?

    The first study often regarded as the beginning of string theory came from an Italian physicist named Gabriele Veneziano in 1968. He discovered a mathematical formula that had many of the properties that people were trying to incorporate in a fundamental theory of the strong nuclear force [a fundamental force that holds nuclei together]. This formula was kind of pulled out of the blue, and ultimately Veneziano and others realized, within a couple years, that it was actually describing a quantum theory of a string—a one-dimensional extended object.

    How did the field grow after this paper?

    In the early ’70s, there were several hundred people worldwide working on string theory. But then everything changed when quantum chromodynamics, or QCD—which was developed by Caltech’s Murray Gell-Mann [Nobel Laureate, 1969] and others—became the favored theory of the strong nuclear force. Almost everyone was convinced QCD was the right way to go and stopped working on string theory. The field shrank down to just a handful of people in the course of a year or two. I was one of the ones who remained.

    How did Gell-Mann become interested in your work?

    Gell-Mann is the one who brought me to Caltech and was very supportive of my work. He took an interest in studies I had done with a French physicist, André Neveu, when we were at Princeton. Neveu and I introduced a second string theory. The initial Veneziano version had many problems. There are two kinds of fundamental particles called bosons and fermions, and the Veneziano theory only described bosons. The one I developed with Neveu included fermions. And not only did it include fermions but it led to the discovery of a new kind of symmetry that relates bosons and fermions, which is called supersymmetry. Because of that discovery, this version of string theory is called superstring theory.

    When did the field take off again?

    A pivotal change happened after work I did with another French physicist, Joël Scherk, whom Gell-Mann and I had brought to Caltech as a visitor in 1974. During that period, we realized that many of the problems we were having with string theory could be turned into advantages if we changed the purpose. Instead of insisting on constructing a theory of the strong nuclear force, we took this beautiful theory and asked what it was good for. And it turned out it was good for gravity. Neither of us had worked on gravity. It wasn’t something we were especially interested in but we realized that this theory, which was having trouble describing the strong nuclear force, gives rise to gravity. Once we realized this, I knew what I would be doing for the rest of my career. And I believe Joël felt the same way. Unfortunately, he died six years later. He made several important discoveries during those six years, including a supergravity theory in 11 dimensions.

    Surprisingly, the community didn’t respond very much to our papers and lectures. We were generally respected and never had a problem getting our papers published, but there wasn’t much interest in the idea. We were proposing a quantum theory of gravity, but in that era physicists who worked on quantum theory weren’t interested in gravity, and physicists who worked on gravity weren’t interested in quantum theory.

    That changed after I met Michael Green [a theoretical physicist then at the University of London and now at the University of Cambridge], at the CERN cafeteria in Switzerland in the summer of 1979. Our collaboration was very successful, and Michael visited Caltech for several extended visits over the next few years. We published a number of papers during that period, which are much cited, but our most famous work was something we did in 1984, which had to do with a problem known as anomalies.

    What are anomalies in string theory?

    One of the facts of nature is that there is what’s called parity violation, which means that the fundamental laws are not invariant under mirror reflection. For example, a neutrino always spins clockwise and not counterclockwise, so it would look wrong viewed in a mirror. When you try to write down a fundamental theory with parity violation, mathematical inconsistencies often arise when you take account of quantum effects. This is referred to as the anomaly problem. It appeared that one couldn’t make a theory based on strings without encountering these anomalies, which, if that were the case, would mean strings couldn’t give a realistic theory. Green and I discovered that these anomalies cancel one another in very special situations.

    When we released our results in 1984, the field exploded. That’s when Edward Witten [a theoretical physicist at the Institute for Advanced Study in Princeton], probably the most influential theoretical physicist in the world, got interested. Witten and three collaborators wrote a paper early in 1985 making a particular proposal for what to do with the six extra dimensions, the ones other than the four for space and time. That proposal looked, at the time, as if it could give a theory that is quite realistic. These developments, together with the discovery of another version of superstring theory, constituted the first superstring revolution.

    Richard Feynman was here at Caltech during that time, before he passed away in 1988. What did he think about string theory?

    After the 1984 to 1985 breakthroughs in our understanding of superstring theory, the subject no longer could be ignored. At that time it acquired some prominent critics, including Richard Feynman and Stephen Hawking. Feynman’s skepticism of superstring theory was based mostly on the concern that it could not be tested experimentally. This was a valid concern, which my collaborators and I shared. However, Feynman did want to learn more, so I spent several hours explaining the essential ideas to him. Thirty years later, it is still true that there is no smoking-gun experimental confirmation of superstring theory, though it has proved its value in other ways. The most likely possibility for experimental support in the foreseeable future would be the discovery of supersymmetry particles. So far, they have not shown up.

    What was the second superstring revolution about?

    The second superstring revolution occurred 10 years later in the mid ’90s. What happened then is that string theorists discovered what happens when particle interactions become strong. Before, we had been studying weakly interacting systems. But as you crank up the strength of the interaction, a 10th dimension of space can emerge. New objects called branes also emerge. Strings are one dimensional; branes have all sorts of dimensions ranging from zero to nine. An important class of these branes, called D-branes, was discovered by the late Joseph Polchinski [BS ’75]. Strings do have a special role, but when the system is strongly interacting, then the strings become less fundamental. It’s possible that in the future the subject will get a new name but until we understand better what the theory is, which we’re still struggling with, it’s premature to invent a new name.

    What can we say now about the future of string theory?

    It’s now over 30 years since a large community of scientists began pooling their talents, and there’s been enormous progress in those 30 years. But the more big problems we solve, the more new questions arise. So, you don’t even know the right questions to ask until you solve the previous questions. Interestingly, some of the biggest spin-offs of our efforts to find the most fundamental theory of nature are in pure mathematics.

    Do you think string theory will ultimately unify the forces of nature?

    Yes, but I don’t think we’ll have a final answer in my lifetime. The journey has been worth it, even if it did take some unusual twists and turns. I’m convinced that, in other intelligent civilizations throughout the galaxy, similar discoveries will occur, or already have occurred, in a different sequence than ours. We’ll find the same result and reach the same conclusions as other civilizations, but we’ll get there by a very different route.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”

    Caltech campus

     
  • richardmitnick 11:35 am on May 8, 2018 Permalink | Reply
    Tags: , , , , , , , Supersymmetry (SUSY)   

    From CERN ATLAS- “Charming SUSY: running out of places to hide” 

    CERN ATLAS Higgs Event

    CERN/ATLAS
    From CERN ATLAS

    7th May 2018
    ATLAS Collaboration

    1
    Figure 1: Limits on pair production of stop and scharm particles. The horizontal axis shows the mass of the stop or scharm, while the vertical axis corresponds to the mass of the lightest superpartner. The red line shows the observed limit, while the blue line and the yellow band show the expected limit and its uncertainty. The filled blue region shows models excluded by previous ATLAS searches. (Image: ATLAS Collaboration/CERN)

    Why is gravity so much weaker than the other forces of nature? This fundamental discrepancy, known as the “hierarchy problem”, has long been a source of puzzlement. Since the discovery of a scalar particle, the Higgs boson, with a mass of 125 GeV near that of the W and Z bosons mediating the weak force, the hierarchy problem is more acute than ever. Due to large quantum corrections the most natural mass of the Higgs boson should be many orders of magnitude above the one observed. Potentially as large as the Planck mass of order 1019 GeV, the energy at which gravity is expected to become as strong as the other forces.

    Supersymmetry addresses the hierarchy problem by introducing partners of the known elementary particles that cancel these detrimental quantum corrections to the Higgs mass. For this solution to work, however, the supersymmetric partner of the top quark (known as the “stop”) must have a mass not too different from that of the top quark itself. When this is the case, supersymmetry “stabilises” the mass of the Higgs boson because the top and stop contribute with opposite signs to the quantum corrections.

    Similar to the top quark itself, the stop is generally expected to decay via a bottom quark. Many attempts have been made to discover the stop in such decays. However, an intriguing theoretical possibility is that the stop might instead preferentially decay to final states containing a charm instead of a bottom quark.

    2
    Figure 2: Summary of the current status of searches for pair production of stops in ATLAS, not including the latest results described in Figure 1. The horizontal axis shows the mass of the stop, while the vertical axis corresponds to the mass of the lightest superpartner. The filled regions show observed limits on the superpartner masses. Different colours correspond to different search strategies, while the light blue area shows the results obtained with the 8 TeV dataset. Dashed lines correspond to expected limits. (Image: ATLAS Collaboration/CERN)

    ATLAS has released a new search for stops decaying to charm quarks, significantly improving previous results that used 8 TeV proton-proton collision data. One of the main challenges in the search is identifying the presence of charm quarks in proton-proton collisions events. Particles containing a charm quark travel just a fraction of a millimetre before decaying. This is unlike most particles composed of lighter quarks, which are either nearly stable or decay almost instantly after they are produced.

    Thanks to ATLAS’ superb particle tracking capabilities, physicists were able to pick out small displacements from charm quarks amongst hundreds of other tracks in the collision. Performing this feat is more challenging for charm quarks than bottom quarks, since particles containing bottom quarks travel longer on average before decaying, making then easier to identify. The new ATLAS search is also able to detect signals from the supersymmetric partner of the charm quark (the “s-charm” or “scharm”). Such particles could be lighter than other quark superpartners and thus may be more copiously produced at the LHC. Their signal in the detector would be identical to that of stops considered by the search.

    Though the ATLAS search found no signals of stops or scharms, new limits were set on the masses of these hypothetical particles, as shown in Figure 1. These results complement those from other ATLAS searches for the stop, summarised in Figure 2. There appears to be little room left for stops of similar mass to the top quark, and the hierarchy puzzle remains unsolved.

    See the full article here .

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition


    CERN Courier

    QuantumDiaries


    Quantum Diaries

     
  • richardmitnick 6:10 pm on February 13, 2018 Permalink | Reply
    Tags: , , , , , , , Supersymmetry (SUSY)   

    From CERN Courier: “ATLAS extends searches for natural supersymmetry” 


    CERN Courier

    Jan 15, 2018

    1
    Exclusion limits

    Despite many negative searches during the last decade and more, supersymmetry (SUSY) remains a popular extension of the Standard Model (SM). Not only can SUSY accommodate dark matter and gauge–force unification at high energy, it offers a natural explanation for why the Higgs boson is so light compared to the Planck scale. In the SM, the Higgs boson mass can be decomposed into a “bare” mass and a modification due to quantum corrections. Without SUSY, but in the presence of a high-energy new physics scale, these two numbers are extremely large and thus must almost exactly oppose one another – a peculiar coincidence called the hierarchy problem. SUSY introduces a set of new particles that each balances the mass correction of its SM partner, providing a “natural” explanation for the Higgs boson mass.

    Thanks to searches at the LHC and previous colliders, we know that SUSY particles must be heavier than their SM counterparts. But if this difference in mass becomes too large, particularly for the particles that produce the largest corrections to the Higgs boson mass, SUSY would not provide a natural solution of the hierarchy problem.

    New SUSY searches from ATLAS using data recorded at an energy of 13 TeV in 2015 and 2016 (some of which were shown for the first time at SUSY 2017 in Mumbai from 11–15 December) have extended existing bounds on the masses of the top squark and higgsinos, the SUSY partners of the top quark and Higgs bosons, respectively, that are critical for natural SUSY. For SUSY to remain natural, the mass of the top squark should be below around 1 TeV and that of the higgsinos below a few hundred GeV.

    ATLAS has now completed a set of searches for the top squark that push the mass limits up to 1 TeV. With no sign of SUSY yet, these searches have begun to focus on more difficult to detect scenarios in which SUSY could hide amongst the SM background. Sophisticated techniques including machine learning are employed to ensure no signal is missed.

    First ATLAS results have also been released for higgsino searches. If the lightest SUSY particles are higgsino-like, their masses will often be close together and such “compressed” scenarios lead to the production of low-momentum particles. One new search at ATLAS targets scenarios with leptons reconstructed at the lowest momenta still detectable. If the SUSY mass spectrum is extremely compressed, the lightest charged SUSY particle will have an extended lifetime, decay invisibly, and leave an unusual detector signature known as a “disappearing track”.

    Such a scenario is targeted by another new ATLAS analysis. These searches extend for the first time the limits on the lightest higgsino set by the Large Electron Positron (LEP) collider 15 years ago. The search for higgsinos remains among the most challenging and important for natural SUSY. With more data and new ideas, it may well be possible to discover, or exclude, natural SUSY in the coming years.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: