Tagged: CERN LHC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:35 pm on August 29, 2019 Permalink | Reply
    Tags: "Forget About Electrons And Protons; The Unstable Muon Could Be The Future Of Particle Physics", , CERN LHC, , , , , MICE collaboration — which stands for Muon Ionization Cooling Experiment — continues to push this technology to new heights and may make a muon collider a real possibility for the future.,   

    From Ethan Siegel: “Forget About Electrons And Protons; The Unstable Muon Could Be The Future Of Particle Physics” 

    From Ethan Siegel
    Aug 29, 2019

    1
    The particle tracks emanating from a high energy collision at the LHC in 2014 show the creation of many new particles. It’s only because of the high-energy nature of this collision that new masses can be created. (WIKIMEDIA COMMONS USER PCHARITO)

    Electron-positron or proton-proton colliders are all the rage. But the unstable muon might be the key to unlocking the next frontier.

    If you want to probe the frontiers of fundamental physics, you have to collide particles at very high energies: with enough energy that you can create the unstable particles and states that don’t exist in our everyday, low-energy Universe. So long as you obey the Universe’s conservation laws and have enough free energy at your disposal, you can create any massive particle (and/or its antiparticle) from that energy via Einstein’s E = mc².

    Traditionally, there have been two strategies to do this.

    Collide electrons moving in one direction with positrons moving in the opposite direction, tuning your beams to whatever energy corresponds to the mass of particles you wish to produce.
    Collide protons in one direction with either other protons or anti-protons in the other, reaching higher energies but creating a much messier, less controllable signal to extract.

    One Nobel Laureate, Carlo Rubbia, has called for physicists to build something entirely novel: a muon collider.

    2
    Carlo Rubbia at the 62nd Lindau Nobel Laureate Meeting on July 4, 2012. Markus Pössel (user name: Mapos)

    It’s ambitious and presently impractical, but it just might be the future of particle physics.

    3
    The particles and antiparticles of the Standard Model have now all been directly detected, with the last holdout, the Higgs Boson, falling at the LHC earlier this decade.

    Standard Model of Particle Physics

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    All of these particles can be created at LHC energies, and the masses of the particles lead to fundamental constants that are absolutely necessary to describe them fully. These particles can be well-described by the physics of the quantum field theories underlying the Standard Model, but they do not describe everything, like dark matter. (E. SIEGEL / BEYOND THE GALAXY)

    Above, you can see the particles and antiparticles of the Standard Model, which have now all been discovered. The Large Hadron Collider (LHC) at CERN discovered the Higgs boson, the long-sought-after last holdout, earlier this decade.

    While there’s still much science left to be done at the LHC — it’s only taken 2% of all the data it will acquire by the end of the 2030s — particle physicists are already looking ahead to the next generation of future colliders.

    5
    A hypothetical new accelerator, either a long linear one or one inhabiting a large tunnel beneath the Earth, could dwarf the sensitivity to new particles that prior and current colliders can achieve. Even at that, there’s no guarantee we’ll find anything new, but we’re certain to find nothing new if we fail to try. ILC collaboration

    All of the plans put forth involve scaled-up version of existing technologies that have been used in past and/or current accelerators. We know how to accelerate electrons, positrons, and protons in a straight line. We know how to bend them into a circle, and maximize both the energy of the collisions and the number of particles colliding per second. Larger, more energetic versions of existing technologies are the simplest approach.

    FNAL/Tevatron map

    CERN map

    Future Circular Collider (FCC) Larger LHC

    CERN FCC Future Circular Collider map

    CERN Future Circular Collider

    The scale of the proposed Future Circular Collider (FCC), compared with the LHC presently at CERN and the Tevatron, formerly operational at Fermilab. The Future Circular Collider is perhaps the most ambitious proposal for a next-generation collider to date, including both lepton and proton options as various phases of its proposed scientific programme. (PCHARITO / WIKIMEDIA COMMONS)

    Of course, there are both benefits and drawbacks to each method we could use. You can build a linear collider, but the energy you can reach is going to be limited by how powerfully you can impart energy to these particles per-unit-distance as well as how long you build your accelerator. The drawback is that, without a continuous injection of circulating particles, linear colliders have lower collision rates and take longer amounts of time to collect the same amount of data.

    The other main style of collider is the style currently used at CERN: circular colliders. Instead of only getting one continuous shot to accelerate your particles before giving them the opportunity to collide, you speed them up while bending them in a circle, adding more and more particles to each clockwise and counterclockwise beam with every revolution. You set up your detectors at designated collision points, and measure what comes out.

    6
    A candidate Higgs event in the ATLAS detector. Note how even with the clear signatures and transverse tracks, there is a shower of other particles; this is due to the fact that protons are composite particles. This is only the case because the Higgs gives mass to the fundamental constituents that compose these particles. At high enough energies, the currently most-fundamental particles known may yet split apart themselves. (THE ATLAS COLLABORATION / CERN)

    CERN ATLAS Image Claudia Marcelloni

    This is the preferred method, so long as your tunnel is long enough and your magnets are strong enough, for both electron/positron and proton/proton colliders. Compared to linear colliders, with a circular collider, you get

    greater numbers of particles inside the beam at any one time,
    second and third and thousandth chances for particles that missed one another on the prior pass through,
    and much greater collision rates overall, particularly for lower-energy heavy particles like the Z-boson.

    In general, electron/positron colliders are better for precision studies of known particles, while proton/proton colliders are better for probing the energy frontier.

    7
    A four-muon candidate event in the ATLAS detector at the Large Hadron Collider. The muon/anti-muon tracks are highlighted in red, as the long-lived muons travel farther than any other unstable particle. The energies achieved by the LHC are sufficient for creating Higgs bosons; previous electron-positron colliders could not achieve the necessary energies. (ATLAS COLLABORATION/CERN)

    In fact, if you compare the LHC — which collides protons with protons — with the previous collider in the same tunnel (LEP, which collided electrons with positrons), you’d find something that surprises most people: the particles inside LEP went much, much faster than the ones inside the LHC!

    CERN LEP Collider


    CERN LEP Collider

    Everything in this Universe is limited by the speed of light in a vacuum: 299,792,458 m/s. It’s impossible to accelerate any massive particle to that speed, much less past it. At the LHC, particles get accelerated up to extremely high energies of 7 TeV per particle. Considering that a proton’s rest energy is only 938 MeV (or 0.000938 TeV), it’s easy to see how it reaches a speed of 299,792,455 m/s.

    But the electrons and positrons at LEP went even faster: 299,792,457.9964 m/s. Yet despite these enormous speeds, they only reached energies of ~110 GeV, or 1.6% the energies achieved at the LHC.

    Let’s understand how colliding particles create new ones. First, the energy available for creating new particles — the “E” in E = mc² — comes from the center-of-mass energy of the two colliding particles. In a proton-proton collision, it’s the internal structures that collide: quarks and gluons. The energy of each proton is divided up among many constituent particles, and these particles zip around inside the proton as well. When two of them collide, the energy available for creating new particles might still be large (up to 2 or 3 TeV), but isn’t the full-on 14 TeV.

    But the electron-positron idea is a lot cleaner: they’re not composite particles, and they don’t have internal structure or energy divided among constituents. Accelerate an electron and positron to the same speed in opposite directions, and 100% of that energy goes into creating new particles. But it won’t be anywhere near 14 TeV.

    8
    A number of the various lepton colliders, with their luminosity (a measure of the collision rate and the number of detections one can make) as a function of center-of-mass collision energy. Note that the red line, which is a circular collider option, offers many more collisions than the linear version, but gets less superior as energy increases. Beyond about 380 GeV, circular colliders cannot achieve those energies, and a linear collider like CLIC is the far superior option. (GRANADA STRATEGY MEETING SUMMARY SLIDES / LUCIE LINSSEN (PRIVATE COMMUNICATION))

    Even though electrons and positrons go much faster than protons do, the total amount of energy a particle possesses is determined by its speed and also its original mass. Even though the electrons and positrons are much closer to the speed of light, it takes nearly 2,000 of them to make up as much rest mass as a proton. They have a greater speed but a much lower rest mass, and hence, a lower energy overall.

    There’s a good physics reasons why, even with the same radius ring and the same strong magnetic fields to bend them into a circle, electrons won’t reach the same energy as protons: synchrotron radiation. When you accelerate a charged particle with a magnetic field, it gives off radiation, which means it carries energy away.

    9
    Relativistic electrons and positrons can be accelerated to very high speeds, but will emit synchrotron radiation (blue) at high enough energies, preventing them from moving faster. This synchrotron radiation is the relativistic analog of the radiation predicted by Rutherford so many years ago, and has a gravitational analogy if you replace the electromagnetic fields and charges with gravitational ones. (CHUNG-LI DONG, JINGHUA GUO, YANG-YUAN CHEN, AND CHANG CHING-LIN, ‘SOFT-X-RAY SPECTROSCOPY PROBES NANOMATERIAL-BASED DEVICES’)

    The amount of energy radiated away is dependent on the field strength (squared), the energy of the particle (squared), but also on the inherent charge-to-mass ratio of the particle (to the fourth power). Since electrons and positrons have the same charge as the proton, but just 1/1836th of a proton’s mass, that synchrotron radiation is the limiting factor for electron-positron systems in a circular collider. You’d need a circular collider 100 km around just to be able to create a pair of top-antitop quarks in a next-generation particle accelerator using electrons and positrons.

    This is where the big idea of using muons comes in. Muons (and anti-muons) are the cousins of electrons (and positrons), being:

    fundamental (and not composite) particles,
    being 206 times as massive as an electron (with a much smaller charge-to-mass ratio and much less synchrotron radiation),
    and also, unlike electrons or positrons, being fundamentally unstable.

    That last difference is the present dealbreaker: muons have a mean lifetime of just 2.2 microseconds before decaying away.

    10
    An earlier design plan (now defunct) for a full-scale muon-antimuon collider at Fermilab, the source of the world’s second-most powerful particle accelerator behind the LHC at CERN. (FERMILAB)

    In the future, however, we might be able to work around that anyway. You see, Einstein’s special relativity tells us that as particles move closer and closer to the speed of light, time dilates for that particle in the observer’s reference frame. In other words, if we make this muon move fast enough, we can dramatically increase the time it lives before decaying; this is the same physics behind why cosmic ray muons pass through us all the time!

    If we could accelerate a muon up to the same 6.5 TeV in energy that LHC protons achieved during their prior data-taking run, that muon would live for 135,000 microseconds instead of 2.2 microseconds: enough time to circle the LHC some 1,500 times before decaying away. If you could collide a muon/anti-muon pair at those speeds, you’d have 100% of that energy — all 13 TeV of it — available for particle creation.

    11
    The prototype MICE 201-megahertz RF module, with the copper cavity mounted, is shown during assembly at Fermilab. This apparatus could focus and collimate a muon beam, enabling the muons to be accelerated and survive for much longer than 2.2 microseconds. (Y. TORUN / IIT / FERMILAB TODAY)

    Humanity can always choose to build a bigger ring or invest in producing stronger-field magnets; those are easy ways to go to higher energies in particle physics. But there’s no cure for synchrotron radiation with electrons and positrons; you’d have to use heavier particles instead. There’s no cure for energy being distributed among multiple constituent particles inside a proton; you’d have to use fundamental particles instead.

    The muon is the one particle that could solve both of these issues. The only drawback is that they’re unstable, and difficult to keep alive for a long time. However, they’re easy to make: smash a proton beam into a piece of acrylic and you’ll produce pions, which will decay into both muons and anti-muons. Accelerate those muons to high energy and collimate them into beams, and you can put them in a circular collider.

    12
    While many unstable particles, both fundamental and composite, can be produced in particle physics, only protons, neutrons (bound in nuclei) and the electron are stable, along with their antimatter counterparts and the photon. Everything else is short-lived, but if muons can be kept at high enough speeds, they might live long enough to forge a next-generation particle collider out of. (CONTEMPORARY PHYSICS EDUCATION PROJECT (CPEP), U.S. DEPARTMENT OF ENERGY / NSF / LBNL)

    The MICE collaboration — which stands for Muon Ionization Cooling Experiment — continues to push this technology to new heights, and may make a muon collider a real possibility for the future. The goal is to reveal whatever secrets nature might have waiting in store for us, and these are secrets we cannot predict. As Carlo Rubbia himself said,

    “…these fundamental choices are coming from nature, not from individuals. Theorists can do what they like, but nature is the one deciding in the end….”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 11:31 am on August 20, 2019 Permalink | Reply
    Tags: "With open data scientists share their work", , , CERN LHC, Gran Sasso, ,   

    From Symmetry: “With open data, scientists share their work” 

    Symmetry Mag
    From Symmetry

    08/20/19
    Meredith Fore

    1
    Illustration by Sandbox Studio, Chicago

    There are barriers to making scientific data open, but doing so has already contributed to scientific progress.

    It could be said that astronomy, one of the oldest sciences, was one of the first fields to have open data. The open records of Chinese astronomers from 1054 A.D. allowed astronomer Carlo Otto Lampland to identify the Crab Nebula as the remnant of a supernova in 1921.

    Supernova remnant Crab nebula. NASA/ESA Hubble

    In 1705 Edward Halley used the previous observations of Johannes Kepler and Petrus Apianus—who did their work before Halley was old enough to use a telescope—to deduce the orbit of his eponymous comet.

    2
    Comet 1P/Halley as taken March 8, 1986 by W. Liller, Easter Island, part of the International Halley Watch (IHW) Large Scale Phenomena Network.
    NASA/W. Liller

    In science, making data open means making available, free of charge, the observations or other information collected in a scientific study for the purpose of allowing other researchers to examine it for themselves, either to verify it or to conduct new analyses.

    Scientists continue to use open data to make new discoveries today. In 2010, a team of scientists led by Professor Doug Finkbeiner at Harvard University found vast gamma-ray bubbles above and below the Milky Way. The accomplishment was compared to the discovery of a new continent on Earth. The scientists didn’t find the bubbles by making their own observations; they did it by analyzing publicly available data from the Fermi Gamma Ray Telescope.

    NASA/Fermi LAT


    NASA/Fermi Gamma Ray Space Telescope

    “Open data often can be used to answer other kinds of questions that the people who collected the data either weren’t interested in asking, or they just never thought to ask,” says Kyle Cranmer, a professor at New York University. By making scientific data available, “you’re enabling a lot of new science by the community to go forward in a more efficient and powerful way.”

    Cranmer is a member of ATLAS, one of the two general-purpose experiments that, among other things, co-discovered the Higgs boson at the Large Hadron Collider at CERN.

    CERN ATLAS Image Claudia Marcelloni

    CERN ATLAS Higgs Event

    He and other CERN researchers recently published a letter in Nature Physics titled “Open is not enough,” which shares lessons learned about providing open data in high-energy physics. The CERN Open Data Portal, which facilitates public access of datasets from CERN experiments, now contains more than two petabytes of information.

    3
    Computing at CERN

    The fields of both particle physics and astrophysics have seen rapid developments in the use and spread of open data, says Ulisses Barres, an astrophysicist at the Brazilian Center for Research in Physics. “Astronomy is going to, in the next decade, increase the amount of data that it produces by a factor of hundreds,” he says. “As the amount of data grows, there is more pressure for increasing our capacity to convert information into knowledge.”

    The Square Kilometer Array Telescope—built in Australia and South Africa and set to turn on in the 2020s—is expected to produce about 600 terabytes of data per year.

    SKA Square Kilometer Array


    SKA South Africa

    Raw data from studies conducted during the site selection process are already available on the SKA website, with a warning that “these files are very large indeed, and before you download them you should check whether your local file system will be able to handle them.”

    Barres sees the growth in open data as an opportunity for developing nations to participate in the global science community in new ways. He and a group of fellow astrophysicists helped develop something called the Open Universe Initiative “with the objective of stimulating a dramatic increase in the availability and usability of space science data, extending the potential of scientific discovery to new participants in all parts of the world and empowering global educational services.”

    The initiative, proposed by the government of Italy, is currently in the “implementation” phase within the United Nations Office for Outer Space Affairs.

    “I think that data is this proper entry point for science development in places that don’t have much science developed yet,” Barres says. “Because it’s there, it’s available, there is much more data than we can properly analyze.”

    There are barriers to implementing open data. One is the concept of ownership—a lab might not want to release data that they could use for another project or might worry about proper credit and attribution. Another is the natural human fear of being accused of being wrong or having your data used irresponsibly.

    But one of the biggest barriers, according to physics professor Jesse Thaler of MIT, is making the data understandable. “From the user perspective, every single aspect of using public data is challenging,” Thaler says.

    Think of a high school student’s chemistry lab notebook. A student might mark certain measurements in her data table with a star, to remind herself that she used a different instrument to take those measurements. Or she may use acronyms to name different samples. Unless she writes these schemes down, another student wouldn’t know the star’s significance and wouldn’t be able to know what the samples were.

    This has been a challenge for the CERN Open Data Portal, Cranmer says. “It’s very well curated, but it’s hard to use, because the data has got a lot of structure to it. It’s very complicated. You have to put additional effort to make it more usable.”

    And for a lot of scientists already working to manage gigantic projects, doing extra work to make their data useable to outside groups—well, “that’s just not mission critical,” he says. But Thaler adds that the CMS experiment has been very responsive to the needs of outside users.


    CERN CMS Higgs Event

    “Figuring out how to release data is challenging because you want to provide as much relevant information to outside users as possible,” Thaler says. “But it’s often not obvious, until outside users actually get their hands on the data, what information is relevant.”

    Still, there are many examples of open data benefiting astrophysics and particle physics. Members of the wider scientific community have discovered exoplanets through public data from the Kepler Space Telescope. When the Gaia spacecraft mapped the positions of 1.7 billion stars and released them as open data, scientists flocked to hackathons hosted by the Flatiron Institute to interpret it and produced about 20 papers’ worth of research.

    Open data policies have allowed for more accountability. The physics community was able to thoroughly check data from the first black hole collisions detected by LIGO and question a proposed dark-matter signal from the DAMA/LIBRA experiment.

    DAMA-LIBRA at Gran Sasso


    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    Open data has also allowed for new collaborations and has nourished existing ones. Thaler, who is a theorist, says the dialogue between experimentalists and theorists has always been strong, but “open data is an opportunity to accelerate that conversation,” he says.

    For Cari Cesarotti, a graduate student who uses CMS Open Data for research in particle physics theory at Harvard, one of the most important benefits of open data is how it maximizes the scientific value of data experimentalists have to work very hard to obtain.

    “Colliders are really expensive and quite laborious to build and test,” she says. “So the more that we can squeeze out utility using the tools that we already have—to me, that’s the right thing to do, to try to get as much mileage as we possibly can out of the data set.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 11:37 am on August 15, 2019 Permalink | Reply
    Tags: , Azure ML, , CERN LHC, Every proton collision at the Large Hadron Collider is different but only a few are special. The special collisions generate particles in unusual patterns — possible manifestations of new rule-break, Fermilab is the lead U.S. laboratory for the CMS experiment., , , , , , , The challenge: more data more computing power   

    From Fermi National Accelerator Lab- “A glimpse into the future: accelerated computing for accelerated particles” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    August 15, 2019
    Leah Hesla

    Every proton collision at the Large Hadron Collider is different, but only a few are special. The special collisions generate particles in unusual patterns — possible manifestations of new, rule-breaking physics — or help fill in our incomplete picture of the universe.

    Finding these collisions is harder than the proverbial search for the needle in the haystack. But game-changing help is on the way. Fermilab scientists and other collaborators successfully tested a prototype machine-learning technology that speeds up processing by 30 to 175 times compared to traditional methods.

    Confronting 40 million collisions every second, scientists at the LHC use powerful, nimble computers to pluck the gems — whether it’s a Higgs particle or hints of dark matter — from the vast static of ordinary collisions.

    Rifling through simulated LHC collision data, the machine learning technology successfully learned to identify a particular postcollision pattern — a particular spray of particles flying through a detector — as it flipped through an astonishing 600 images per second. Traditional methods process less than one image per second.

    The technology could even be offered as a service on external computers. Using this offloading model would allow researchers to analyze more data more quickly and leave more LHC computing space available to do other work.

    It is a promising glimpse into how machine learning services are supporting a field in which already enormous amounts of data are only going to get bigger.

    1
    Particles emerging from proton collisions at CERN’s Large Hadron Collider travel through through this stories-high, many-layered instrument, the CMS detector. In 2026, the LHC will produce 20 times the data it does currently, and CMS is currently undergoing upgrades to read and process the data deluge. Photo: Maximilien Brice, CERN

    The challenge: more data, more computing power

    Researchers are currently upgrading the LHC to smash protons at five times its current rate.

    By 2026, the 17-mile circular underground machine at the European laboratory CERN will produce 20 times more data than it does now.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    CMS is one of the particle detectors at the Large Hadron Collider, and CMS collaborators are in the midst of some upgrades of their own, enabling the intricate, stories-high instrument to take more sophisticated pictures of the LHC’s particle collisions. Fermilab is the lead U.S. laboratory for the CMS experiment.

    If LHC scientists wanted to save all the raw collision data they’d collect in a year from the High-Luminosity LHC, they’d have to find a way to store about 1 exabyte (about 1 trillion personal external hard drives), of which only a sliver may unveil new phenomena. LHC computers are programmed to select this tiny fraction, making split-second decisions about which data is valuable enough to be sent downstream for further study.

    Currently, the LHC’s computing system keeps roughly one in every 100,000 particle events. But current storage protocols won’t be able to keep up with the future data flood, which will accumulate over decades of data taking. And the higher-resolution pictures captured by the upgraded CMS detector won’t make the job any easier. It all translates into a need for more than 10 times the computing resources than the LHC has now.

    The recent prototype test shows that, with advances in machine learning and computing hardware, researchers expect to be able to winnow the data emerging from the upcoming High-Luminosity LHC when it comes online.

    “The hope here is that you can do very sophisticated things with machine learning and also do them faster,” said Nhan Tran, a Fermilab scientist on the CMS experiment and one of the leads on the recent test. “This is important, since our data will get more and more complex with upgraded detectors and busier collision environments.”

    2
    Particle physicists are exploring the use of computers with machine learning capabilities for processing images of particle collisions at CMS, teaching them to rapidly identify various collision patterns. Image: Eamonn Maguire/Antarctic Design

    Machine learning to the rescue: the inference difference

    Machine learning in particle physics isn’t new. Physicists use machine learning for every stage of data processing in a collider experiment.

    But with machine learning technology that can chew through LHC data up to 175 times faster than traditional methods, particle physicists are ascending a game-changing step on the collision-computation course.

    The rapid rates are thanks to cleverly engineered hardware in the platform, Microsoft’s Azure ML, which speeds up a process called inference.

    To understand inference, consider an algorithm that’s been trained to recognize the image of a motorcycle: The object has two wheels and two handles that are attached to a larger metal body. The algorithm is smart enough to know that a wheelbarrow, which has similar attributes, is not a motorcycle. As the system scans new images of other two-wheeled, two-handled objects, it predicts — or infers — which are motorcycles. And as the algorithm’s prediction errors are corrected, it becomes pretty deft at identifying them. A billion scans later, it’s on its inference game.

    Most machine learning platforms are built to understand how to classify images, but not physics-specific images. Physicists have to teach them the physics part, such as recognizing tracks created by the Higgs boson or searching for hints of dark matter.

    Researchers at Fermilab, CERN, MIT, the University of Washington and other collaborators trained Azure ML to identify pictures of top quarks — a short-lived elementary particle that is about 180 times heavier than a proton — from simulated CMS data. Specifically, Azure was to look for images of top quark jets, clouds of particles pulled out of the vacuum by a single top quark zinging away from the collision.

    “We sent it the images, training it on physics data,” said Fermilab scientist Burt Holzman, a lead on the project. “And it exhibited state-of-the-art performance. It was very fast. That means we can pipeline a large number of these things. In general, these techniques are pretty good.”

    One of the techniques behind inference acceleration is to combine traditional with specialized processors, a marriage known as heterogeneous computing architecture.

    Different platforms use different architectures. The traditional processors are CPUs (central processing units). The best known specialized processors are GPUs (graphics processing units) and FPGAs (field programmable gate arrays). Azure ML combines CPUs and FPGAs.

    “The reason that these processes need to be accelerated is that these are big computations. You’re talking about 25 billion operations,” Tran said. “Fitting that onto an FPGA, mapping that on, and doing it in a reasonable amount of time is a real achievement.”

    And it’s starting to be offered as a service, too. The test was the first time anyone has demonstrated how this kind of heterogeneous, as-a-service architecture can be used for fundamental physics.

    5
    Data from particle physics experiments are stored on computing farms like this one, the Grid Computing Center at Fermilab. Outside organizations offer their computing farms as a service to particle physics experiments, making more space available on the experiments’ servers. Photo: Reidar Hahn

    At your service

    In the computing world, using something “as a service” has a specific meaning. An outside organization provides resources — machine learning or hardware — as a service, and users — scientists — draw on those resources when needed. It’s similar to how your video streaming company provides hours of binge-watching TV as a service. You don’t need to own your own DVDs and DVD player. You use their library and interface instead.

    Data from the Large Hadron Collider is typically stored and processed on computer servers at CERN and partner institutions such as Fermilab. With machine learning offered up as easily as any other web service might be, intensive computations can be carried out anywhere the service is offered — including off site. This bolsters the labs’ capabilities with additional computing power and resources while sparing them from having to furnish their own servers.

    “The idea of doing accelerated computing has been around decades, but the traditional model was to buy a computer cluster with GPUs and install it locally at the lab,” Holzman said. “The idea of offloading the work to a farm off site with specialized hardware, providing machine learning as a service — that worked as advertised.”

    The Azure ML farm is in Virginia. It takes only 100 milliseconds for computers at Fermilab near Chicago, Illinois, to send an image of a particle event to the Azure cloud, process it, and return it. That’s a 2,500-kilometer, data-dense trip in the blink of an eye.

    “The plumbing that goes with all of that is another achievement,” Tran said. “The concept of abstracting that data as a thing you just send somewhere else, and it just comes back, was the most pleasantly surprising thing about this project. We don’t have to replace everything in our own computing center with a whole bunch of new stuff. We keep all of it, send the hard computations off and get it to come back later.”

    Scientists look forward to scaling the technology to tackle other big-data challenges at the LHC. They also plan to test other platforms, such as Amazon AWS, Google Cloud and IBM Cloud, as they explore what else can be accomplished through machine learning, which has seen rapid evolution over the past few years.

    “The models that were state-of-the-art for 2015 are standard today,” Tran said.

    As a tool, machine learning continues to give particle physics new ways of glimpsing the universe. It’s also impressive in its own right.

    “That we can take something that’s trained to discriminate between pictures of animals and people, do some modest amount computation, and have it tell me the difference between a top quark jet and background?” Holzman said. “That’s something that blows my mind.”

    This work is supported by the DOE .

    See the full here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 12:35 pm on August 10, 2019 Permalink | Reply
    Tags: "Physicists Working to Discover New Particles, , , CERN LHC, , , , Texas Tech, The LDMX Experiment   

    From Texas Tech via FNAL: “Physicists Working to Discover New Particles, Dark Matter” 

    1

    From TEXAS TECH UNIVERSITY

    via

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    August 5, 2019
    Glenys Young, Texas Tech

    Faculty recently presented their work at the European Physical Society’s 2019 Conference on High Energy Physics.

    Texas Tech University is well known for its research on topics that hit close to home for us here on the South Plains, like agriculture, water use and climate. But Texas Tech also is making its name known among those who study the farthest reaches of space and the mysteries of matter.

    Faculty from the Texas Tech Department of Physics & Astronomy recently presented at the European Physical Society’s 2019 Conference on High Energy Physics on the search for dark matter and other new particles that could help unlock the history and nature of the universe.

    New ways to approach the most classical search for new particles.

    Texas Tech, led by professor and department chair Sung-Won Lee, has been playing a leading role in new-particle hunt for more than a decade. As part of the Compact Muon Solenoid (CMS) experiment, which investigates a wide range of physics, including the search for extra dimensions and particles that could make up dark matter, Lee has led the new-particle search at the European Organization for Nuclear Research (CERN).

    1
    Lee

    “Basically, we’re looking for any experimental evidence of new particles that could open the door to whole new realms of physics that researchers believe could be there,” Lee said. “Researchers at Texas Tech are continuing to look for elusive new particles in the CMS experiment at CERN’s Large Hadron Collider (LHC), and if found, we could answer some of the most profound questions about the structure of matter and the evolution of the early universe.”

    The LHC essentially bounces around tiny particles at incredibly high speeds to see what happens when the particles collide. Lee’s search focuses on identifying possible hints of new physics that could add more subatomic particles to the Standard Model of particle physics.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS

    CERN CMS New

    LHCb
    CERN LHCb New II

    “The Standard Model has been enormously successful, but it leaves many important questions unanswered,” Lee said.

    Standard Model of Particle Physics

    “It is also widely acknowledged that, from the theoretical standpoint, the Standard Model must be part of a larger theory, ‘Beyond the Standard Model’ (BSM), which is yet to be experimentally confirmed.”

    Some BSM theories suggest that the production and decay of new particles could be observed in the LHC by the resulting highly energetic jets that shoot out in opposite directions (dijets) and the resonances they leave. Thus the search for new particles depends on the search for these resonances. In some ways, it’s like trying to trace air movements to find a fan you can’t see, hear or touch.

    In 2018-19, in collaboration with the CMS group, Texas Tech’s team performed a search for narrow dijet resonances using a newly available dataset at the LHC. The data were consistent with the Standard Model predictions, and no significant deviations from the pure background hypothesis were observed. But one spectacular collision was recorded in which the masses of the two jets were the same. This evidence allows for the possibility that the jets originated from BSM-hypothesized particle decay.

    “Since the LHC is the highest energy collider currently in operation, it is crucial to pay special attention to the highest-dijet-mass events where first hints of new physics at higher energies could start to appear,” Lee said. “This unusual high-mass event could likely be a collision created by the Standard Model background or possibly the first hint of new physics, but with only one event in hand, it is not possible to say which.”

    For now, Lee, postdoctoral research fellow Federico De Guio and doctoral student Zhixing (Tyler) Wang are working to update the dijet resonance search using the full LHC dataset and extend the scope of the analysis.

    “This extension of the search could help prove space-time-matter theory, which requires the existence of several extra spatial dimensions to the universe,” Lee said. “I believe that, with our extensive research experience, Texas Tech’s High Energy Physics group can contribute to making such discoveries.”

    Enhancing the missing momentum microscope

    Included in the ongoing new-particle search using the LHC is the pursuit of dark matter, an elusive, invisible form of matter that dominates the matter content of the universe.

    “Currently, the LHC is producing the highest-energy collisions from an accelerator in the world, and my primary research interest is in understanding whether or not new states of matter are being produced in these collisions,” said Andrew Whitbeck, an assistant professor in the Department of Physics & Astronomy.

    4
    Whitbeck

    “Specifically, we are looking for dark matter produced in association with quarks, the constituents of the proton and neutron. These signatures are important for both understanding the nature of dark matter, but also the nature of the Higgs boson, a cornerstone of our theory for how elementary particles interact.”

    The discovery of the Higgs boson at the LHC in 2012 was a widely celebrated accomplishment of the LHC and the detector collaborations involved.

    Peter Higgs


    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    However, the mere existence of the Higgs boson has provoked a lot of questions about whether there are new particles that could help us better understand the Higgs boson and other questions, like why gravity is so weak compared to other forces.

    As an offshoot of that finding, Whitbeck has been working to better understand a type of particle called neutrinos.

    “Neutrinos are a unique particle in the catalog of known particles in that they are the lightest matter particles, and they only can interact with particles via the Weak force, which, as its name suggests, only produces a feeble force between neutrinos and other matter,” Whitbeck said. “Neutrinos are so weakly interacting at the energies produced by the LHC that it is very likely a neutrino travels through the entire earth without deviating from its initial trajectory.

    “Dark matter is expected to behave similarly given that, despite being all around us, we don’t directly see it. This means that in looking for dark matter produced in proton-proton collisions, we often find lots of neutrinos. Understanding how many events with neutrinos there are is an important first step to understanding if there are events with dark matter.”

    Since the discovery of the Higgs boson, many of the most obvious signatures have come up empty for any signs of dark matter, and the latest results are some of the most sensitive measurements done to date. However, Whitbeck and his fellow scientists will continue to look for many more subtle signatures as well as a very powerful signature in which dark matter hypothetically is produced almost by itself, with only one lonely proton fragment visible in the event. The strategy provides powerful constraints for the most difficult-to-see models of dark matter.

    “With all of the traditional ways of searching for dark matter in proton-proton collisions turning up empty, I have also been working to design a new experiment, the Light Dark Matter eXperiment (LDMX), that will employ detector technology and techniques similar to what is used at CMS to look for dark matter,” Whitbeck said.

    6
    Texas Tech The LDMX Experiment schematic

    “One significant difference is that LDMX will look at electrons bombarding a target. If the mass of dark matter is somewhere between the mass of the electron and the mass of the proton, this experiment will likely be able to see it.”

    Texas Tech also is working to upgrade the CMS detector so it can handle much higher rates of collisions after the LHC undergoes some upgrades of its own. The hope is that with higher rates, they’ll be able to see not only new massive particles but also the rarest of processes, such as the production of two Higgs bosons. This detector construction is ramping up now at Texas Tech’s new Advanced Physics Detector Laboratory at Reese Technology Center.

    Besides being a background for dark matter searches, neutrinos also are a growing focus of research in particle physics. Even now, the Fermi National Accelerator Laboratory is able to produce intense beams of neutrinos that can be used to study their idiosyncrasies, but there are plans to upgrade the facility to produce the most intense beams of neutrinos ever and to place the most sensitive neutrino detectors nearby, making the U.S. the center of neutrino physics.

    FNAL/NOvA experiment map

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    Measurements done with these neutrinos could unlock whether these particles play a big role in the creation of a matter-dominated universe.

    Texas Tech’s High Energy Physics group hopes that, in the near future, it can help tackle some of the challenges this endeavor presents.

    See the full here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 12:10 pm on July 15, 2019 Permalink | Reply
    Tags: , , , , CERN LHC, , , , ,   

    From CERN: “Exploring the Higgs boson “discovery channels” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    12th July 2019
    ATLAS Collaboration

    1
    Event display of a two-electron two-muon ZH candidate. The Higgs candidate can be seen on the left with the two leading electrons represented by green tracks and green EM calorimeter deposits (pT = 22 and 120 GeV), and two subleading muons indicated by two red tracks (pT = 34 and 43 GeV). Recoiling against the four lepton candidate in the left hemisphere is a dimuon pair in the right hemisphere indicated by two red tracks (pT = 139 and 42 GeV) and an invariant mass of 91.5 GeV, which agrees well with the mass of the Z boson. (Image: ATLAS Collaboration/CERN)

    At the 2019 European Physical Society’s High-Energy Physics conference (EPS-HEP) taking place in Ghent, Belgium, the ATLAS and CMS collaborations presented a suite of new results. These include several analyses using the full dataset from the second run of CERN’s Large Hadron Collider (LHC), recorded at a collision energy of 13 TeV between 2015 and 2018. Among the highlights are the latest precision measurements involving the Higgs boson. In only seven years since its discovery, scientists have carefully studied several of the properties of this unique particle, which is increasingly becoming a powerful tool in the search for new physics.

    The results include new searches for transformations (or “decays”) of the Higgs boson into pairs of muons and into pairs of charm quarks. Both ATLAS and CMS also measured previously unexplored properties of decays of the Higgs boson that involve electroweak bosons (the W, the Z and the photon) and compared these with the predictions of the Standard Model (SM) of particle physics. ATLAS and CMS will continue these studies over the course of the LHC’s Run 3 (2021 to 2023) and in the era of the High-Luminosity LHC (from 2026 onwards).

    The Higgs boson is the quantum manifestation of the all-pervading Higgs field, which gives mass to elementary particles it interacts with, via the Brout-Englert-Higgs mechanism. Scientists look for such interactions between the Higgs boson and elementary particles, either by studying specific decays of the Higgs boson or by searching for instances where the Higgs boson is produced along with other particles. The Higgs boson decays almost instantly after being produced in the LHC and it is by looking through its decay products that scientists can probe its behaviour.

    In the LHC’s Run 1 (2010 to 2012), decays of the Higgs boson involving pairs of electroweak bosons were observed. Now, the complete Run 2 dataset – around 140 inverse femtobarns each, the equivalent of over 10 000 trillion collisions – provides a much larger sample of Higgs bosons to study, allowing measurements of the particle’s properties to be made with unprecedented precision. ATLAS and CMS have measured the so-called “differential cross-sections” of the bosonic decay processes, which look at not just the production rate of Higgs bosons but also the distribution and orientation of the decay products relative to the colliding proton beams. These measurements provide insight into the underlying mechanism that produces the Higgs bosons. Both collaborations determined that the observed rates and distributions are compatible with those predicted by the Standard Model, at the current rate of statistical uncertainty.

    Since the strength of the Higgs boson’s interaction is proportional to the mass of elementary particles, it interacts most strongly with the heaviest generation of fermions, the third. Previously, ATLAS and CMS had each observed these interactions. However, interactions with the lighter second-generation fermions – muons, charm quarks and strange quarks – are considerably rarer. At EPS-HEP, both collaborations reported on their searches for the elusive second-generation interactions.
    ATLAS presented their first result from searches for Higgs bosons decaying to pairs of muons (H→μμ) with the full Run 2 dataset. This search is complicated by the large background of more typical SM processes that produce pairs of muons. “This result shows that we are now close to the sensitivity required to test the Standard Model’s predictions for this very rare decay of the Higgs boson,” says Karl Jakobs, the ATLAS spokesperson. “However, a definitive statement on the second generation will require the larger datasets that will be provided by the LHC in Run 3 and by the High-Luminosity LHC.”
    CMS presented their first result on searches for decays of Higgs bosons to pairs of charm quarks (H→cc). When a Higgs boson decays into quarks, these elementary particles immediately produce jets of particles. “Identifying jets formed by charm quarks and isolating them from other types of jets is a huge challenge,” says Roberto Carlin, spokesperson for CMS. “We’re very happy to have shown that we can tackle this difficult decay channel. We have developed novel machine-learning techniques to help with this task.”

    3
    An event recorded by CMS showing a candidate for a Higgs boson produced in association with two top quarks. The Higgs boson and top quarks decay leading to a final state with seven jets (orange cones), an electron (green line), a muon (red line) and missing transverse energy (pink line) (Image: CMS/CERN)

    The Higgs boson also acts as a mediator of physics processes in which electroweak bosons scatter or bounce off each other. Studies of these processes with very high statistics serve as powerful tests of the Standard Model. ATLAS presented the first-ever measurement of the scattering of two Z bosons. Observing this scattering completes the picture for the W and Z bosons as ATLAS has previously observed the WZ scattering process and both collaborations the WW processes. CMS presented the first observation of electroweak-boson scattering that results in the production of a Z boson and a photon.
    “The experiments are making big strides in the monumental task of understanding the Higgs boson,” says Eckhard Elsen, CERN’s Director of Research and Computing. “After observation of its coupling to the third-generation fermions, the experiments have now shown that they have the tools at hand to address the even more challenging second generation. The LHC’s precision physics programme is in full swing.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 7:29 am on May 23, 2019 Permalink | Reply
    Tags: "Atom smasher could be making new particles that are hiding in plain sight", , , CERN FASER experiment, CERN LHC, Compact Detector for Exotics at LHCb, ,   

    From Science Magazine: “Atom smasher could be making new particles that are hiding in plain sight” 

    AAAS
    From Science Magazine

    May. 22, 2019
    Adrian Cho

    1
    In a simulated event, the track of a decay particle called a muon (red), displaced slightly from the center of particle collisions, could be a sign of new physics.
    ATLAS EXPERIMENT © 2019 CERN

    Are new particles materializing right under physicists’ noses and going unnoticed? The world’s great atom smasher, the Large Hadron Collider (LHC), could be making long-lived particles that slip through its detectors, some researchers say.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    Next week, they will gather at the LHC’s home, CERN, the European particle physics laboratory near Geneva, Switzerland, to discuss how to capture them.


    They argue the LHC’s next run should emphasize such searches, and some are calling for new detectors that could sniff out the fugitive particles.

    It’s a push born of anxiety. In 2012, experimenters at the $5 billion LHC discovered the Higgs boson, the last particle predicted by the standard model of particles and forces, and the key to explaining how fundamental particles get their masses.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    But the LHC has yet to blast out anything beyond the standard model.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    “We haven’t found any new physics with the assumptions we started with, so maybe we need to change the assumptions,” says Juliette Alimena, a physicist at Ohio State University (OSU) in Columbus who works with the Compact Muon Solenoid (CMS), one of the two main particle detectors fed by the LHC.

    CERN/CMS Detector


    For decades, physicists have relied on a simple strategy to look for new particles: Smash together protons or electrons at ever-higher energies to produce heavy new particles and watch them decay instantly into lighter, familiar particles within the huge, barrel-shaped detectors. That’s how CMS and its rival detector, A Toroidal LHC Apparatus (ATLAS), spotted the Higgs, which in a trillionth of a nanosecond can decay into, among other things, a pair of photons or two “jets” of lighter particles.

    CERN ATLAS Credit CERN SCIENCE PHOTO LIBRARY

    Long-lived particles, however, would zip through part or all of the detector before decaying. That idea is more than a shot in the dark, says Giovanna Cottin, a theorist at National Taiwan University in Taipei. “Almost all the frameworks for beyond-the-standard-model physics predict the existence of long-lived particles,” she says. For example, a scheme called supersymmetry posits that every standard model particle has a heavier superpartner, some of which could be long-lived. Long-lived particles also emerge in “dark sector” theories that envision undetectable particles that interact with ordinary matter only through “porthole” particles, such as a dark photon that every so often would replace an ordinary photon in a particle interaction.

    CMS and ATLAS, however, were designed to detect particles that decay instantaneously. Like an onion, each detector contains layers of subsystems—trackers that trace charged particles, calorimeters that measure particle energies, and chambers that detect penetrating and particularly handy particles called muons—all arrayed around a central point where the accelerator’s proton beams collide. Particles that fly even a few millimeters before decaying would leave unusual signatures: kinked or offset tracks, or jets that emerge gradually instead of all at once.

    Standard data analysis often assumes such oddities are mistakes and junk, notes Tova Holmes, an ATLAS member from the University of Chicago in Illinois who is searching for the displaced tracks of decays from long-lived supersymmetric particles. “It’s a bit of a challenge because the way we’ve designed things, and the software people have written, basically rejects these things,” she says. So Holmes and colleagues had to rewrite some of that software.

    More important is ensuring that the detectors record the odd events in the first place. The LHC smashes bunches of protons together 400 million times a second. To avoid data overload, trigger systems on CMS and ATLAS sift interesting collisions from dull ones and immediately discard data about 1999 of every 2000 collisions. The culling can inadvertently toss out long-lived particles. Alimena and colleagues wanted to look for particles that live long enough to get stuck in CMS’s calorimeter and decay only later. So they had to put in a special trigger that occasionally reads out the entire detector between the proton collisions.

    Long-lived particle searches had been fringe efforts, says James Beacham, an ATLAS experimenter from OSU. “It’s always been one guy working on this thing,” he says. “Your support group was you in your office.” Now, researchers are joining forces. In March, 182 of them released a 301-page white paper on how to optimize their searches.

    Some want ATLAS and CMS to dedicate more triggers to long-lived particle searches in the next LHC run, from 2021 through 2023. In fact, the next run “is probably our last chance to look for unusual rare events,” says Livia Soffi, a CMS member from the Sapienza University of Rome. Afterward, an upgrade will increase the intensity of the LHC’s beams, requiring tighter triggers.

    Others have proposed a half-dozen new detectors to search for particles so long-lived that they escape the LHC’s existing detectors altogether. Jonathan Feng, a theorist at the University of California, Irvine, and colleagues have won CERN approval for the Forward Search Experiment (FASER), a small tracker to be placed in a service tunnel 480 meters down the beamline from ATLAS.

    CERN FASER experiment schematic

    Supported by $2 million from private foundations and built of borrowed parts, FASER will look for low-mass particles such as dark photons, which could spew from ATLAS, zip through the intervening rock, and decay into electron-positron pairs.

    Another proposal calls for a tracking chamber in an empty hall next to the LHCb, a smaller detector fed by the LHC.

    CERN/LHCb detector

    The Compact Detector for Exotics at LHCb would look for long-lived particles, especially those born in Higgs decays, says Vladimir Gligorov, an LHCb member from the Laboratory for Nuclear Physics and High Energies in Paris.

    3
    The Compact Detector for Exotics at LHCb. https://indico.cern.ch/event/755856/contributions/3263683/attachments/1779990/2897218/PBC2019_CERN_CodexB_report.pdf

    Even more ambitious would be a detector called MATHUSLA, essentially a large, empty building on the surface above the subterranean CMS detector.

    5
    MATHUSLA. http://cds.cern.ch/record/2653848

    Tracking chambers in the ceiling would detect jets spraying up from the decays of long-lived particles created 70 meters below, says David Curtin, a theorist at the University of Toronto in Canada and project co-leader. Curtin is “optimistic” MATHUSLA would cost less than €100 million. “Given that it has sensitivity to this broad range of signatures—and that we haven’t seen anything else—I’d say it’s a no-brainer.”

    Physicists have a duty to look for the odd particles, Beacham says. “The nightmare scenario is that in 20 years, Jill Theorist says, ‘The reason you didn’t see anything is you didn’t keep the right events and do the right search.’”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 12:04 pm on May 14, 2019 Permalink | Reply
    Tags: >Model-dependent vs model-independent research, , , , CERN LHC, , , , , , ,   

    From Symmetry: “Casting a wide net” 

    Symmetry Mag
    From Symmetry

    05/14/19
    Jim Daley

    1
    Illustration by Sandbox Studio, Chicago

    In their quest to discover physics beyond the Standard Model, physicists weigh the pros and cons of different search strategies.

    On October 30, 1975, theorists John Ellis, Mary K. Gaillard and D.V. Nanopoulos published a paper [Science Direct] titled “A Phenomenological Profile of the Higgs Boson.” They ended their paper with a note to their fellow scientists.

    “We should perhaps finish with an apology and a caution,” it said. “We apologize to experimentalists for having no idea what is the mass of the Higgs boson… and for not being sure of its couplings to other particles, except that they are probably all very small.

    “For these reasons, we do not want to encourage big experimental searches for the Higgs boson, but we do feel that people performing experiments vulnerable to the Higgs boson should know how it may turn up.”

    What the theorists were cautioning against was a model-dependent search, a search for a particle predicted by a certain model—in this case, the Standard Model of particle physics.

    Standard Model of Particle Physics

    It shouldn’t have been too much of a worry. Around then, most particle physicists’ experiments were general searches, not based on predictions from a particular model, says Jonathan Feng, a theoretical particle physicist at the University of California, Irvine.

    Using early particle colliders, physicists smashed electrons and protons together at high energies and looked to see what came out. Samuel Ting and Burton Richter, who shared the 1976 Nobel Prize in physics for the discovery of the charm quark, for example, were not looking for the particle with any theoretical prejudice, Feng says.

    That began to change in the 1980s and ’90s. That’s when physicists began exploring elegant new theories such as supersymmetry, which could tie up many of the Standard Model’s theoretical loose ends—and which predict the existence of a whole slew of new particles for scientists to try to find.

    Of course, there was also the Higgs boson. Even though scientists didn’t have a good prediction of its mass, they had good motivations for thinking it was out there waiting to be discovered.

    And it was. Almost 40 years after the theorists’ tongue-in-cheek warning about searching for the Higgs, Ellis found himself sitting in the main auditorium at CERN next to experimentalist Fabiola Gianotti, the spokesperson of the ATLAS experiment at the Large Hadron Collider who, along with CMS spokesperson Joseph Incandela, had just co-announced the discovery of the particle he had once so pessimistically described.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    Model-dependent vs model-independent

    Scientists’ searches for particles predicted by certain models continue, but in recent years, searches for new physics independent of those models have begun to enjoy a resurgence as well.

    “A model-independent search is supposed to distill the essence from a whole bunch of specific models and look for something that’s independent of the details,” Feng says. The goal is to find an interesting common feature of those models, he explains. “And then I’m going to just look for that phenomenon, irrespective of the details.”

    Particle physicist Sara Alderweireldt uses model-independent searches in her work on the ATLAS experiment at the Large Hadron Collider.

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    Alderweireldt says that while many high-energy particle physics experiments are designed to make very precise measurements of a specific aspect of the Standard Model, a model-independent search allows physicists to take a wider view and search more generally for new particles or interactions. “Instead of zooming in, we try to look in as many places as possible in a consistent way.”

    Such a search makes room for the unexpected, she says. “You’re not dependent on the prior interpretation of something you would be looking for.”

    Theorist Patrick Fox and experimentalist Anadi Canepa, both at Fermilab, collaborate on searches for new physics.


    In Canepa’s work on the CMS experiment, the other general-purpose particle detector at the LHC, many of the searches are model-independent.

    While the nature of these searches allows them to “cast a wider net,” Fox says, “they are in some sense shallower, because they don’t manage to strongly constrain any one particular model.”

    At the same time, “by combining the results from many independent searches, we are getting closer to one dedicated search,” Canepa says. “Developing both model-dependent and model-independent searches is the approach adopted by the CMS and ATLAS experiments to fully exploit the unprecedented potential of the LHC.”

    Driven by data and powered by machine learning

    Model-dependent searches focus on a single assumption or look for evidence of a specific final state following an experimental particle collision. Model-independent searches are far broader—and how broad is largely driven by the speed at which data can be processed.

    “We have better particle detectors, and more advanced algorithms and statistical tools that are enabling us to understand searches in broader terms,” Canepa says.

    One reason model-independent searches are gaining prominence is because now there is enough data to support them. Particle detectors are recording vast quantities of information, and modern computers can run simulations faster than ever before, she says. “We are able to do model-independent searches because we are able to better understand much larger amounts of data and extreme regions of parameter and phase space.”

    Machine-learning is a key part of this processing power, Canepa says. “That’s really a change of paradigm, because it really made us make a major leap forward in terms of sensitivity [to new signals]. It really allows us to benefit from understanding the correlations that we didn’t capture in a more classical approach.”

    These broader searches are an important part of modern particle physics research, Fox says.

    “At a very basic level, our job is to bequeath to our descendants a better understanding of nature than we got from our ancestors,” he says. “One way to do that is to produce lots of information that will stand the test of time, and one way of doing that is with model-independent searches.”

    Models go in and out of fashion, he adds. “But model-independent searches don’t feel like they will.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 11:26 am on May 5, 2019 Permalink | Reply
    Tags: 'Where Does A Proton’s Mass Come From?', 99.8% of the proton’s mass comes from gluons, , Antiquarks, Asymptotic freedom: the particles that mediate this force are known as gluons., , , CERN LHC, , , , , , , , The production of Higgs bosons is dominated by gluon-gluon collisions at the LHC, , The strong interaction is the most powerful interaction in the entire known Universe.   

    From Ethan Siegel: “Ask Ethan: ‘Where Does A Proton’s Mass Come From?'” 

    From Ethan Siegel
    May 4, 2019

    1
    The three valence quarks of a proton contribute to its spin, but so do the gluons, sea quarks and antiquarks, and orbital angular momentum as well. The electrostatic repulsion and the attractive strong nuclear force, in tandem, are what give the proton its size, and the properties of quark mixing are required to explain the suite of free and composite particles in our Universe. (APS/ALAN STONEBRAKER)

    The whole should equal the sum of its parts, but doesn’t. Here’s why.

    The whole is equal to the sum of its constituent parts. That’s how everything works, from galaxies to planets to cities to molecules to atoms. If you take all the components of any system and look at them individually, you can clearly see how they all fit together to add up to the entire system, with nothing missing and nothing left over. The total amount you have is equal to the amounts of all the different parts of it added together.

    So why isn’t that the case for the proton? It’s made of three quarks, but if you add up the quark masses, they not only don’t equal the proton’s mass, they don’t come close. This is the puzzle that Barry Duffey wants us to address, asking:

    “What’s happening inside protons? Why does [its] mass so greatly exceed the combined masses of its constituent quarks and gluons?”

    In order to find out, we have to take a deep look inside.

    2
    The composition of the human body, by atomic number and by mass. The whole of our bodies is equal to the sum of its parts, until you get down to an extremely fundamental level. At that point, we can see that we’re actually more than the sum of our constituent components. (ED UTHMAN, M.D., VIA WEB2.AIRMAIL.NET/UTHMAN (L); WIKIMEDIA COMMONS USER ZHAOCAROL (R))

    There’s a hint that comes just from looking at your own body. If you were to divide yourself up into smaller and smaller bits, you’d find — in terms of mass — the whole was equal to the sum of its parts. Your body’s bones, fat, muscles and organs sum up to an entire human being. Breaking those down further, into cells, still allows you to add them up and recover the same mass you have today.

    Cells can be divided into organelles, organelles are composed of individual molecules, molecules are made of atoms; at each stage, the mass of the whole is no different than that of its parts. But when you break atoms into protons, neutrons and electrons, something interesting happens. At that level, there’s a tiny but noticeable discrepancy: the individual protons, neutrons and electrons are off by right around 1% from an entire human. The difference is real.

    3
    From macroscopic scales down to subatomic ones, the sizes of the fundamental particles play only a small role in determining the sizes of composite structures. Whether the building blocks are truly fundamental and/or point-like particles is still not known. (MAGDALENA KOWALSKA / CERN / ISOLDE TEAM)

    CERN ISOLDE

    Like all known organisms, human beings are carbon-based life forms. Carbon atoms are made up of six protons and six neutrons, but if you look at the mass of a carbon atom, it’s approximately 0.8% lighter than the sum of the individual component particles that make it up. The culprit here is nuclear binding energy; when you have atomic nuclei bound together, their total mass is smaller than the mass of the protons and neutrons that comprise them.

    The way carbon is formed is through the nuclear fusion of hydrogen into helium and then helium into carbon; the energy released is what powers most types of stars in both their normal and red giant phases. That “lost mass” is where the energy powering stars comes from, thanks to Einstein’s E = mc². As stars burn through their fuel, they produce more tightly-bound nuclei, releasing the energy difference as radiation.

    4
    In between the 2nd and 3rd brightest stars of the constellation Lyra, the blue giant stars Sheliak and Sulafat, the Ring Nebula shines prominently in the night skies. Throughout all phases of a star’s life, including the giant phase, nuclear fusion powers them, with the nuclei becoming more tightly bound and the energy emitted as radiation coming from the transformation of mass into energy via E = mc². (NASA, ESA, DIGITIZED SKY SURVEY 2)

    NASA/ESA Hubble Telescope

    ESO Online Digitized Sky Survey Telescopes

    Caltech Palomar Samuel Oschin 48 inch Telescope, located in San Diego County, California, United States, altitude 1,712 m (5,617 ft)


    Australian Astronomical Observatory, Siding Spring Observatory, near Coonabarabran, New South Wales, Australia, 1.2m UK Schmidt Telescope, Altitude 1,165 m (3,822 ft)


    From http://archive.eso.org/dss/dss

    This is how most types of binding energy work: the reason it’s harder to pull apart multiple things that are bound together is because they released energy when they were joined, and you have to put energy in to free them again. That’s why it’s such a puzzling fact that when you take a look at the particles that make up the proton — the up, up, and down quarks at the heart of them — their combined masses are only 0.2% of the mass of the proton as a whole. But the puzzle has a solution that’s rooted in the nature of the strong force itself.

    The way quarks bind into protons is fundamentally different from all the other forces and interactions we know of. Instead of the force getting stronger when objects get closer, like the gravitational, electric, or magnetic forces, the attractive force goes down to zero when quarks get arbitrarily close. And instead of the force getting weaker when objects get farther away, the force pulling quarks back together gets stronger the farther away they get.

    5
    The internal structure of a proton, with quarks, gluons, and quark spin shown. The nuclear force acts like a spring, with negligible force when unstretched but large, attractive forces when stretched to large distances. (BROOKHAVEN NATIONAL LABORATORY)

    This property of the strong nuclear force is known as asymptotic freedom, and the particles that mediate this force are known as gluons. Somehow, the energy binding the proton together, responsible for the other 99.8% of the proton’s mass, comes from these gluons. The whole of matter, somehow, weighs much, much more than the sum of its parts.

    This might sound like an impossibility at first, as the gluons themselves are massless particles. But you can think of the forces they give rise to as springs: asymptoting to zero when the springs are unstretched, but becoming very large the greater the amount of stretching. In fact, the amount of energy between two quarks whose distance gets too large can become so great that it’s as though additional quark/antiquark pairs exist inside the proton: sea quarks.

    6
    When two protons collide, it isn’t just the quarks making them up that can collide, but the sea quarks, gluons, and beyond that, field interactions. All can provide insights into the spin of the individual components, and allow us to create potentially new particles if high enough energies and luminosities are reached. (CERN / CMS COLLABORATION)

    Those of you familiar with quantum field theory might have the urge to dismiss the gluons and the sea quarks as just being virtual particles: calculational tools used to arrive at the right result. But that’s not true at all, and we’ve demonstrated that with high-energy collisions between either two protons or a proton and another particle, like an electron or photon.

    The collisions performed at the Large Hadron Collider at CERN are perhaps the greatest test of all for the internal structure of the proton. When two protons collide at these ultra-high energies, most of them simply pass by one another, failing to interact. But when two internal, point-like particles collide, we can reconstruct exactly what it was that smashed together by looking at the debris that comes out.

    7
    A Higgs boson event as seen in the Compact Muon Solenoid detector at the Large Hadron Collider. This spectacular collision is 15 orders of magnitude below the Planck energy, but it’s the precision measurements of the detector that allow us to reconstruct what happened back at (and near) the collision point. Theoretically, the Higgs gives mass to the fundamental particles; however, the proton’s mass is not due to the mass of the quarks and gluons that compose it. (CERN / CMS COLLABORATION)

    Under 10% of the collisions occur between two quarks; the overwhelming majority are gluon-gluon collisions, with quark-gluon collisions making up the remainder. Moreover, not every quark-quark collision in protons occurs between either up or down quarks; sometimes a heavier quark is involved.

    Although it might make us uncomfortable, these experiments teach us an important lesson: the particles that we use to model the internal structure of protons are real. In fact, the discovery of the Higgs boson itself was only possible because of this, as the production of Higgs bosons is dominated by gluon-gluon collisions at the LHC. If all we had were the three valence quarks to rely on, we would have seen different rates of production of the Higgs than we did.

    8
    Before the mass of the Higgs boson was known, we could still calculate the expected production rates of Higgs bosons from proton-proton collisions at the LHC. The top channel is clearly production by gluon-gluon collisions. I (E. Siegel) have added the yellow highlighted region to indicate where the Higgs boson was discovered. (CMS COLLABORATION (DORIGO, TOMMASO FOR THE COLLABORATION) ARXIV:0910.3489)

    As always, though, there’s still plenty more to learn. We presently have a solid model of the average gluon density inside a proton, but if we want to know where the gluons are actually more likely to be located, that requires more experimental data, as well as better models to compare the data against. Recent advances by theorists Björn Schenke and Heikki Mäntysaari may be able to provide those much needed models. As Mäntysaari detailed:

    “It is very accurately known how large the average gluon density is inside a proton. What is not known is exactly where the gluons are located inside the proton. We model the gluons as located around the three [valence] quarks. Then we control the amount of fluctuations represented in the model by setting how large the gluon clouds are, and how far apart they are from each other. […] The more fluctuations we have, the more likely this process [producing a J/ψ meson] is to happen.”

    9
    A schematic of the world’s first electron-ion collider (EIC). Adding an electron ring (red) to the Relativistic Heavy Ion Collider (RHIC) at Brookhaven would create the eRHIC: a proposed deep inelastic scattering experiment that could improve our knowledge of the internal structure of the proton significantly. (BROOKHAVEN NATIONAL LABORATORY-CAD ERHIC GROUP)

    The combination of this new theoretical model and the ever-improving LHC data will better enable scientists to understand the internal, fundamental structure of protons, neutrons and nuclei in general, and hence to understand where the mass of the known objects in the Universe comes from. From an experimental point of view, the greatest boon would be a next-generation electron-ion collider, which would enable us to perform deep inelastic scattering experiments to reveal the internal makeup of these particles as never before.

    But there’s another theoretical approach that can take us even farther into the realm of understanding where the proton’s mass comes from: Lattice QCD.

    10
    A better understanding of the internal structure of a proton, including how the “sea” quarks and gluons are distributed, has been achieved through both experimental improvements and new theoretical developments in tandem. (BROOKHAVEN NATIONAL LABORATORY)

    The difficult part with the quantum field theory that describes the strong force — quantum chromodynamics (QCD) — is that the standard approach we take to doing calculations is no good. Typically, we’d look at the effects of particle couplings: the charged quarks exchange a gluon and that mediates the force. They could exchange gluons in a way that creates a particle-antiparticle pair or an additional gluon, and that should be a correction to a simple one-gluon exchange. They could create additional pairs or gluons, which would be higher-order corrections.

    We call this approach taking a perturbative expansion in quantum field theory, with the idea that calculating higher and higher-order contributions will give us a more accurate result.

    11
    Today, Feynman diagrams are used in calculating every fundamental interaction spanning the strong, weak, and electromagnetic forces, including in high-energy and low-temperature/condensed conditions. But this approach, which relies on a perturbative expansion, is only of limited utility for the strong interactions, as this approach diverges, rather than converges, when you add more and more loops for QCD.(DE CARVALHO, VANUILDO S. ET AL. NUCL.PHYS. B875 (2013) 738–756)

    Richard Feynman © Open University

    But this approach, which works so well for quantum electrodynamics (QED), fails spectacularly for QCD. The strong force works differently, and so these corrections get very large very quickly. Adding more terms, instead of converging towards the correct answer, diverges and takes you away from it. Fortunately, there is another way to approach the problem: non-perturbatively, using a technique called Lattice QCD.

    By treating space and time as a grid (or lattice of points) rather than a continuum, where the lattice is arbitrarily large and the spacing is arbitrarily small, you overcome this problem in a clever way. Whereas in standard, perturbative QCD, the continuous nature of space means that you lose the ability to calculate interaction strengths at small distances, the lattice approach means there’s a cutoff at the size of the lattice spacing. Quarks exist at the intersections of grid lines; gluons exist along the links connecting grid points.

    As your computing power increases, you can make the lattice spacing smaller, which improves your calculational accuracy. Over the past three decades, this technique has led to an explosion of solid predictions, including the masses of light nuclei and the reaction rates of fusion under specific temperature and energy conditions. The mass of the proton, from first principles, can now be theoretically predicted to within 2%.

    12
    As computational power and Lattice QCD techniques have improved over time, so has the accuracy to which various quantities about the proton, such as its component spin contributions, can be computed. By reducing the lattice spacing size, which can be done simply by raising the computational power employed, we can better predict the mass of not only the proton, but of all the baryons and mesons. (LABORATOIRE DE PHYSIQUE DE CLERMONT / ETM COLLABORATION)

    It’s true that the individual quarks, whose masses are determined by their coupling to the Higgs boson, cannot even account for 1% of the mass of the proton. Rather, it’s the strong force, described by the interactions between quarks and the gluons that mediate them, that are responsible for practically all of it.

    The strong nuclear force is the most powerful interaction in the entire known Universe. When you go inside a particle like the proton, it’s so powerful that it — not the mass of the proton’s constituent particles — is primarily responsible for the total energy (and therefore mass) of the normal matter in our Universe. Quarks may be point-like, but the proton is huge by comparison: 8.4 × 10^-16 m in diameter. Confining its component particles, which the binding energy of the strong force does, is what’s responsible for 99.8% of the proton’s mass.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 1:01 pm on May 2, 2019 Permalink | Reply
    Tags: , An unexpected signature, , CERN LHC, It’s not always about what you discover, Nature might be tough with us- but maybe nature is testing us and making us stronger., , , , Taking a closer look, Why the force of gravity is so much weaker than other known forces like electromagnetism. There is only one right answer. We haven’t found it yet.   

    From Symmetry: “The unseen progress of the LHC” 

    Symmetry Mag
    From Symmetry

    05/02/19
    Sarah Charley

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    It’s not always about what you discover.

    About seven years ago, physicist Stephane Willocq at the University of Massachusetts became enthralled with a set of theories that predicted the existence of curled-up extra dimensions hiding within our classical four dimensions of spacetime.

    “The idea of extra spatial dimensions is appealing because it allows us to look at the fundamental problems in particle physics from a different viewpoint,” Willocq says.

    As an experimental physicist, Willocq can do more than ponder. At the Large Hadron Collider at CERN, he put his pet theories to the test.

    Models based on those theories predicted how curled-up extra dimensions would affect the outcome of proton-proton collisions at the LHC. They predicted the collisions would produce more high-energy particles than expected.

    After several searches, Willocq and his colleagues found nothing out of the ordinary. “It was a great idea and disappointing to see it fade away, bit by bit,” he says, “but that’s how scientific progress works—finding the right idea by process of elimination.”

    The LHC research program is famous for discovering and studying the long-sought Higgs boson. But out of the spotlight, scientists have been using the LHC for an equally important scientific endeavor: testing, constraining and eliminating hundreds of theories that propose solutions to outstanding problems in physics, such as why the force of gravity is so much weaker than other known forces like electromagnetism.

    “There is only one right answer,” Willocq says. “We haven’t found it yet.”

    Now that scientists are at the end of the second run of the LHC, they have covered a huge amount of ground, eliminating the simplest versions of numerous theoretical ideas. They’ve covered four times as much phase space as previous searches for heavy new particles and set strict limits on what is physically possible.

    These studies don’t get the same attention as the Higgs boson, but these null results—results that don’t support a certain hypothesis—have moved physics forward as well.

    An unexpected signature

    Having chased down their most obvious leads, physicists are now adapting their methodology and considering new possibilities in their pursuit of new physics.

    Thus far, physicists have often used a straightforward formula to look for new particles. Massive particles produced in particle collisions will almost instantly decay, transforming into more stable particles. If scientists can measure all of those particles, they can reconstruct the mass and properties of the original particle that produced them.

    This worked wonderfully when scientists discovered the top quark in 1995 and the Higgs boson in 2012. But finding the next new thing might take a different tactic.

    “Finding new physics is more challenging than we expected it to be,” says University of Wisconsin physicist Tulika Bose of the CMS experiment. “Challenging situations make people come up with clever ideas.”

    One idea is that maybe scientists have been so focused on instantly decaying particles that they’ve been missing a whole host of particles that can travel up to several meters before falling apart. This would look like a firework exploding randomly in one of the detector subsystems.

    Scientists are rethinking how they reconstruct the data as a way to cast a bigger net and potentially catch particles with signatures like these. “If we only used our standard analysis methods, we would definitely not be sensitive to anything like this,” Bose says. “We’re no longer just reloading previous analyses but exploring innovative ideas.”

    Taking a closer look

    Since looking for excess particles coming out of collisions has yet to yield evidence of extra spatial dimensions, Willocq has decided to devote some of his efforts to a different method used at LHC experiments: precision measurements.

    Models also make predictions about properties of particles such as how often they decay into one set of particles versus another set. If precise measurements show deviations from predictions by the Standard Model of particle physics, it can mean that something new is at play.

    “Several new physics models predict an enhanced rate of rare subatomic processes,” Bose says. “However, their rates are so low that we have not been able to measure them yet.”

    In the past, precision measurements of well-known particles have overturned seemingly bulletproof paradigms. In the 1940s, for example, the measurement of a property called the “magnetic moment” of the neutron showed that it was not a fundamental particle, as had been previously assumed. This eventually helped lead to the discovery of particles that make up neutrons: quarks.

    Another example is the measurement of the mismatched decays of certain matter and antimatter particles, which led to the prediction of a new group of quarks—later confirmed by the discoveries of the top and bottom quarks.

    The plan for the LHC research program is to collect a huge amount of data, which will give scientists the resolution they need to examine every shadowy corner of the Standard Model.

    “This work naturally pushes our search methods towards making more detailed and higher precision measurements that will help us constrain possible deviations by new physics,” Willocq says.

    Because many of these predictions have never been thoroughly tested, scientists are hoping that they’ll find a few small deviations that could open the door to a new era of physics research. “Nature might be tough with us,” Bose says, “but maybe nature is testing us and making us stronger.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 12:32 pm on April 18, 2019 Permalink | Reply
    Tags: "When Beauty Gets in the Way of Science", , CERN LHC, , , , , , , , ,   

    From Nautilus: “When Beauty Gets in the Way of Science” 

    Nautilus

    From Nautilus

    April 18, 2019
    Sabine Hossenfelder

    Insisting that new ideas must be beautiful blocks progress in particle physics.

    When Beauty Gets in the Way of Science. Nautilus

    The biggest news in particle physics is no news. In March, one of the most important conferences in the field, Rencontres de Moriond, took place. It is an annual meeting at which experimental collaborations present preliminary results. But the recent data from the Large Hadron Collider (LHC), currently the world’s largest particle collider, has not revealed anything new.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    Forty years ago, particle physicists thought themselves close to a final theory for the structure of matter. At that time, they formulated the Standard Model of particle physics to describe the elementary constituents of matter and their interactions.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    After that, they searched for the predicted, but still missing, particles of the Standard Model. In 2012, they confirmed the last missing particle, the Higgs boson.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    The Higgs boson is necessary to make sense of the rest of the Standard Model. Without it, the other particles would not have masses, and probabilities would not properly add up to one. Now, with the Higgs in the bag, the Standard Model is complete; all Pokémon caught.

    1
    HIGGS HANGOVER: After the Large Hadron Collider (above) confirmed the Higgs boson, which validated the Standard Model, it’s produced nothing newsworthy, and is unlikely to, says physicist Sabine Hossenfelder.Shutterstock

    The Standard Model may be physicists’ best shot at the structure of fundamental matter, but it leaves them wanting. Many particle physicists think it is simply too ugly to be nature’s last word. The 25 particles of the Standard Model can be classified by three types of symmetries that correspond to three fundamental forces: The electromagnetic force, and the strong and weak nuclear forces. Physicists, however, would rather there was only one unified force. They would also like to see an entirely new type of symmetry, the so-called “supersymmetry,” because that would be more appealing.

    2
    Supersymmetry builds on the Standard Model, with many new supersymmetric particles, represented here with a tilde (~) on them. ( From the movie “Particle fever” reproduced by Mark Levinson)

    Oh, and additional dimensions of space would be pretty. And maybe also parallel universes. Their wish list is long.

    It has become common practice among particle physicists to use arguments from beauty to select the theories they deem worthy of further study. These criteria of beauty are subjective and not evidence-based, but they are widely believed to be good guides to theory development. The most often used criteria of beauty in the foundations of physics are presently simplicity and naturalness.

    By “simplicity,” I don’t mean relative simplicity, the idea that the simplest theory is the best (a.k.a. “Occam’s razor”). Relying on relative simplicity is good scientific practice. The desire that a theory be simple in absolute terms, in contrast, is a criterion from beauty: There is no deep reason that the laws of nature should be simple. In the foundations of physics, this desire for absolute simplicity presently shows in physicists’ hope for unification or, if you push it one level further, in the quest for a “Theory of Everything” that would merge the three forces of the Standard Model with gravity.

    The other criterion of beauty, naturalness, requires that pure numbers that appear in a theory (i.e., those without units) should neither be very large nor very small; instead, these numbers should be close to one. Exactly how close these numbers should be to one is debatable, which is already an indicator of the non-scientific nature of this argument. Indeed, the inability of particle physicists to quantify just when a lack of naturalness becomes problematic highlights that the fact that an unnatural theory is utterly unproblematic. It is just not beautiful.

    Anyone who has a look at the literature of the foundations of physics will see that relying on such arguments from beauty has been a major current in the field for decades. It has been propagated by big players in the field, including Steven Weinberg, Frank Wilczek, Edward Witten, Murray Gell-Mann, and Sheldon Glashow. Countless books popularized the idea that the laws of nature should be beautiful, written, among others, by Brian Greene, Dan Hooper, Gordon Kane, and Anthony Zee. Indeed, this talk about beauty has been going on for so long that at this point it seems likely most people presently in the field were attracted by it in the first place. Little surprise, then, they can’t seem to let go of it.

    Trouble is, relying on beauty as a guide to new laws of nature is not working.

    Since the 1980s, dozens of experiments looked for evidence of unified forces and supersymmetric particles, and other particles invented to beautify the Standard Model. Physicists have conjectured hundreds of hypothetical particles, from “gluinos” and “wimps” to “branons” and “cuscutons,” each of which they invented to remedy a perceived lack of beauty in the existing theories. These particles are supposed to aid beauty, for example, by increasing the amount of symmetries, by unifying forces, or by explaining why certain numbers are small. Unfortunately, not a single one of those particles has ever been seen. Measurements have merely confirmed the Standard Model over and over again. And a theory of everything, if it exists, is as elusive today as it was in the 1970s. The Large Hadron Collider is only the most recent in a long series of searches that failed to confirm those beauty-based predictions.

    These decades of failure show that postulating new laws of nature just because they are beautiful according to human standards is not a good way to put forward scientific hypotheses. It’s not the first time this has happened. Historical precedents are not difficult to find. Relying on beauty did not work for Kepler’s Platonic solids, it did not work for Einstein’s idea of an eternally unchanging universe, and it did not work for the oh-so-pretty idea, popular at the end of the 19th century, that atoms are knots in an invisible ether. All of these theories were once considered beautiful, but are today known to be wrong. Physicists have repeatedly told me about beautiful ideas that didn’t turn out to be beautiful at all. Such hindsight is not evidence that arguments from beauty work, but rather that our perception of beauty changes over time.

    That beauty is subjective is hardly a breakthrough insight, but physicists are slow to learn the lesson—and that has consequences. Experiments that test ill-motivated hypotheses are at high risk to only find null results; i.e., to confirm the existing theories and not see evidence of new effects. This is what has happened in the foundations of physics for 40 years now. And with the new LHC results, it happened once again.

    The data analyzed so far shows no evidence for supersymmetric particles, extra dimensions, or any other physics that would not be compatible with the Standard Model. In the past two years, particle physicists were excited about an anomaly in the interaction rates of different leptons. The Standard Model predicts these rates should be identical, but the data demonstrates a slight difference. This “lepton anomaly” has persisted in the new data, but—against particle physicists’ hopes—it did not increase in significance, is hence not a sign for new particles. The LHC collaborations succeeded in measuring the violation of symmetry in the decay of composite particles called “D-mesons,” but the measured effect is, once again, consistent with the Standard Model. The data stubbornly repeat: Nothing new to see here.

    Of course it’s possible there is something to find in the data yet to be analyzed. But at this point we already know that all previously made predictions for new physics were wrong, meaning that there is now no reason to expect anything new to appear.

    Yes, null results—like the recent LHC measurements—are also results. They rule out some hypotheses. But null results are not very useful results if you want to develop a new theory. A null-result says: “Let’s not go this way.” A result says: “Let’s go that way.” If there are many ways to go, discarding some of them does not help much.

    To find the way forward in the foundations of physics, we need results, not null-results. When testing new hypotheses takes decades of construction time and billions of dollars, we have to be careful what to invest in. Experiments have become too costly to rely on serendipitous discoveries. Beauty-based methods have historically not worked. They still don’t work. It’s time that physicists take note.

    And it’s not like the lack of beauty is the only problem with the current theories in the foundations of physics. There are good reasons to think physics is not done. The Standard Model cannot be the last word, notably because it does not contain gravity and fails to account for the masses of neutrinos. It also describes neither dark matter nor dark energy, which are necessary to explain galactic structures.

    So, clearly, the foundations of physics have problems that require answers. Physicists should focus on those. And we currently have no reason to think that colliding particles at the next higher energies will help solve any of the existing problems. New effects may not appear until energies are a billion times higher than what even the next larger collider could probe. To make progress, then, physicists must, first and foremost, learn from their failed predictions.

    So far, they have not. In 2016, the particle physicists Howard Baer, Vernon Barger, and Jenny List wrote an essay for Scientific American arguing that we need a larger particle collider to “save physics.” The reason? A theory the authors had proposed themselves, that is natural (beautiful!) in a specific way, predicts such a larger collider should see new particles. This March, Kane, a particle physicist, used similar beauty-based arguments in an essay for Physics Today. And a recent comment in Nature Reviews Physics about a big, new particle collider planned in Japan once again drew on the same motivations from naturalness that have already not worked for the LHC. Even the particle physicists who have admitted their predictions failed do not want to give up beauty-based hypotheses. Instead, they have argued we need more experiments to test just how wrong they are.

    Will this latest round of null-results finally convince particle physicists that they need new methods of theory-development? I certainly hope so.

    As an ex-particle physicist myself, I understand very well the desire to have an all-encompassing theory for the structure of matter. I can also relate to the appeal of theories such a supersymmetry or string theory. And, yes, I quite like the idea that we live in one of infinitely many universes that together make up the “multiverse.” But, as the latest LHC results drive home once again, the laws of nature care heartily little about what humans find beautiful.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: