Tagged: CERN LHC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:04 pm on May 14, 2019 Permalink | Reply
    Tags: >Model-dependent vs model-independent research, , , , CERN LHC, , , , , , ,   

    From Symmetry: “Casting a wide net” 

    Symmetry Mag
    From Symmetry

    Jim Daley

    Illustration by Sandbox Studio, Chicago

    In their quest to discover physics beyond the Standard Model, physicists weigh the pros and cons of different search strategies.

    On October 30, 1975, theorists John Ellis, Mary K. Gaillard and D.V. Nanopoulos published a paper [Science Direct] titled “A Phenomenological Profile of the Higgs Boson.” They ended their paper with a note to their fellow scientists.

    “We should perhaps finish with an apology and a caution,” it said. “We apologize to experimentalists for having no idea what is the mass of the Higgs boson… and for not being sure of its couplings to other particles, except that they are probably all very small.

    “For these reasons, we do not want to encourage big experimental searches for the Higgs boson, but we do feel that people performing experiments vulnerable to the Higgs boson should know how it may turn up.”

    What the theorists were cautioning against was a model-dependent search, a search for a particle predicted by a certain model—in this case, the Standard Model of particle physics.

    Standard Model of Particle Physics

    It shouldn’t have been too much of a worry. Around then, most particle physicists’ experiments were general searches, not based on predictions from a particular model, says Jonathan Feng, a theoretical particle physicist at the University of California, Irvine.

    Using early particle colliders, physicists smashed electrons and protons together at high energies and looked to see what came out. Samuel Ting and Burton Richter, who shared the 1976 Nobel Prize in physics for the discovery of the charm quark, for example, were not looking for the particle with any theoretical prejudice, Feng says.

    That began to change in the 1980s and ’90s. That’s when physicists began exploring elegant new theories such as supersymmetry, which could tie up many of the Standard Model’s theoretical loose ends—and which predict the existence of a whole slew of new particles for scientists to try to find.

    Of course, there was also the Higgs boson. Even though scientists didn’t have a good prediction of its mass, they had good motivations for thinking it was out there waiting to be discovered.

    And it was. Almost 40 years after the theorists’ tongue-in-cheek warning about searching for the Higgs, Ellis found himself sitting in the main auditorium at CERN next to experimentalist Fabiola Gianotti, the spokesperson of the ATLAS experiment at the Large Hadron Collider who, along with CMS spokesperson Joseph Incandela, had just co-announced the discovery of the particle he had once so pessimistically described.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    Model-dependent vs model-independent

    Scientists’ searches for particles predicted by certain models continue, but in recent years, searches for new physics independent of those models have begun to enjoy a resurgence as well.

    “A model-independent search is supposed to distill the essence from a whole bunch of specific models and look for something that’s independent of the details,” Feng says. The goal is to find an interesting common feature of those models, he explains. “And then I’m going to just look for that phenomenon, irrespective of the details.”

    Particle physicist Sara Alderweireldt uses model-independent searches in her work on the ATLAS experiment at the Large Hadron Collider.

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    Alderweireldt says that while many high-energy particle physics experiments are designed to make very precise measurements of a specific aspect of the Standard Model, a model-independent search allows physicists to take a wider view and search more generally for new particles or interactions. “Instead of zooming in, we try to look in as many places as possible in a consistent way.”

    Such a search makes room for the unexpected, she says. “You’re not dependent on the prior interpretation of something you would be looking for.”

    Theorist Patrick Fox and experimentalist Anadi Canepa, both at Fermilab, collaborate on searches for new physics.

    In Canepa’s work on the CMS experiment, the other general-purpose particle detector at the LHC, many of the searches are model-independent.

    While the nature of these searches allows them to “cast a wider net,” Fox says, “they are in some sense shallower, because they don’t manage to strongly constrain any one particular model.”

    At the same time, “by combining the results from many independent searches, we are getting closer to one dedicated search,” Canepa says. “Developing both model-dependent and model-independent searches is the approach adopted by the CMS and ATLAS experiments to fully exploit the unprecedented potential of the LHC.”

    Driven by data and powered by machine learning

    Model-dependent searches focus on a single assumption or look for evidence of a specific final state following an experimental particle collision. Model-independent searches are far broader—and how broad is largely driven by the speed at which data can be processed.

    “We have better particle detectors, and more advanced algorithms and statistical tools that are enabling us to understand searches in broader terms,” Canepa says.

    One reason model-independent searches are gaining prominence is because now there is enough data to support them. Particle detectors are recording vast quantities of information, and modern computers can run simulations faster than ever before, she says. “We are able to do model-independent searches because we are able to better understand much larger amounts of data and extreme regions of parameter and phase space.”

    Machine-learning is a key part of this processing power, Canepa says. “That’s really a change of paradigm, because it really made us make a major leap forward in terms of sensitivity [to new signals]. It really allows us to benefit from understanding the correlations that we didn’t capture in a more classical approach.”

    These broader searches are an important part of modern particle physics research, Fox says.

    “At a very basic level, our job is to bequeath to our descendants a better understanding of nature than we got from our ancestors,” he says. “One way to do that is to produce lots of information that will stand the test of time, and one way of doing that is with model-independent searches.”

    Models go in and out of fashion, he adds. “But model-independent searches don’t feel like they will.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 11:26 am on May 5, 2019 Permalink | Reply
    Tags: 'Where Does A Proton’s Mass Come From?', 99.8% of the proton’s mass comes from gluons, , Antiquarks, Asymptotic freedom: the particles that mediate this force are known as gluons., , , CERN LHC, , , , , , , , The production of Higgs bosons is dominated by gluon-gluon collisions at the LHC, , The strong interaction is the most powerful interaction in the entire known Universe.   

    From Ethan Siegel: “Ask Ethan: ‘Where Does A Proton’s Mass Come From?'” 

    From Ethan Siegel
    May 4, 2019

    The three valence quarks of a proton contribute to its spin, but so do the gluons, sea quarks and antiquarks, and orbital angular momentum as well. The electrostatic repulsion and the attractive strong nuclear force, in tandem, are what give the proton its size, and the properties of quark mixing are required to explain the suite of free and composite particles in our Universe. (APS/ALAN STONEBRAKER)

    The whole should equal the sum of its parts, but doesn’t. Here’s why.

    The whole is equal to the sum of its constituent parts. That’s how everything works, from galaxies to planets to cities to molecules to atoms. If you take all the components of any system and look at them individually, you can clearly see how they all fit together to add up to the entire system, with nothing missing and nothing left over. The total amount you have is equal to the amounts of all the different parts of it added together.

    So why isn’t that the case for the proton? It’s made of three quarks, but if you add up the quark masses, they not only don’t equal the proton’s mass, they don’t come close. This is the puzzle that Barry Duffey wants us to address, asking:

    “What’s happening inside protons? Why does [its] mass so greatly exceed the combined masses of its constituent quarks and gluons?”

    In order to find out, we have to take a deep look inside.

    The composition of the human body, by atomic number and by mass. The whole of our bodies is equal to the sum of its parts, until you get down to an extremely fundamental level. At that point, we can see that we’re actually more than the sum of our constituent components. (ED UTHMAN, M.D., VIA WEB2.AIRMAIL.NET/UTHMAN (L); WIKIMEDIA COMMONS USER ZHAOCAROL (R))

    There’s a hint that comes just from looking at your own body. If you were to divide yourself up into smaller and smaller bits, you’d find — in terms of mass — the whole was equal to the sum of its parts. Your body’s bones, fat, muscles and organs sum up to an entire human being. Breaking those down further, into cells, still allows you to add them up and recover the same mass you have today.

    Cells can be divided into organelles, organelles are composed of individual molecules, molecules are made of atoms; at each stage, the mass of the whole is no different than that of its parts. But when you break atoms into protons, neutrons and electrons, something interesting happens. At that level, there’s a tiny but noticeable discrepancy: the individual protons, neutrons and electrons are off by right around 1% from an entire human. The difference is real.

    From macroscopic scales down to subatomic ones, the sizes of the fundamental particles play only a small role in determining the sizes of composite structures. Whether the building blocks are truly fundamental and/or point-like particles is still not known. (MAGDALENA KOWALSKA / CERN / ISOLDE TEAM)


    Like all known organisms, human beings are carbon-based life forms. Carbon atoms are made up of six protons and six neutrons, but if you look at the mass of a carbon atom, it’s approximately 0.8% lighter than the sum of the individual component particles that make it up. The culprit here is nuclear binding energy; when you have atomic nuclei bound together, their total mass is smaller than the mass of the protons and neutrons that comprise them.

    The way carbon is formed is through the nuclear fusion of hydrogen into helium and then helium into carbon; the energy released is what powers most types of stars in both their normal and red giant phases. That “lost mass” is where the energy powering stars comes from, thanks to Einstein’s E = mc². As stars burn through their fuel, they produce more tightly-bound nuclei, releasing the energy difference as radiation.

    In between the 2nd and 3rd brightest stars of the constellation Lyra, the blue giant stars Sheliak and Sulafat, the Ring Nebula shines prominently in the night skies. Throughout all phases of a star’s life, including the giant phase, nuclear fusion powers them, with the nuclei becoming more tightly bound and the energy emitted as radiation coming from the transformation of mass into energy via E = mc². (NASA, ESA, DIGITIZED SKY SURVEY 2)

    NASA/ESA Hubble Telescope

    ESO Online Digitized Sky Survey Telescopes

    Caltech Palomar Samuel Oschin 48 inch Telescope, located in San Diego County, California, United States, altitude 1,712 m (5,617 ft)

    Australian Astronomical Observatory, Siding Spring Observatory, near Coonabarabran, New South Wales, Australia, 1.2m UK Schmidt Telescope, Altitude 1,165 m (3,822 ft)

    From http://archive.eso.org/dss/dss

    This is how most types of binding energy work: the reason it’s harder to pull apart multiple things that are bound together is because they released energy when they were joined, and you have to put energy in to free them again. That’s why it’s such a puzzling fact that when you take a look at the particles that make up the proton — the up, up, and down quarks at the heart of them — their combined masses are only 0.2% of the mass of the proton as a whole. But the puzzle has a solution that’s rooted in the nature of the strong force itself.

    The way quarks bind into protons is fundamentally different from all the other forces and interactions we know of. Instead of the force getting stronger when objects get closer, like the gravitational, electric, or magnetic forces, the attractive force goes down to zero when quarks get arbitrarily close. And instead of the force getting weaker when objects get farther away, the force pulling quarks back together gets stronger the farther away they get.

    The internal structure of a proton, with quarks, gluons, and quark spin shown. The nuclear force acts like a spring, with negligible force when unstretched but large, attractive forces when stretched to large distances. (BROOKHAVEN NATIONAL LABORATORY)

    This property of the strong nuclear force is known as asymptotic freedom, and the particles that mediate this force are known as gluons. Somehow, the energy binding the proton together, responsible for the other 99.8% of the proton’s mass, comes from these gluons. The whole of matter, somehow, weighs much, much more than the sum of its parts.

    This might sound like an impossibility at first, as the gluons themselves are massless particles. But you can think of the forces they give rise to as springs: asymptoting to zero when the springs are unstretched, but becoming very large the greater the amount of stretching. In fact, the amount of energy between two quarks whose distance gets too large can become so great that it’s as though additional quark/antiquark pairs exist inside the proton: sea quarks.

    When two protons collide, it isn’t just the quarks making them up that can collide, but the sea quarks, gluons, and beyond that, field interactions. All can provide insights into the spin of the individual components, and allow us to create potentially new particles if high enough energies and luminosities are reached. (CERN / CMS COLLABORATION)

    Those of you familiar with quantum field theory might have the urge to dismiss the gluons and the sea quarks as just being virtual particles: calculational tools used to arrive at the right result. But that’s not true at all, and we’ve demonstrated that with high-energy collisions between either two protons or a proton and another particle, like an electron or photon.

    The collisions performed at the Large Hadron Collider at CERN are perhaps the greatest test of all for the internal structure of the proton. When two protons collide at these ultra-high energies, most of them simply pass by one another, failing to interact. But when two internal, point-like particles collide, we can reconstruct exactly what it was that smashed together by looking at the debris that comes out.

    A Higgs boson event as seen in the Compact Muon Solenoid detector at the Large Hadron Collider. This spectacular collision is 15 orders of magnitude below the Planck energy, but it’s the precision measurements of the detector that allow us to reconstruct what happened back at (and near) the collision point. Theoretically, the Higgs gives mass to the fundamental particles; however, the proton’s mass is not due to the mass of the quarks and gluons that compose it. (CERN / CMS COLLABORATION)

    Under 10% of the collisions occur between two quarks; the overwhelming majority are gluon-gluon collisions, with quark-gluon collisions making up the remainder. Moreover, not every quark-quark collision in protons occurs between either up or down quarks; sometimes a heavier quark is involved.

    Although it might make us uncomfortable, these experiments teach us an important lesson: the particles that we use to model the internal structure of protons are real. In fact, the discovery of the Higgs boson itself was only possible because of this, as the production of Higgs bosons is dominated by gluon-gluon collisions at the LHC. If all we had were the three valence quarks to rely on, we would have seen different rates of production of the Higgs than we did.

    Before the mass of the Higgs boson was known, we could still calculate the expected production rates of Higgs bosons from proton-proton collisions at the LHC. The top channel is clearly production by gluon-gluon collisions. I (E. Siegel) have added the yellow highlighted region to indicate where the Higgs boson was discovered. (CMS COLLABORATION (DORIGO, TOMMASO FOR THE COLLABORATION) ARXIV:0910.3489)

    As always, though, there’s still plenty more to learn. We presently have a solid model of the average gluon density inside a proton, but if we want to know where the gluons are actually more likely to be located, that requires more experimental data, as well as better models to compare the data against. Recent advances by theorists Björn Schenke and Heikki Mäntysaari may be able to provide those much needed models. As Mäntysaari detailed:

    “It is very accurately known how large the average gluon density is inside a proton. What is not known is exactly where the gluons are located inside the proton. We model the gluons as located around the three [valence] quarks. Then we control the amount of fluctuations represented in the model by setting how large the gluon clouds are, and how far apart they are from each other. […] The more fluctuations we have, the more likely this process [producing a J/ψ meson] is to happen.”

    A schematic of the world’s first electron-ion collider (EIC). Adding an electron ring (red) to the Relativistic Heavy Ion Collider (RHIC) at Brookhaven would create the eRHIC: a proposed deep inelastic scattering experiment that could improve our knowledge of the internal structure of the proton significantly. (BROOKHAVEN NATIONAL LABORATORY-CAD ERHIC GROUP)

    The combination of this new theoretical model and the ever-improving LHC data will better enable scientists to understand the internal, fundamental structure of protons, neutrons and nuclei in general, and hence to understand where the mass of the known objects in the Universe comes from. From an experimental point of view, the greatest boon would be a next-generation electron-ion collider, which would enable us to perform deep inelastic scattering experiments to reveal the internal makeup of these particles as never before.

    But there’s another theoretical approach that can take us even farther into the realm of understanding where the proton’s mass comes from: Lattice QCD.

    A better understanding of the internal structure of a proton, including how the “sea” quarks and gluons are distributed, has been achieved through both experimental improvements and new theoretical developments in tandem. (BROOKHAVEN NATIONAL LABORATORY)

    The difficult part with the quantum field theory that describes the strong force — quantum chromodynamics (QCD) — is that the standard approach we take to doing calculations is no good. Typically, we’d look at the effects of particle couplings: the charged quarks exchange a gluon and that mediates the force. They could exchange gluons in a way that creates a particle-antiparticle pair or an additional gluon, and that should be a correction to a simple one-gluon exchange. They could create additional pairs or gluons, which would be higher-order corrections.

    We call this approach taking a perturbative expansion in quantum field theory, with the idea that calculating higher and higher-order contributions will give us a more accurate result.

    Today, Feynman diagrams are used in calculating every fundamental interaction spanning the strong, weak, and electromagnetic forces, including in high-energy and low-temperature/condensed conditions. But this approach, which relies on a perturbative expansion, is only of limited utility for the strong interactions, as this approach diverges, rather than converges, when you add more and more loops for QCD.(DE CARVALHO, VANUILDO S. ET AL. NUCL.PHYS. B875 (2013) 738–756)

    Richard Feynman © Open University

    But this approach, which works so well for quantum electrodynamics (QED), fails spectacularly for QCD. The strong force works differently, and so these corrections get very large very quickly. Adding more terms, instead of converging towards the correct answer, diverges and takes you away from it. Fortunately, there is another way to approach the problem: non-perturbatively, using a technique called Lattice QCD.

    By treating space and time as a grid (or lattice of points) rather than a continuum, where the lattice is arbitrarily large and the spacing is arbitrarily small, you overcome this problem in a clever way. Whereas in standard, perturbative QCD, the continuous nature of space means that you lose the ability to calculate interaction strengths at small distances, the lattice approach means there’s a cutoff at the size of the lattice spacing. Quarks exist at the intersections of grid lines; gluons exist along the links connecting grid points.

    As your computing power increases, you can make the lattice spacing smaller, which improves your calculational accuracy. Over the past three decades, this technique has led to an explosion of solid predictions, including the masses of light nuclei and the reaction rates of fusion under specific temperature and energy conditions. The mass of the proton, from first principles, can now be theoretically predicted to within 2%.

    As computational power and Lattice QCD techniques have improved over time, so has the accuracy to which various quantities about the proton, such as its component spin contributions, can be computed. By reducing the lattice spacing size, which can be done simply by raising the computational power employed, we can better predict the mass of not only the proton, but of all the baryons and mesons. (LABORATOIRE DE PHYSIQUE DE CLERMONT / ETM COLLABORATION)

    It’s true that the individual quarks, whose masses are determined by their coupling to the Higgs boson, cannot even account for 1% of the mass of the proton. Rather, it’s the strong force, described by the interactions between quarks and the gluons that mediate them, that are responsible for practically all of it.

    The strong nuclear force is the most powerful interaction in the entire known Universe. When you go inside a particle like the proton, it’s so powerful that it — not the mass of the proton’s constituent particles — is primarily responsible for the total energy (and therefore mass) of the normal matter in our Universe. Quarks may be point-like, but the proton is huge by comparison: 8.4 × 10^-16 m in diameter. Confining its component particles, which the binding energy of the strong force does, is what’s responsible for 99.8% of the proton’s mass.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

  • richardmitnick 1:01 pm on May 2, 2019 Permalink | Reply
    Tags: , An unexpected signature, , CERN LHC, It’s not always about what you discover, Nature might be tough with us- but maybe nature is testing us and making us stronger., , , , Taking a closer look, Why the force of gravity is so much weaker than other known forces like electromagnetism. There is only one right answer. We haven’t found it yet.   

    From Symmetry: “The unseen progress of the LHC” 

    Symmetry Mag
    From Symmetry

    Sarah Charley


    CERN map

    CERN LHC Maximilien Brice and Julien Marius Ordan

    CERN LHC particles

    It’s not always about what you discover.

    About seven years ago, physicist Stephane Willocq at the University of Massachusetts became enthralled with a set of theories that predicted the existence of curled-up extra dimensions hiding within our classical four dimensions of spacetime.

    “The idea of extra spatial dimensions is appealing because it allows us to look at the fundamental problems in particle physics from a different viewpoint,” Willocq says.

    As an experimental physicist, Willocq can do more than ponder. At the Large Hadron Collider at CERN, he put his pet theories to the test.

    Models based on those theories predicted how curled-up extra dimensions would affect the outcome of proton-proton collisions at the LHC. They predicted the collisions would produce more high-energy particles than expected.

    After several searches, Willocq and his colleagues found nothing out of the ordinary. “It was a great idea and disappointing to see it fade away, bit by bit,” he says, “but that’s how scientific progress works—finding the right idea by process of elimination.”

    The LHC research program is famous for discovering and studying the long-sought Higgs boson. But out of the spotlight, scientists have been using the LHC for an equally important scientific endeavor: testing, constraining and eliminating hundreds of theories that propose solutions to outstanding problems in physics, such as why the force of gravity is so much weaker than other known forces like electromagnetism.

    “There is only one right answer,” Willocq says. “We haven’t found it yet.”

    Now that scientists are at the end of the second run of the LHC, they have covered a huge amount of ground, eliminating the simplest versions of numerous theoretical ideas. They’ve covered four times as much phase space as previous searches for heavy new particles and set strict limits on what is physically possible.

    These studies don’t get the same attention as the Higgs boson, but these null results—results that don’t support a certain hypothesis—have moved physics forward as well.

    An unexpected signature

    Having chased down their most obvious leads, physicists are now adapting their methodology and considering new possibilities in their pursuit of new physics.

    Thus far, physicists have often used a straightforward formula to look for new particles. Massive particles produced in particle collisions will almost instantly decay, transforming into more stable particles. If scientists can measure all of those particles, they can reconstruct the mass and properties of the original particle that produced them.

    This worked wonderfully when scientists discovered the top quark in 1995 and the Higgs boson in 2012. But finding the next new thing might take a different tactic.

    “Finding new physics is more challenging than we expected it to be,” says University of Wisconsin physicist Tulika Bose of the CMS experiment. “Challenging situations make people come up with clever ideas.”

    One idea is that maybe scientists have been so focused on instantly decaying particles that they’ve been missing a whole host of particles that can travel up to several meters before falling apart. This would look like a firework exploding randomly in one of the detector subsystems.

    Scientists are rethinking how they reconstruct the data as a way to cast a bigger net and potentially catch particles with signatures like these. “If we only used our standard analysis methods, we would definitely not be sensitive to anything like this,” Bose says. “We’re no longer just reloading previous analyses but exploring innovative ideas.”

    Taking a closer look

    Since looking for excess particles coming out of collisions has yet to yield evidence of extra spatial dimensions, Willocq has decided to devote some of his efforts to a different method used at LHC experiments: precision measurements.

    Models also make predictions about properties of particles such as how often they decay into one set of particles versus another set. If precise measurements show deviations from predictions by the Standard Model of particle physics, it can mean that something new is at play.

    “Several new physics models predict an enhanced rate of rare subatomic processes,” Bose says. “However, their rates are so low that we have not been able to measure them yet.”

    In the past, precision measurements of well-known particles have overturned seemingly bulletproof paradigms. In the 1940s, for example, the measurement of a property called the “magnetic moment” of the neutron showed that it was not a fundamental particle, as had been previously assumed. This eventually helped lead to the discovery of particles that make up neutrons: quarks.

    Another example is the measurement of the mismatched decays of certain matter and antimatter particles, which led to the prediction of a new group of quarks—later confirmed by the discoveries of the top and bottom quarks.

    The plan for the LHC research program is to collect a huge amount of data, which will give scientists the resolution they need to examine every shadowy corner of the Standard Model.

    “This work naturally pushes our search methods towards making more detailed and higher precision measurements that will help us constrain possible deviations by new physics,” Willocq says.

    Because many of these predictions have never been thoroughly tested, scientists are hoping that they’ll find a few small deviations that could open the door to a new era of physics research. “Nature might be tough with us,” Bose says, “but maybe nature is testing us and making us stronger.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 12:32 pm on April 18, 2019 Permalink | Reply
    Tags: "When Beauty Gets in the Way of Science", , CERN LHC, , , , , , , , ,   

    From Nautilus: “When Beauty Gets in the Way of Science” 


    From Nautilus

    April 18, 2019
    Sabine Hossenfelder

    Insisting that new ideas must be beautiful blocks progress in particle physics.

    When Beauty Gets in the Way of Science. Nautilus

    The biggest news in particle physics is no news. In March, one of the most important conferences in the field, Rencontres de Moriond, took place. It is an annual meeting at which experimental collaborations present preliminary results. But the recent data from the Large Hadron Collider (LHC), currently the world’s largest particle collider, has not revealed anything new.


    CERN map

    CERN LHC Tunnel

    CERN LHC particles

    Forty years ago, particle physicists thought themselves close to a final theory for the structure of matter. At that time, they formulated the Standard Model of particle physics to describe the elementary constituents of matter and their interactions.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    After that, they searched for the predicted, but still missing, particles of the Standard Model. In 2012, they confirmed the last missing particle, the Higgs boson.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    The Higgs boson is necessary to make sense of the rest of the Standard Model. Without it, the other particles would not have masses, and probabilities would not properly add up to one. Now, with the Higgs in the bag, the Standard Model is complete; all Pokémon caught.

    HIGGS HANGOVER: After the Large Hadron Collider (above) confirmed the Higgs boson, which validated the Standard Model, it’s produced nothing newsworthy, and is unlikely to, says physicist Sabine Hossenfelder.Shutterstock

    The Standard Model may be physicists’ best shot at the structure of fundamental matter, but it leaves them wanting. Many particle physicists think it is simply too ugly to be nature’s last word. The 25 particles of the Standard Model can be classified by three types of symmetries that correspond to three fundamental forces: The electromagnetic force, and the strong and weak nuclear forces. Physicists, however, would rather there was only one unified force. They would also like to see an entirely new type of symmetry, the so-called “supersymmetry,” because that would be more appealing.

    Supersymmetry builds on the Standard Model, with many new supersymmetric particles, represented here with a tilde (~) on them. ( From the movie “Particle fever” reproduced by Mark Levinson)

    Oh, and additional dimensions of space would be pretty. And maybe also parallel universes. Their wish list is long.

    It has become common practice among particle physicists to use arguments from beauty to select the theories they deem worthy of further study. These criteria of beauty are subjective and not evidence-based, but they are widely believed to be good guides to theory development. The most often used criteria of beauty in the foundations of physics are presently simplicity and naturalness.

    By “simplicity,” I don’t mean relative simplicity, the idea that the simplest theory is the best (a.k.a. “Occam’s razor”). Relying on relative simplicity is good scientific practice. The desire that a theory be simple in absolute terms, in contrast, is a criterion from beauty: There is no deep reason that the laws of nature should be simple. In the foundations of physics, this desire for absolute simplicity presently shows in physicists’ hope for unification or, if you push it one level further, in the quest for a “Theory of Everything” that would merge the three forces of the Standard Model with gravity.

    The other criterion of beauty, naturalness, requires that pure numbers that appear in a theory (i.e., those without units) should neither be very large nor very small; instead, these numbers should be close to one. Exactly how close these numbers should be to one is debatable, which is already an indicator of the non-scientific nature of this argument. Indeed, the inability of particle physicists to quantify just when a lack of naturalness becomes problematic highlights that the fact that an unnatural theory is utterly unproblematic. It is just not beautiful.

    Anyone who has a look at the literature of the foundations of physics will see that relying on such arguments from beauty has been a major current in the field for decades. It has been propagated by big players in the field, including Steven Weinberg, Frank Wilczek, Edward Witten, Murray Gell-Mann, and Sheldon Glashow. Countless books popularized the idea that the laws of nature should be beautiful, written, among others, by Brian Greene, Dan Hooper, Gordon Kane, and Anthony Zee. Indeed, this talk about beauty has been going on for so long that at this point it seems likely most people presently in the field were attracted by it in the first place. Little surprise, then, they can’t seem to let go of it.

    Trouble is, relying on beauty as a guide to new laws of nature is not working.

    Since the 1980s, dozens of experiments looked for evidence of unified forces and supersymmetric particles, and other particles invented to beautify the Standard Model. Physicists have conjectured hundreds of hypothetical particles, from “gluinos” and “wimps” to “branons” and “cuscutons,” each of which they invented to remedy a perceived lack of beauty in the existing theories. These particles are supposed to aid beauty, for example, by increasing the amount of symmetries, by unifying forces, or by explaining why certain numbers are small. Unfortunately, not a single one of those particles has ever been seen. Measurements have merely confirmed the Standard Model over and over again. And a theory of everything, if it exists, is as elusive today as it was in the 1970s. The Large Hadron Collider is only the most recent in a long series of searches that failed to confirm those beauty-based predictions.

    These decades of failure show that postulating new laws of nature just because they are beautiful according to human standards is not a good way to put forward scientific hypotheses. It’s not the first time this has happened. Historical precedents are not difficult to find. Relying on beauty did not work for Kepler’s Platonic solids, it did not work for Einstein’s idea of an eternally unchanging universe, and it did not work for the oh-so-pretty idea, popular at the end of the 19th century, that atoms are knots in an invisible ether. All of these theories were once considered beautiful, but are today known to be wrong. Physicists have repeatedly told me about beautiful ideas that didn’t turn out to be beautiful at all. Such hindsight is not evidence that arguments from beauty work, but rather that our perception of beauty changes over time.

    That beauty is subjective is hardly a breakthrough insight, but physicists are slow to learn the lesson—and that has consequences. Experiments that test ill-motivated hypotheses are at high risk to only find null results; i.e., to confirm the existing theories and not see evidence of new effects. This is what has happened in the foundations of physics for 40 years now. And with the new LHC results, it happened once again.

    The data analyzed so far shows no evidence for supersymmetric particles, extra dimensions, or any other physics that would not be compatible with the Standard Model. In the past two years, particle physicists were excited about an anomaly in the interaction rates of different leptons. The Standard Model predicts these rates should be identical, but the data demonstrates a slight difference. This “lepton anomaly” has persisted in the new data, but—against particle physicists’ hopes—it did not increase in significance, is hence not a sign for new particles. The LHC collaborations succeeded in measuring the violation of symmetry in the decay of composite particles called “D-mesons,” but the measured effect is, once again, consistent with the Standard Model. The data stubbornly repeat: Nothing new to see here.

    Of course it’s possible there is something to find in the data yet to be analyzed. But at this point we already know that all previously made predictions for new physics were wrong, meaning that there is now no reason to expect anything new to appear.

    Yes, null results—like the recent LHC measurements—are also results. They rule out some hypotheses. But null results are not very useful results if you want to develop a new theory. A null-result says: “Let’s not go this way.” A result says: “Let’s go that way.” If there are many ways to go, discarding some of them does not help much.

    To find the way forward in the foundations of physics, we need results, not null-results. When testing new hypotheses takes decades of construction time and billions of dollars, we have to be careful what to invest in. Experiments have become too costly to rely on serendipitous discoveries. Beauty-based methods have historically not worked. They still don’t work. It’s time that physicists take note.

    And it’s not like the lack of beauty is the only problem with the current theories in the foundations of physics. There are good reasons to think physics is not done. The Standard Model cannot be the last word, notably because it does not contain gravity and fails to account for the masses of neutrinos. It also describes neither dark matter nor dark energy, which are necessary to explain galactic structures.

    So, clearly, the foundations of physics have problems that require answers. Physicists should focus on those. And we currently have no reason to think that colliding particles at the next higher energies will help solve any of the existing problems. New effects may not appear until energies are a billion times higher than what even the next larger collider could probe. To make progress, then, physicists must, first and foremost, learn from their failed predictions.

    So far, they have not. In 2016, the particle physicists Howard Baer, Vernon Barger, and Jenny List wrote an essay for Scientific American arguing that we need a larger particle collider to “save physics.” The reason? A theory the authors had proposed themselves, that is natural (beautiful!) in a specific way, predicts such a larger collider should see new particles. This March, Kane, a particle physicist, used similar beauty-based arguments in an essay for Physics Today. And a recent comment in Nature Reviews Physics about a big, new particle collider planned in Japan once again drew on the same motivations from naturalness that have already not worked for the LHC. Even the particle physicists who have admitted their predictions failed do not want to give up beauty-based hypotheses. Instead, they have argued we need more experiments to test just how wrong they are.

    Will this latest round of null-results finally convince particle physicists that they need new methods of theory-development? I certainly hope so.

    As an ex-particle physicist myself, I understand very well the desire to have an all-encompassing theory for the structure of matter. I can also relate to the appeal of theories such a supersymmetry or string theory. And, yes, I quite like the idea that we live in one of infinitely many universes that together make up the “multiverse.” But, as the latest LHC results drive home once again, the laws of nature care heartily little about what humans find beautiful.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

  • richardmitnick 11:37 am on April 16, 2019 Permalink | Reply
    Tags: , , CERN LHC, , , , ,   

    From Symmetry: “A collision of light” 

    Symmetry Mag
    From Symmetry

    Sarah Charley

    Natasha Hartono

    One of the latest discoveries from the LHC takes the properties of photons beyond what your electrodynamics teacher will tell you in class.

    Professor Anne Sickles is currently teaching a laboratory class at the University of Illinois in which her students will measure what happens when two photons meet.

    What they will find is that the overlapping waves of light get brighter when two peaks align and dimmer when a peak meets a trough. She tells her students that this is process called interference, and that—unlike charged particles, which can merge, bond and interact—light waves can only add or subtract.

    “We teach undergraduates the classical theory,” Sickles says. “But there are situations where effects forbidden in the classical theory are allowed in the quantum theory.”

    Sickles is a collaborator on the ATLAS experiment at CERN and studies what happens when particles of light meet inside the Large Hadron Collider.



    CERN map

    CERN LHC Tunnel

    CERN LHC particles

    For most of the year, the LHC collides protons, but for about a month each fall, the LHC switches things up and collides heavy atomic nuclei, such as lead ions. The main purpose of these lead collisions is to study a hot and dense subatomic fluid called the quark-gluon plasma, which is harder to create in collisions of protons. But these ion runs also enable scientists to turn the LHC into a new type of machine: a photon-photon collider.

    “This result demonstrates that photons can scatter off each other and change each other’s direction,” says Peter Steinberg, and ATLAS scientist at Brookhaven National Laboratory.

    When heavy nuclei are accelerated in the LHC, they are encased within an electromagnetic aura generated by their large positive charges.

    As the nuclei travel faster and faster, their surrounding fields are squished into disks, making them much more concentrated. When two lead ions pass closely enough that their electromagnetic fields swoosh through one another, the high-energy photons which ultimately make up these fields can interact. In rare instances, a photon from one lead ion will merge with a photon from an oncoming lead ion, and they will ricochet in different directions.

    However, according to Steinberg, it’s not as simple as two solid particles bouncing off each other. Light particles are both chargeless and massless, and must go through a quantum mechanical loophole (literally called a quantum loop) to interact with one another.

    “That’s why this process is so rare,” he says. “They have no way to bounce off of each other without help.”

    When the two photons see each other inside the LHC, they sometimes overreact with excitement and split themselves into an electron and positron pair. These electron-positron pairs are not fully formed entities, but rather unstable quantum fluctuations that scientists call virtual particles. The four virtual particles swirl into each other and recombine to form two new photons, which scatter off at weird angles into the detector.

    “It’s like a quantum-mechanical square dance,” Steinberg says.

    When ATLAS first saw hints of this process in 2017, they had only 13 candidate events with the correct characteristics (collisions that resulted in two low-energy photons inside the detector and nothing else).

    After another two years of data taking, they have now collected 59 candidate events, bumping this original observation into the statistical certainty of a full-fledged discovery.

    Steinberg sees this discovery as a big win for quantum electrodynamics, a theory about the quantum behavior of light that predicted this interaction. “This amazingly precise theory, which was developed in the first half of the 20th century, made a prediction that we are finally able to confirm many decades later.”

    Sickles says she is looking forward to exploring these kinds of light-by-light interactions and figuring out what else they could teach us about the laws of physics. “It’s one thing to see something,” she says. “It’s another thing to study it.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 11:26 am on April 12, 2019 Permalink | Reply
    Tags: , CERN LHC, , , , , ,   

    From Fermi National Accelerator Lab: “Quarks, squarks, stops and charm at this year’s Moriond conference” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    April 11, 2019
    Don Lincoln

    Fermilab RAs Kevin Pedro and Nadja Strobbe presented a variety of CMS and ATLAS research results at the 53rd annual Recontres de Moriond conference.

    This March, scientists from around the world gathered in LaThuile, Italy, for the 53rd annual Recontres de Moriond conference, one of the longest running and most prestigious conferences in particle physics. This conference is broken into two distinct weeks, with the first week usually covering electroweak physics and the second covering processes involving quantum chromodynamics. Fermilab and the LHC Physics Center were well represented at the conference.

    Fermilab research associates Kevin Pedro and Nadja Strobbe from the CMS group both presented talks on LHC physics result. Pedro spoke on searches for new physics with unconventional signatures at both the ATLAS and CMS experiments. The interest in unusual signatures is driven by the fact that many researchers have already searched for more commonly accepted physical processes. Looking for the unconventional opens up the possibility of unanticipated discoveries. Pedro covered long-lived particles emerging from a complex dark matter sector. The signature for this possible physics result is a jet that originates far from the interaction vertex. He also covered long-lived particles that disappear in the detector. This is a signature for a form of supersymmetry.

    Strobbe presented a thorough overview of searches for strong-force-produced signatures of supersymmetry. She covered both ATLAS and CMS results, covering a broad range of signatures, including the associated production of b quarks and Higgs bosons, diphotons, several stop squark analyses, and the associated production of three bottom quarks and missing transverse momentum. In total, she presented 12 distinct analyses. The phenomenology of strong-force-produced supersymmetry is diverse, and it provides a rich source for the possible discovery of new physics. This is Strobbe’s last Moriond presentation as a Fermilab research associate, as she has recently accepted a faculty position at the University of Minnesota, where she will be starting in the fall.

    Strobbe and Pedro were not the only people associated with the LHC Physics Center presenting or involved at Moriond. Fermilab Senior Scientist Boaz Klima has long been a member of the organizing committee. Meng Xiao (Johns Hopkins) and Greg Landsberg (Brown) also presented.

    More broadly, many interesting physics topics were covered at the conference. The LHCb experiment announced the discovery of new pentaquarks containing charm quarks. They also reported that a peak in the data that was previously thought to be a single pentaquark was actually two distinct particles. Studies of mesons containing both bottom and charm quarks were very well-represented, with ATLAS, CMS and LHCb all making presentations. In the first week of the Moriond conference, both ATLAS and LHCb announced studies in the matter-antimatter asymmetry in decays of mesons containing both bottom and strange quarks. And in an example of very quick inter-collaboration cooperation, the experiments presented a combined result in the second week.

    While the LHC is best known for colliding two beams of protons (studies of which were well represented at Moriond), the LHC also collides lead ions to study the behavior of superhot quark matter – what is called quark-gluon plasma. ALICE presented studies of charmed mesons called J/psi, which showed that charm quarks are affected in quark-gluon plasmas, just like lighter quarks. The ALICE experiment presented data gathered in a special run of proton-proton collisions at an energy unusual for the LHC an observation of charmed baryons in LHC collisions. These particles occur more often in proton-proton collisions than in electron-positron ones.

    The Moriond conference is a fascinating one. It is small and cozy and allows for conversations and collaboration between researchers, with a storied history of over half a century. In its 53rd year, researchers are showing that its second half century will be just as exciting.

    Don Lincoln is a Fermilab scientist on the CMS experiment.

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world collaborate at Fermilab on experiments at the frontiers of discovery.

    FNAL MINERvA front face Photo Reidar Hahn


    FNAL Muon g-2 studio

    FNAL Short-Baseline Near Detector under construction

    FNAL Mu2e solenoid

    Dark Energy Camera [DECam], built at FNAL

    FNAL DUNE Argon tank at SURF


    FNAL Don Lincoln


    FNAL Cryomodule Testing Facility

    FNAL MINOS Far Detector in the Soudan Mine in northern Minnesota

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    FNAL/NOvA experiment map

    FNAL NOvA Near Detector


    FNAL Holometer

  • richardmitnick 11:37 am on April 1, 2019 Permalink | Reply
    Tags: "Highlights from the 2019 Moriond conference (electroweak physics)", , , CERN LHC, , , ,   

    From CERN: “Highlights from the 2019 Moriond conference (electroweak physics)” 

    Cern New Bloc

    Cern New Particle Event

    From CERN

    29 March, 2019

    The latest experimental data provide more stringent tests of the Standard Model and of rare phenomena of the microworld.

    At the 66th Rencontres de Moriond conference, which is taking place in La Thuile, Italy, physicists working at CERN are presenting their most recent results. Since the start of the conference on 16 March, a wide range of topics from measurements of the Higgs boson and Standard Model processes to searches for rare and exotic phenomena have been presented.

    The Standard Model of particle physics is a successful theory that describes how elementary particles and forces govern the properties of the Universe, but it is incomplete as it cannot explain certain phenomena, such as gravity, dark matter and dark energy.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    For this reason, physicists welcome any measurement that shows discrepancies with the Standard Model, as these give hints of new particles and new forces – of new physics, in other words. At the conference, the ATLAS and CMS collaborations have presented new results based on up to 140fb–1 of proton-proton collision data collected during Run 2 of the Large Hadron Collider (LHC) from 2015 to 2018. Many of these analyses benefited from novel machine-learning techniques used to extract data from background processes.

    Since the discovery of the Higgs boson in 2012, ATLAS and CMS physicists have made significant progress in understanding its properties, how it is formed and how it interacts with other known particles.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    Thanks to the large quantity of Higgs bosons produced in the collisions of Run 2, the collaborations were able to measure most of the Higgs boson’s main production and decay modes with a statistical significance far exceeding five standard deviations. In addition, many searches for new, additional Higgs bosons have been presented. From a combination of all Higgs boson measurements, ATLAS obtained new constraints on the Higgs self-coupling. CMS has presented updated results on the Higgs decay to two Z bosons and has also derived new information on the strength of the interaction between Higgs bosons and top quarks. This interaction is measured in two ways, using top quark pairs and using a rare process in which four top quarks are produced. The probability of four top quarks being produced at the LHC is about a factor of ten less likely than the production of Higgs bosons together with two top quarks, and about a factor of ten thousand less likely than the production of just a top quark pair.

    ATLAS event display showing the clean signature of light-by-light scattering (Image: ATLAS/CERN)

    The ATLAS collaboration has also reported first evidence for the simultaneous production of three W or Z bosons, which are the mediator particles of the weak force. Tri-boson production is a rare process predicted by the Standard Model, and is sensitive to possible contributions from yet unknown particles or forces. The very large new dataset has also been used by the ATLAS and CMS collaborations to expand the searches for new particles beyond the Standard Model at the energy available at the LHC. One of the possible theories is supersymmetry, an extension of the Standard Model, which features a symmetry between matter and force and introduces many new particles, including possible candidates for dark matter. These hypothetical particles have not been detected in experiments so far, and the collaborations have set stronger lower limits on the possible range of masses that they could have.

    A collision event recorded by CMS, containing a missing-transverse-energy signature, which is one of the characteristics sought in the search for SUSY (Image: CMS/CERN)

    The CMS collaboration has placed new limits on the parameters of new physics theories that describe hypothetical slowly moving heavy particles. These are detected by measuring how fast particles travel through the detector: while the regular particles propagate at speeds close to that of light, straight from the proton collisions, these heavy particles are expected to move measurably slower before decaying into a shower of other particles, creating a “delayed jet”. CMS has also presented first evidence for another rare process, the production of two W bosons in not one but two simultaneous interactions between the constituents of the colliding protons.

    In addition, ATLAS and CMS have presented new studies on the search for hypothetical Z′ (Z-prime) bosons. The existence of such neutral heavy particles is predicted by certain Grand Unified theories that could provide an elegant extension of the Standard Model. Although no significant signs of Z′ particles have been observed thus far, the results provide constraints on their production rate.

    The LHCb collaboration has presented several new measurements concerning particles containing beauty or charm quarks. Certain properties of these particles can be affected by the existence of new particles beyond the Standard Model. This allows LHCb to search for signs of new physics via a complementary, indirect route. One much anticipated result, shown for the first time at the conference, is a measurement using data taken from 2011 to 2016 of the ratio of two related rare decays of a B+ particle. These decays are predicted in the Standard Model to occur at the same rate to within 1%; the data collected are consistent with this prediction but favour a lower value. This follows a pattern of intriguing hints in other, similar decay processes; while none of these results are significant enough to constitute evidence of new physics on their own, they have captured the interest of physicists and will be investigated further with the full LHCb data set. LHCb also presented the first observation of matter–antimatter asymmetry known as CP violation in charm particle decays, as reported in a dedicated press release last week.

    Finally, using the results of lead-ion collisions taken in 2018, the ATLAS collaboration has been able to clearly observe a very rare phenomenon in which two photons – particles of light – interact, producing another pair of photons, with a significance of over 8 standard deviations. This process was among the earliest predictions of quantum electrodynamics (QED), the quantum theory of electromagnetism, and is forbidden by Maxwell’s classical theory of electrodynamics.

    See the full article here.

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries

    Cern Courier



    CERN/ALICE Detector

    CERN CMS New

    CERN LHCb New II


    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles





    CERN ALPHA-g Detector

    CERN ALPHA-g Detector









    CERN CAST Axion Solar Telescope













    CERN NA62

    CERN NA62



    CERN UA9

    CERN Proto Dune

    CERN Proto Dune

  • richardmitnick 9:54 am on March 20, 2019 Permalink | Reply
    Tags: "Report reveals full reach of LHC programme", , , CERN LHC, , , ,   

    From CERN: “Report reveals full reach of LHC programme” 

    Cern New Bloc

    Cern New Particle Event

    From CERN

    19 March, 2019
    Matthew Chalmers

    The excavation of the two new shafts for the HL-LHC at points 1 and 5 of the accelerator has recently been completed. © Antonino Panté, Reproduced with permission.

    The High-Luminosity LHC (HL-LHC), scheduled to operate from 2026, will increase the instantaneous luminosity of the LHC by at least a factor of five beyond its initial design luminosity. The analysis of a fraction of the data already delivered by the LHC – a mere 6% of what is expected by the end of HL-LHC in the late-2030s – led to the discovery of the Higgs boson and a diverse set of measurements and searches that have been documented in some 2000 physics papers published by the LHC experiments. “Although the HL-LHC is an approved and funded project, its physics programme evolves with scientific developments and also with the physics programmes planned at future colliders,” says Aleandro Nisati of ATLAS, who is a member of the steering group for a new report quantifying the HL-LHC physics potential.

    The 1000+ page report, published in January, contains input from more than 1000 experts from the experimental and theory communities. It stems from an initial workshop at CERN held in late 2017 (CERN Courier January/February 2018 p44) and also addresses the physics opportunities at a proposed high-energy upgrade (HE-LHC). Working groups have carried out hundreds of projections for physics measurements within the extremely challenging HL-LHC collision environment, taking into account the expected evolution of the theoretical landscape in the years ahead. In addition to their experience with LHC data analysis, the report factors in the improvements expected from the newly upgraded detectors and the likelihood that new analysis techniques will be developed. “A key aspect of this report is the involvement of the whole LHC community, working closely together to ensure optimal scientific progress,” says theorist and steering-group member Michelangelo Mangano.

    Physics streams

    The physics programme has been distilled into five streams: Standard Model (SM), Higgs, beyond the SM, flavour and QCD matter at high density.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    The LHC results so far have confirmed the validity of the SM up to unprecedented energy scales and with great precision in the strong, electroweak and flavour sectors. Thanks to a 10-fold larger data set, the HL-LHC will probe the SM with even greater precision, give access to previously unseen rare processes, and will extend the experiments’ sensitivity to new physics in direct and indirect searches for processes with low-production cross sections and more elusive signatures. The precision of key measurements, such as the coupling of the Higgs boson to SM particles, is expected to reach the percent level, where effects of new physics could be seen. The experimental uncertainty on the top-quark mass will be reduced to a few hundred MeV, and vector-boson scattering – recently observed in LHC data – will be studied with an accuracy of a few percent using various diboson processes.

    The 2012 discovery of the Higgs boson opens brand-new studies of its properties, the SM in general, and of possible physics beyond the SM. Outstanding opportunities have emerged for measurements of fundamental importance at the HL-LHC, such as the first direct constraints on the Higgs trilinear self-coupling and the natural width. The experience of LHC Run 2 has led to an improved understanding of the HL-LHC’s ability to probe Higgs pair production, a key measure of its self-interaction, with a projected combined ATLAS and CMS sensitivity of four standard deviations. In addition to significant improvements on the precision of Higgs-boson measurements, the HL-LHC will improve searches for heavier Higgs bosons motivated by theories beyond the SM and will be able to probe very rare exotic decay modes thanks to the huge dataset expected.

    The new report considers a large variety of new-physics models that can be probed at HL-LHC. In addition to searches for new heavy resonances and supersymmetry models, it includes results on dark matter and dark sectors, long-lived particles, leptoquarks, sterile neutrinos, axion-like particles, heavy scalars, vector-like quarks, and more. “Particular attention is placed on the potential opened by the LHC detector upgrades, the assessment of future systematic uncertainties, and new experimental techniques,” says steering-group member Andreas Meyer of CMS. “In addition to extending the present LHC mass and coupling reach by 20–50% for most new-physics scenarios, the HL-LHC will be able to potentially discover, or constrain, new physics that is not in reach of the current LHC dataset.”

    Pushing for precision

    The flavour-physics programme at the HL-LHC comprises many different probes – the weak decays of beauty, charm, strange and top quarks, as well as of the τ lepton and the Higgs boson – in which the experiments can search for signs of new physics. ATLAS and CMS will push the measurement precision of Higgs couplings and search for rare top decays, while the proposed second phase of the LHCb upgrade will greatly enhance the sensitivity with a range of beauty-, charm-, and strange-hadron probes. “It’s really exciting to see the full potential of the HL-LHC as a facility for precision flavour physics,” says steering-group member Mika Vesterinen of LHCb. “The projected experimental advances are also expected to be accompanied by improvements in theory, enhancing the current mass-reach on new physics by a factor as large as four.”

    Finally, the report identifies four major scientific goals for future high-density QCD studies at the LHC, including detailed characterisation of the quark–gluon plasma and its underlying parton dynamics, the development of a unified picture of particle production, and QCD dynamics from small to large systems. To address these goals, high-luminosity lead–lead and proton–lead collision programmes are considered as priorities, while high-luminosity runs with intermediate-mass nuclei such as argon could extend the heavy-ion programme at the LHC into the HL-LHC phase.

    High-energy considerations

    High Energy LHC (HE-LHC)

    One of the proposed options for a future collider at CERN is the HE-LHC, a new pp collider in the LHC ring with CM energy in the range of 27 TeV, which would occupy the same tunnel but be built from advanced high-field dipole magnets that could support roughly double the LHC’s energy. Such a machine would be expected to deliver an integrated proton–proton luminosity of 15,000 fb–1 at a centre-of-mass energy of 27 TeV, increasing the discovery mass-reach beyond anything possible at the HL-LHC. The HE-LHC would provide precision access to rare Higgs boson (H) production modes, with approximately a 2% uncertainty on the ttH coupling, as well as an unambiguous observation of the HH signal and a precision of about 20% on the trilinear coupling. An HE-LHC would enable a heavy new Z´ gauge boson discovered at the HL-LHC to be studied in detail, and in general double the discovery reach of the HL-LHC to beyond 10 TeV.

    The HL/HE-LHC reports were submitted to the European Strategy for Particle Physics Update in December 2018, and are also intended to bring perspective to the physics potential of future projects beyond the LHC. “We now have a better sense of our potential to characterise the Higgs boson, hunt for new particles and make Standard Model measurements that restrict the opportunities for new physics to hide,” says Mangano. “This report has made it clear that these planned 3000 fb–1 of data from HL-LHC, and much more in the case of a future HE-LHC, will play a central role in particle physics for decades to come.”

    See the full article here.

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries

    Cern Courier

  • richardmitnick 1:10 pm on March 10, 2019 Permalink | Reply
    Tags: A quantum computer would greatly speed up analysis of the collisions hopefully finding evidence of supersymmetry much sooner—or at least allowing us to ditch the theory and move on., And they’ve been waiting for decades. Google is in the race as are IBM Microsoft Intel and a clutch of startups academic groups and the Chinese government., , At the moment researchers spend weeks and months sifting through the debris from proton-proton collisions in the LCH trying to find exotic heavy sister-particles to all our known particles of matter., “This is a marathon” says David Reilly who leads Microsoft’s quantum lab at the University of Sydney Australia. “And it's only 10 minutes into the marathon.”, , CERN LHC, CERN-Future Circular Collider, For CERN the quantum promise could for instance help its scientists find evidence of supersymmetry or SUSY which so far has proven elusive., HL-LHC-High-Luminosity LHC, IBM has steadily been boosting the number of qubits on its quantum computers starting with a meagre 5-qubit computer then 16- and 20-qubit machines and just recently showing off its 50-qubit processor, In a bid to make sense of the impending data deluge some at CERN are turning to the emerging field of quantum computing., In a quantum computer each circuit can have one of two values—either one (on) or zero (off) in binary code; the computer turns the voltage in a circuit on or off to make it work., In theory a quantum computer would process all the states a qubit can have at once and with every qubit added to its memory size its computational power should increase exponentially., Last year physicists from the California Institute of Technology in Pasadena and the University of Southern California managed to replicate the discovery of the Higgs boson found at the LHC in 2012, None of the competing teams have come close to reaching even the first milestone., , , , The quest has now lasted decades and a number of physicists are questioning if the theory behind SUSY is really valid., Traditional computers—be it an Apple Watch or the most powerful supercomputer—rely on tiny silicon transistors that work like on-off switches to encode bits of data., Venture capitalists invested some $250 million in various companies researching quantum computing in 2018 alone.,   

    From WIRED: “Inside the High-Stakes Race to Make Quantum Computers Work” 

    Wired logo

    From WIRED

    Katia Moskvitch

    View Pictures/Getty Images

    Deep beneath the Franco-Swiss border, the Large Hadron Collider is sleeping.


    CERN map

    CERN LHC Tunnel

    CERN LHC particles

    But it won’t be quiet for long. Over the coming years, the world’s largest particle accelerator will be supercharged, increasing the number of proton collisions per second by a factor of two and a half.

    Once the work is complete in 2026, researchers hope to unlock some of the most fundamental questions in the universe. But with the increased power will come a deluge of data the likes of which high-energy physics has never seen before. And, right now, humanity has no way of knowing what the collider might find.

    To understand the scale of the problem, consider this: When it shut down in December 2018, the LHC generated about 300 gigabytes of data every second, adding up to 25 petabytes (PB) annually. For comparison, you’d have to spend 50,000 years listening to music to go through 25 PB of MP3 songs, while the human brain can store memories equivalent to just 2.5 PB of binary data. To make sense of all that information, the LHC data was pumped out to 170 computing centers in 42 countries [http://greybook.cern.ch/]. It was this global collaboration that helped discover the elusive Higgs boson, part of the Higgs field believed to give mass to elementary particles of matter.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    To process the looming data torrent, scientists at the European Organization for Nuclear Research, or CERN, will need 50 to 100 times more computing power than they have at their disposal today. A proposed Future Circular Collider, four times the size of the LHC and 10 times as powerful, would create an impossibly large quantity of data, at least twice as much as the LHC.

    CERN FCC Future Circular Collider map

    In a bid to make sense of the impending data deluge, some at CERN are turning to the emerging field of quantum computing. Powered by the very laws of nature the LHC is probing, such a machine could potentially crunch the expected volume of data in no time at all. What’s more, it would speak the same language as the LHC. While numerous labs around the world are trying to harness the power of quantum computing, it is the future work at CERN that makes it particularly exciting research. There’s just one problem: Right now, there are only prototypes; nobody knows whether it’s actually possible to build a reliable quantum device.

    Traditional computers—be it an Apple Watch or the most powerful supercomputer—rely on tiny silicon transistors that work like on-off switches to encode bits of data.

    ORNL IBM AC922 SUMMIT supercomputer, No.1 on the TOP500. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    Each circuit can have one of two values—either one (on) or zero (off) in binary code; the computer turns the voltage in a circuit on or off to make it work.

    A quantum computer is not limited to this “either/or” way of thinking. Its memory is made up of quantum bits, or qubits—tiny particles of matter like atoms or electrons. And qubits can do “both/and,” meaning that they can be in a superposition of all possible combinations of zeros and ones; they can be all of those states simultaneously.

    For CERN, the quantum promise could, for instance, help its scientists find evidence of supersymmetry, or SUSY, which so far has proven elusive.

    Standard Model of Supersymmetry via DESY

    At the moment, researchers spend weeks and months sifting through the debris from proton-proton collisions in the LCH, trying to find exotic, heavy sister-particles to all our known particles of matter. The quest has now lasted decades, and a number of physicists are questioning if the theory behind SUSY is really valid. A quantum computer would greatly speed up analysis of the collisions, hopefully finding evidence of supersymmetry much sooner—or at least allowing us to ditch the theory and move on.

    A quantum device might also help scientists understand the evolution of the early universe, the first few minutes after the Big Bang. Physicists are pretty confident that back then, our universe was nothing but a strange soup of subatomic particles called quarks and gluons. To understand how this quark-gluon plasma has evolved into the universe we have today, researchers simulate the conditions of the infant universe and then test their models at the LHC, with multiple collisions. Performing a simulation on a quantum computer, governed by the same laws that govern the very particles that the LHC is smashing together, could lead to a much more accurate model to test.

    Beyond pure science, banks, pharmaceutical companies, and governments are also waiting to get their hands on computing power that could be tens or even hundreds of times greater than that of any traditional computer.

    And they’ve been waiting for decades. Google is in the race, as are IBM, Microsoft, Intel and a clutch of startups, academic groups, and the Chinese government. The stakes are incredibly high. Last October, the European Union pledged to give $1 billion to over 5,000 European quantum technology researchers over the next decade, while venture capitalists invested some $250 million in various companies researching quantum computing in 2018 alone. “This is a marathon,” says David Reilly, who leads Microsoft’s quantum lab at the University of Sydney, Australia. “And it’s only 10 minutes into the marathon.”

    Despite the hype surrounding quantum computing and the media frenzy triggered by every announcement of a new qubit record, none of the competing teams have come close to reaching even the first milestone, fancily called quantum supremacy—the moment when a quantum computer performs at least one specific task better than a standard computer. Any kind of task, even if it is totally artificial and pointless. There are plenty of rumors in the quantum community that Google may be close, although if true, it would give the company bragging rights at best, says Michael Biercuk, a physicist at the University of Sydney and founder of quantum startup Q-CTRL. “It would be a bit of a gimmick—an artificial goal,” says Reilly “It’s like concocting some mathematical problem that really doesn’t have an obvious impact on the world just to say that a quantum computer can solve it.”

    That’s because the first real checkpoint in this race is much further away. Called quantum advantage, it would see a quantum computer outperform normal computers on a truly useful task. (Some researchers use the terms quantum supremacy and quantum advantage interchangeably.) And then there is the finish line, the creation of a universal quantum computer. The hope is that it would deliver a computational nirvana with the ability to perform a broad range of incredibly complex tasks. At stake is the design of new molecules for life-saving drugs, helping banks to adjust the riskiness of their investment portfolios, a way to break all current cryptography and develop new, stronger systems, and for scientists at CERN, a way to glimpse the universe as it was just moments after the Big Bang.

    Slowly but surely, work is already underway. Federico Carminati, a physicist at CERN, admits that today’s quantum computers wouldn’t give researchers anything more than classical machines, but, undeterred, he’s started tinkering with IBM’s prototype quantum device via the cloud while waiting for the technology to mature. It’s the latest baby step in the quantum marathon. The deal between CERN and IBM was struck in November last year at an industry workshop organized by the research organization.

    Set up to exchange ideas and discuss potential collab­orations, the event had CERN’s spacious auditorium packed to the brim with researchers from Google, IBM, Intel, D-Wave, Rigetti, and Microsoft. Google detailed its tests of Bristlecone, a 72-qubit machine. Rigetti was touting its work on a 128-qubit system. Intel showed that it was in close pursuit with 49 qubits. For IBM, physicist Ivano Tavernelli took to the stage to explain the company’s progress.

    IBM has steadily been boosting the number of qubits on its quantum computers, starting with a meagre 5-qubit computer, then 16- and 20-qubit machines, and just recently showing off its 50-qubit processor.

    IBM iconic image of Quantum computer

    Carminati listened to Tavernelli, intrigued, and during a much needed coffee break approached him for a chat. A few minutes later, CERN had added a quantum computer to its impressive technology arsenal. CERN researchers are now starting to develop entirely new algorithms and computing models, aiming to grow together with the device. “A fundamental part of this process is to build a solid relationship with the technology providers,” says Carminati. “These are our first steps in quantum computing, but even if we are coming relatively late into the game, we are bringing unique expertise in many fields. We are experts in quantum mechanics, which is at the base of quantum computing.”

    The attraction of quantum devices is obvious. Take standard computers. The prediction by former Intel CEO Gordon Moore in 1965 that the number of components in an integrated circuit would double roughly every two years has held true for more than half a century. But many believe that Moore’s law is about to hit the limits of physics. Since the 1980s, however, researchers have been pondering an alternative. The idea was popularized by Richard Feynman, an American physicist at Caltech in Pasadena. During a lecture in 1981, he lamented that computers could not really simulate what was happening at a subatomic level, with tricky particles like electrons and photons that behave like waves but also dare to exist in two states at once, a phenomenon known as quantum superposition.

    Feynman proposed to build a machine that could. “I’m not happy with all the analyses that go with just the classical theory, because nature isn’t classical, dammit,” he told the audience back in 1981. “And if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy.”

    And so the quantum race began. Qubits can be made in different ways, but the rule is that two qubits can be both in state A, both in state B, one in state A and one in state B, or vice versa, so there are four probabilities in total. And you won’t know what state a qubit is at until you measure it and the qubit is yanked out of its quantum world of probabilities into our mundane physical reality.

    In theory, a quantum computer would process all the states a qubit can have at once, and with every qubit added to its memory size, its computational power should increase exponentially. So, for three qubits, there are eight states to work with simultaneously, for four, 16; for 10, 1,024; and for 20, a whopping 1,048,576 states. You don’t need a lot of qubits to quickly surpass the memory banks of the world’s most powerful modern supercomputers—meaning that for specific tasks, a quantum computer could find a solution much faster than any regular computer ever would. Add to this another crucial concept of quantum mechanics: entanglement. It means that qubits can be linked into a single quantum system, where operating on one affects the rest of the system. This way, the computer can harness the processing power of both simultaneously, massively increasing its computational ability.

    While a number of companies and labs are competing in the quantum marathon, many are running their own races, taking different approaches. One device has even been used by a team of researchers to analyze CERN data, albeit not at CERN. Last year, physicists from the California Institute of Technology in Pasadena and the University of Southern California managed to replicate the discovery of the Higgs boson, found at the LHC in 2012, by sifting through the collider’s troves of data using a quantum computer manufactured by D-Wave, a Canadian firm based in Burnaby, British Columbia. The findings didn’t arrive any quicker than on a traditional computer, but, crucially, the research showed a quantum machine could do the work.

    One of the oldest runners in the quantum race, D-Wave announced back in 2007 that it had built a fully functioning, commercially available 16-qubit quantum computer prototype—a claim that’s controversial to this day. D-Wave focuses on a technology called quantum annealing, based on the natural tendency of real-world quantum systems to find low-energy states (a bit like a spinning top that inevitably will fall over). A D-Wave quantum computer imagines the possible solutions of a problem as a landscape of peaks and valleys; each coordinate represents a possible solution and its elevation represents its energy. Annealing allows you to set up the problem, and then let the system fall into the answer—in about 20 milliseconds. As it does so, it can tunnel through the peaks as it searches for the lowest valleys. It finds the lowest point in the vast landscape of solutions, which corresponds to the best possible outcome—although it does not attempt to fully correct for any errors, inevitable in quantum computation. D-Wave is now working on a prototype of a universal annealing quantum computer, says Alan Baratz, the company’s chief product officer.

    Apart from D-Wave’s quantum annealing, there are three other main approaches to try and bend the quantum world to our whim: integrated circuits, topological qubits and ions trapped with lasers. CERN is placing high hopes on the first method but is closely watching other efforts too.

    IBM, whose computer Carminati has just started using, as well as Google and Intel, all make quantum chips with integrated circuits—quantum gates—that are superconducting, a state when certain metals conduct electricity with zero resistance. Each quantum gate holds a pair of very fragile qubits. Any noise will disrupt them and introduce errors—and in the quantum world, noise is anything from temperature fluctuations to electromagnetic and sound waves to physical vibrations.

    To isolate the chip from the outside world as much as possible and get the circuits to exhibit quantum mechanical effects, it needs to be supercooled to extremely low temperatures. At the IBM quantum lab in Zurich, the chip is housed in a white tank—a cryostat—suspended from the ceiling. The temperature inside the tank is a steady 10 millikelvin or –273 degrees Celsius, a fraction above absolute zero and colder than outer space. But even this isn’t enough.

    Just working with the quantum chip, when scientists manipulate the qubits, causes noise. “The outside world is continually interacting with our quantum hardware, damaging the information we are trying to process,” says physicist John Preskill at the California Institute of Technology, who in 2012 coined the term quantum supremacy. It’s impossible to get rid of the noise completely, so researchers are trying to suppress it as much as possible, hence the ultracold temperatures to achieve at least some stability and allow more time for quantum computations.

    “My job is to extend the lifetime of qubits, and we’ve got four of them to play with,” says Matthias Mergenthaler, an Oxford University postdoc student working at IBM’s Zurich lab. That doesn’t sound like a lot, but, he explains, it’s not so much the number of qubits that counts but their quality, meaning qubits with as low a noise level as possible, to ensure they last as long as possible in superposition and allow the machine to compute. And it’s here, in the fiddly world of noise reduction, that quantum computing hits up against one of its biggest challenges. Right now, the device you’re reading this on probably performs at a level similar to that of a quantum computer with 30 noisy qubits. But if you can reduce the noise, then the quantum computer is many times more powerful.

    Once the noise is reduced, researchers try to correct any remaining errors with the help of special error-correcting algorithms, run on a classical computer. The problem is, such error correction works qubit by qubit, so the more qubits there are, the more errors the system has to cope with. Say a computer makes an error once every 1,000 computational steps; it doesn’t sound like much, but after 1,000 or so operations, the program will output incorrect results. To be able to achieve meaningful computations and surpass standard computers, a quantum machine has to have about 1,000 qubits that are relatively low noise and with error rates as corrected as possible. When you put them all together, these 1,000 qubits will make up what researchers call a logical qubit. None yet exist—so far, the best that prototype quantum devices have achieved is error correction for up to 10 qubits. That’s why these prototypes are called noisy intermediate-scale quantum computers (NISQ), a term also coined by Preskill in 2017.

    For Carminati, it’s clear the technology isn’t ready yet. But that isn’t really an issue. At CERN the challenge is to be ready to unlock the power of quantum computers when and if the hardware becomes available. “One exciting possibility will be to perform very, very accurate simulations of quantum systems with a quantum computer—which in itself is a quantum system,” he says. “Other groundbreaking opportunities will come from the blend of quantum computing and artificial intelligence to analyze big data, a very ambitious proposition at the moment, but central to our needs.”

    But some physicists think NISQ machines will stay just that—noisy—forever. Gil Kalai, a professor at Yale University, says that error correcting and noise suppression will never be good enough to allow any kind of useful quantum computation. And it’s not even due to technology, he says, but to the fundamentals of quantum mechanics. Interacting systems have a tendency for errors to be connected, or correlated, he says, meaning errors will affect many qubits simultaneously. Because of that, it simply won’t be possible to create error-correcting codes that keep noise levels low enough for a quantum computer with the required large number of qubits.

    “My analysis shows that noisy quantum computers with a few dozen qubits deliver such primitive computational power that it will simply not be possible to use them as the building blocks we need to build quantum computers on a wider scale,” he says. Among scientists, such skepticism is hotly debated. The blogs of Kalai and fellow quantum skeptics are forums for lively discussion, as was a recent much-shared article titled “The Case Against Quantum Computing”—followed by its rebuttal, “The Case Against the Case Against Quantum Computing.

    For now, the quantum critics are in a minority. “Provided the qubits we can already correct keep their form and size as we scale, we should be okay,” says Ray Laflamme, a physicist at the University of Waterloo in Ontario, Canada. The crucial thing to watch out for right now is not whether scientists can reach 50, 72, or 128 qubits, but whether scaling quantum computers to this size significantly increases the overall rate of error.

    The Quantum Nano Centre in Canada is one of numerous big-budget research and development labs focussed on quantum computing. James Brittain/Getty Images

    Others believe that the best way to suppress noise and create logical qubits is by making qubits in a different way. At Microsoft, researchers are developing topological qubits—although its array of quantum labs around the world has yet to create a single one. If it succeeds, these qubits would be much more stable than those made with integrated circuits. Microsoft’s idea is to split a particle—for example an electron—in two, creating Majorana fermion quasi-particles. They were theorized back in 1937, and in 2012 researchers at Delft University of Technology in the Netherlands, working at Microsoft’s condensed matter physics lab, obtained the first experimental evidence of their existence.

    “You will only need one of our qubits for every 1,000 of the other qubits on the market today,” says Chetan Nayak, general manager of quantum hardware at Microsoft. In other words, every single topological qubit would be a logical one from the start. Reilly believes that researching these elusive qubits is worth the effort, despite years with little progress, because if one is created, scaling such a device to thousands of logical qubits would be much easier than with a NISQ machine. “It will be extremely important for us to try out our code and algorithms on different quantum simulators and hardware solutions,” says Carminati. “Sure, no machine is ready for prime time quantum production, but neither are we.”

    Another company Carminati is watching closely is IonQ, a US startup that spun out of the University of Maryland. It uses the third main approach to quantum computing: trapping ions. They are naturally quantum, having superposition effects right from the start and at room temperature, meaning that they don’t have to be supercooled like the integrated circuits of NISQ machines. Each ion is a singular qubit, and researchers trap them with special tiny silicon ion traps and then use lasers to run algorithms by varying the times and intensities at which each tiny laser beam hits the qubits. The beams encode data to the ions and read it out from them by getting each ion to change its electronic states.

    In December, IonQ unveiled its commercial device, capable of hosting 160 ion qubits and performing simple quantum operations on a string of 79 qubits. Still, right now, ion qubits are just as noisy as those made by Google, IBM, and Intel, and neither IonQ nor any other labs around the world experimenting with ions have achieved quantum supremacy.

    As the noise and hype surrounding quantum computers rumbles on, at CERN, the clock is ticking. The collider will wake up in just five years, ever mightier, and all that data will have to be analyzed. A non-noisy, error-corrected quantum computer will then come in quite handy.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 1:29 pm on March 8, 2019 Permalink | Reply
    Tags: , And finally theywill be shipped to CERN, “The need to go beyond the already excellent performance of the LHC is at the basis of the scientific method” said Giorgio Apollinari Fermilab scientist and HL-LHC AUP project manager., , , CERN LHC, Each magnet will have four sets of coils making it a quadrupole., Earlier this month the AUP earned approval for both Critical Decisions 2 and 3b from DOE., Fermilab will manufacture 43 coils and Brookhaven National Laboratory in New York will manufacture another 41, , , In its current configuration on average an astonishing 1 billion collisions occur every second at the LHC., It’s also the reason behind the collider’s new name the High-Luminosity LHC., LHC AUP began just over two years ago and on Feb. 11 it received key approvals allowing the project to transition into its next steps., , , , Superconducting niobium-tin magnets have never been used in a high-energy particle accelerator like the LHC., The AUP calls for 84 coils fabricated into 21 magnets., The first upgrade is to the magnets that focus the particles., The magnets will be sent to Brookhaven to be tested before being shipped back to Fermilab., The new technologies developed for the LHC will boost that number by a factor of 10., The second upgrade is a special type of accelerator cavity., The U.S. Large Hadron Collider Accelerator Upgrade Project is the Fermilab-led collaboration of U.S. laboratories in partnership with CERN and a dozen other countries., These new magnets will generate a maximum magnetic field of 12 tesla roughly 50 percent more than the niobium-titanium magnets currently in the LHC., This means that significantly more data will be available to experiments at the LHC., This special cavity called a crab cavity is used to increase the overlap of the two beams so that more protons have a chance of colliding., Those will then be delivered to Lawrence Berkeley National Laboratory to be formed into accelerator magnets, Twenty successful magnets will be inserted into 10 containers which are then tested by Fermilab, U.S. Department of Energy projects undergo a series of key reviews and approvals referred to as “Critical Decisions” that every project must receive., U.S. physicists and engineers helped research and develop two technologies to make this upgrade possible.   

    From Brookhaven National Lab: “Large Hadron Collider Upgrade Project Leaps Forward” 

    From Brookhaven National Lab

    March 4, 2019
    Caitlyn Buongiorno

    Staff members of the Superconducting Magnet Division at Brookhaven National Laboratory next to the “top hat”— the interface between the room temperature components of the magnet test facility and the LHC high-luminosity magnet to be tested. The magnet is attached to the bottom of the top hat and tested in superfluid helium at temperatures close to absolute zero. Left to right: Joseph Muratore, Domenick Milidantri, Sebastian Dimaiuta, Raymond Ceruti, and Piyush Joshi. Credit: Brookhaven National Laboratory

    The U.S. Large Hadron Collider Accelerator Upgrade Project is the Fermilab-led collaboration of U.S. laboratories that, in partnership with CERN and a dozen other countries, is working to upgrade the Large Hadron Collider.

    LHC AUP began just over two years ago and, on Feb. 11, it received key approvals, allowing the project to transition into its next steps.


    CERN map

    CERN LHC Tunnel

    CERN LHC particles

    U.S. Department of Energy projects undergo a series of key reviews and approvals, referred to as “Critical Decisions” that every project must receive. Earlier this month, the AUP earned approval for both Critical Decisions 2 and 3b from DOE. CD-2 approves the performance baseline — the scope, cost and schedule — for the AUP. In order to stay on that schedule, CD-3b allows the project to receive the funds and approval necessary to purchase base materials and produce final design models of two technologies by the end of 2019.

    The LHC, a 17-mile-circumference particle accelerator on the French-Swiss border, smashes together two opposing beams of protons to produce other particles. Researchers use the particle data to understand how the universe operates at the subatomic scale.

    In its current configuration, on average, an astonishing 1 billion collisions occur every second at the LHC. The new technologies developed for the LHC will boost that number by a factor of 10. This increase in luminosity — the number of proton-proton interactions per second — means that significantly more data will be available to experiments at the LHC. It’s also the reason behind the collider’s new name, the High-Luminosity LHC.

    This “crab cavity” is designed to maximize the chance of collision between two opposing particle beams. Photo: Paolo Berrutti

    “The need to go beyond the already excellent performance of the LHC is at the basis of the scientific method,” said Giorgio Apollinari, Fermilab scientist and HL-LHC AUP project manager. “The endorsement and support received for this U.S. contribution to the HL-LHC will allow our scientists to remain at the forefront of research at the energy frontier.”

    U.S. physicists and engineers helped research and develop two technologies to make this upgrade possible. The first upgrade is to the magnets that focus the particles. The new magnets rely on niobium-tin conductors and can exert a stronger force on the particles than their predecessors. By increasing the force, the particles in each beam are driven closer together, enabling more proton-proton interactions at the collision points.

    The second upgrade is a special type of accelerator cavity. Cavities are structures inside colliders that impart energy to the particle beam and propel them forward. This special cavity, called a crab cavity, is used to increase the overlap of the two beams so that more protons have a chance of colliding.

    “This approval is a recognition of 15 years of research and development started by a U.S. research program and completed by this project,” said Giorgio Ambrosio, Fermilab scientist and HL-LHC AUP manager for magnets.

    This completed niobium-tin magnet coil will generate a maximum magnetic field of 12 tesla, roughly 50 percent more than the niobium-titanium magnets currently in the LHC. Photo: Alfred Nobrega

    Magnets help the particles go ’round

    Superconducting niobium-tin magnets have never been used in a high-energy particle accelerator like the LHC. These new magnets will generate a maximum magnetic field of 12 tesla, roughly 50 percent more than the niobium-titanium magnets currently in the LHC. For comparison, an MRI’s magnetic field ranges from 0.5 to 3 tesla, and Earth’s magnetic field is only 50 millionths of one tesla.

    There are multiple stages to creating the niobium-tin coils for the magnets, and each brings its challenges.

    Each magnet will have four sets of coils, making it a quadrupole. Together the coils conduct the electric current that produces the magnetic field of the magnet. In order to make niobium-tin capable of producing a strong magnetic field, the coils must be baked in an oven and turned into a superconductor. The major challenge with niobium-tin is that the superconducting phase is brittle. Similar to uncooked spaghetti, a small amount of pressure can snap it in two if the coils are not well supported. Therefore, the coils must be handled delicately from this point on.

    The AUP calls for 84 coils, fabricated into 21 magnets. Fermilab will manufacture 43 coils, and Brookhaven National Laboratory in New York will manufacture another 41. Those will then be delivered to Lawrence Berkeley National Laboratory to be formed into accelerator magnets. The magnets will be sent to Brookhaven to be tested before being shipped back to Fermilab. Twenty successful magnets will be inserted into 10 containers, which are then tested by Fermilab, and finally shipped to CERN.

    With CD-2/3b approval, AUP expects to have the first magnet assembled in April and tested by July. If all goes well, this magnet will be eligible for installation at CERN.

    Crab cavities for more collisions

    Cavities accelerate particles inside a collider, boosting them to higher energies. They also form the particles into bunches: As individual protons travel through the cavity, each one is accelerated or decelerated depending on whether they are below or above an expected energy. This process essentially sorts the beam into collections of protons, or particle bunches.

    HL-LHC puts a spin on the typical cavity with its crab cavities, which get their name from how the particle bunches appear to move after they’ve passed through the cavity. When a bunch exits the cavity, it appears to move sideways, similar to how a crab walks. This sideways movement is actually a result of the crab cavity rotating the particle bunches as they pass through.

    Imagine that a football was actually a particle bunch. Typically, you want to throw a football straight ahead, with the pointed end cutting through the air. The same is true for particle bunches; they normally go through a collider like a football. Now let’s say you wanted to ensure that your football and another football would collide in mid-air. Rather than throwing it straight on, you’d want to throw the football on its side to maximize the size of the target and hence the chance of collision.

    Of course, turning the bunches is harder than turning a football, as each bunch isn’t a single, rigid object.

    To make the rotation possible, the crab cavities are placed right before and after the collision points at two of the particle detectors at the LHC, called ATLAS and CMS. An alternating electric field runs through each cavity and “tilts” the particle bunch on its side. To do this, the front section of the bunch gets a “kick” to one side on the way in and, before it leaves, the rear section gets a “kick” to the opposite side. Now, the particle bunch looks like a football on its side. When the two bunches meet at the collision point, they overlap better, which makes the occurrence of a particle collision more likely.

    After the collision point, more crab cavities straighten the remaining bunches, so they can travel through the rest of the LHC without causing unwanted interactions.

    With CD-2/3b approval, all raw materials necessary for construction of the cavities can be purchased. Two crab cavity prototypes are expected by the end of 2019. Once the prototypes have been certified, the project will seek further approval for the production of all cavities destined to the LHC tunnel.

    After further testing, the cavities will be sent out to be “dressed”: placed in a cooling vessel. Once the dressed cavities pass all acceptance criteria, Fermilab will ship all 10 dressed cavities to CERN.

    “It’s easy to forget that these technological advances don’t benefit just accelerator programs,” said Leonardo Ristori, Fermilab engineer and an HL-LHC AUP manager for crab cavities. “Accelerator technology existed in the first TV screens and is currently used in medical equipment like MRIs. We might not be able to predict how these technologies will appear in everyday life, but we know that these kinds of endeavors ripple across industries.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus



    BNL RHIC Campus

    BNL/RHIC Star Detector


    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: