Tagged: HEP Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:11 pm on September 23, 2020 Permalink | Reply
    Tags: "Berkeley Team Plays Key Role in Analysis of Particle Interactions That Produce Matter From Light", , , CERN’s ATLAS detector produced W bosons from photons which are particles of light., HEP, , , , Photons are particles of light that carry the electromagnetic force which is the fundamental force associated with magnetism and electricity., , W bosons carry the weak force which is associated with the fusion that powers the sun and with nuclear fission that takes place in nuclear power plant reactors.   

    From Lawrence Berkeley National Lab: “Berkeley Team Plays Key Role in Analysis of Particle Interactions That Produce Matter From Light” 

    From Lawrence Berkeley National Lab

    September 23, 2020
    Glenn Roberts Jr.
    (510) 520-0843

    This image shows a reconstruction of a particle event at CERN’s ATLAS detector that produced W bosons from photons, which are particles of light. (Credit: ATLAS collaboration.)

    Researchers at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) played a key role in an analysis of data from the world’s largest particle collider that found proof of rare, high-energy particle interactions in which matter was produced from light.

    Simone Pagan Griso, a Berkeley Lab physicist and Divisional Fellow who coordinated the efforts of the Berkeley Lab team, said his team found about 174 particle interactions that are consistent with the creation of pairs of heavy force-carrying particles called W bosons from the collision of two photons.

    Photons are particles of light that carry the electromagnetic force, which is the fundamental force associated with magnetism and electricity. W bosons carry the weak force, which is associated with the fusion that powers the sun, and with a nuclear reaction called nuclear fission that takes place in nuclear power plant reactors.

    From 2015 to 2018, only about one of the approximately 30 trillion proton interactions measured at the ATLAS detector at CERN’s Large Hadron Collider (LHC) would produce W boson pairs from the interaction of two photons per data-taking day, Pagan Griso said.

    CERN ATLAS Image Claudia Marcelloni.

    The LHC is designed to accelerate and collide protons, which are positively charged particles found in atomic nuclei. The acceleration of bunches of these particles to nearly the speed of light produces strong electromagnetic fields that accompany the proton bunches and act like a field of photons. So when these high-energy photon bunches pass each other very closely at the LHC, their electromagnetic fields may interact, causing what’s known as an “ultra-peripheral collision.”

    Unlike the destructive proton-proton particle collisions that are used to generate a swarm of constituent particles and lead to particle interactions that are typically studied at the LHC, these ultra-peripheral collisions are more like rocks skipping across a surface. The interacting fields produce “quasi-real” photons, which are effects that resemble genuine photons in their characteristics but are not actual particles.

    In this latest analysis, the researchers focused on those rare occasions when the quasi-real photons produced pairs of W bosons.

    “It’s around 1,000 times less likely to happen than a quark-initiated W boson pair creation,” which presented a challenge in filtering out these more common types of interactions, Pagan Griso noted.

    “We had to be able to predict how much background we expected relative to a signal, and all of the other interactions that happen nearby,” he said. “This meant a lot of modeling and simulations to understand what different phenomena will look like.” Ultimately, the W boson pairs produced from the photon-photon interactions decayed down to an electron and a muon, a particle in the same class as the electron but with a mass 200 times greater.

    Also participating in Berkeley Lab’s analysis were Aleksandra Dimitrievska, a postdoctoral researcher and Chamberlain Fellow in the Physics Division; William Patrick Mccormack, a Ph.D. student at UC Berkeley and a Berkeley Lab researcher; and Maurice Garcia-Sciveres and Juerg Beringer, who are both senior staff scientists at Berkeley Lab.

    Pagan Griso noted that there was some hint for the production of W boson pairs from photon pairs in earlier data-taking at the LHC, though it was far less conclusive than this latest analysis.

    “We wrote this measurement from A to Z,” Pagan Griso said of the Berkeley Lab team involved in the studies. “We literally were involved in the entire spectrum of this analysis.”

    He added, “With even more data expected at the LHC in the future, we can probe this even better,” and learn more about the production rate of W boson pairs from photon-photon interactions, and the strength of the self-interaction among these four bosons, which is a stringent test of the Standard Model of particle physics. The team will also try to improve its analysis techniques, he said.

    Pagan Griso and other members of the Berkeley Lab ATLAS Group started the analysis, together with international collaborators, about a year and a half ago, he said. These recent results are preliminary and the study will soon be submitted to a scientific journal.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    LBNL Molecular Foundry

    Bringing Science Solutions to the World
    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (UC) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a UC Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    A U.S. Department of Energy National Laboratory Operated by the University of California.

    University of California Seal

  • richardmitnick 12:54 pm on September 23, 2020 Permalink | Reply
    Tags: "Unraveling Nature's secrets: vector boson scattering at the LHC", , , HEP, , ,   

    From CERN ATLAS- “Unraveling Nature’s secrets: vector boson scattering at the LHC” 

    CERN/ATLAS detector

    CERN ATLAS Higgs Event

    CERN ATLAS another view Image Claudia Marcelloni ATLAS CERN


    22nd September 2020
    Lucia Di Ciaccio
    Simone Pagan Griso

    Figure 1: “Wandering the immeasurable”, a sculpture designed by Gayle Hermick welcomes the CERN visitors. From the Mesopotamians’ cuneiform script to the mathematical formalism behind the discovery of the Higgs boson, the sculpture narrates the story of how knowledge is passed through the generations and illustrates the aesthetic nature of the mathematics behind physics. (Image: J. Guillaume/CERN.)

    In 2017, the ATLAS and CMS Collaborations announced the detection of a process in high-energy proton–proton collisions that had not been observed before: the vector boson scattering. It results in the production of two W particles with the same electric charge as well as two collimated sprays of particles called “jets” (see Figure 2). The observation of vector boson scattering didn’t receive as much attention from the media as the Higgs discovery in 2012, even though it was an important event for the particle physics community. Another missing piece of the big puzzle had been found – the puzzle that is the mathematical description of the microscopic world (see Figure 1).

    The W+ and W– bosons are unstable particles, which decay (transform) into a lepton and an antilepton or a quark and an antiquark with a mean lifetime of only a few 10-25 seconds. They have integer spin (characteristic of bosons) and are carriers of the weak force. Though the weak force is not directly experienced in everyday life, it is nevertheless important as it is responsible for radioactive β decay, which plays a role in the fusion of hydrogen into helium that powers the Sun’s thermonuclear process.

    To appreciate the importance of this discovery, it is instructive to follow the history of how and why the W+ and W– bosons were introduced; it illustrates nicely how the interplay between experimental information, theoretical models and mathematical principles drives progress in physics.

    Figure 2: Simplified view of a proton–proton collision event recorded with the ATLAS detector that was selected as a candidate for vector-boson-scattering production. The insert depicts a schematic view of the candidate physics process. Protons (p) from the LHC beam travel from left to right and from right to left in this view. They collide at approximately the centre of the detector. Within a very short period of time, too short to be resolved, two W bosons are emitted independently by the incoming quarks (q) from each of the LHC proton beams. These W bosons interact and each of the resulting W bosons decays to a muon (μ) and a neutrino (ν), where the neutrinos leave the ATLAS detector undetected. The outgoing quarks undergo a process called hadronization and manifest as a spray of particles called a “jet”. (Image: ATLAS Collaboration/CERN.)

    Enrico Fermi originally formulated a mathematical description of the weak force in 1933 as a “contact interaction” between particles, occurring at a single point without a carrier particle propagating the force. This formulation successfully described the known experimental observations, including the radioactive β decay for which it was developed. However it was soon realised that its predictions at high energy, a regime not yet experimentally accessible at that time, were bound to fail.

    Indeed, Fermi’s theory predicts that the production rate of some processes caused by the weak force – such as the elastic scattering of neutrinos on electrons – increases linearly with the neutrino energy. This continuous growth, however, leads to the violation of a limit derived from the conservation of probability in a scattering process. In other words, predictions become unphysical at a high enough energy. To overcome this problem, physicists modified Fermi’s theory by introducing “by hand” two massive spin-one (“vector”) charged particles propagating the interaction between neutrinos and electrons, dubbed “intermediate vector bosons”. This development came well before the discovery of the W bosons decades later.

    So even if the discovery of the long-awaited W± bosons in 1983 – and, five months later, of a neutral companion, the Z boson – didn’t come as a real surprise to physicists, it was certainly an epochal experimental achievement. Fermi’s theory remains an example of an effective theory valid only at low energy (well below the mass of the force carrier boson) – an approximation of a more general, universally valid theory.

    Along this line, the search for a consistent description of the fundamental forces between the ultimate constituents of matter has led to the Standard Model of particle physics: a mathematical construction based on fundamental principles and experimental observations. The Standard Model provides a coherent, unified picture of three of the four fundamental interactions, namely the electromagnetic, weak and strong force. The fourth force, not included in the Standard Model, is gravity. So far, the Standard Model has been successful at describing a myriad of experimental measurements in the microscopic world. Its success is, by all means, mind blowing.

    Standard Model of Particle Physics, Quantum Diaries.

    We do not know why natural phenomena are so well described by mathematical entities and relations but, experimentally, we know that it works. Just as Galileo said four hundred years ago, the big book of Nature is written in a mathematical language[1] – the Standard Model and Einstein’s theory of gravity, for example, are additional chapters of this book.

    Particle physics makes use of a theoretical tool in which all particles are represented mathematically by quantum fields. These entities encode properties like spin and mass of a particle. In the Standard Model, the existence of the electromagnetic, weak and strong force carriers follows from the invariance of the behaviour of quantum fields under a “local gauge transformation”. This is a transformation from one field configuration to another, which can be imagined as a rotation in an abstract mathematical space. The parameters of the transformation may vary from point to point in space-time, and thus the transformation is defined as “local”. Gauge invariance or gauge symmetry is the lack of changes in measurable quantities under gauge transformations, despite the quantum fields (which represent particles) being transformed.

    Gauge invariance holds if spin-one particles are introduced, which interact in a well-defined manner with the elementary constituents of matter such as electrons and quarks (constituents of the proton and neutron, or, more generally, of hadrons). These spin-one particles are interpreted as “the carriers of the interaction” between the matter particles, with the photon the carrier for the electromagnetic force, the W–, W+ and Z bosons for the weak force, and eight gluons for the strong force. These are the (intermediate) vector bosons introduced above.

    In this way, the Standard Model forces (or interactions) emerge in a very elegant manner from one general principle, namely a fundamental symmetry of Nature. Interestingly enough, in the model, the electromagnetic and weak interactions manifest themselves at high energy as different aspects of a single “electroweak” force, while at low energy the weak interaction remains feebler than the electromagnetic interaction. As a consequence, the photon, Z boson and W± bosons are collectively named “electroweak gauge bosons”.

    In the example above, Fermi’s followed a bottom-up approach: going from an observation to a mathematical description (a contact-interaction theory), which was modified “by hand” with few additions to obey the general principle of probability conservation (known as “unitarity” in physics). Starting from this premise, the work of many physicists consequently led to a more general theory. One in which the description of the fundamental forces follows the opposite path: predictions are obtained from fundamental principles (as gauge invariance) in a mathematically and physically coherent framework.[2] The interplay between these two ways of developing knowledge had been common in physics since before Newton’s time, and still valid today.

    In both cases, a theory is successful not only when it describes the known experimental facts, but also when it has predictive power. The Standard Model possesses both virtues and examples of its predictive power include the discoveries of the Higgs boson and the neutral kind of weak interaction mediated by the Z boson.

    As a matter of fact, the Standard Model tells us (much) more: the quantum fields representing the new spin-one particles will also transform under a local gauge transformation. To ensure that the measurable quantities describing their behaviour do not change (gauge invariance, mentioned above), interactions among the carriers of the weak force must also exist, as well as among the carriers of the strong force. These self-interactions may involve three or four gauge bosons. No self-interaction among photons is possible, except indirectly through virtual processes involving intermediate particles such as electrons, as observed in a dedicated ATLAS measurement.

    The process first observed by the ATLAS and CMS Collaborations in 2017, characterised by the presence of two W bosons with the same electric charge and two jets, is a signature of the occurrence of an electroweak interaction. The dominant part of the process is due to the self-interaction among four weak gauge bosons; another central prediction of the Standard Model finally confirmed by the LHC experiments. This self-interaction manifests as a “vector boson scattering”, where two incoming gauge bosons interact and produce two, potentially different, gauge bosons as final state particles. The production rate of this electroweak process is very low – lower than that of Higgs boson – which is why it was observed only recently. And just like the Higgs boson discovery, the observation of this process didn’t come out of the blue.

    At the Large Electron–Positron (LEP) collider, which operated at CERN between 1989 and 2000 in what is today the LHC tunnel, physicists had already observed the self-interaction among three gauge bosons. They measured the production of a pair of gauge bosons of opposite charge, a W+ and a W– boson, in the collisions of beams of electrons and positrons, the antiparticle of the electron. According to the Standard Model, three main processes contribute to this production. They proceed via the exchange of either a photon, neutrino or Z boson between the electron and positron of the initial state and the W pair of the final state (Figure 3).

    CERN Large Electron–Positron Collider.

    They measured the production of a pair of gauge bosons of opposite charge, a W+ and a W– boson, in the collisions of beams of electrons and positrons, the antiparticle of the electron. According to the Standard Model, three main processes contribute to this production. They proceed via the exchange of either a photon, neutrino or Z boson between the electron and positron of the initial state and the W pair of the final state (Figure 3).

    Figure 3: Diagrams representing three processes contributing to the e+e- → W+W- production. They illustrate the exchange between the initial e+e- and final W+W- of (from left to right), a neutrino (ν), photon (γ), and a Z boson, respectively. The symbol γ* is often used when a photon mediates the interaction. Following the rules of quantum mechanics, the production rate of a process is computed by the square of the sum of all possible diagrams contributing to it. The diagrams in the sum may have different relative signs, so they may cancel (destructive interference), in the same way that waves can cancel each other if they arrive out of synchronization. In the case discussed here, each diagram is necessary to avoid an unphysical continuous increase of the production rate with the collision energy and to ensure the preservation of the gauge invariance of the theory. (Image: S. Pagan Griso, L. Di Ciaccio/ATLAS Collaboration.)

    The exchange of a photon or a Z boson occurs via the self-interaction of three weak gauge bosons: WWγ and WWZ, respectively. The main point here is that without considering all three processes, the calculated production rate would grow continuously with energy, leading to the already encountered unphysical behaviour. The observation of this process at LEP, with a production rate consistent with the Standard Model prediction, therefore confirmed the existence of a self-interaction among three bosons.

    It is striking that the theory predicts the structure of each underlying process such that, even though each of them gives to the calculated production rate a contribution which at high energy becomes unphysical, violating unitarity, the unphysical behaviour cancels out when all of the processes are considered together.

    So far, so good – but there’s a catch. The W± and Z bosons observed and identified by experiments as the carriers of the weak interaction are massive, yet gauge invariance is only preserved if the carriers are massless. Should physicists give up the principle of gauge invariance to reconcile the theory with experimental facts?

    A solution to this puzzle was proposed in 1964, postulating the existence of a new spin-zero (“scalar”) field with a slightly more complex mathematical structure. While the basic laws of the forces remain exactly gauge symmetric, in the sense explained above, Nature has randomly chosen (among many possibilities) a particular lowest-energy state of this field, breaking with this choice the gauge symmetry in a limited way, called “spontaneous”.

    The consequences are dramatic. Out of this new field, a new particle emerges – the scalar Higgs boson – and the W± and Z bosons become massive. Physicists now believe that gauge symmetry was not always spontaneously broken. The universe transitioned from an “unbroken phase” with massless gauge bosons to our current “broken phase” during expansion and cool-down, a fraction of a second after the Big Bang.

    The discovery of the Higgs boson in 2012 by the ATLAS and CMS Collaborations is a great success of the Standard Model theory, especially when considering that it was found to have the mass that indirect clues were pointing to.

    CERN CMS Higgs Event May 27, 2012.

    CERN ATLAS Higgs Event
    June 12, 2012.

    While the Higgs boson mass is not predicted by theory, the existence of the Higgs boson with a given mass leaves a delicate footprint in natural phenomena such that, if measured very precisely (as was done at LEP and at Tevatron, the smaller predecessor of the LHC at Fermilab, nearby Chicago, USA), physicists could derive constraints on its mass. The Higgs boson’s discovery was thus an experimental prowess as well as a consecration of the Standard Model. It emphasized the remarkable role of the precision measurements at LEP, even though the energy of that accelerator was not high enough to directly produce the Higgs boson.

    Obviously, the story doesn’t end here. Solid indications exist that the Standard Model is not complete and that it must be encompassed in a more general theory. This possibility is not surprising. As Fermi’s weak interaction theory exemplifies, history has shown that a theory’s validity is related to the energy range (or, equivalently, size of space) accessible by experiments.

    More generally, classical mechanics is appropriate and predictive for the macroscopic world, when the speed of the objects is small with respect to the speed of light. To describe the microscopic world, however, quantum mechanics must be invoked, and the special theory of relativity must be applied to appropriately describe the behaviour of objects moving close to light speed.

    How can physicists find experimental signs that may help to formulate a more general theory than the Standard Model?

    A valuable approach is to directly search collision events for particles not included in the Standard Model. However this is inherently limited: only particles with a mass at or below the collision energy can be directly produced, due to the fundamental principle of energy conservation and following the equivalence between mass and energy. Alternative avenues, which suffer less from this limitation but are indirect, include performing very precise measurements of fundamental parameters of the Standard Model or measuring rare processes to look for deviations with respect to theoretical predictions. Such measurements are able to explore a higher energy domain, as the LEP Higgs-boson example showed.

    Vector boson scattering is one of these rare processes. It is special because closely related to the Higgs mechanism, and able to shed light on unexplored corners of Nature at the highest energy available in a laboratory. Similar to the LEP vector-boson study described above, vector boson scattering is expected to proceed via several processes, this time including the self-interaction of four gauge bosons as well as the exchange of a Higgs boson (see Figure 4). Without accounting for all of the processes, the calculated scattering rate grows indefinitely with energy, leading to the above-mentioned unphysical behaviour (violation of unitarity).

    It could be argued that this question is already settled, since we know that the Higgs boson exists. The key issue is that the way in which the Higgs boson interacts with the gauge bosons in the Standard Model is exactly what is required to moderate the growth of the scattering rate at high energy; a minimal deviation of the Higgs mechanism from the Standard Model prediction could result in an apparent breakdown of unitarity.

    Vector boson scattering would then occur at a rate different from what is predicted by the Standard Model, and unitarity would have to be recovered by a yet-unknown mechanism. The study of vector boson scattering thus allows physicists to investigate the Higgs mechanism in the highest energy domain accessible, where there may be signs of new physics.

    Figure 4: Diagrams of some of the processes contributing to the W+W+ → W+W+ process. Analogous diagrams contribute to the W-W- → W-W- process. Similarly to the explanation given in Figure 3, in order to compute the production rate each contribution is first added before their sum is squared. The individual contributions may have different relative signs leading to cancellations. In this case each contribution is necessary to avoid an unphysical continuous increase of the production rate with (the square or fourth power of) the scattering energy. (Image: S. Pagan Griso, L. Di Ciaccio/ATLAS Collaboration.)

    The LHC is the perfect place to look for rare processes like vector boson scattering, as it collides protons with the highest energy and rate ever reached. Furthermore, the ATLAS and CMS experiments are designed to select and record these rare events.

    As weak gauge bosons are extremely short-lived particles, experiments search for the scattering of vector bosons by looking for the production of two jets and two lepton–antilepton pairs in proton-proton collisions. Imagine this as two gauge bosons being emitted by the quarks from each of the incoming LHC proton beams. These gauge bosons subsequently scatter off each other and the bosons emerging from this interaction promptly decay (see Figure 2). The quarks are subsequently deflected and appear in the detector as jets of particles, typically emitted at a relatively small angle with respect to the beam direction. This is called an “electroweak” process as it is mediated by electroweak gauge bosons.

    The experimental signature of vector boson scattering is therefore characterised by the presence of the decay particles of the two bosons, accompanied by two jets with large angular separation. The W and Z bosons predominantly decay into a quark and antiquark pair. Nevertheless, the search of these rare events preferentially exploits the decays into a lepton and an anti-lepton because a concurrent process, the multi-jet production, being mediated by the strong interaction has an overwhelming rate and obscures processes with a much smaller rate.

    Still, the search for vector boson scattering is very challenging. This is not only because the rate of the process is low – accounting for only one in hundreds of trillions of proton–proton interactions – but also because, even making use of the leptonic decays, several “background” processes produce the same kinds of particles in the detector, mimicking the process’ signal.

    Due to its high rate, a particularly challenging background process is one in which the jets accompanying the decay products of the gauge bosons arise as a result of the strong-force interaction. The impact of this background with respect to the signal depends on the kind of gauge bosons which scatter. When they are W bosons with the same electric charge, the production rate of the two processes (signal and background) is comparable.

    For this reason, same-charge WW production is considered the golden channel for experimental measurements and was the first target for the ATLAS Collaboration to study vector-boson-scattering processes. ATLAS physicists reported for the first time strong hints of the process in a 2014 paper [Evidence for Electroweak Production of W±W±jj in pp Collisions at s√=8 TeV with the ATLAS Detector] – a milestone in the LHC physics programme. However, it took three more years to arrive at an unambiguous observation, passing the five-sigma threshold that particle physicists use to define a discovery and corresponding to a probability of less than one in 3.5 million that a signal observation could be due to a mere upward statistical fluctuation of the number of background events. In the years between the first hint and discovery, the LHC was upgraded to increase its proton–proton collision energy – from 8 TeV to 13 TeV – as well as its collision rate – yielding about six times more collected data. These improvements made observation of vector boson scattering possible – the era of its study had at last begun.

    However, not all electroweak bosons are equal. While the observation of two same-charge W bosons has allowed physicists to start testing the interaction of four W bosons (WWWW, Figure 2), the quest to test other self-interactions remained. The Standard Model only allows a specific set of combinations of four-gauge-boson self-interactions: WWWW, WWγγ, WWZγ and WWZZ, forbidding interactions among four neutral bosons.

    Not all of these electroweak interactions are predicted to have the same strength and, because of this, probing them requires identifying processes that are less and less frequent. Similarly to the case of two same-charge W bosons, electroweak processes involving two jets and a WZ pair, a Zγ pair, or a ZZ pair are increasingly rare or have significantly larger backgrounds. Hunting for such processes among the billions of proton–proton collisions recorded by ATLAS requires physicists to look for subtle differences in order to distinguish a signal from very similar background processes occurring at much higher rates.

    Table 1. List of processes presented in the text (first column) that are used to study vector boson scattering: WW with same charge, WZ, ZZ or Zɣ production in association with two jets (j), and photon-induced production of two W bosons. For each process a check indicates the four bosons involved in the self-interaction. Other measurements performed at the LHC also play a role to test these self-interactions but have been omitted in this table for simplicity. (Image: ATLAS Collaboration/CERN.)

    While such a task was commonly regarded as requiring a much larger amount of data than collected so far, the LHC experiments used artificial-intelligence algorithms to distinguish between the sought-after signal and the much larger background. Thanks to such innovations, in 2018 and 2019, ATLAS reported the observations of WZ and ZZ electroweak production, and saw a hint of the Zγ process. Suddenly, this brand-new field saw a surge in the number of processes that could be used to probe the self-interaction of gauge bosons.

    The most recent addition is ATLAS’ observation of two W bosons produced by the interaction of two photons, each radiated by the LHC protons. This phenomenon occurs when the accelerated protons skim each other, producing extremely high electromagnetic fields, with photons mediating an electromagnetic interaction between them. Such an interaction is only possible when quantum mechanical effects of electromagnetism are taken into account.

    This is a direct and clean probe of the γγWW gauge bosons interaction. A peculiarity of this process is that the protons participate as a whole and can remain intact after the interaction; this is very different from inelastic interactions where the quarks, the protons’ constituents, are the main actors (see Figure 2).

    Table 1 [above] summarises the processes that are used to study vector boson scattering at the LHC. It also shows the four bosons involved in the self-interaction. The study of each process provides a different test of the Standard Model, as modifications of the theory can differently alter the strength of the self-interactions.

    Now, ten years on from the first high-energy collisions took place in the LHC, the study of the vector boson scattering is a very active field – though still in its adolescence, both from the experimental and theoretical point of view. Experimentally, the size of the available signal sample is limited. The upcoming data-taking period (from 2022 to 2024) and the high-luminosity phase of the LHC (starting in 2027) will increase the amount of collected data by more than a factor two and by an additional factor of ten, respectively. An extensive upgrade of the LHC experiments is also ongoing, which will improve further the detection capabilities for the vector-boson-scattering processes.

    In parallel, physicists will continue to improve their analysis methods, relying on more and more advanced artificial-intelligence algorithms to disentangle the rare signal processes from the abundant backgrounds. Physicists are also employing advanced calculation techniques to improve the precision of Standard Model predictions to match the increased measurement precision.

    Furthermore, a bottom-up approach is being introduced which follows in the footsteps of Enrico Fermi. Physicists have developed a theoretical framework that allows new mathematical terms, respecting basic conservation rules and symmetries, to be added “by hand” to the Standard Model, without relying on a specific new physics model. These terms change the predictions in the high-energy regime where new physics could be expected (Figure 5). The simplest form of this approach is called Standard Model Effective Field Theory.

    Figure 5. Distribution of the photon energy in the search for events resulting from the electroweak production of two vector bosons (a Z and a γ) associated with two jets (Zγjj-EW). The black polymarkers represent the data, the full histograms with different colours represent the Standard Model predicted contributions for the signal (in brown) and the many background processes (in different colours). All expected contributions are stacked. The dotted blue line in the upper panel indicates the calculated signal distribution when a new term is added to the Standard Model theory. (Image: ATLAS Collaboration/CERN.)

    Even though we know that an effective theory cannot work at an arbitrary high energy scale, history has shown that, supplemented by measurements, it can provide useful guidance at lower energy. Different production-rate measurements – including those of the Higgs boson, boson self-interactions and the top quark – can be, separately or simultaneously, compared to predictions in the same effective theoretical framework.

    It would be a sensation if more precise measurements indicated that such new terms are necessary to describe the data. It would be a sign of physics beyond the Standard Model and indication of the direction to take in order to develop a more complete theory, depending on which kinds of terms are needed. The interplay between experimental observations and models in the quest for a complete theory would continue.

    Ultimately, all ongoing experimental collider and non-collider studies in particle physics will contribute to building knowledge – be they direct searches for new particles, precision measurements exploiting the power of quantum fluctuations or studies of rare processes. This experimental work is complemented by ever more precise theoretical calculations. In this task, the next generation of powerful particle accelerators now being planned are indispensable tools to find new phenomena that would help us understand the remaining mysteries of the microscopic world.

    See the full article here.

    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN Courier

    Quantum Diaries

    CERN map

    CERN LHC underground tunnel and tube.

    SixTRack CERN LHC particles.

  • richardmitnick 3:31 pm on September 17, 2020 Permalink | Reply
    Tags: "Parking the LHC proton train", , , HEP, If something goes wrong we have to be able to get rid of this beam immediately from the accelerator and send it somewhere safe where it can’t do any damage., It’s an exciting multidisciplinary activity in which the boundary of engineering physics is put to the extreme., , , , Sometimes that’s because the circulating particles have lost too much energy to produce good collisions., , The beam dump is a solid graphite cylinder 8 meters long and under a meter in diameter. It’s wrapped in a stainless-steel case filled with nitrogen gas and surrounded by iron and concrete shielding., The beam dumps provide an excellent mixture of physics and engineering challenges within a challenging radiation environment., The proton train must be parked outside the LHC about three times a day.   

    From Symmetry: “Parking the LHC proton train” 

    Symmetry Mag
    From Symmetry

    Zack Savitsky

    Particle accelerators like the LHC require intricate beam dump systems to safely dispose of high-energy particles after each run.

    CERN/LHC Map

    SixTRack CERN LHC particles

    The Large Hadron Collider at CERN is the world’s most powerful particle accelerator. It hurtles hundreds of trillions of protons around a circular track at just under the speed of light.

    While each individual proton has the kinetic energy of a flying mosquito, the whole proton beam—a collection of 2500 bunches of particles—has as much energy as a 10-carriage railway train traveling 200 mph.

    Like an underground high-speed train, the energetic protons ride along the LHC’s 17-mile track about 100 meters below the surface of the Earth. At the end of each run, or when there are issues on the track, the proton train needs to be able to stop quickly and carefully.

    “If something goes wrong, we have to be able to get rid of this beam immediately from the accelerator and send it somewhere safe where it can’t do any damage,” says Brennan Goddard, leader of the beam transfer group at CERN.

    The proton train must be parked outside the LHC about three times a day. Sometimes that’s because the circulating particles have lost too much energy to produce good collisions. Other times it’s due to an electrical malfunction in the machine. For either case, scientists and engineers have designed a system that immediately diverts the beam to its own train station: the beam dump.

    “The LHC is often referred to as the most complex machine ever built,” says Alex Krainer, a doctoral student at CERN currently designing beam dumps for future accelerators. “The beam dump needs to be the most reliable system in the whole collider. We couldn’t dream of running the machine without it.”

    But how can scientists divert and park a train that is many miles long, the width of a dime, and contains enough stored energy to melt a ton of copper? Like any modern locomotive, it starts with a signal from a complex warning system.

    The LHC is outfitted with tens of thousands of sensors that are continually monitoring the size, spacing and trajectory of the proton beam. If the beam misbehaves, these sensors send an automated signal that triggers a set of magnets to kick the proton train onto a different track. Once the signal is received, the beam switches paths in under 90 microseconds—within one rotation around the LHC.

    On this new track, the proton train is stripped into its constituent carriages, or bunches, which spread out as they enter the beam dump—diluting the energy density that could otherwise damage it.

    The beam dump is a solid graphite cylinder 8 meters long and under a meter in diameter. It’s wrapped in a stainless-steel case, filled with nitrogen gas, and surrounded by iron and concrete shielding.

    It’s made mostly of low-density graphite “with a sandwich of higher-density materials at the end,” says Marco Calviani, leader of the targets, collimators and dumps group at CERN. “If we used only graphite, you’d still have a lot of uncollided protons passing through. And if you put the higher density material toward the front, the block would melt.”

    While dumping the beam, particle collisions cook the cylinder to over 1000 degrees Celsius and produce some new, harmless particles that pass through the block and quickly decay. Most of the proton bunches slow down while traveling through the layers of graphite and safely park in their own spots, distributing the energy of the proton train across the beam dump.

    This solves the problem of overburdening the beam dump. But a different problem arises when the rapid heating and cooling from beam collisions cause the dump to physically move.

    “In recent examination, we found that the dump has actually jumped several centimeters from the regular thermal expansion and contraction,” says Simone Gilardoni, group leader of the Sources, Targets and Interactions group at CERN.

    If the dump gets pushed too far one way, it’ll pull on the pipes connected to it. If it shuffles too far the other, it’ll hit an iron wall. There’s also the issue of wear and tear—the present block is 10 years old.

    The beam dump repair team must attend to melting, moving and bruising concerns creatively, since constant high-energy collisions create radioactive elements. Scientists at the lab are using remote-controlled robots to swap out the main absorber with an upgraded spare and implement a detached cradle for the dump, which allows it to swing back and forth to dampen the harsh movement.

    Such care is necessary to keep the experiment safe and functional. As the LHC crew prepares for its high-luminosity upgrade, which will more than double the intensity of the beam, scientists are working to reinforce the already intricate system. They plan to add more magnetic kickers to handle the beam before it hits the dump. US involvement in the HL-LHC upgrade is supported by the US Department of Energy’s Office of Science and the National Science Foundation.

    “The beam dumps provide an excellent mixture of physics and engineering challenges within a challenging radiation environment,” says Calviani. “It’s an exciting multidisciplinary activity in which the boundary of engineering physics is put to the extreme.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 10:53 am on September 11, 2020 Permalink | Reply
    Tags: "Future machines to explore new frontiers in particle physics", , CERN FCC Future Circular Collider 100km-diameter successor to LHC., CERN-European Organization for Nuclear Research, FNAL Long-Baseline Neutrino Facility, FNAL new superconducting accelerator Proton Improvement Plan II (PIP-II), HEP, , , , , ,   

    From U.S. Department of Energy Office of Science: “Future machines to explore new frontiers in particle physics” 

    DOE Main

    From U.S. Department of Energy Office of Science

    September 10, 2020

    Jim Siegrist
    Associate Director for High Energy Physics Office
    U.S Department of Energy
    Email: news@science.doe.gov

    Particle physics is global. Addressing the full breadth of the field’s most urgent scientific questions requires expertise from around the world. The timeline for developing a world-class international facility to explore new frontiers in the subatomic world may take decades, but it is built from a multitude of milestones marking scientific and technical advances. The U.S. Department of Energy’s (DOE’s) Office of Science is working with partners around the globe to realise the next generation of particle physics facilities and enable future discoveries.

    Studying the science of neutrinos

    Today, the foundational groundwork is underway in the U.S. to host an international facility to study the science of neutrinos. These ghostly particles rarely interact with other forms of matter and change their flavour between three known types as they travel. To enable precision study of this puzzling behaviour, the Long-Baseline Neutrino Facility (LBNF) will produce the world’s most intense beam of neutrinos at DOE’s Fermi National Accelerator Laboratory (Fermilab), in Illinois, and send them 1,300 km through the earth to the Sanford Underground Research Facility in South Dakota.

    SURF-Sanford Underground Research Facility, Lead, South Dakota, USA.

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA.

    A new superconducting particle accelerator at Fermilab, the Proton Improvement Plan II (PIP-II), will provide the high-intensity proton beam needed to create the neutrinos.

    FNAL new superconducting accelerator Proton Improvement Plan II (PIP-II).

    About 1,500 m below the surface of the Earth in South Dakota, the Deep Underground Neutrino Experiment (DUNE) will measure neutrinos as they arrive from Illinois as well as from natural sources, such as supernovas from our region of the Milky Way. An international collaboration of over 1,000 scientists from 33 countries is now working to develop and build the large-scale DUNE detector, using results from prototypes at the CERN Neutrino Platform to refine their design and affirm the technology.

    International partnerships will play a crucial role in the successful realisation of this new international neutrino facility. The DOE Office of Science is working to strengthen existing collaborative partnerships in High Energy Physics and build new ones with global partners in order to bring together the necessary scientific talent and technical expertise. Formal agreements are currently in place with the European Organization for Nuclear Research (CERN) as well as the governments of India, Italy, and the United Kingdom, to contribute to different areas of this mega-scale neutrino endeavour.

    Discussions to expand the partnerships are now underway with several other countries across Europe, Asia, and South America. In fact, through such cooperative partnerships, the contributions for PIP-II will make this facility the first accelerator project hosted in the U.S. with significant contributions from global partners.

    Developing particle accelerator technology

    The DOE Office of Science is also developing particle accelerator technology that will help enable future particle physics facilities around the world. DOE is supporting the development of a future “Higgs factory,” an electron-positron collider with international participation that could produce many Higgs bosons to enable precision studies that complement those at the Large Hadron Collider (LHC) at CERN.

    To realise this vision, DOE supports the R&D of accelerator and detector technologies to enable Japan to move forward with the International Linear Collider (ILC).

    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan.

    Over the past year, DOE has also worked with the U.S. Department of State, The White House Office of Science & Technology Policy, and the National Security Council to make a concerted effort to support a Japanese initiative to move forward with the proposed ILC “Pre-Laboratory” phase of the project.

    Our scientists are developing improvements to the superconducting technology that will increase accelerator cavity efficiency and reduce the cost of construction and subsequent operations.

    FNAL A superconducting radiofrequency cavity responsible for accelerating particles at the new PIP-II accelerator.

    In June, the CERN Council unanimously adopted the resolution updating the 2020 European Strategy for Particle Physics. As recently pointed out by the CERN Director-General, the strategy is visionary and ambitious while remaining realistic and prudent, emphasising many exciting future initiatives in particle physics that can be achieved in collaboration with global partners, including the DOE. As one of its high priorities, the European strategy reaffirms the successful completion of the high-luminosity upgrades of the LHC accelerator and the LHC experimental ATLAS and CMS detectors. To enable this next era of the LHC program, the DOE Office of Science is contributing key magnets and cavity components to the accelerator upgrade, including high-field niobium-tin-based superconducting magnets developed in the United States, as well as state-of-the-art detector elements for the ATLAS and CMS detector upgrades.

    The future: New frontiers in particle physics

    Looking to the farther future towards the next facility after the LHC, studies are underway for a Future Circular Collider (FCC), the next-generation complex that could reach particle collision energies over seven times that of the LHC. The development of such a facility is one of the key focal points of the 2020 update of the European strategy.

    CERN FCC Future Circular Collider details of proposed 100km-diameter successor to LHC.

    Earlier this year, the DOE Office of Science partnered with CERN and national laboratories across Europe on a FCC Innovation Study as part of a European Commission Horizon 2020 Design Study initiative that would investigate the technical design for a 100 km circumference collider in the French-Swiss border, one that could also leverage the existing infrastructure at CERN. The study would enable scientists and engineers to optimise the particle collider design and plan investigations into a suitable civil engineering project while also allowing all global partners to integrate into the study’s network and user community.

    Moreover, DOE and CERN have recently begun discussions to expand DOE’s cooperation into CERN’s proposed future collider and is looking forward to working with CERN and other global partners to envision the technology that could achieve a FCC. Overall, facilities such as the LHC, FCC and LBNF/DUNE/PIP-II across the frontiers of science and technology promise to enable our quest to explore and achieve groundbreaking discoveries.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of the Energy Department is to ensure America’s security and prosperity by addressing its energy, environmental and nuclear challenges through transformative science and technology solutions.

    Science Programs Organization

    The Office of Science manages its research portfolio through six program offices:

    Advanced Scientific Computing Research
    Basic Energy Sciences
    Biological and Environmental Research
    Fusion Energy Sciences
    High Energy Physics
    Nuclear Physics

    The Science Programs organization also includes the following offices:

    The Department of Energy’s Small Business Innovation Research and Small Business Technology Transfer Programs, which the Office of Science manages for the Department;
    The Workforce Development for Teachers and Students program sponsors programs helping develop the next generation of scientists and engineers to support the DOE mission, administer programs, and conduct research; and
    The Office of Project Assessment provides independent advice to the SC leadership regarding those activities essential to constructing and operating major research facilities.

  • richardmitnick 12:06 pm on September 3, 2020 Permalink | Reply
    Tags: "The ALICE TPC is upgraded ", , ALICE at CERN, ALICE’s Time Projection Chamber (TPC) the large tracking device of the LHC’s heavy-ion specialist., , HEP, , ,   

    From ALICE at CERN: “The ALICE TPC is upgraded “ 

    From From ALICE at CERN

    1 September, 2020
    Chilo Garabatos

    The refurbished detector was lowered into the ALICE cavern and installed in the experiment in August.

    The ALICE TPC being tested in its clean room in May 2020. Credit: CERN

    The TPC being lowered down the shaft to the experimental cavern (Images: CERN)

    Credit: CERN

    “One more centimetre,” said the chief technician, while operating the hydraulic jack system on 14 August. The 5-m-diameter, 5-m-long cylindrical detector gently slid into the parking position, 56 metres below the ground in the ALICE cavern at LHC Point 2, where it will stand for some time. This operation culminates the many-years-long upgrade of ALICE’s Time Projection Chamber (TPC), the large tracking device of the LHC’s heavy-ion specialist.

    The ALICE TPC is a big, gas-filled cylinder with a hole in the centre – to accommodate the silicon tracker as well as the beam pipe – where the charge produced by ionising radiation is projected onto detectors arranged in the two endplates. These detectors used to be multi-wire proportional chambers, 72 in total, which have now been replaced by detectors based on Gas Electron Multipliers (GEM), a micro-pattern structure developed at CERN. These new devices, together with new readout electronics that feature a continuous readout mode, will allow ALICE to record the information of all tracks produced in lead–lead collisions at rates of 50 kHz, producing data at a staggering rate of 3.5 TB/s. The average load on the chambers under these conditions is expected to be as high as 10 nA/cm², and the GEM detectors are able to cope with this. But will these new devices perform as nicely as their predecessors?

    In order to answer this question, several years of intensive R&D were necessary, since the large number of positive ions produced at the detectors would lead to excessive track distortions. This, combined with the necessity of keeping excellent energy-loss (dE/dx) resolution for particle identification, and the imperative robustness against discharges, posed an exciting challenge that led to a novel configuration of GEM-based detectors.

    As the fabrication of over 800 GEM foils was taking place at the CERN PCB workshop, the new chambers and electronics were being constructed and thoroughly tested around the world – quite a logistic exercise. The ALICE team proceeded with the final steps of the upgrade process during the ongoing second long shutdown of CERN’s accelerator complex (LS2). First, the TPC was extracted from the underground cavern and brought, inside its blue frame, to a large clean room at the surface. Cranes, jacks and a huge truck were used for careful transportation. The chamber replacement, electronics installation and tests with a laser system, cosmic rays and X-rays took over a year. In July 2020, the TPC was declared ready for being re-installed in the cavern. Cranes, truck and jacks once again.

    ALICE achieved a major milestone with the completion of the TPC upgrade, after many years of intense R&D, construction and assembly. At the end of 2020, all the services will be connected and the full, upgraded TPC will be operated and commissioned together with all other detectors in the experiment. The real excitement will be when the first post-LS2 collisions from the LHC are delivered.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN/ALICE Detector

    Meet CERN in a variety of places:

    Quantum Diaries

    Cern Courier

  • richardmitnick 7:53 am on August 28, 2020 Permalink | Reply
    Tags: "CMS experiment at CERN releases fifth batch of open data", , , , , HEP, , ,   

    From CERN CMS: “CMS experiment at CERN releases fifth batch of open data” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN CMS

    27 August, 2020
    Achintya Rao

    All research-quality data recorded by CMS during the first two years of LHC operation are now publicly available.

    An artistic representation of the CMS detector made of pulsating lines. (Image: Achintya Rao/CERN.)

    The CMS collaboration at CERN has released into the open 18 new datasets, comprising proton–proton collision data recorded by CMS at the Large Hadron Collider (LHC) during the second half of 2011. LHC data are unique and are of interest to the scientific community as well as to those in education. Preserving the data and the knowledge to analyse them is critical. CMS has therefore committed to releasing its research data openly, with up to 100% being made available 10 years after recording them; the embargo gives the scientists working on CMS adequate time to analyse the data themselves.

    The total data volume of this latest release is 96 terabytes. Not only does this batch complement the data from the first half of 2011, released back in 2016, it also provides additional tools, workflows and examples as well as improved documentation for analysing the data using cloud technologies. The data and related materials are available on the CERN Open Data portal, an open repository built using CERN’s home-grown and open source software, Invenio.

    Previous releases from CMS included the full recorded data volume from 2010 and half the volumes from 2011 and 2012 (the first “run” of the LHC). Special “derived datasets”, some for education and others for data science, have allowed people around the world to “rediscover” the Higgs boson in CMS open data. Novel papers have also been published using CMS data, by scientists unaffiliated with the collaboration.

    In the past, those interested in analysing CMS open data needed to install the CMS software onto virtual machines to re-create the appropriate analysis environment. This made it challenging to scale up a full analysis for research use, a task that requires considerable computing resources. With this batch, CMS has updated the documentation for using software containers with all the software pre-installed and added workflows running on them, allowing the data to be easily analysed in the cloud, either at universities or using commercial providers. Some of the new workflows are also integrated with REANA, the CERN platform for reusable analyses.

    CMS and the CERN Open Data team have been working closely with current and potential users of the open data – in schools, in higher education and in research – to improve the services offered. The search functionality of the portal has been updated with feedback from teachers who participated in dedicated workshops at CERN in previous years, the documentation has been enhanced based on conversations with research users and a new online forum has been established to provide support. In September, CMS is organising a virtual workshop for theoretical physicists interested in using the open data.

    “We are thrilled to be able to release these new data and tools from CMS into the public domain,” says Kati Lassila-Perini, who has co-led the CMS project for open data and data preservation since its inception. “We look forward to seeing how the steps we have taken to improve the usability of our public data are received by the community of users, be it in education or in research.”

    You can read more about the latest open-data release from CMS on the CERN Open Data portal: opendata.cern.ch/docs/cms-completes-2010-2011-pp-data

    See the full article here.

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries

    Cern Courier

    CERN CMS New

  • richardmitnick 9:06 am on August 26, 2020 Permalink | Reply
    Tags: "Z bosons zoom through quark–gluon plasma as jets quench", , ATLAS physicists have measured jet-quenching phenomena in the quark–gluon plasma with help of Z bosons., , , HEP, , , , When beams of lead ions collide head-on in the LHC the matter comprising the nuclei melts away and forms a high-temperature quark–gluon plasma (QGP).   

    From CERN ATLAS: “Z bosons zoom through quark–gluon plasma as jets quench” 

    CERN/ATLAS detector

    CERN ATLAS Higgs Event

    CERN ATLAS another view Image Claudia Marcelloni ATLAS CERN


    17th August 2020 [Just now in social media.]
    ATLAS Collaboration

    Figure 1: ATLAS event display showing the Z+jet process occurring in a lead–lead collision. In this event, the Z boson is identified by its decay to two muons (red lines). The jet can be seen as a small, collimated set of blue towers, surrounded by the transparent green cone in the lower image. (Image: ATLAS Collaboration/CERN.)

    With new data from the LHC, ATLAS physicists have measured jet-quenching phenomena in the quark–gluon plasma with help of Z bosons.

    When beams of lead ions collide head-on in the LHC, the matter comprising the nuclei melts away and forms a high-temperature quark–gluon plasma (QGP) – an extended region of interacting quarks and gluons. As high-momentum jets of particles attempt to traverse this region, their energy and radiative properties change through interactions with the QGP medium. This phenomenon is known as jet quenching. Its study can help physicists understand the properties of the QGP and give new insight into the theory of the strong nuclear force (quantum chromodynamics).

    The ATLAS Collaboration has performed extensive studies of how jets are quenched in the QGP. The emerging picture is that the quarks and gluons lose energy in the medium, fragmenting into fewer high-momentum particles. The remaining energy is redistributed by the QGP and appears as low-momentum, “thermal” particles over a broad area around the jet. For example, studies have shown that the total momentum in a jet is depleted relative to expectations from proton–proton collisions and that the distribution of particles inside the jet is modified.

    However, one challenge in interpreting these measurements is that they are made after the quenching process – so it is impossible to tell whether a given jet has traversed the medium or merely glanced at it. This can be overcome by studying events where the jet is produced as a partner to a high-momentum photon, with the two moving in opposite directions in the detector. Since photons do not interact via the strong nuclear force they pass through the QGP medium without being affected. Thus, photons serve as a “tag” or a control experiment for the jet’s momentum before quenching. ATLAS physicists have previously used this key feature to measure jet quenching and jet structure modification in photon–jet events using lead–lead data recorded in 2015.

    In new results released today based on the large lead–lead collision dataset accumulated in 2018, ATLAS researchers applied this same strategy to measure jet quenching tagged with another particle: the Z boson. Similar to photon-tagged events, a jet can be produced alongside a Z boson which decays into particles (two electrons or two muons) that do not interact with the QGP medium. An example of such a collision event can be seen in Figure 1, where a Z boson decays into two muons (red lines).

    Figure 2: Ratio of the yield of charged particles opposite in angle to a Z boson between lead–lead collisions and proton-proton collisions. The charged particles are the result of jet fragmentation. As a result of the jet-quenching process, the ratio is below one at high transverse momentum (pTch), and above one at low transverse momentum. The data (red) are compared to theoretical predictions (purple, blue, green yellow). (Image: ATLAS Collaboration/CERN.)

    Unlike photons, Z bosons can be measured in a low momentum range, where experiments have difficulty triggering on photons and distinguishing them from various background particles. At these low scales, the jet momentum matches the QGP temperature scale more closely and quenching effects are expected to be large. This region is thus particularly interesting to explore, despite being experimentally difficult to access.

    The new ATLAS result measures the production of charged particles opposite a Z boson. The measurement compares lead–lead collisions and proton–proton collisions, using the leptonic Z boson “tag” to select a similar population of jets arising predominantly from high-momentum quarks. Physicists looked at the charged particle yield for each tagged Z-boson event, and measured the ratio of this quantity between head-on (“central”) lead–lead and proton–proton events (as shown in Figure 2). At large transverse momentum (> 3 GeV), there are significantly fewer charged particles in lead–lead collisions, consistent with the picture of energy loss and softer fragmentation in the QGP. At small transverse momentum (< 3 GeV), there are significantly more charged particles, reflecting the thermalization of the lost energy by the medium.

    Researchers then compared the results to a variety of state-of-the-art theoretical calculations, which describe the jet-quenching process according to different models. These calculations all indicate a suppression of high-momentum particles, with a corresponding enhancement of low-momentum particles, but each prediction differs quantitatively from the rest. These comparisons highlight the value of new experimental data to constrain theory in this particular area. The upcoming Run 3 of the LHC should bring many more Z boson events in lead–lead collisions – opening further avenues for these discriminating measurements.


    Medium-induced modification of Z-tagged charged particle yields in Pb+Pb collisions at 5.02 TeV with the ATLAS detector (submitted to Phys. Rev. Lett, see figures)
    Measurement of the nuclear modification factor for inclusive jets in lead–lead at 5.02 TeV with the ATLAS detector (Phys. Lett. B 790 (2019) 108, see figures)
    Measurement of jet fragmentation in lead–lead and proton–proton collisions at 5.02 TeV with the ATLAS detector (Phys. Rev. C 98 (2018) 024908, see figures)
    Comparison of Fragmentation Functions for Jets Dominated by Light Quarks and Gluons from proton–proton and lead–lead Collisions in ATLAS (Phys. Rev. Lett. 123 (2019) 042001, see figures)
    Measurement of photon-jet transverse momentum correlations in 5.02 TeV lead–lead and proton–proton collisions with ATLAS (Phys. Lett. B 789 (2019) 167, see figures)
    Photon-tagged jet quenching in the quark-gluon plasma, Physics Briefing, October 2017
    See also the full lists of ATLAS Conference Notes and ATLAS Physics Papers.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN Courier

    Quantum Diaries

    CERN map

    CERN LHC Grand Tunnel
    CERN LHC particles

  • richardmitnick 3:50 pm on August 24, 2020 Permalink | Reply
    Tags: "LHC creates matter from light", , , E = mc², Forces that seem separate in our everyday lives—electromagnetism and the weak force—are united., From massless to massive, HEP, Last year the ATLAS experiment at the LHC observed two photons- particles of light- ricocheting off one another and producing two new photons., , , , Scientists on an experiment at the Large Hadron Collider see massive W particles emerging from collisions with electromagnetic fields., , The LHC is the only place where scientists have seen two energetic photons merging and transforming into massive W bosons., The reason photons can collide and produce W bosons in the LHC is that at the highest energies those forces combine to make the electroweak force.   

    From Symmetry: “LHC creates matter from light” 

    Symmetry Mag
    From Symmetry<

    Sarah Charley

    Scientists on an experiment at the Large Hadron Collider see massive W particles emerging from collisions with electromagnetic fields. How can this happen?

    Illustration by Sandbox Studio, Chicago

    The Large Hadron Collider plays with Albert Einstein’s famous equation, E = mc², to transform matter into energy and then back into different forms of matter. But on rare occasions, it can skip the first step and collide pure energy—in the form of electromagnetic waves.

    CERN LHC Map

    Last year, the ATLAS experiment at the LHC observed two photons, particles of light, ricocheting off one another and producing two new photons.


    This year, they’ve taken that research a step further and discovered photons merging and transforming into something even more interesting: W bosons, particles that carry the weak force, which governs nuclear decay.

    This research doesn’t just illustrate the central concept governing processes inside the LHC: that energy and matter are two sides of the same coin. It also confirms that at high enough energies, forces that seem separate in our everyday lives—electromagnetism and the weak force—are united.

    From massless to massive

    If you try to replicate this photon-colliding experiment at home by crossing the beams of two laser pointers, you won’t be able to create new, massive particles. Instead, you’ll see the two beams combine to form an even brighter beam of light.

    “If you go back and look at Maxwell’s equations for classical electromagnetism, you’ll see that two colliding waves sum up to a bigger wave,” says Simone Pagan Griso, a researcher at the US Department of Energy’s Lawrence Berkeley National Laboratory. “We only see these two phenomena recently observed by ATLAS when we put together Maxwell’s equations with special relativity and quantum mechanics in the so-called theory of quantum electrodynamics.”

    Inside CERN’s accelerator complex, protons are accelerated close to the speed of light. Their normally rounded forms squish along the direction of motion as special relativity supersedes the classical laws of motion for processes taking place at the LHC. The two incoming protons see each other as compressed pancakes accompanied by an equally squeezed electromagnetic field (protons are charged, and all charged particles have an electromagnetic field). The energy of the LHC combined with the length contraction boosts the strength of the protons’ electromagnetic fields by a factor of 7500.

    When two protons graze each other, their squished electromagnetic fields intersect. These fields skip the classical “amplify” etiquette that applies at low energies and instead follow the rules outlined by quantum electrodynamics. Through these new laws, the two fields can merge and become the “E” in E=mc².

    “If you read the equation E=mc² from right to left, you’ll see that a small amount of mass produces a huge amount of energy because of the c² constant, which is the speed of light squared,” says Alessandro Tricoli, a researcher at Brookhaven National Laboratory—the US headquarters for the ATLAS experiment, which receives funding from DOE’s Office of Science. “But if you look at the formula the other way around, you’ll see that you need to start with a huge amount of energy to produce even a tiny amount of mass.”

    The LHC is one of the few places on Earth that can produce and collide energetic photons, and it’s the only place where scientists have seen two energetic photons merging and transforming into massive W bosons.

    A unification of forces

    The generation of W bosons from high-energy photons exemplifies the discovery that won Sheldon Glashow, Abdus Salam and Steven Weinberg the 1979 Nobel Prize in physics: At high energies, electromagnetism and the weak force are one in the same.

    Electricity and magnetism often feel like separate forces. One normally does not worry about getting shocked while handling a refrigerator magnet. And light bulbs, even while lit up with electricity, don’t stick to the refrigerator door. So why do electrical stations sport signs warning about their high magnetic fields?

    “A magnet is one manifestation of electromagnetism, and electricity is another,” Tricoli says. “But it’s all electromagnetic waves, and we see this unification in our everyday technologies, such as cell phones that communicate through electromagnetic waves.”

    At extremely high energies, electromagnetism combines with yet another fundamental force: the weak force. The weak force governs nuclear reactions, including the fusion of hydrogen into helium that powers the sun and the decay of radioactive atoms.

    Just as photons carry the electromagnetic force, the W and Z bosons carry the weak force. The reason photons can collide and produce W bosons in the LHC is that at the highest energies, those forces combine to make the electroweak force.

    “Both photons and W bosons are force carriers, and they both carry the electroweak force,” Griso says. “This phenomenon is really happening because nature is quantum mechanical.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 12:59 pm on August 21, 2020 Permalink | Reply
    Tags: "Researchers discover first 'open-charm' tetraquark", , , , HEP, , , ,   

    From CERN LHCb via phys.org: “Researchers discover first ‘open-charm’ tetraquark” 

    Cern New Bloc

    Cern New Particle Event

    From CERN LHCb



    August 21, 2020

    The band associated with the new tetraquark transforming into a D− and a K+ at a mass of 2.9 GeVc2. Credit: LHCb Collaboration/CERN

    The LHCb experiment at CERN has developed a penchant for finding exotic combinations of quarks, the elementary particles that come together to give us composite particles such as the more familiar proton and neutron. In particular, LHCb has observed several tetraquarks, which, as the name suggests, are made of four quarks (or rather two quarks and two antiquarks). Observing these unusual particles helps scientists advance our knowledge of the strong force, one of the four known fundamental forces in the universe. At a CERN seminar held virtually on 12 August, LHCb announced the first signs of an entirely new kind of tetraquark with a mass of 2.9 GeV/c²: the first such particle with only one charm quark.

    First predicted to exist in 1964, scientists have observed six kinds of quarks (and their antiquark counterparts) in the laboratory: up, down, charm, strange, top and bottom. Since quarks cannot exist freely, they group to form composite particles: three quarks or three antiquarks form “baryons” like the proton, while a quark and an antiquark form “mesons.”

    The LHCb detector at the Large Hadron Collider (LHC) is devoted to the study of B mesons, which contain either a bottom or an antibottom. Shortly after being produced in proton–proton collisions at the LHC, these heavy mesons transform—or “decay”—into a variety of lighter particles, which may undergo further transformations themselves. LHCb scientists observed signs of the new tetraquark in one such decay, in which the positively charged B meson transforms into a positive D meson, a negative D meson and a positive kaon: B+→D+D−K+. In total, they studied around 1300 candidates for this particular transformation in all the data the LHCb detector has recorded so far.

    The well-established quark model predicts that some of the D+D− pairs in this transformation could be the result of intermediate particles—such as the ψ(3770) meson—that only manifest momentarily: B+→ψ(3770)K+→D+D−K+. However, theory does not predict meson-like intermediaries resulting in a D−K+ pair. LHCb were therefore surprised to see a clear band in their data corresponding to an intermediate state transforming into a D−K+ pair at a mass of around 2.9 GeV/c², or around three times the mass of a proton.

    The data have been interpreted as the first sign of a new exotic state of four quarks: an anticharm, an up, a down and an antistrange (c̄uds̄). All previous tetraquark-like states observed by LHCb always had a charm–anticharm pair, resulting in net-zero “charm flavor.” The newly observed state is the first time a tetraquark containing a sole charm has been seen, which has been dubbed an “open-charm” tetraquark.

    “When we first saw the excess in our data, we thought there was a mistake,” says Dan Johnson, who led the LHCb analysis. “After years of analyzing the data, we accepted that there really is something surprising!”

    Why is this important? It so happens that the jury is still out as to what a tetraquark really is. Some theoretical models favor the notion that tetraquarks are pairs of distinct mesons bound together temporarily as a “molecule,” while other models prefer to think of them as a single cohesive unit of four particles. Identifying new kinds of tetraquarks and measuring their properties—such as their quantum spin (their intrinsic spatial orientation) and their parity (how they appear under a mirror-like transformation) – will help paint a clearer picture of these exotic inhabitants of the subatomic domain. Johnson adds: “This discovery will also allow us to stress-test our theories in an entirely new domain.”

    While LHCb’s observation is an important first step, more data will be needed to verify the nature of the structure observed in the B+ decay. The LHCb collaboration will also anticipate independent verification of their discovery from other dedicated B-physics experiments such as Belle II. Meanwhile, the LHC continues to provide new and exciting results for experimentalists and theorists alike to dig into.

    Science paper:

    Click to access 20-08Aug-11_DanJohnson.pdf

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN LHCb New II

    Meet CERN in a variety of places:

    Quantum Diaries

    Cern Courier

  • richardmitnick 10:09 am on August 18, 2020 Permalink | Reply
    Tags: "Long-lived particles get their moment", , ATLAS and CMS experiments at CERN LHC, , , HEP, , , ,   

    From Symmetry: “Long-lived particles get their moment” 

    Symmetry Mag
    From Symmetry

    Sarah Charley

    Scientists on experiments at the LHC are redesigning their methods and building supplemental detectors to look for new particles that might be evading them.

    Illustration by Sandbox Studio, Chicago with Ariel Davis.

    Duke University postdoc Katherine Pachal has spent the last ten years—from undergraduate on—searching for new particles with the ATLAS experiment at the Large Hadron Collider. “I’ve always been a search person,” she says.

    CERN ATLAS Image Claudia Marcelloni

    Physicists discovered the Higgs boson in 2012, but since then the list of known fundamental particles has remained static.

    CERN CMS Higgs Event May 27, 2012

    CERN ATLAS Higgs Event June 12, 2012

    This hasn’t dampened Pachal’s enthusiasm for the search for new particles. Rather, she sees it as an indication that physicists need to look for them in an innovative new way.

    When scientists designed detectors for the LHC, they wagered that new forces, fields and physics would come in the form of extremely short-lived particles that decay almost precisely at their points of origin. Scientists catch the particles that behave this way—such as the aforementioned Higgs bosons—in detectors surrounding the collision points.

    “The primary goal of ATLAS [above] and CMS was to find the Higgs, and we built darn good experiments to do that,” Pachal says.

    CERN/CMS Detector

    With many unanswered questions still looming in the field, LHC physicists are revisiting their original assumptions and reinventing their tools and techniques to reach for long-lived particles—ones that could travel long distances before becoming detectable.

    Illustration by Sandbox Studio, Chicago with Ariel Davis.

    Long-lived particles

    “We already have long-lived particles in the Standard Model,” says Jingyu Luo, a graduate student at Princeton University.

    Standard Model of Particle Physics, Quantum Diaries

    Muons, for instance, can travel several kilometers before decaying (which is the main reason the particle detectors at CERN are so enormous). Protons and electrons may not decay at all.

    According to theorist Jonathan Feng at the University of California, Irvine, physicists were originally hesitant to search for additional long-lived particles because there seemed to be no real need for them in the theory.

    “If you want to come up with a theory with long-lived particles, it’s extremely easy,” he says. “You could add an arbitrarily long-lived particle to any theory and put it in by hand, but there was no rhyme or reason to it.”

    Feng’s feelings changed in 2003 when he was building upon a popular set of theories called supersymmetry, and a long-lived particle popped out of his equations. “There was no way around it, we needed long-lived particles,” he says. “This was different than putting it in by hand. It was coming out of a very well-structured theory.”

    But these theoretical particles seemed out of the grasp of the experiments running at the LHC.

    The detectors for the ATLAS and CMS experiments—funded by CERN member states and other contributing countries including the United States, via the US Department of Energy’s Office of Science and the National Science Foundation—generate about 50 terabytes of data a second. Most of this data comes from already well-understood subatomic processes, and a series of increasingly selective trigger systems evaluate the onslaught of hits and only pass along events that they pre-approve as “high quality and potentially interesting.” But a new type of long-lived particle wouldn’t necessarily have any of these pre-defined ‘interesting’ characteristics.

    “Our trigger systems are lacking a lot of the information that is core to many of our long-lived particle searchers,” Pachal says.

    These systems make snap judgments based on factors such as the amount of energy a collision leaves in the detector (a good indicator of the presence of a rare, massive particle). Scientists have already developed software that helps their trigger systems scan parts of the detector for signs of long-lived particles. But for a truly comprehensive search, they need to consider detailed particle tracks.

    “In the past, we were restricted by how time-consuming it is to reconstruct all the tracks,” Pachal says. In the next run of the LHC, “we’re improving our software so that we can use more of the detector to look for particle tracks in the trigger, and this will help us make these more subtle decisions.”

    Illustration by Sandbox Studio, Chicago with Ariel Davis.


    Even if long-lived particles are out there waiting to be found, there is still the question of whether scientists can find enough of them to claim a discovery.

    Traditional techniques to pick out possible sightings of new particles involve a series of strict cuts, removing giant chunks of the dataset at a time. “For instance, if I had a room full of people and wanted to find fans of the Italian composer Ennio Morricone, I could make a series of judgements such as, ‘people between 50 and 70 are good candidates to like this kind of music’ and focus my attention on them,” Luo says. “But in reality, it’s so much more complicated than that.”

    To separate long-lived particle candidates from an ocean of look-alikes, Luo is incorporating machine earning.

    Traditional techniques rely on a series of pre-programed “yes” or “no” check boxes to determine which events to keep. Machine-learning algorithms, on the contrary, examine thousands of collision events to build a deep understanding of how different variables interplay with one another to create the kind of particle signatures physicists are looking for.

    By the time physicists look at the data deemed “interesting,” their machine-learning framework is already a collision connoisseur. Like an expert judge scoring rhythmic gymnastics at the Olympics, it has built up enough specialized knowledge to rate each contender.

    The avoidance of strict cuts gives physicists increased flexibility to conduct these kinds of blue-sky searches.

    “There’s what we know and what we don’t know,” Luo says. “What we know is that there is a group of models that predict the existence of long-lived particles. But what we don’t know is which model is right.”

    Luo and his colleagues are working on model-independent searches at the LHC. Their goal is to stay sensitive to many types of potential long-lived particles, with a wide range of characteristics. “Leave no stone unturned,” he says.

    Particle escape artists

    While CMS and ATLAS search for long-lived particles inside their detectors, other teams of scientists are considering how to capture long-lived particles that could travel beyond them.

    “Upgrading existing experiments is one method,” Feng says. “The other method involves building supplemental detectors.”

    In fall of 2017, Feng and colleagues proposed building one such detector, which they named FASER.

    CERN FASER experiment schematic

    To catch long-lived particles that might escape the ATLAS experiment, FASER will sit in an unused tunnel that just happens to be right along the path they expect particles to follow, 480 meters from the ATLAS detector.

    Construction for FASER started in 2019. It is scheduled to start operation when collisions resume at the LHC, foreseen for late 2021 or early 2022.

    Teams of scientists are designing other, larger detectors—with names such as CODEX-b and MATHUSLA—to be built near other LHC collision points.

    With the help of these improved tools and techniques, the LHC physics community will be poised to jump on new physics. “There’s a moment for everything, and the moment for long-lived particles is starting,” Pachal says.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: