Tagged: Particle Accelerators Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:57 pm on July 22, 2019 Permalink | Reply
    Tags: , CERN NA64, , , Particle Accelerators, ,   

    From CERN: “NA64 casts light on dark photons” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    22 July, 2019
    Ana Lopes

    The NA64 collaboration has placed new limits on the interaction between a photon and its hypothetical dark-matter counterpart.

    1
    The NA64 experiment (Image: CERN)

    Without dark matter, most galaxies in the universe would not hold together. Scientists are pretty sure about this. However, they have not been able to observe dark matter and the particles that comprise it directly. They have only been able to infer its presence through the gravitational pull it exerts on visible matter.

    Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com

    Coma cluster via NASA/ESA Hubble

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL)


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu

    One hypothesis is that dark matter consists of particles that interact with each other and with visible matter through a new force carried by a particle called the dark photon. In a recent study, the collaboration behind the NA64 experiment at CERN describes how it has tried to hunt down such dark photons.

    NA64 is a fixed-target experiment. A beam of particles is fired onto a fixed target to look for particles and phenomena produced by collisions between the beam particles and atomic nuclei in the target. Specifically, the experiment uses an electron beam of 100 GeV energy from the Super Proton Synchrotron accelerator.

    The Super Proton Synchrotron (SPS), CERN’s second-largest accelerator. (Image: Julien Ordan/CERN)

    In the new study, the NA64 team looked for dark photons using the missing-energy technique: although dark photons would escape through the NA64 detector unnoticed, they would carry away energy that can be identified by analysing the energy budget of the collisions.

    The team analysed data collected in 2016, 2017 and 2018, which together corresponded to a whopping hundred billion electrons hitting the target. They found no evidence of dark photons in the data but their analysis resulted in the most stringent bounds yet on the strength of the interaction between a photon and a dark photon for dark-photon masses between 1 MeV and 0.2 GeV.

    These bounds imply that a 1-MeV dark photon would interact with an electron with a force that is at least one hundred thousand times weaker than the electromagnetic force carried by a photon, whereas a 0.2-GeV dark photon would interact with an electron with a force that is at least one thousand times weaker. The collaboration anticipates obtaining even stronger limits with the upgraded detector, which is expected to be completed in 2021.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Tunnel

    CERN LHC particles

     
  • richardmitnick 12:49 pm on July 21, 2019 Permalink | Reply
    Tags: "A golden era of exploration: ATLAS highlights from EPS-HEP 2019", , , , , Particle Accelerators, ,   

    From CERN ATLAS: “A golden era of exploration: ATLAS highlights from EPS-HEP 2019” 

    CERN/ATLAS detector

    CERN ATLAS Higgs Event

    CERN ATLAS another view Image Claudia Marcelloni ATLAS CERN


    CERN ATLAS New II Credit CERN SCIENCE PHOTO LIBRARY


    From CERN ATLAS

    20th July 2019
    Katarina Anthony

    1
    Event display of a Higgs boson candidate decaying in the four-lepton channel. (Image: ATLAS Collaboration/CERN)

    Eight years of operation. Over 10,000 trillion high-energy proton collisions. One critical new particle discovery. Countless new insights into our universe. The Large Hadron Collider (LHC) has been breaking records since data-taking began in 2010 – and yet, for ATLAS and its fellow LHC experiments, a golden era of exploration is only just beginning.

    2
    Figure 1: New ATLAS measurement of the Higgs boson decaying in the four-lepton channel, using the full LHC Run-2 dataset. The distribution of the invariant mass of the four leptons (m4l) is shown. The Higgs boson corresponds to the excess of events (blue) over the non-resonant ZZ* background (red) at 125 GeV. (Image: ATLAS Collaboration/CERN)

    This week, the ATLAS Collaboration presented 25 new results at the European Physical Society’s High-Energy Physics conference (EPS-HEP) in Ghent, Belgium. The new analyses examine the largest-ever proton–proton collision dataset from the LHC, recorded during Run 2 of the accelerator (2015–2018) at the 13 TeV energy frontier.

    The new data have been fertile ground for ATLAS. New precision measurements of the Higgs boson, observations of key electroweak processes and high-precision tests of the Standard Model are among the highlights described below; find the full list of ATLAS public results using the full Run-2 dataset here.

    Studying the Higgs discovery channels

    Just over seven years ago, the Higgs boson was an elusive particle, out of reach from physicists for nearly five decades. Today, not only is the Higgs boson frequently observed, it is studied with such precision as to become a powerful tool for exploration.

    Key to these accomplishments are the so-called “Higgs discovery channels”: H→γγ, where the Higgs boson decays into two photons, and H→ZZ*→4l, where it decays via two Z bosons into four leptons. Though rare, these decays are easily identified in the ATLAS detector, making them essential to both the particle’s discovery and study.

    ATLAS presented new explorations of the Higgs boson in these channels (Figures 1 and 2), yielding greater insight into its behaviour. The new results benefit from the large full Run-2 dataset, as well as a number of new improvements to the analysis techniques. For example, ATLAS physicists now utilise Deep-Learning Neural Networks to assign the Higgs-boson events to specific production modes.

    All four Higgs-boson production modes can now be clearly identified in a single decay channel. ATLAS’ studies of the Higgs boson have advanced so quickly, in fact, that rare processes – such as its production in association with a top-quark pair, observed only just last year – can now been seen in just a single decay channel. The new sensitivity allowed physicists to measure kinematic properties of the Higgs boson with unprecedented precision (Figure 3). These are sensitive to new physics processes, making their exploration of particular interest to the collaboration.

    All four Higgs-boson production modes can now be clearly identified in a single decay channel. ATLAS’ studies of the Higgs boson have advanced so quickly, in fact, that rare processes – such as its production in association with a top-quark pair, observed only just last year – can now been seen in just a single decay channel. The new sensitivity allowed physicists to measure kinematic properties of the Higgs boson with unprecedented precision (Figure 3). These are sensitive to new physics processes, making their exploration of particular interest to the collaboration.

    3
    Figure 2: Distribution of the invariant mass of the two photons in the ATLAS measurement of H→γγ using the full Run-2 dataset. The Higgs boson corresponds to the excess of events observed at 125 GeV with respect to the non-resonant background (dashed line). (Image: ATLAS Collaboration/CERN)

    4
    Figure 3: Differential cross section for the transverse momentum (pT,H) of the Higgs boson from the two individual channels (H→ZZ*→4ℓ, H→γγ) and their combination. (Image: ATLAS Collaboration/CERN)

    Searching unseen properties of the Higgs boson

    Having accomplished the observation of Higgs boson interactions with third-generation quarks and leptons, ATLAS physicists are turning their focus to the lighter, second-generation of fermions: muons, charm quarks and strange quarks. While their interactions with the Higgs boson are described by the Standard Model, they have – so far – remained relegated to theory. Results from the ATLAS Collaboration are backing up these theories with real data.

    At EPS-HEP, ATLAS presented a new search for the Higgs boson decaying into muon pairs. This already-rare process is made all the more difficult to detect by background Standard Model processes, which produce muon pairs in abundance.

    5
    Figure 4: ATLAS search for the Higgs boson decaying to two muons. The plot shows the weighted muon pair invariant mass spectrum (muu) summed over all categories. (Image: ATLAS Collaboration/CERN)

    The new result utilised novel machine learning techniques to provide ATLAS’ most sensitive result yet, with a moderate excess of 1.5 standard deviations expected for the predicted signal. In agreement with this prediction, only a small excess of 0.8 standard deviations is present around the Higgs-boson mass in the data (Figure 4).

    “This result shows that we are now close to the sensitivity required to test the Standard Model’s predictions for this very rare decay of the Higgs boson,” said ATLAS spokesperson Karl Jakobs from the University of Freiburg, Germany. “However, a definitive statement on the second generation will require the larger datasets that will be provided by the LHC in Run 3 and by the High-Luminosity LHC.”

    ATLAS’ growing sensitivity was also clearly on display in the collaboration’s new “di-Higgs” search, where two Higgs bosons are formed via the fusion of two vector bosons. Though one of the rarest Standard Model processes explored by ATLAS, its study gives unique insight into the previously-untested relationship between vector boson and Higgs-boson pairs. A small variation of this coupling relative to the Standard Model value would result in a dramatic rise in the measured cross section. The new search, despite being negative, successfully sets the first constraints on this relationship.

    Entering the Higgs sector

    The Higgs mechanism, giving mass to all elementary particles, is directly connected with profound questions about our universe, including the stability and energy of the vacuum, the “naturalness” of a world described by the Standard Model, and more. As such, the exploration of the Higgs sector is not limited to direct measurements of the Higgs boson – it instead requires a broad experimental programme that will extend over decades.

    A perfect example of this came in ATLAS’ new observation of the electroweak production of two jets in association with a pair of Z bosons. The Z and W bosons are the force carriers of weak interactions and, as they both have a spin of 1, are known as “vector bosons”. The Higgs boson is a vital mediator in “vector-boson scattering”, an electroweak process that contributes to the pair production of vector bosons (WW, WZ and ZZ) with jets. Measurements of these production processes are key for the study of electroweak symmetry breaking via the Higgs mechanism.

    The new ATLAS result – with a statistical significance of 5.5 standard deviations (Figure 5) – completes the experiment’s observation of vector-boson scattering in these critical processes, and sparks new ways to test the Standard Model.

    6
    Figure 5: Observed and predicted distributions (BDT) in the signal regions of Z-boson pairs decaying to four leptons. The electroweak production of the Z-boson pair is shown in red; the error bars on the data points (black) show the statistical uncertainty on data. (Image: ATLAS Collaboration/CERN)

    7
    Figure 6: Summary of the mass limits on supersymmetry models set by the ATLAS searches for Supersymmetry. Results are quoted for the nominal cross section in both a region of near-maximal mass reach and a demonstrative alternative scenario, in order to display the range in model space of search sensitivity. (Image: ATLAS Collaboration/CERN)

    Probing new physics

    As the community enters the tenth year of supersymmetry searches at the LHC, the ATLAS Collaboration continues to take a broad approach to the hunt. ATLAS is committed to providing results that are theory-independent as well as signature-based searches, in addition to the highly-targeted, model-dependent ones.

    Along with new, updated limits on various supersymmetry searches using the full Run-2 dataset (Figure 6), ATLAS once again highlighted new searches (first presented at the LHCP2019 conference) for superpartners produced through the electroweak interaction. Generated at extremely low rates at the LHC and decaying into Standard Model particles that are themselves difficult to reconstruct, such supersymmetry searches can only be described by the iconic quote: “not because it is easy, but because it is hard”.

    Overall, the results place strong constraints on important supersymmetric scenarios, which will inform theory developments and future ATLAS searches. Further, they provide examples of how advanced reconstruction techniques can help improve the ATLAS’ sensitivity of new physics searches.

    Asymmetric top-quark production

    The Standard Model continued to show its strength in ATLAS’ new precision measurement of charge asymmetry in top-quark pairs (Figure 7). This intriguing imbalance – where top and antitop quarks are not produced equally at all angles with respect to the proton beam direction – is among the most subtle, difficult and yet vital properties to measure in the study of top quarks.

    The effect of this asymmetry is predicted to be extremely small, however new physics processes interfering with the known production modes can lead to larger (or even smaller) values. ATLAS found evidence of this imbalance, with a significance of four standard deviations, with a value compatible with the Standard Model. The result marks an important milestone for the field, following decades of measurements which began at the Tevatron proton–antiproton collider, the predecessor of the LHC in the USA.

    FNAL/Tevatron


    FNAL/Tevatron map

    8
    Figure 7: Measured values of the charge asymmetry (Ac) as a function of the invariant mass of the top quark pair system (mtt) in data. (Image: ATLAS Collaboration/CERN)

    Following the data

    As EPS-HEP 2019 drew to a close, it was clear that exploration of the high-energy frontier remains far from complete. With the LHC – and its upcoming HL-­LHC upgrade – set to continue apace, the future of high-energy physics will be guided by the results of ATLAS and its fellow experiments at the energy frontier.

    “Our community is living through data-driven times,” said ATLAS Deputy Spokesperson Andreas Hoecker from CERN. “Experimental results must guide the high-energy physics community to the next stage of exploration. This requires a broad and diverse particle physics research programme. The ATLAS Collaboration is up to taking this challenge!”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN Courier

    Quantum Diaries
    QuantumDiaries

    CERN map


    CERN LHC Grand Tunnel
    CERN LHC particles

     
  • richardmitnick 12:35 pm on July 18, 2019 Permalink | Reply
    Tags: "CMS releases open data for Machine Learning", , , , Particle Accelerators, ,   

    From CERN CMS: “CMS releases open data for Machine Learning” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN CMS

    17 July, 2019

    CMS has also provided open access to 100% of its research data recorded in proton–proton collisions in 2010.

    1
    (Image: Fermilab/CERN)

    The CMS collaboration at CERN has released its fourth batch of open data to the public. With this release, which brings the volume of its open data to more than 2 PB (or two million GB), CMS has now provided open access to 100% of its research data recorded in proton–proton collisions in 2010, in line with the collaboration’s data-release policy. The release also includes several new data and simulation samples. The new release builds upon and expands the scope of the successful use of CMS open data in research and in education.

    In this release, CMS open data address the ever-growing application of machine learning (ML) to challenges in high-energy physics. According to a recent paper, collaboration with the data-science and ML community is considered a high-priority to help advance the application of state-of-the-art algorithms in particle physics. CMS has therefore also made available samples that can help foster such collaboration.

    “Modern machine learning is having a transformative impact on collider physics, from event reconstruction and detector simulation to searches for new physics,” remarks Jesse Thaler, an Associate Professor at MIT, who is working on ML using CMS open data with two doctoral students, Patrick Komiske and Eric Metodiev. “The performance of machine-learning techniques, however, is directly tied to the quality of the underlying training data. With the extra information provided in the latest data release from CMS, outside users can now investigate novel strategies on fully realistic samples, which will likely lead to exciting advances in collider data analysis.”

    The ML datasets, derived from millions of CMS simulation events for previous and future runs of the Large Hadron Collider, focus on solving a number of representative challenges for particle identification, tracking and distinguishing between multiple collisions that occur in each crossing of proton bunches. All the datasets come with extensive documentation on what they contain, how to use them and how to reproduce them with modified content.

    In its policy on data preservation and open access, CMS commits to releasing 100% of its analysable data within ten years of collecting them. Around half of proton-proton collision data collected at 7 TeV center-of-mass in 2010 were released in the first CMS release in 2014, and the remaining data are included in this new release. In addition, a small sample of unprocessed raw data from LHC’s Run 1 (2010 to 2012) are also released. These samples will help test the chain for processing CMS data using the legacy software environment.

    Reconstructed data and simulations from the CASTOR calorimeter, which was used by CMS in 2010, are also available and represent the first release of data from the very-forward region of CMS. Finally, CMS has released instructions and examples on how to generate simulated events and how to analyse data in isolated “containers”, within which one has access to the CMS software environment required for specific datasets. It is also easier to search through the simulated data and to discover the provenance of datasets.

    As before, the data are released into the public domain under the Creative Commons CC0 waiver via the CERN Open Data portal. The portal is openly developed by the CERN Information Technology department, in cooperation with the experimental collaborations who release open data on it.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    CMS
    CERN CMS New

     
  • richardmitnick 7:59 am on July 17, 2019 Permalink | Reply
    Tags: "Bottomonium particles don’t go with the flow", , , , , Particle Accelerators, ,   

    From CERN: “Bottomonium particles don’t go with the flow” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    16 July, 2019
    Ana Lopes

    The first measurement, by the ALICE [below] collaboration, of an elliptic-shaped flow for bottomonium particles could help shed light on the early universe.

    A few millionths of a second after the Big Bang, the universe was so dense and hot that the quarks and gluons that make up protons, neutrons and other hadrons existed freely in what is known as the quark–gluon plasma. The ALICE experiment at the Large Hadron Collider (LHC) can recreate this plasma in high-energy collisions of beams of heavy ions of lead. However, ALICE, as well as any other collision experiments that can recreate the plasma, cannot observe this state of matter directly. The presence and properties of the plasma can only be deduced from the signatures it leaves on the particles that are produced in the collisions.

    In a new article, presented at the ongoing European Physical Society conference on High-Energy Physics, the ALICE collaboration reports the first measurement of one such signature – the elliptic flow – for upsilon particles produced in lead–lead LHC collisions.

    The upsilon is a bottomonium particle, consisting of a bottom (often also called beauty) quark and its antiquark. Bottomonia and their charm-quark counterparts, charmonium particles, are excellent probes of the quark–gluon plasma. They are created in the initial stages of a heavy-ion collision and therefore experience the entire evolution of the plasma, from the moment it is produced to the moment it cools down and gives way to a state in which hadrons can form.

    One indication that the quark–gluon plasma forms is the collective motion, or flow, of the produced particles. This flow is generated by the expansion of the hot plasma after the collision, and its magnitude depends on several factors, including: the particle type and mass; how central, or “head on”, the collision is; and the momenta of the particles at right angles to the collision line. One type of flow, called elliptic flow, results from the initial elliptic shape of non-central collisions.

    In their new study, the ALICE team determined the elliptic flow of the upsilons by observing the pairs of muons (heavier cousins of the electron) into which they transform, or “decay”. They found that the magnitude of the upsilon elliptic flow for a range of momenta and collision centralities is small, making the upsilons the first hadrons that don’t seem to exhibit a significant elliptic flow.

    The results are consistent with the prediction that the upsilons are largely split up into their constituent quarks in the early stages of their interaction with the plasma, and they pave the way to higher-precision measurements using data from ALICE’s upgraded detector, which will be able to record ten times more upsilons. Such data should also cast light on the curious case of the J/psi flow. This lighter charmonium particle has a larger flow and is believed to re-form after being split up by the plasma.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 12:10 pm on July 15, 2019 Permalink | Reply
    Tags: , , , , , , , Particle Accelerators, ,   

    From CERN: “Exploring the Higgs boson “discovery channels” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    12th July 2019
    ATLAS Collaboration

    1
    Event display of a two-electron two-muon ZH candidate. The Higgs candidate can be seen on the left with the two leading electrons represented by green tracks and green EM calorimeter deposits (pT = 22 and 120 GeV), and two subleading muons indicated by two red tracks (pT = 34 and 43 GeV). Recoiling against the four lepton candidate in the left hemisphere is a dimuon pair in the right hemisphere indicated by two red tracks (pT = 139 and 42 GeV) and an invariant mass of 91.5 GeV, which agrees well with the mass of the Z boson. (Image: ATLAS Collaboration/CERN)

    At the 2019 European Physical Society’s High-Energy Physics conference (EPS-HEP) taking place in Ghent, Belgium, the ATLAS and CMS collaborations presented a suite of new results. These include several analyses using the full dataset from the second run of CERN’s Large Hadron Collider (LHC), recorded at a collision energy of 13 TeV between 2015 and 2018. Among the highlights are the latest precision measurements involving the Higgs boson. In only seven years since its discovery, scientists have carefully studied several of the properties of this unique particle, which is increasingly becoming a powerful tool in the search for new physics.

    The results include new searches for transformations (or “decays”) of the Higgs boson into pairs of muons and into pairs of charm quarks. Both ATLAS and CMS also measured previously unexplored properties of decays of the Higgs boson that involve electroweak bosons (the W, the Z and the photon) and compared these with the predictions of the Standard Model (SM) of particle physics. ATLAS and CMS will continue these studies over the course of the LHC’s Run 3 (2021 to 2023) and in the era of the High-Luminosity LHC (from 2026 onwards).

    The Higgs boson is the quantum manifestation of the all-pervading Higgs field, which gives mass to elementary particles it interacts with, via the Brout-Englert-Higgs mechanism. Scientists look for such interactions between the Higgs boson and elementary particles, either by studying specific decays of the Higgs boson or by searching for instances where the Higgs boson is produced along with other particles. The Higgs boson decays almost instantly after being produced in the LHC and it is by looking through its decay products that scientists can probe its behaviour.

    In the LHC’s Run 1 (2010 to 2012), decays of the Higgs boson involving pairs of electroweak bosons were observed. Now, the complete Run 2 dataset – around 140 inverse femtobarns each, the equivalent of over 10 000 trillion collisions – provides a much larger sample of Higgs bosons to study, allowing measurements of the particle’s properties to be made with unprecedented precision. ATLAS and CMS have measured the so-called “differential cross-sections” of the bosonic decay processes, which look at not just the production rate of Higgs bosons but also the distribution and orientation of the decay products relative to the colliding proton beams. These measurements provide insight into the underlying mechanism that produces the Higgs bosons. Both collaborations determined that the observed rates and distributions are compatible with those predicted by the Standard Model, at the current rate of statistical uncertainty.

    Since the strength of the Higgs boson’s interaction is proportional to the mass of elementary particles, it interacts most strongly with the heaviest generation of fermions, the third. Previously, ATLAS and CMS had each observed these interactions. However, interactions with the lighter second-generation fermions – muons, charm quarks and strange quarks – are considerably rarer. At EPS-HEP, both collaborations reported on their searches for the elusive second-generation interactions.
    ATLAS presented their first result from searches for Higgs bosons decaying to pairs of muons (H→μμ) with the full Run 2 dataset. This search is complicated by the large background of more typical SM processes that produce pairs of muons. “This result shows that we are now close to the sensitivity required to test the Standard Model’s predictions for this very rare decay of the Higgs boson,” says Karl Jakobs, the ATLAS spokesperson. “However, a definitive statement on the second generation will require the larger datasets that will be provided by the LHC in Run 3 and by the High-Luminosity LHC.”
    CMS presented their first result on searches for decays of Higgs bosons to pairs of charm quarks (H→cc). When a Higgs boson decays into quarks, these elementary particles immediately produce jets of particles. “Identifying jets formed by charm quarks and isolating them from other types of jets is a huge challenge,” says Roberto Carlin, spokesperson for CMS. “We’re very happy to have shown that we can tackle this difficult decay channel. We have developed novel machine-learning techniques to help with this task.”

    3
    An event recorded by CMS showing a candidate for a Higgs boson produced in association with two top quarks. The Higgs boson and top quarks decay leading to a final state with seven jets (orange cones), an electron (green line), a muon (red line) and missing transverse energy (pink line) (Image: CMS/CERN)

    The Higgs boson also acts as a mediator of physics processes in which electroweak bosons scatter or bounce off each other. Studies of these processes with very high statistics serve as powerful tests of the Standard Model. ATLAS presented the first-ever measurement of the scattering of two Z bosons. Observing this scattering completes the picture for the W and Z bosons as ATLAS has previously observed the WZ scattering process and both collaborations the WW processes. CMS presented the first observation of electroweak-boson scattering that results in the production of a Z boson and a photon.
    “The experiments are making big strides in the monumental task of understanding the Higgs boson,” says Eckhard Elsen, CERN’s Director of Research and Computing. “After observation of its coupling to the third-generation fermions, the experiments have now shown that they have the tools at hand to address the even more challenging second generation. The LHC’s precision physics programme is in full swing.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 9:00 am on July 12, 2019 Permalink | Reply
    Tags: "ATLAS searches for rare Higgs boson decays into muon pairs", , , , , Particle Accelerators, ,   

    From CERN ATLAS: “ATLAS searches for rare Higgs boson decays into muon pairs” 

    CERN/ATLAS detector

    CERN ATLAS Higgs Event

    CERN ATLAS another view Image Claudia Marcelloni ATLAS CERN


    CERN ATLAS New II Credit CERN SCIENCE PHOTO LIBRARY


    From CERN ATLAS

    11th July 2019
    ATLAS Collaboration

    1
    Figure 1: A Run 2 Higgs boson candidate event containing two muons (red) and two jets (yellow cones). (Image: ATLAS Collaboration/CERN)

    Could the Higgs boson still surprise us? Since its discovery in 2012, the ATLAS and CMS collaborations at CERN have been actively studying the properties of this latest and most mysterious addition to the Standard Model of particle physics.

    In the Standard Model, the Brout-Englert-Higgs mechanism predicts the Higgs boson will interact with matter particles (quarks and leptons, known as fermions) with a strength proportional to the particle’s mass. It also predicts the Higgs boson will interact with the force carrier particles (W and Z bosons) with a strength proportional to the square of the particle’s mass. Therefore, by measuring the Higgs boson decay and production rates, which depend on the interaction strength to these other particles, ATLAS physicists can perform a fundamental test of the Standard Model.

    Today, at the European Physical Society Conference on High-Energy Physics (EPS-HEP) in Ghent, Belgium, the ATLAS Collaboration released a new preliminary result searching for Higgs boson decays to a muon and antimuon pair (H → μμ). The new, more sensitive result uses the full Run 2 dataset, analysing almost twice as many Higgs boson events as the previous ATLAS result (released in 2018, for the ICHEP conference).

    Both the ATLAS and CMS Collaborations have already observed the Higgs boson decaying to tau lepton – the muon’s heavier cousin, belonging to the third “generation” of fermions. Since muons are much lighter than tau leptons, the Higgs boson decay to a muon pair is expected to occur about 300 times less often than that to a tau-lepton pair. Despite this scarceness, the H → μμ decay offers the best opportunity to measure the Higgs interaction with second-generation fermions at the LHC, providing new insights into the origin of mass for different fermion generations.

    2
    Figure 2: The muon pair invariant mass spectrum summed over all categories. Each event is weighted by log(1+S/B), where S and B are the number of signal and background events between 120 and 130 GeV of a given category determined by the simultaneous fit. The weighting visualizes the effect of the categorization on the analysis. The curves show the results of the fit. (Image: ATLAS Collaboration/CERN)

    Experimentally, ATLAS is well-equipped to identify and reconstruct muon pairs. By combining measurements from the ATLAS inner detector and muon spectrometer, physicists can achieve a good muon momentum resolution. However, they must also account for muons being created by a common background: the abundant “Drell-Yan process”, where a muon pair is produced via the exchange of a virtual Z boson or a photon. To help differentiate the H → μμ signal from this background, ATLAS teams use multivariate discriminants (boosted decision trees), which exploit the different production and decay properties of each event. For example, H → μμ signal events are characterised by a more central muon pair system and a larger momentum in the plane transverse to the colliding protons.

    To further enhance the sensitivity of the search, physicists separate the potential H → μμ events into multiple categories, each with different expected signal-to-background ratios. They examine each category separately, studying the distribution of the mass of the muon pair of the selected events. The signal and background abundances could then be determined simultaneously by a fit to the mass spectrum, exploiting the different shapes of the signal and background processes. Figure 2 shows the resulting muon pair mass distribution combined over all the categories.

    In the new ATLAS result, no significant excess of events above the measured background was observed in the signal region around the Higgs boson mass of 125 GeV. The observed signal significance is 0.8 standard deviations for 1.5 standard deviations expected from the Standard Model. An upper limit on the Higgs boson production cross section times branching fraction to muons was set at 1.7 times the Standard Model prediction at 95% confidence level. This new result represents an improvement of about 50% with respect to previous ATLAS results.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN Courier

    Quantum Diaries
    QuantumDiaries

    CERN map


    CERN LHC Grand Tunnel
    CERN LHC particles

     
  • richardmitnick 8:42 am on July 12, 2019 Permalink | Reply
    Tags: "ATLAS finds evidence of charge asymmetry in top quark pairs", , , Particle Accelerators, ,   

    From CERN ATLAS: “ATLAS finds evidence of charge asymmetry in top quark pairs” 

    CERN/ATLAS detector

    CERN ATLAS Higgs Event

    CERN ATLAS another view Image Claudia Marcelloni ATLAS CERN


    CERN ATLAS New II Credit CERN SCIENCE PHOTO LIBRARY


    From CERN ATLAS

    11th July 2019
    ATLAS Collaboration

    1
    Figure 1: Measured values of the charge asymmetry (Ac) as a function of the invariant mass of the top quark pair system (mtt) in data. The green hatched regions show new state-of-the-art Standard Model predictions, while red hatched regions show the asymmetry as implemented in simulated ‘Monte Carlo’ events. Vertical bars correspond to the total uncertainties. (Image: ATLAS Collaboration/CERN)

    Among the most intriguing particles studied by the ATLAS collaboration is the top quark. As the heaviest known fundamental particle, it plays a unique role in the Standard Model of particle physics and – perhaps – in yet unseen physics beyond the Standard Model.

    During Run 2 of the Large Hadron Collider (LHC), proton beams were collided with high luminosity at a centre-of-mass energy of 13 TeV. This allowed ATLAS to detect and measure an unprecedented number of events involving top-antitop quark pairs, providing ATLAS physicists with a unique opportunity to gain insight into the top quark’s properties.

    Due to sneaky interference between particles involved in the production, top and antitop quarks are not produced equally with respect to the proton beam direction in the ATLAS detector. Instead, top quarks are produced preferentially in the centre of the LHC’s collisions, while antitop quarks are produced preferentially at larger angles. This is known as a ‘charge asymmetry’.

    Charge asymmetry is similar to a phenomenon measured at the Tevatron collider at Fermilab, known as a ‘forward-backward’ asymmetry. At Tevatron, colliding beams were made of protons and anti-protons, respectively, which led to top and antitop quarks each being produced at non-central angles, but in opposite directions. A forward-backward asymmetry, compatible with improved Standard Model predictions, was observed.

    FNAL/Tevatron map

    FNAL/Tevatron

    2
    Figure 2: Confidence limits on the linear combination C−/Λ2 of Wilson coefficients of dimension-six EFT operators. The bounds are derived from a comparison of the charge-asymmetry measurements presented in this paper with the state-of-the-art Standard Model predictions. Also shown are bounds derived from the forward-backward asymmetry measurements at the Tevatron using collisions at a centre-of-mass energy 1.96TeV, at Run 1 LHC charge-asymmetry measurements in proton-proton collisions at a centre-of-mass energy of 8 TeV. (Image: ATLAS Collaboration/CERN)

    The effect of charge asymmetry at the LHC is predicted to be extremely small (< 1%), as the dominant production mode of top-quark pairs via the scattering of gluons (the carriers of the strong force) emerging from the protons does not exhibit a charge asymmetry. A residual asymmetry can only be generated by more complicated scattering processes involving also quarks. However, new physics processes interfering with the known production modes can lead to much larger (or even smaller) values. Therefore, a precision measurement of the charge asymmetry is a stringent test of the Standard Model. It is among the most subtle, difficult, and yet important properties to measure in the study of top quarks.

    A new ATLAS result, presented today at the European Physical Society Conference on High-Energy Physics (EPS-HEP) in Ghent, Belgium, examines the full Run 2 dataset to measure top-antitop production in a channel where one top quark decays to one charged lepton, a neutrino and one hadronic "jet" (a spray of hadrons); and the other decays to three hadronic jets. The analysis fully includes events where the hadronic jets are merged together (so-called "boosted topology").

    ATLAS finds evidence of charge asymmetry in top-quark pair events, with a significance of four standard deviations. The measured charge asymmetry of 0.0060 ± 0.0015 (stat+syst.) is compatible with the latest Standard Model prediction, and the measurement confidently states that the observed asymmetry is non-zero. It is the first ATLAS top physics measurement to utilise the full Run 2 dataset.

    The new ATLAS result marks a very important milestone following decades of measurements. Figure 1 shows that the dataset allows ATLAS to measure the charge asymmetry as a function of the mass of the top-antitop system. Figure 2 shows the resulting bounds on anomalous effective field theory (EFT) couplings that parametrise effects from new physics which would be beyond the reach of being directly produced at the LHC.

    This new result is yet another demonstration of ATLAS’ ability to study subtle Standard Model effects with great precision. The observed agreement with Standard Model predictions provides one more piece to the puzzle in our understanding of particle physics at the energy frontier.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN Courier

    Quantum Diaries
    QuantumDiaries

    CERN map


    CERN LHC Grand Tunnel
    CERN LHC particles

     
  • richardmitnick 8:38 am on June 12, 2019 Permalink | Reply
    Tags: , , , , Particle Accelerators, , , ,   

    From Science Magazine: “Exotic particles called pentaquarks may be less weird than previously thought” 

    AAAS
    From Science Magazine

    Jun. 5, 2019
    Adrian Cho

    1
    The Large Hadron Collider beauty experiment has discovered three new pentaquarks. Peter Ginter/CERN

    Four years ago, when experimenters spotted pentaquarks—exotic, short-lived particles made of five quarks—some physicists thought they had glimpsed the strong nuclear force, which binds the atomic nucleus, engaging in a bizarre new trick. New observations have now expanded the zoo of pentaquarks, but suggest a tamer explanation for their structure. The findings, from the Large Hadron Collider beauty experiment (LHCb), a particle detector fed by the LHC at CERN, the European particle physics laboratory near Geneva, Switzerland, suggest pentaquarks are not bags of five quarks binding in a new way, but are more like conventional atomic nuclei.

    “I’m really excited that the new data send such a clear message,” says Tomasz Skwarnicki, an LHCb physicist at Syracuse University in New York who led the study. But, he notes, “It may not be the message some people had hoped for.”

    Pentaquarks are heavier cousins of protons and neutrons, which are also made of quarks. In ordinary matter, quarks come in two types, up and down. Atom smashers can blast four heavier types of quarks into brief existence: charm, strange, top, and bottom. Quarks cling to one another through the strong force so mightily they cannot be isolated. Instead, they are almost always found in groups of three in particles known as baryons—including the proton and neutron—or in pairs called mesons, which consist of a quark and an antimatter quark.

    But for decades, some theorists have hypothesized the existence of larger bundles of quarks. In recent years, experimenters have found evidence for four-quark particles, or tetraquarks. Then, in 2015, LHCb reported signs of two pentaquarks.

    Some theorists argue that the new particles are bags of four and five quarks, bound together through the exchange of quantum particles called gluons, adding a new wrinkle to the often intractable theory of the strong force. Others argue they’re more like an atomic nucleus. In this “molecular” picture a pentaquark is a three-quark baryon stuck to a two-quark meson the same way that protons and neutrons bind in a nucleus—by exchanging short-lived pi mesons.

    LHCb’s new pentaquarks, reported today in Physical Review Letters (PRL), bolster the molecular picture. In 2015, LHCb researchers reported a pentaquark with a mass of 4450 megaelectron volts (MeV), 4.74 times the mass of the proton. With nine times more data, they now find in that mass range two nearly overlapping but separate pentaquarks with masses of 4440 MeV and 4457 MeV. They also find a lighter pentaquark at 4312 MeV. Each contains the same set of quarks: charm, anticharm, two ups, and a down. (Previous hints of a pentaquark at 4380 MeV have faded.)

    3
    Pentaquark depiction

    5
    New Large Hadron Collider data reveal that exotic quark quintets, discovered in 2016, are composites of quark-antiquark mesons and three-quark baryons.

    The lightest pentaquark has a mass just below the sum of a particular baryon and meson that together contain the correct quark ingredients. The heavier pentaquarks have masses just below the sum of the same baryon and a related meson with extra internal energy. That suggests each pentaquark is just a baryon bound to a meson, with a tiny bit of mass taken up in binding energy. “This is a no-brainer explanation,” says Marek Karliner, a theorist at Tel Aviv University in Israel.

    The molecular picture also helps explain why the pentaquarks, although fleeting, appear to be more stable than expected, Karliner says. That’s because packaging the charm quark in the baryon and anticharm quark in the meson separates them, keeping them from annihilating each other.

    Other theorists rushed to a similar conclusion when LHCb researchers discussed their results at a conference in La Thuile, Italy, in March. For example, within a day, Li-Sheng Geng, a theorist at Beihang University in Beijing, and colleagues posted a paper, in press at PRL, that uses the molecular picture to predict the existence of four more pentaquarks that should be within LHCb’s reach.

    But the bag-of-quarks picture is not dead. Pentaquarks should occasionally form when protons are bombarded with gamma ray photons, as physicists at Thomas Jefferson National Accelerator Facility in Newport News, Virginia, are trying to do. But they have yet to spot any pentaquarks. That undermines the molecular picture because it predicts higher rates for such photoproduction than the bag-of-quarks model does, says Ahmed Ali, a theorist at DESY, the German accelerator laboratory in Hamburg. “They are already almost excluding the molecular interpretation,” he says. Others say it’s too early to draw such conclusions.

    The structure of pentaquarks isn’t necessarily an either/or proposition, notes Feng-Kun Guo, a theorist at the Chinese Academy of Sciences in Beijing. Quantum mechanics allows a tiny object to be both a particle and a wave, or to be in two places at once. Similarly, a pentaquark could have both structures simultaneously. “It’s just a question of which one is dominant,” Guo says.

    Regardless of the binding mechanism, the new pentaquarks are exciting because they suggest the existence of a whole new family of such particles, Karliner says. “It’s like a whole new periodic table.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 2:11 pm on June 11, 2019 Permalink | Reply
    Tags: , , , , , Future Circular Collider (FCC), If we don’t push the frontiers of physics we’ll never learn what lies beyond our current understanding., , Lepton collider, New accelerators ecplored, Particle Accelerators, , , Proton collider,   

    From Ethan Siegel: “Does Particle Physics Have A Future On Earth?” 

    From Ethan Siegel
    Jun 11. 2019

    1
    The inside of the LHC, where protons pass each other at 299,792,455 m/s, just 3 m/s shy of the speed of light. As powerful as the LHC is, the cancelled SSC could have been three times as powerful, and may have revealed secrets of nature that are inaccessible at the LHC. (CERN)

    If we don’t push the frontiers of physics, we’ll never learn what lies beyond our current understanding.

    At a fundamental level, what is our Universe made of? This question has driven physics forward for centuries. Even with all the advances we’ve made, we still don’t know it all. While the Large Hadron Collider discovered the Higgs boson and completed the Standard Model earlier this decade, the full suite of the particles we know of only make up 5% of the total energy in the Universe.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    Standard Model of Particle Physics

    We don’t know what dark matter is, but the indirect evidence for it is overwhelming.

    Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com

    Coma cluster via NASA/ESA Hubble

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL)


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu

    Same deal with dark energy.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Timeline of the Inflationary Universe WMAP

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.

    Or questions like why the fundamental particles have the masses they do, or why neutrinos aren’t massless, or why our Universe is made of matter and not antimatter. Our current tools and searches have not answered these great existential puzzles of modern physics. Particle physics now faces an incredible dilemma: try harder, or give up.

    2
    The Standard Model of particle physics accounts for three of the four forces (excepting gravity), the full suite of discovered particles, and all of their interactions. Whether there are additional particles and/or interactions that are discoverable with colliders we can build on Earth is a debatable subject, but one we’ll only know the answer to if we explore past the known energy frontier. (CONTEMPORARY PHYSICS EDUCATION PROJECT / DOE / NSF / LBNL)

    The particles and interactions that we know of are all governed by the Standard Model of particle physics, plus gravity, dark matter, and dark energy. In particle physics experiments, however, it’s the Standard Model alone that matters. The six quarks, charged leptons and neutrinos, gluons, photon, gauge bosons and Higgs boson are all that it predicts, and each particle has been not only discovered, but their properties have been measured.

    As a result, the Standard Model is perhaps a victim of its own success. The masses, spins, lifetimes, interaction strengths, and decay ratios of every particle and antiparticle have all been measured, and they agree with the Standard Model’s predictions at every turn. There are enormous puzzles about our Universe, and particle physics has given us no experimental indications of where or how they might be solved.

    3
    The particles and antiparticles of the Standard Model have now all been directly detected, with the last holdout, the Higgs Boson, falling at the LHC earlier this decade. All of these particles can be created at LHC energies, and the masses of the particles lead to fundamental constants that are absolutely necessary to describe them fully. These particles can be well-described by the physics of the quantum field theories underlying the Standard Model, but they do not describe everything, like dark matter. (E. SIEGEL / BEYOND THE GALAXY)

    It might be tempting, therefore, to presume that building a superior particle collider would be a fruitless endeavor. Indeed, this could be the case. The Standard Model of particle physics has explicit predictions for the couplings that occur between particles. While there are a number of parameters that remain poorly determined at present, it’s conceivable that there are no new particles that a next-generation collider could reveal.

    The heaviest Standard Model particle is the top quark, which takes roughly ~180 GeV of energy to create. While the Large Hadron Collider can reach energies of 14 TeV (about 80 times the energy needed to create a top quark), there might not be any new particles present to find unless we reach energies in excess of 1,000,000 times as great. This is the great fear of many: the possible existence of a so-called “energy desert” extending for many orders of magnitude.

    4
    There is certainly new physics beyond the Standard Model, but it might not show up until energies far, far greater than what a terrestrial collider could ever reach. Still, whether this scenario is true or not, the only way we’ll know is to look. In the meantime, properties of the known particles can be better explored with a future collider than any other tool. The LHC has failed to reveal, thus far, anything beyond the known particles of the Standard Model. (UNIVERSE-REVIEW.CA)

    But it’s also possible that there is new physics present at a modest scale beyond where we’ve presently probed. There are many theoretical extensions to the Standard Model that are quite generic, where deviations from the Standard Model’s predictions can be detected by a next-generation collider.

    If we want to know what the truth about our Universe is, we have to look, and that means pushing the present frontiers of particle physics into uncharted territory. Right now, the community is debating between multiple approaches, with each one having its pros and cons. The nightmare scenario, however, isn’t that we’ll look and won’t find anything. It’s that infighting and a lack of unity will doom experimental physics forever, and that we won’t get a next-generation collider at all.

    5
    A hypothetical new accelerator, either a long linear one or one inhabiting a large tunnel beneath the Earth, could dwarf the sensitivity to new particles that prior and current colliders can achieve. Even at that, there’s no guarantee we’ll find anything new, but we’re certain to find nothing new if we fail to try. (ILC COLLABORATION)

    When it comes to deciding what collider to build next, there are two generic approaches: a lepton collider (where electrons and positrons are accelerated and collided), and a proton collider (where protons are accelerated and collided). The lepton colliders have the advantages of:

    the fact that leptons are point particles, rather than composite particles,
    100% of the energy from electrons colliding with positrons can be converted into energy for new particles,
    the signal is clean and much easier to extracts,
    and the energy is controllable, meaning we can choose to tune the energy to a specific value and maximize the chance of creating a specific particle.

    Lepton colliders, in general, are great for precision studies, and we haven’t had a cutting-edge one since LEP was operational nearly 20 years ago.

    CERN LEP Collider

    5
    At various center-of-mass energies in electron/positron (lepton) colliders, various Higgs production mechanisms can be reached at explicit energies. While a circular collider can achieve much greater collision rates and production rates of W, Z, H, and t particles, a long-enough linear collider can conceivably reach higher energies, enabling us to probe Higgs production mechanisms that a circular collider cannot reach. This is the main advantage that linear lepton colliders possess; if they are low-energy only (like the proposed ILC), there is no reason not to go circular. (H. ABRAMOWICZ ET AL., EUR. PHYS. J. C 77, 475 (2017))

    It’s very unlikely, unless nature is extremely kind, that a lepton collider will directly discover a new particle, but it may be the best bet for indirectly discovering evidence of particles beyond the Standard Model. We’ve already discovered particles like the W and Z bosons, the Higgs boson, and the top quark, but a lepton collider could both produce them in great abundances and through a variety of channels.

    The more events of interest we create, the more deeply we can probe the Standard Model. The Large Hadron Collider, for example, will be able to tell whether the Higgs behaves consistently with the Standard Model down to about the 1% level. In a wide series of extensions to the Standard Model, ~0.1% deviations are expected, and the right future lepton collider will get you the best physics constraints possible.

    6
    The observed Higgs decay channels vs. the Standard Model agreement, with the latest data from ATLAS and CMS included. The agreement is astounding, and yet frustrating at the same time. By the 2030s, the LHC will have approximately 50 times as much data, but the precisions on many decay channels will still only be known to a few percent. A future collider could increase that precision by multiple orders of magnitude, revealing the existence of potential new particles.(ANDRÉ DAVID, VIA TWITTER)

    These precision studies could be incredibly sensitive to the presence of particles or interactions we haven’t yet discovered. When we create a particle, it has a certain set of branching ratios, or probabilities that it will decay in a variety of ways. The Standard Model makes explicit predictions for those ratios, so if we create a million, or a billion, or a trillion such particles, we can probe those branching ratios to unprecedented precisions.

    If you want better physics constraints, you need more data and better data. It isn’t just the technical considerations that should determine which collider comes next, but also where and how you can get the best personnel, the best infrastructure and support, and where you can build a (or take advantage of an already-existing) strong experimental and theoretical physics community.

    7
    The idea of a linear lepton collider has been bandied about in the particle physics community as the ideal machine to explore post-LHC physics for many decades, but that was under the assumption that the LHC would find a new particle other than the Higgs. If we want to do precision testing of Standard Model particles to indirectly search for new physics, a linear collider may be an inferior option to a circular lepton collider. (REY HORI/KEK)

    There are two general classes proposals for a lepton collider: a circular collider and a linear collider. Linear colliders are simple: accelerate your particles in a straight line and collide them together in the center. With ideal accelerator technology, a linear collider 11 km long could reach energies of 380 GeV: enough to produce the W, Z, Higgs, or top in great abundance. With a 29 km linear collider, you could reach energies of 1.5 TeV, and with a 50 km collider, 3 TeV, although costs rise tremendously to accompany longer lengths.

    Linear colliders are slightly less expensive than circular colliders for the same energy, because you can dig a smaller tunnel to reach the same energies, and they don’t suffer energy losses due to synchrotron radiation, enabling them to reach potentially higher energies. However, the circular colliders offer an enormous advantage: they can produce much greater numbers of particles and collisions.

    Future Circular Collider (FCC)Larger LHC


    The Future Circular Collider is a proposal to build, for the 2030s, a successor to the LHC with a circumference of up to 100 km: nearly four times the size of the present underground tunnels. This will enable, with current magnet technology, the creation of a lepton collider that can produce ~1⁰⁴ times the number of W, Z, H, and t particles that have been produced by prior and current colliders. (CERN / FCC STUDY)

    While a linear collider might be able to produce 10 to 100 times as many collisions as a prior-generation lepton collider like LEP (dependent on energies), a circular version can surpass that easily: producing 10,000 times as many collisions at the energies required to create the Z boson.

    Although circular colliders have substantially higher event rates than linear colliders at the relevant energies that produce Higgs particles as well, they begin to lose their advantage at energies required to produce top quarks, and cannot reach beyond that at all, where linear colliders become dominant.

    Because all of the decay and production processes that occur in these heavy particles scales as either the number of collisions or the square root of the number of collisions, a circular collider has the potential to probe physics with many times the sensitivity of a linear collider.

    7
    A number of the various lepton colliders, with their luminosity (a measure of the collision rate and the number of detections one can make) as a function of center-of-mass collision energy. Note that the red line, which is a circular collider option, offers many more collisions than the linear version, but gets less superior as energy increases. Beyond about 380 GeV, circular colliders cannot reach, and a linear collider like CLIC is the far superior option. (GRANADA STRATEGY MEETING SUMMARY SLIDES / LUCIE LINSSEN (PRIVATE COMMUNICATION))

    The proposed FCC-ee, or the lepton stage of the Future Circular Collider, would realistically discover indirect evidence for any new particles that coupled to the W, Z, Higgs, or top quark with masses up to 70 TeV: five times the maximum energy of the Large Hadron Collider.

    The flipside to a lepton collider is a proton collider, which — at these high energies — is essentially a gluon-gluon collider. This cannot be linear; it must be circular.

    8
    The scale of the proposed Future Circular Collider (FCC), compared with the LHC presently at CERN and the Tevatron, formerly operational at Fermilab. The Future Circular Collider is perhaps the most ambitious proposal for a next-generation collider to date, including both lepton and proton options as various phases of its proposed scientific programme. (PCHARITO / WIKIMEDIA COMMONS)

    There is really only one suitable site for this: CERN, since it not only needs a new, enormous tunnel, but all the infrastructure of the prior stages, which only exist at CERN. (They could be built elsewhere, but the cost would be more expensive than a site where the infrastructure like the LHC and earlier colliders like SPS already exist.)

    The Super Proton Synchrotron (SPS), CERN’s second-largest accelerator.

    Just as the LHC is presently occupying the tunnel previously occupied by LEP, a circular lepton collider could be superseded by a next-generation circular proton collider, such as the proposed FCC-pp. However, you cannot run both an exploratory proton collider and a precision lepton collider simultaneously; you must decommission one to finish the other.

    9
    The CMS detector at CERN, one of the two most powerful particle detectors ever assembled. Every 25 nanoseconds, on average, a new particle bunch collides at the center-point of this detector. A next-generation detector, whether for a lepton or proton collider, may be able to record even more data, faster, and with higher-precision than the CMS or ATLAS detectors can at present. (CERN)

    It’s very important to make the right decision, as we do not know what secrets nature holds beyond the already-explored frontiers. Going to higher energies unlocks the potential for new direct discoveries, while going to higher precisions and greater statistics could provide even stronger indirect evidence for the existence of new physics.

    The first-stage linear colliders are going to cost between 5 and 7 billion dollars, including the tunnel, while a proton collider of four times the LHC’s radius, with magnets twice as strong, 10 times the collision rate and next-generation computing and cryogenics might cost a total of up to $22 billion, offering as big a leap over the LHC as the LHC was over the Tevatron. Some money could be saved if we build the circular lepton and proton colliders one after the other in the same tunnel, which would essentially provide a future for experimental particle physics after the LHC is done running at the end of the 2030s.

    10
    The Standard Model particles and their supersymmetric counterparts. Slightly under 50% of these particles have been discovered, and just over 50% have never showed a trace that they exist. Supersymmetry is an idea that hopes to improve on the Standard Model, but it has yet to make successful predictions about the Universe in attempting to supplant the prevailing theory. However, new colliders are not being proposed to find supersymmetry or dark matter, but to perform generic searches. Regardless of what they’ll find, we’ll learn something new about the Universe itself. (CLAIRE DAVID / CERN)

    The most important thing to remember in all of this is that we aren’t simply continuing to look for supersymmetry, dark matter, or any particular extension of the Standard Model. We have a slew of problems and puzzles that indicate that there must be new physics beyond what we currently understand, and our scientific curiosity compels us to look. In choosing what machine to build, it’s vital to choose the most performant machine: the ones with the highest numbers of collisions at the energies we’re interested in probing.

    Regardless of which specific projects the community chooses, there will be trade-offs. A linear lepton collider can always reach higher energies than a circular one, while a circular one can always create more collisions and go to higher precisions. It can gather just as much data in a tenth the time, and probe for more subtle effects, at the cost of a lower energy reach.

    Will it be successful? Regardless of what we find, that answer is unequivocally yes. In experimental physics, success does not equate to finding something, as some might erroneously believe. Instead, success means knowing something, post-experiment, that you did not know before you did the experiment. To push beyond the presently known frontiers, we’d ideally want both a lepton and a proton collider, at the highest energies and collision rates we can achieve.

    There is no doubt that new technologies and spinoffs will come from whichever collider or colliders come next, but that’s not why we do it. We are after the deepest secrets of nature, the ones that will remain elusive even after the Large Hadron Collider finishes. We have the technical capabilities, the personnel, and the expertise to build it right at our fingertips. All we need is the political and financial will, as a civilization, to seek the ultimate truths about nature.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 9:02 am on May 31, 2019 Permalink | Reply
    Tags: , , , Particle Accelerators, , ,   

    From Brookhaven National Lab:”Ten Years and Nearly a Billion Dollars: How Project Management Made a Massive X-Ray Light Source Possible” 

    From Brookhaven National Lab

    May 29, 2019
    Shannon Brescher Shea, D.O.E.

    1
    Aerial view of the construction site of the National Synchrotron Light Source II, taken in 2009, four years after the project started.

    Replacing a beloved tool is never easy. Erik Johnson had worked with the National Synchrotron Light Source (NSLS) for nearly 15 years when he and his colleagues began thinking about its replacement. But this switch wasn’t a matter of walking down to the hardware store.

    The NSLS, a Department of Energy (DOE) Office of Science (SC) user facility at Brookhaven National Laboratory (BNL), opened in 1982.

    BNL NSLS

    Over 30 years, scientists — three of whom won Nobel prizes for their work — used its intense beams of light over the course of more than 55,000 visits to study atomic structures and chemical processes. Johnson came to the NSLS in 1985 as a post-doctoral student. By 2000, Johnson and other leaders in the field realized the NSLS would soon be past its glory days.

    They began dreaming up its successor: the NSLS-II.

    BNL NSLS II

    After five years of planning and research, SC approved the project to move forward.
    “There was elation in the hallways,” said Johnson.

    This was just the beginning. Ahead of Johnson and his co-workers stood a project that would take a decade and almost a billion dollars to build.

    SC’s scientists and managers are familiar with this type of challenge. For decades, DOE and its predecessors have been developing federal user facilities for scientists to probe the building blocks of the universe and of life.

    Constructing these user facilities requires immense planning and coordination. Good project management keeps projects on-time and on-budget. The $912-million NSLS-II project put SC’s project management skills to the test.

    A Big Machine Comes with Big Challenges

    With the NSLS reaching its technical limits, scientists needed the next big tool to study incredibly small materials under real-world conditions. To look closer than ever at subjects ranging from batteries’ chemical reactions to viruses’ structures, researchers required an X-ray beam 10,000 times brighter than the original NSLS.

    To make the investment worthwhile, the team needed to design NSLS-II to be so advanced that it could stay at the forefront of science for more than three decades. It took more than 20 different scientific workshops to hammer out the requirements. Participants included both scientists who would use the facility and engineers who would design it.

    “I had gone from 15, 20 years where I knew everybody by first name and what they did to being surrounded by all new people,” said Johnson. “Which was pretty neat.”

    To satisfy as many stakeholders as possible, the team designed the facility to immediately run at full capacity. They then scaled the plans down to what they could build initially. If they finished early or had funding left over, they could add pieces back. To make it possible to adapt the most advanced technologies, the team planned to add more research equipment and capabilities in phases after they finished construction of the main facility.

    “[The process] was 90 percent preplanning and 10 percent execution,” said Robert Caradonna, the DOE Brookhaven Site Office deputy federal project director for the NSLS-II project.

    All of that planning clarified the challenges that awaited them.

    Creating powerful X-rays requires massive machines that accelerate electrons to high energies. These machines use specialized magnets to control the electrons that produce the X-rays. Equipment at experimental stations then harness these X-rays. Most facilities have rings that store the electrons; NSLS-II’s ring needed to be a half-mile around.

    The building needed to meet several other specific requirements. Because temperature changes can influence the magnets’ size and position, the inside of the storage ring could never waver from a balmy 78 F. To ensure the beams would never tremble, the team needed to minimize the effect of vibrations from trucks on the nearby Long Island Expressway and waves from the Atlantic Ocean.

    The project’s sheer complexity was perhaps the biggest challenge of all. The original schedule projected 5,000 separate activities. At the height of the project, the list expanded to 11,000. At some points, the team was managing a million dollars of work each day.

    “We spent so much time with our heads down pushing on this thing that it didn’t really get to be overwhelming,” said Johnson. “We were too damn busy for it to get overwhelming.”

    Prepared for Disaster

    In project management, being a pessimist can pay off. A huge part of ensuring the project ran smoothly was anticipating and managing risks.

    The Office of Science purposely plans for the worst case scenarios. Over the course of the project, the team ran more than 400 separate risk assessments. They also built numerous computer models that pinpointed exactly where every single piece of equipment needed to go.

    One of the biggest areas of uncertainty was manufacturing the massive magnets. Buying 900 magnets anywhere is difficult. Buying 900 extremely high-powered, cutting-edge magnets from one supplier wasn’t going to happen.

    “It would be nice if we could give it to one guy and he could produce all the magnets,” said Caradonna, but, “we knew that was going to be impossible.”

    There are only a handful of places in the world that could produce the magnets and other specialized pieces of equipment. The team developed seven contracts with five different suppliers, several of which were in other countries. The different languages, management cultures, geographic locations, and even measurement units caused conflicts. Some vendors were unresponsive.

    To solve these problems, members of the NSLS-II team flew to the suppliers and consulted with them on site. The hands-on assistance improved communications and quality control. It even allowed NSLS-II staff members to get experience with the components before they arrived to BNL. Despite some early delays, the magnet manufacturing didn’t hold up the process as a whole.

    The team was able to compensate for these delays because of savvy scheduling. In many projects like this, a delay in one step can cascade down to others, creating multiple delays and scheduling conflicts. In contrast, the NSLS-II team designed the project so they could change the order of the steps. For example, they built five identical pieces of the accelerator ring that fit together like LEGOs®. If the magnets weren’t finished for one piece of the ring, they could still finish the others in time.

    This approach also came in handy when the project received $150 million in funding earlier than planned through a special bill to help the country recover from the economic crash in 2008. With this funding and a favorable construction market, they negotiated a lower price with the construction company and finished the laboratory buildings nearly a year ahead of schedule.

    “You never know when fortune is going to smile on you, and you never know when you’re going to do something sooner rather than later,” said Johnson. “It’s all about being ready.”

    Here a Review, There a Review

    Preparation will only get you so far. SC’s regular and strict project reviews by independent experts got the team the rest of the way there. Every step of the process had a review, from the scope and scientific goals to the construction: 54 in total.

    Over the course of the project, the review teams gave more than 1,300 recommendations.

    As the NSLS-II lessons learned document states: “Project reviews are the most important management tool to ensure the project is staying on track. If you are not required to have them, you should inflict them on yourself.”

    NSLS-II is not the only project to benefit from rigorous reviews. The GAO has cited SC’s reviews as a major reason why the majority of SC’s projects are completed on-time and on-budget.

    The project teams aren’t the only ones who learn from the experience. The independent experts are often in charge of similar projects at their own agencies, national laboratories, or universities.

    10 Years Later

    Ten years after DOE approved the idea for NSLS-II, it was finished. SC declared it complete in March 2015, three months before its target date. It opened to users that July.

    Owing to sound project management practices, there was enough funding available for the NSLS-II to include $68 million in optional features beyond the basic construction plan. It had an additional beamline to provide X-rays to another experimental station, a larger building that would make it easier to expand, and extra components to increase reliability. In 2016, the team won both the Project Management Institute’s Project of the Year award and the DOE Secretary’s Award of Excellence, the highest honor that DOE awards to a project.

    “I was immensely proud,” said Johnson, “but fully cognizant of all of the work that needs to be done still to fully realize the potential of this instrument.”

    Since it opened, the team has launched 28 experimental stations, or beamlines, out of a total of 60 stations it can support.

    “Everything from the dreaming to the final delivery is going on at the same time,” said Johnson.

    Despite all of the new machinery, one day the NSLS-II will become obsolete just like its predecessor. One Friday night, Johnson went home and noted that the old NSLS sign was still up. The next Monday, it was gone, replaced by a sign for the lab’s new Computational Science Initiative. To Johnson, that change reinforced the fact that his mission is about more than equipment.

    “Those people who used it, they still have those experiences,” he said. “It’s not the stuff that you build, it’s that what you build that enables other people to do what’s important.”

    See the full article here .
    Original publication by D.O.E.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus


    BNL Center for Functional Nanomaterials

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: