Tagged: CERN (CH) ATLAS Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:28 pm on November 20, 2020 Permalink | Reply
    Tags: "Refining the picture of the Higgs boson", , CERN (CH) ATLAS, Constraints on Higgs boson properties using WW∗(→eνμν)jj production in 36.1fb−1 of s√=13TeV pp collisions with the ATLAS detector, ,   

    From CERN (CH) ATLAS via phys.org: “Refining the picture of the Higgs boson” 

    CERN (CH) ATLAS detector

    CERN (CH) ATLAS Higgs Event

    CERN ATLAS another view Image Claudia Marcelloni ATLAS CERN


    From CERN (CH) ATLAS

    via


    phys.org

    November 20, 2020

    1
    Figure 1: The weighted distribution of the azimuthal angle between two jets in the signal region used in the CP measurement. The signal and background yields are determined from the fit. Data-to-simulation ratios are shown at the bottom of the plot. The blue histogram represents measured signal; the shaded areas depict the total uncertainty. Credit: ATLAS Collaboration/CERN.

    To explain the masses of electroweak bosons—the W and Z bosons—theorists in the 1960s postulated a mechanism of spontaneous symmetry breaking. While this mathematical formalism is relatively simple, its cornerstone—the Higgs boson – remained undetected for almost 50 years.

    Since its discovery in 2012, researchers of the ATLAS and CMS experiments at CERN’s Large Hadron Collider (LHC) have tirelessly investigated the properties of the Higgs boson. They’ve measured its mass to be around 125 GeV—that’s about 130 times the mass of the proton at rest—and found it has zero electric charge and spin.

    The mirror image

    Researchers set out to determine the Higgs boson’s parity properties by measuring its decays to pairs of W bosons (H → WW*), Z bosons (H → ZZ*) and to photons (H → γγ). Through these measurements, they confirmed that the Higgs boson has even charge-parity (CP). This means that—as predicted by the Standard Model—the Higgs boson’s interactions with other particles do not change when “looking” in the CP mirror.

    As any distortions in this CP mirror (or “CP violation in Higgs interactions”), such as CP-odd admixture, would indicate the presence of as-yet undiscovered phenomena, physicists at the LHC are scrutinizing the strengths of Higgs-boson couplings very carefully. A new result from the ATLAS Collaboration, released for the Higgs 2020 conference, aims at enriching the Higgs picture by studying its WW* decays.

    One new ATLAS study examines the CP nature of the effective coupling between the Higgs boson and gluons (the mediator particles of the strong force). Until now, the gluon-fusion-induced production of a Higgs boson, in association with two particle jets, had not been studied in a dedicated analysis. The study of this production mechanism is an excellent way to search for signs of CP violation, as it affects the Higgs-boson kinematics, leaving a trace in the azimuthal angle between the jets measured by ATLAS.

    2
    Figure 2: The weighted distribution of the azimuthal angle between two jets in the signal region used in the polarisation measurement. The signal and background yields are determined from the fit. Data-to-simulation ratios are shown at the bottom of the plot. The red histogram represents measured signal; the shaded areas depict the total uncertainty Credit: ATLAS Collaboration/CERN.

    Polarization filter

    At high energies, the weak and electromagnetic forces merge into a single electroweak force. Yet at low energies, electromagnetic waves (such as light) can travel an infinite distance, while weak interactions have a finite range. This is because unlike photons (the carriers of the electromagnetic force), W and Z bosons are massive. Their masses originate from interactions with the Higgs field.

    Another difference is that electromagnetic waves are transverse; oscillations in the electromagnetic field only occur in the plane perpendicular to its propagation. W and Z bosons, on the other hand, have both longitudinal and transverse polarisations due to their interactions with the Higgs field. There is a subtle interplay between these longitudinal polarisations and the boson masses that ensures that Standard Model predictions remain finite.

    Should the Higgs boson not be a fundamental scalar particle, and instead an entity arising from new dynamics, a different (more complicated) mechanism would have to give mass to the W and Z bosons. In such a case, the measured Higgs-boson couplings with electroweak bosons may deviate from the predicted Standard Model values.

    The ATLAS Collaboration has released its first study of individual polarization-dependent Higgs-boson couplings to massive electroweak bosons. Specifically, physicists examined the production of Higgs bosons through vector-boson fusion in association with two jets. Just as a polarizing filter helps you to take a sharper picture at a seaside by selectively absorbing polarized light, this new ATLAS study investigated individual Higgs-boson couplings to longitudinally and transversely polarized electroweak bosons. Further, similar to the study of the Higgs-boson coupling to gluons, the presence of a new mechanism would impact the kinematics of the jets measured by ATLAS.

    Follow those jets!

    The main challenge of these analyses is the rarity of the Higgs-boson events being studied. For the signal selections studied in the new ATLAS result, only about 60 Higgs bosons are observed via gluon fusion and only 30 Higgs bosons via vector-boson fusion. Meanwhile, background events are almost a hundred times more abundant. To tackle this challenge, both analyses not only counted events but also looked into the shapes of the azimuthal angle (the angle transverse to the direction of the proton beams) between the two jets. The correlation between these jets has helped resolve properties of Higgs-boson production.

    Researchers used the technique of parameter morphing to interpolate and extrapolate the distribution of this angle from a small set of coupling benchmarks to a large variety of coupling scenarios.The fitted distributions of the azimuthal angle between the jets are shown in Figures 1 and 2.

    So far, both distributions show no sign of new physics. Once more LHC data is analyzed (these studies only include data collected in 2015 and 2016), the shaded areas in the plots that represent the measurement’s uncertainty should decrease. This will provide an even sharper picture of the Higgs boson.

    Constraints on Higgs boson properties using WW*(→eνμν)jj production in 36.1fb−1 of 13TeV proton-proton collisions with the ATLAS detector (ATLAS-CONF-2020-055)

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN Courier (CH)

    Quantum Diaries
    QuantumDiaries

    CERN map

    CERN LHC underground tunnel and tube.

    SixTRack CERN LHC particles.

     
  • richardmitnick 1:12 pm on November 5, 2020 Permalink | Reply
    Tags: "Know When to Unfold ’Em- Study Applies Error-Reducing Methods from Particle Physics to Quantum Computing", , , , CERN (CH) ATLAS, , , , , ,   

    From DOE’s Lawrence Berkeley National Laboratory: “Know When to Unfold ’Em- Study Applies Error-Reducing Methods from Particle Physics to Quantum Computing” 


    From DOE’s Lawrence Berkeley National Laboratory

    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 520-0843

    1
    A wheel-shaped muon detector is part of an ATLAS particle detector upgrade at CERN. A new study applies “unfolding,” or error-correction techniques used for particle detectors, to problems with noise in quantum computing. Credit: Julien Marius Ordan/CERN (CH).

    Borrowing a page from high-energy physics and astronomy textbooks, a team of physicists and computer scientists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) has successfully adapted and applied a common error-reduction technique to the field of quantum computing.

    In the world of subatomic particles and giant particle detectors, and distant galaxies and giant telescopes, scientists have learned to live, and to work, with uncertainty. They are often trying to tease out ultra-rare particle interactions from a massive tangle of other particle interactions and background “noise” that can complicate their hunt, or trying to filter out the effects of atmospheric distortions and interstellar dust to improve the resolution of astronomical imaging.

    Also, inherent problems with detectors, such as with their ability to record all particle interactions or to exactly measure particles’ energies, can result in data getting misread by the electronics they are connected to, so scientists need to design complex filters, in the form of computer algorithms, to reduce the margin of error and return the most accurate results.

    The problems of noise and physical defects, and the need for error-correction and error-mitigation algorithms, which reduce the frequency and severity of errors, are also common in the fledgling field of quantum computing, and a study published in the journal npj Quantum Information found that there appear to be some common solutions, too.

    Ben Nachman, a Berkeley Lab physicist who is involved with particle physics experiments at CERN as a member of Berkeley Lab’s ATLAS group, saw the quantum-computing connection while working on a particle physics calculation with Christian Bauer, a Berkeley Lab theoretical physicist who is a co-author of the study. ATLAS is one of the four giant particle detectors at CERN’s Large Hadron Collider, the largest and most powerful particle collider in the world.

    CERN (CH) LHC Map

    CERN (CH) ATLAS Image Claudia Marcelloni.

    “At ATLAS, we often have to ‘unfold,’ or correct for detector effects,” said Nachman, the study’s lead author. “People have been developing this technique for years.”

    In experiments at the LHC, particles called protons collide at a rate of about 1 billion times per second. To cope with this incredibly busy, “noisy” environment and intrinsic problems related to the energy resolution and other factors associated with detectors, physicists use error-correcting “unfolding” techniques and other filters to winnow down this particle jumble to the most useful, accurate data.

    “We realized that current quantum computers are very noisy, too,” Nachman said, so finding a way to reduce this noise and minimize errors – error mitigation – is a key to advancing quantum computing. “One kind of error is related to the actual operations you do, and one relates to reading out the state of the quantum computer,” he noted – that first kind is known as a gate error, and the latter is called a readout error.

    3
    These charts show the connection between sorted high-energy physics measurements related to particle scattering – called differential cross-section measurements (left) – and repeated measurements of outputs from quantum computers (right). These similarities provide an opportunity to apply similar error-mitigation techniques to data from both fields. (Credit: Berkeley Lab; npj Quantum Inf 6, 84 (2020), DOE: 10.1038/s41534-020-00309-7)

    The latest study focuses on a technique to reduce readout errors, called “iterative Bayesian unfolding” (IBU), which is familiar to the high-energy physics community. The study compares the effectiveness of this approach to other error-correction and mitigation techniques. The IBU method is based on Bayes’ theorem, which provides a mathematical way to find the probability of an event occurring when there are other conditions related to this event that are already known.

    Nachman noted that this technique can be applied to the quantum analog of classical computers, known as universal gate-based quantum computers.

    In quantum computing, which relies on quantum bits, or qubits, to carry information, the fragile state known as quantum superposition is difficult to maintain and can decay over time, causing a qubit to display a zero instead of a one – this is a common example of a readout error.

    Superposition provides that a quantum bit can represent a zero, a one, or both quantities at the same time. This enables unique computing capabilities not possible in conventional computing, which rely on bits representing either a one or a zero, but not both at once. Another source of readout error in quantum computers is simply a faulty measurement of a qubit’s state due to the architecture of the computer.

    In the study, researchers simulated a quantum computer to compare the performance of three different error-correction (or error-mitigation or unfolding) techniques. They found that the IBU method is more robust in a very noisy, error-prone environment, and slightly outperformed the other two in the presence of more common noise patterns. Its performance was compared to an error-correction method called Ignis that is part of a collection of open-source quantum-computing software development tools developed for IBM’s quantum computers, and a very basic form of unfolding known as the matrix inversion method.

    The researchers used the simulated quantum-computing environment to produce more than 1,000 pseudo-experiments, and they found that the results for the IBU method were the closest to predictions. The noise models used for this analysis were measured on a 20-qubit quantum computer called IBM Q Johannesburg.

    5
    IBM Q Johannesburg at University of the Witwatersrand (SA)

    “We took a very common technique from high-energy physics, and applied it to quantum computing, and it worked really well – as it should,” Nachman said. There was a steep learning curve. “I had to learn all sorts of things about quantum computing to be sure I knew how to translate this and to implement it on a quantum computer.”

    He said he was also very fortunate to find collaborators for the study with expertise in quantum computing at Berkeley Lab, including Bert de Jong, who leads a DOE Office of Advanced Scientific Computing Research Quantum Algorithms Team and an Accelerated Research for Quantum Computing project in Berkeley Lab’s Computational Research Division.

    “It’s exciting to see how the plethora of knowledge the high-energy physics community has developed to get the most out of noisy experiments can be used to get more out of noisy quantum computers,” de Jong said.

    The simulated and real quantum computers used in the study varied from five qubits to 20 qubits, and the technique should be scalable to larger systems, Nachman said. But the error-correction and error-mitigation techniques that the researchers tested will require more computing resources as the size of quantum computers increases, so Nachman said the team is focused on how to make the methods more manageable for quantum computers with larger qubit arrays.

    Nachman, Bauer, and de Jong also participated in an earlier study [Physical Review A] that proposes a way to reduce gate errors, which is the other major source of quantum-computing errors. They believe that error correction and error mitigation in quantum computing may ultimately require a mix-and-match approach – using a combination of several techniques.

    “It’s an exciting time,” Nachman said, as the field of quantum computing is still young and there is plenty of room for innovation. “People have at least gotten the message about these types of approaches, and there is still room for progress.” He noted that quantum computing provided a “push to think about problems in a new way,” adding, “It has opened up new science potential.”

    The Oak Ridge Leadership Computing Facility, a DOE Office of Science user facility at Oak Ridge National Laboratory, provided the researchers with access to quantum-computing resources at IBM, including the IBM Quantum Experience and Q Hub Network.

    Miroslav Urbanek in Berkeley Lab’s Computational Research Division also participated in the study, which was supported by the U.S. DOE’s Office of Science and the Aspen Center for Physics.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    LBNL Molecular Foundry

    Bringing Science Solutions to the World
    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (UC) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a UC Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    A U.S. Department of Energy National Laboratory Operated by the University of California.

    University of California Seal

     
  • richardmitnick 1:22 pm on October 15, 2020 Permalink | Reply
    Tags: "ATLAS Experiment releases new search for long-lived particles", , , CERN (CH) ATLAS, , , ,   

    From CERN (CH) ATLAS: “ATLAS Experiment releases new search for long-lived particles” 

    CERN (CH) ATLAS detector

    CERN (CH) ATLAS Higgs Event

    CERN ATLAS another view Image Claudia Marcelloni ATLAS CERN (CH)


    From CERN (CH) ATLAS

    via


    phys.org

    October 15, 2020

    1
    The efficiency of reconstructing a lepton from the decay of a long-lived particle, measured in simulated events, shown as a function of the distance between the lepton track and the collision point (d0). The solid blue circles show the efficiency using standard ATLAS reconstruction techniques. The solid purple squares indicate the efficiency using additional tracking for displaced particles and special identification criteria developed for this search. Credit: ATLAS Collaboration/CERN (CH).

    Despite its decades of predictive success, there are important phenomena left unexplained by the Standard Model of particle physics. Additional theories must exist that can fully describe the universe, even though definitive signatures of particles beyond the Standard Model have yet to turn up.

    Researchers at the ATLAS experiment at CERN are broadening their extensive search program to look for more unusual signatures of unknown physics, such as long-lived particles. These new particles would have lifetimes of 0.01 to 10 ns; for comparison, the Higgs boson has a lifetime of 10–13 ns. A theory that naturally motivates long-lived particles is supersymmetry (SUSY). SUSY predicts that there are “superpartner” particles corresponding to the particles of the Standard Model with different spin properties.

    A new search from the ATLAS Collaboration looks for the superpartners of the electron, muon and tau lepton, called “sleptons” (“selectron”, “smuon”, and “stau”, respectively). The search considers scenarios where sleptons would be produced in pairs and couple weakly to their decay products and so become long-lived. In this model, each long-lived slepton would travel some distance (depending on their average lifetime) through the detector before decaying to a Standard Model lepton and a light undetectable particle. Physicists would thus observe two leptons that seem to come from different locations than where the proton–proton collision occurred.

    2
    Upper limits set by the analysis on the lifetime of possible sleptons as a function of the slepton mass. The solid lines indicate the observed limit, the dotted lines show the limit expected in the case of no statistical fluctuations, and the coloured regions are excluded by the analysis result. The excluded area is smaller for staus than for selectrons and smuons because it depends on the produced Standard Model taus decaying to electrons or muons. The dependence of the limits on the slepton mass stems mostly from the slepton-pair production cross section that strongly decreases with mass. Credit: ATLAS Collaboration/CERN (CH).

    This unique signature presented a challenge for physicists. Although many theories predict particles that could travel in the ATLAS detector for some time before decaying, typical data reconstruction and analysis is oriented towards new particles that would decay instantaneously, the way heavy Standard Model particles do. ATLAS physicists thus had to develop new methods of identifying particles in order to increase the likelihood of reconstructing these “displaced” leptons. Only displaced electrons and muons were studied in this analysis, but the results could be applied to taus as well, since taus decay promptly into an electron or a muon in around one third of cases.

    Because the particles created by the decay of a long-lived particle would appear away from the collision, unusual background sources can arise: photons mis-identified as electrons, muons that are mis-measured, and poorly measured cosmic-ray muons. Cosmic-ray muons come from high-energy particles colliding with our atmosphere and can traverse the ATLAS detector. Since they do not necessarily pass through the detector near the collision point, they can appear as if originating from a long-lived particle decay. ATLAS physicists have developed techniques not only for reducing these sources’ contributions but also for estimating how much each contributes to the search.

    The analysis did not find any collision events with displaced leptons that passed the selection requirements, a result that is consistent with the low expected background abundance. Using these results, physicists set limits on the slepton mass and lifetime. For the slepton lifetime that this search is most sensitive to (around 0.1 nanoseconds) ATLAS was able to exclude selectrons and smuons up to a mass of around 700 GeV, and staus up to around 350 GeV. The previous best limits on these long-lived particles were around 90 GeV and came from the experiments on the Large Electron–Positron Collider (LEP), CERN’s predecessor to the LHC. This new result is the first to make a statement on this model using LHC data.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN Courier

    Quantum Diaries
    QuantumDiaries

    CERN (CH) map

    CERN (CH)LHC underground tunnel and tube.

    SixTRack CERN (CH) LHC particles.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: