Tagged: HEP Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:16 pm on July 18, 2018 Permalink | Reply
    Tags: , , , , HEP, Meenakshi Narain, , , ,   

    From Brown University: Women in STEM- “Brown physicist elected to represent U.S. in Large Hadron Collider experiment” Meenakshi Narain 

    Brown University
    From Brown University

    July 18, 2018
    Kevin Stacey
    kevin_stacey@brown.edu
    401-863-3766

    1
    Meenakshi Narain

    Meenakshi Narain will lead the collaboration board for U.S. institutions participating the CMS experiment at the Large Hadron Collider, an experiment pushing the frontiers of modern particle physics.

    Brown University physics professor Meenakshi Narain has been tapped to chair the collaboration board of U.S. institutions in the Compact Muon Solenoid (CMS) experiment, one of two large-scale experiments happening at the Large Hadron Collider particle accelerator headquartered in Geneva.

    CERN CMS Higgs Event


    CERN/CMS Detector

    The CMS experiment is an international collaboration of 4,000 particle physicists, engineers, computer scientists, technicians and students from approximately 200 institutes and universities around the world. With more than 1,200 participants, the U.S. CMS collaboration is the largest national group in the global experiment. As collaboration board chair, Narain will represent U.S. institutions within the broader collaboration, as well as with U.S. funding agencies. The board also plays a key role in shaping the vision and direction of the U.S. collaboration.

    “I’m honored that my colleagues from the 50 U.S. institutions that collaborate with the CMS Experiment have chosen me to represent them,” Narain said. “I see this position as an opportunity to help U.S. CMS to become a more inclusive community and to enable all young scientists to contribute to their full potential to CMS and find rewarding career opportunities in academia and industry.”

    Narain and other Brown physicists working with the CMS experiment played key roles in the discovery in 2012 of the Higgs Boson, which at the time was the final missing piece in the Standard Model of particle physics. After the Higgs, the CMS experiment has been searching for particles beyond the Standard Model, including a potential candidate particle for dark matter, the mysterious stuff thought to account for a majority of matter in the universe.

    Narain says part of her job is to maintain the research synergy created by the numerous U.S. scientists and institutions involved in the collaboration as they analyze data from the collider’s latest run. At the same time, the experiment must also prepare for the next stage of the Large Hadron Collider program slated to start around 2026. The next stage involves beam intensities five times higher the current level and 10 times more data than has been acquired to date. That will require parts of the CMS detector to be rebuilt.

    “We need the resources to maintain the detector during the current run as well as to start building the upgrades,” Narain said. “I will work with funding agencies to communicate what we’ll need to both maintain our involvement in the data analysis and play a leading role in the upgrade of the detector.”

    Narain says that as the first woman to chair the collaboration board, she plans to work toward cultivating more diversity in what is currently the largest physics collaboration in the U.S.

    “With this comes the opportunity to promote women and other underrepresented minorities to have the opportunity to develop their careers to their fullest potential,” she said. “I hope that I will be able to improve our community in the U.S. and in CMS in general to be more inclusive during my two-year term.”

    Narain joined the Brown faculty in 2007 and has worked at the Large Hadron Collider together with the Brown team that includes professors David Cutts, Ulrich Heintz and Greg Landsberg. She was also a member of the DZero experiment at the Fermi National Accelerator Laboratory, where she played a prominent role in the discoveries of the top quark and the anti-top quark, two fundamental constituents of matter. She is a fellow of the American Physical Society and the author of more than 500 journal articles.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Welcome to Brown

    Brown U Robinson Hall
    Located in historic Providence, Rhode Island and founded in 1764, Brown University is the seventh-oldest college in the United States. Brown is an independent, coeducational Ivy League institution comprising undergraduate and graduate programs, plus the Alpert Medical School, School of Public Health, School of Engineering, and the School of Professional Studies.

    With its talented and motivated student body and accomplished faculty, Brown is a leading research university that maintains a particular commitment to exceptional undergraduate instruction.

    Brown’s vibrant, diverse community consists of 6,000 undergraduates, 2,000 graduate students, 400 medical school students, more than 5,000 summer, visiting and online students, and nearly 700 faculty members. Brown students come from all 50 states and more than 100 countries.

    Undergraduates pursue bachelor’s degrees in more than 70 concentrations, ranging from Egyptology to cognitive neuroscience. Anything’s possible at Brown—the university’s commitment to undergraduate freedom means students must take responsibility as architects of their courses of study.

    Advertisements
     
  • richardmitnick 12:18 pm on July 17, 2018 Permalink | Reply
    Tags: , , HEP, , , , ,   

    From Symmetry: “Rise of the machines” 

    Symmetry Mag
    From Symmetry

    07/17/18
    Sarah Charley

    Machine learning will become an even more important tool when scientists upgrade to the High-Luminosity Large Hadron Collider.

    1
    Artwork by Sandbox Studio, Chicago

    When do a few scattered dots become a line? And when does that line become a particle track? For decades, physicists have been asking these kinds of questions. Today, so are their machines.

    Machine learning is the process by which the task of pattern recognition is outsourced to a computer algorithm. Humans are naturally very good at finding and processing patterns. That’s why you can instantly recognize a song from your favorite band, even if you’ve never heard it before.

    Machine learning takes this very human process and puts computing power behind it. Whereas a human might be able to recognize a band based on a variety of attributes such as the vocal tenor of the lead singer, a computer can process other subtle features a human might miss. The music-streaming platform Pandora categorizes every piece of music in terms of 450 different auditory qualities.

    “Machines can handle a lot more information than our brains can,” says Eduardo Rodrigues, a physicist at the University of Cincinnati. “It’s why they can find patterns that are sometimes invisible to us.”

    Machine learning started to become commonplace in computing during the 1980s, and LHC physicists have been using it routinely to help to manage and process raw data since 2012. Now, with upgrades to what is already the world’s most powerful particle accelerator looming on the horizon, physicists are implementing new applications of machine learning to help them with the imminent data deluge.

    “The high-luminosity upgrade to the LHC is going to increase our amount of data by a factor of 100 relative to that used to discover the Higgs,” says Peter Elmer, a physicist at Princeton University. “This will help us search for rare particles and new physics, but if we’re not prepared, we risk being completely swamped with data.”

    Only a small fraction of the LHC’s collisions are interesting to scientists. For instance, Higgs bosons are born in just roughly one out of every 2 billion proton-proton collisions. Machine learning is helping scientists to sort through the noise and isolate what’s truly important.

    “It’s like mining for rare gems,” Rodrigues says. “Keeping all the sand and pebbles would be ridiculous, so we use algorithms to help us single out the things that look interesting. With machine learning, we can purify the sample even further and more efficiently.”

    LHC physicists use a kind of machine learning called supervised learning. The principle behind supervised learning is nothing new; in fact, it’s how most of us learn how to read and write. Physicists start by training their machine-learning algorithms with data from collisions that are already well-understood. They tell them, “This is what a Higgs looks like. This is what a particle with a bottom quark looks like.”

    After giving an algorithm all of the information they already know about hundreds of examples, physicists then pull back and task the computer with identifying the particles in collisions without labels. They monitor how well the algorithm performs and give corrections along the way. Eventually, the computer needs only minimal guidance and can become even better than humans at analyzing the data.

    “This is saving the LHCb experiment a huge amount of time,” Rodrigues says. “In the past, we needed months to make sense of our raw detector data. With machine learning, we can now process and label events within the first few hours after we record them.”

    Not only is machine learning helping physicists understand their real data, but it will soon help them create simulations to test their predictions from theory as well.

    Using algorithms in the absence of machine learning, scientists have created virtual versions of their detectors with all the known laws of physics pre-programmed.

    “The virtual experiment follows the known laws of physics to a T,” Elmer says. “We simulate proton-proton collisions and then predict how the byproducts will interact with every part of our detector.”

    If scientists find a consistent discrepancy between the virtual data generated by their simulations and the real data recorded by their detectors, it could mean that the particles in the real world are playing by a different set of rules than the ones physicists already know.

    A weakness of scientists’ current simulations is that they’re too slow. They use series of algorithms to precisely calculate how a particle will interact with every detector part it bumps into while moving through the many layers of a particle detector.

    Even though it takes only a few minutes to simulate a collision this way, scientists need to simulate trillions of collisions to cover the possible outcomes of the 600 million collisions per second they will record with the HL-LHC.

    “We don’t have the time or resources for that,” Elmer says.

    With machine learning, on the other hand, they can generalize. Instead of calculating every single particle interaction with matter along the way, they can estimate its overall behavior based on its typical paths through the detector.

    “It’s a matter of balancing quality with quantity,” Elmer says. “We’ll still use the very precise calculations for some studies. But for others, we don’t need such high-resolution simulations for the physics we want to do.”

    Machine learning is helping scientists process more data faster. With the planned upgrades to the LHC, it could play an even large role in the future. But it is not a silver bullet, Elmer says.

    “We still want to understand why and how all of our analyses work so that we can be completely confident in the results they produce,” he says. “We’ll always need a balance between shiny new technologies and our more traditional analysis techniques.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 3:49 pm on July 16, 2018 Permalink | Reply
    Tags: , Computing for CERN, HEP, , , ,   

    From Symmetry: “The LHC’s computing revolution” 

    Symmetry Mag
    From Symmetry

    4
    Artwork by Sandbox Studio, Chicago

    07/16/18
    Sarah Charley

    Making new discoveries may require writing new software first.

    Scientists conceived of the Large Hadron Collider and its experiments in 1992. Back then, Apple was starting to figure out the laptop computer, and a CERN fellow named Tim Berners-Lee had just released the code for a pet project called the World Wide Web.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    1
    Apple’s original iBook

    3
    Tim Berners-Lee. CERN

    “In the days when we started, there was no Google. There was no Facebook. There was no Twitter. All of these companies that tackle big data did not exist,” says Graeme Stewart, a CERN software specialist working on the ATLAS experiment. “Big data didn’t exist.”

    The LHC experiments grew up with computing and have been remarkable in their ability to adapt to the evolving technology. Over the last 15 years, researchers have written more than 20 million lines of code that govern everything from data acquisition to final analysis. But physicists are anxious that the continually accumulating code has begun to pose a problem.

    “This software is not sustainable,” says Peter Elmer, a physicist at Princeton. “Many of the original authors have left physics. Given the complex future demands on the software, it will be very difficult to evolve.”

    Back when Stewart and his computer engineering colleagues were designing the computing structure for the LHC research program, they were focused on making their machines perform a single task faster and faster.

    “And then in the mid-2000s, hardware manufacturers hit a wall, and it was impossible to get a computer to do one single thing any more quickly,” Stewart says, “so instead they started to do something which we call concurrency: the ability of a computer to do more than one thing at the same time. And that was sort of unfortunate timing for us. If it had happened five or 10 years earlier, we would have built that concurrent paradigm into the software framework for the LHC startup, but it came a little bit too late.”

    Thanks to concurrency, today’s personal laptops can perform roughly four tasks at the same time, and the processors in CERN’s computing clusters can perform around 30 tasks at once. But graphics cards—such as the GPUs used in gaming—are now able to process up to 500 tasks at once.

    “It’s critical that we take advantage of these new architectures to get the most out of the LHC research program,” Stewart says. “At the same time, adapting to that kind of hardware is a tremendous challenge.”

    The experiments will need these hardware advancements. In eight years, a turbocharged version of the LHC will turn on with a proton beam four times more intense than it is today. This transformation will provide scientists with the huge volume of data they need to search for new physics and study rare processes. But according to Stewart, today’s software won’t be able to handle it.

    “The volume of data anticipated jumps by an order of magnitude, and the complexity goes up by an order of magnitude,” he says. “Those are tremendous computing challenges, and the best way of succeeding is if we work in common.”

    Stewart and Elmer are part of a huge community initiative that is planning how they will meet the enormous computing challenges of the four big LHC experiments and prepare the program for another two decades of intensive data collection.

    According to a white paper recently published by the High Energy Physics Software Foundation, the software and computing power will be the biggest limiting factor to the amount of data the LHC experiments can collect and process, and so “the physics reach during HL-LHC will be limited by how efficiently these resources can be used.”

    So the HEP Software Foundation has set out to adapt the LHC software to modern computing hardware so that the entire system can run more effectively and efficiently. “It’s like engineering a car,” Stewart says. “You might design something with really great tires, but if it doesn’t fit the axle, then the final result will not work very well.”

    Instead of building custom solutions for each experiment—which would be time-consuming and costly—the community is coming together to identify where their computing needs overlap.

    “Ninety percent of what we do is the same, so if we can develop a common system which all the experiments can use, that saves us a lot on time and computing resources,” Stewart says. “We’re creating tool kits and libraries that protect the average physicist from the complexity of the hardware and give them good signposts and guidelines as to how they actually write their code and integrate it into the larger system.”

    These incremental changes will gradually modernize LHC computing and help maintain continuity with all the earlier work. It will also enable the system to remain flexible and adaptable to future advancements in computing.

    “The discovery of the Higgs is behind us,” says Elmer. “The game is changing, and we need to be prepared.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 11:47 am on July 11, 2018 Permalink | Reply
    Tags: , , , HEP, MICE experiment, , ,   

    From CERN Courier: “Muons cooled for action” 


    From CERN Courier

    9 July 2018
    Manuela Boscolo, INFN-LNF
    Patrick Huber, Virginia Tech
    Kenneth Long, Imperial College London and STFC.

    The recent demonstration of muon ionisation-cooling by the MICE collaboration opens a path to a neutrino factory and muon collider.

    Rutherford Appleton Lab Muon Ionization Cooling Experiment (or MICE) is a high energy physics experiment

    Fundamental insights into the constituents of matter have been gained by observing what happens when beams of high-energy particles collide. Electron–positron, proton–proton, proton–antiproton and electron–proton colliders have all contributed to the development of today’s understanding, embodied in the Standard Model of particle physics (SM). The Large Hadron Collider (LHC) brings 6.5 TeV proton beams into collision, allowing the Higgs boson and other SM particles to be studied and searches for new physics to be carried out. To reach physics beyond the LHC will require hadronic colliders at higher energies and/or lepton colliders that can deliver substantially increased precision.

    A variety of options are being explored to achieve these goals. For example, the Future Circular Collider study at CERN is investigating a 100 km-circumference proton–proton collider with beam energies of around 50 TeV the tunnel for which could also host an electron–positron collider (CERN Courier June 2018 p15).

    CERN FCC Future Circular Collider map

    Electron–positron annihilation has the advantage that all of the beam energy is available in the collision, rather than being shared between the constituent quarks and gluons as it is in hadronic collisions. But to reach very high energies requires either a state-of-the-art linear accelerator, such as the proposed Compact Linear Collider or the International Linear Collider, or a circular accelerator with an extremely large bending radius.

    Cern Compact Linear Collider


    CLIC Collider annotated

    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan

    Muons to the fore

    A colliding-beam facility based on muons has a number of advantages. First, since the muon is a lepton, all of the beam energy is available in the collision. Second, since the muon is roughly 200 times heavier than the electron and thus emits around 109 times less synchrotron radiation than an electron beam of the same energy, it is possible to produce multi-TeV collisions in an LHC-sized circular collider. The large muon mass also enhances the direct “s-channel” Higgs-production rate by a factor of around 40,000 compared to that in electron–positron colliders, making it possible to scan the centre-of-mass energy to measure the Higgs-boson line shape directly and to search for closely spaced states.

    __________________________________________________________
    2
    __________________________________________________________

    Stored muon beams could also serve the long-term needs of neutrino physicists (see box 1). In a neutrino factory, beams of electron and muon neutrinos are produced from the decay of muons circulating in a storage ring. It is straightforward to tune the neutrino-beam energy because the neutrinos carry away a substantial fraction of the muon’s energy. This, combined with the excellent knowledge of the beam composition and energy spectrum resulting from the very well-known characteristics of muon decays, makes the neutrino factory the ideal place to make precision measurements of neutrino properties and to look for oscillation phenomena that are outside the standard, three-neutrino-mixing paradigm.

    Given the many benefits of a muon collider or neutrino factory, it is reasonable to ask why one has yet to be built. The answer is that muons are unstable, decaying with a mean lifetime at rest of 2.2 microseconds. This presents two main challenges: first, a high-intensity primary beam must be used to create the muons that will form the beam; and, second, once captured, the muon beam must be accelerated rapidly to high energy so that the effective lifetime of the muon can be extended by the relativistic effect of time dilation.

    One way to produce beams for a muon collider or neutrino factory is to harness the muons produced from the decay of pions when a high-power (few-MW), multi-GeV proton beam strikes a target such as carbon or mercury. For this approach, new proton accelerators with the required performance are being developed at CERN, Fermilab, J-PARC and at the European Spallation Source.

    ESS European Spallation Source, currently under construction in Lund, Sweden.

    The principle of the mercury target was proved by the MERIT experiment that operated on the Proton Synchrotron at CERN. However, at the point of production, the tertiary muon beam emerging from such schemes occupies a large volume in phase space. To maximise the muon yield, the beam has to be “cooled” – i.e. its phase-space volume reduced – in a short period of time before it is accelerated.

    __________________________________________________________

    __________________________________________________________

    The proposed solution is called ionisation cooling, which involves passing the beam through a material in which it loses energy via ionisation and then re-accelerating it in the longitudinal direction to replace the lost energy. Proving the principle of this technique is the goal of the Muon Ionization Cooling Experiment (MICE) collaboration, which, following a long period of development, has now reported its first observation of ionisation cooling.

    An alternative path to a muon collider called the Low Emittance Muon Accelerator (LEMMA), recently proposed by accelerator physicists at INFN in Italy and the ESRF in France, provides a naturally cooled muon beam with a long lifetime in the laboratory by capturing muon–antimuon pairs created in electron–positron annihilation.

    Cool beginnings

    The benefits of a collider based on stored muon beams were first recognised by Budker and Tikhonin at the end of the 1960s. In 1974, when CERN’s Super Proton Synchrotron (SPS) was being brought into operation, Koshkarev and Globenko showed how muons confined within a racetrack-shaped storage ring could be used to provide intense neutrino beams. The following year, the SPS proton beam was identified as a potential muon source and the basic parameters of the muon beam, storage ring and neutrino beam were defined.

    The Super Proton Synchrotron (SPS), CERN’s second-largest accelerator. (Image: Julien Ordan/CERN

    It was quickly recognised that the performance of this facility—the first neutrino factory to be proposed – could be enhanced if the muon beam was cooled. In 1978, Budker and Skrinsky identified ionisation cooling as a technique that could produce sufficient cooling in a timeframe short compared to the muon lifetime and, the following year, Neuffer proposed a muon collider that exploited ionisation cooling to increase the luminosity.

    The study of intense, low-emittance muon beams as the basis of a muon collider and/or neutrino factory was re-initiated in the 1990s, first in the US and then in Europe and Japan. Initial studies of muon production and capture, phase-space manipulation, cooling and acceleration were carried out and neutrino- and energy-frontier physics opportunities evaluated. The reduction of the tertiary muon-beam phase space was recognised as a key technological challenge and at the 2001 NuFact workshop the international MICE collaboration was created, comprising 136 physicists and engineers from 40 institutes in Asia, Europe and the US.

    __________________________________________________________
    3
    __________________________________________________________

    he MICE cooling cell, in common with the cooling channels studied since the seminal work of the 1990s, is designed to operate at a beam momentum of around 200 MeV/c. This choice is a compromise between the size of the ionisation-cooling effect and its dependence on the muon energy, the loss rate of muon-beam intensity through decay, and the ease of acceleration following the cooling channel. The ideal absorber has, at the same time, a large ionisation energy loss per unit length (to maximise ionisation cooling) and a large radiation length (to minimise heating through multiple Coulomb scattering). Liquid hydrogen meets these requirements and is an excellent absorber material; a close runner-up, with the practical advantage of being solid, is lithium hydride. MICE was designed to study the properties of both. The critical challenges faced by the collaboration therefore included: the integration of high-field superconducting magnets operating in a magnetically coupled lattice; high-gradient accelerating cavities capable of operation in a strong magnetic field; and the safe implementation of liquid-hydrogen absorber modules – all solved through more than a decade of R&D.

    In 2003 the MICE collaboration submitted a proposal to mount the experiment (figure 1) on a new beamline at the ISIS proton and muon source at the Science and Technology Facilities Council’s (STFC) Rutherford Appleton Laboratory in the UK. Construction began in 2005 and first beam was delivered on 29 March 2008. The detailed design of the spectrometer solenoids was also carried out at this time and the procurement process was started. During the period from 2008 to 2012, the collaboration carried out detailed studies of the properties of the beam delivered to the experiment and, in parallel, designed and fabricated the focus-coil magnets and a first coupling coil.

    4
    No image caption or credit.

    Delays were incurred in addressing issues that arose in the manufacture of the spectrometer solenoids. This, combined with the challenges of integrating the four-cavity linac module with the coupling coil, led, in November 2014, to a reconfiguration of the MICE cooling cell. The simplified experiment required two, single-cavity modules and beam transport was provided by the focus-coil modules. An intense period of construction followed, culminating with the installation of the spectrometer solenoids and the focus-coil module in the summer of 2015. Magnet commissioning progressed well until, a couple of months later, a coil in the downstream solenoid failed during a training quench. The modular design of the apparatus meant the collaboration was able to devise new settings rapidly, but it proved not to be possible to restore the downstream spectrometer magnet to full functionality. This, combined with the additional delays incurred in the recovery of the magnet, eventually led to the cancellation of the installation of the RF cavities in favour of the extended operation of a configuration of the experiment without the cavities.

    It is interesting to reflect, as was done in a recent lessons-learnt exercise convened by the STFC, whether a robust evaluation of alternative options for the cooling-demonstration lattice at the outset of MICE might have identified the simplified lattice as a “less-risky” option and allowed some of the delays in implementing the experiment to be avoided.

    5

    The bulk of the data-taking for MICE was carried out between November 2015 and December 2017, using lithium-hydride and liquid-hydrogen absorbers. The campaign was successful: more than 5 × 108 triggers were collected over a range of initial beam momentum and emittance for a variety of configurations of the magnetic channel for each absorber material. The key parameter to measure when demonstrating ionisation cooling is the “amplitude” of each muon – the distance from the beam centre in transverse phase space, reconstructed from its position and momentum. The muon’s amplitude is measured before it enters the absorber and again as it leaves, and the distributions of amplitudes are then examined for evidence of cooling: a net migration of muons from high to low amplitudes. As can be seen (figure 2), the particle density in the core of the MICE beam is increased as a result of the beam’s passage through the absorber, leading to a lower transverse emittance and thereby providing a higher neutrino flux or a larger luminosity.

    The MICE observation of the ionisation-cooling of muon beams is an important breakthrough, achieved through the creativity and tenacity of the collaboration and the continuous support of the funding agencies and host laboratory. The results match expectations, and the next step would be to design an experiment to demonstrate cooling in all six phase-space dimensions.

    Completing the MICE programme

    Having completed its experimental programme, MICE will now focus on the detailed analysis of the factors that determine ionisation-cooling performance over a range of momentum, initial emittance and lattice configurations for both liquid-hydrogen and lithium-hydride absorbers. MICE was operated such that data were recorded one particle at a time. This single-particle technique will allow the collaboration to study the impact of transverse-emittance growth in rapidly varying magnetic fields and to devise mechanisms to mitigate such effects. Furthermore, MICE has taken data to explore a scheme in which a wedge-shaped absorber is used to decrease the beam’s longitudinal emittance while allowing a controlled growth in its transverse emittance. This is required for a proton-based muon collider to reach the highest luminosities.

    With the MICE observation of ionisation cooling, the last of the proof-of-principle demonstrations of the novel technologies that underpin a proton-based neutrino factory or muon collider has now been delivered. The drive to produce lepton–antilepton collisions at centre-of-mass energies in the multi-TeV range can now include consideration of the muon collider, for which two routes are offered: one, for which the R&D is well advanced, that exploits muons produced using a high-power proton beam and which requires ionisation cooling; and one that exploits positron annihilation with electrons at rest to create a high-energy cold muon source. The high muon flux that can be achieved using the proton-based technique has the potential to serve a neutrino-physics programme of unprecedented sensitivity, and the MICE collaboration’s timely results will inform the coming update of the European Strategy for Particle Physics.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 9:23 am on July 10, 2018 Permalink | Reply
    Tags: , , , HEP, Higgs boson observed decaying to b quarks – at last!, , ,   

    From CERN ATLAS: “Higgs boson observed decaying to b quarks – at last!” 

    CERN ATLAS Higgs Event

    CERN/ATLAS
    From CERN ATLAS

    9th July 2018

    The Brout-Englert-Higgs mechanism solves the apparent theoretical impossibility of weak vector bosons (W and Z) to have mass. The discovery of the Higgs boson in 2012 via its decays into photon, Z and W pairs was a triumph of the Standard Model built upon this mechanism. The Higgs field can also be used in an elegant way to provide mass to charged fermions (quarks and leptons) through interactions involving “Yukawa couplings”, with strength proportional to the particle mass. The observation of the Higgs boson decaying into pairs of τ leptons provided the first direct evidence of this type of interaction.

    Six years after its discovery, ATLAS has observed about 30% of the Higgs boson decays predicted in the Standard Model. However, the favoured decay of the Higgs boson into a pair of b quarks (H→bb), which is expected to account for almost 60% of all possible decays, had remained elusive up to now. Observing this decay mode and measuring its rate is a mandatory step to confirm (or not…) the mass generation for fermions via Yukawa interactions, as predicted in the Standard Model.

    Today, at the 2018 International Conference on High Energy Physics (ICHEP) in Seoul, the ATLAS experiment reported a preliminary result establishing the observation of the Higgs boson decaying into pairs of b quarks, furthermore at a rate consistent with the Standard Model prediction. In the community of particle physics (and beyond), for the detection of a process to be qualified as an “observation”, it is necessary to exclude at a level of one in three million the probability that it arises from a fluctuation of the background that could mimic the process in question. When such a probability is at the level of only one in a thousand, the detection is qualified as an “evidence”. Evidence of the H→bb decay was first provided at the Tevatron in 2012, and a year ago by the ATLAS and CMS Collaborations, independently.

    FNAL/Tevatron map

    FNAL/Tevatron

    CERN/CMS Detector


    CERN CMS Higgs Event

    Combing through the haystack of b quarks

    Given the abundance of the H→bb decay, and how much rarer decay modes such as H→γγ had already been observed at the time of discovery, why did it take so long to achieve this observation?

    The main reason: the most copious production process for the Higgs boson in proton-proton interactions leads to just a pair of particle jets originating from the fragmentation of b quarks (b-jets). These are almost impossible to distinguish from the overwhelming background of b-quark pairs produced via the strong interaction (quantum chromodynamics or QCD). To overcome this challenge, it was necessary to consider production processes that are less copious, but exhibit features not present in QCD. The most effective of these is the associated production of the Higgs boson with a vector boson, W or Z. The leptonic decays, W→ℓν, Z→ℓℓ and Z→νν (where ℓ stands for an electron or a muon) provide signatures that allow for efficient triggering and powerful QCD background reduction.

    However, the Higgs boson signal remains orders of magnitude smaller than the remaining backgrounds arising from top quark or vector boson production, which lead to similar signatures. For instance, a top quark pair can decay as tt→[(W→ℓν)b][(W→qq)b] with a final state containing an electron or a muon and two b quarks, exactly as the (W→ℓν)(H→bb) signal.

    The main handle to discriminate the signal from such backgrounds is the invariant mass, mbb, of pairs of b-jets identified by sophisticated “b-tagging” algorithms. An example of such a mass distribution is shown in Figure 1, where the sum of the signal and background components is confronted to the data.

    1
    Figure 1: Distribution of mbb in the (W→ℓν)(H→bb) search channel. The signal is shown in red, the different backgrounds in various colours. The data are shown as points with error bars. (Image: ATLAS Collaboration/CERN)

    When all WH and ZH channels are combined and the backgrounds (apart from WZ and ZZ production) subtracted from the data, the distribution shown in Figure 2 exhibits a clear peak arising from Z boson decays to b-quark pairs, which validates the analysis procedure. The shoulder on the upper side is consistent in shape and rate with the expectation from Higgs boson production.

    2
    Figure 2: Distribution of mbb from all search channels combined after subtraction of all backgrounds except for WZ and ZZ production. The data (points with error bars) are compared to the expectations from the production of WZ and ZZ (in grey) and of WH and ZH (in red). (Image: ATLAS Collaboration/CERN)

    When all WH and ZH channels are combined and the backgrounds (apart from WZ and ZZ production) subtracted from the data, the distribution shown in Figure 2 exhibits a clear peak arising from Z boson decays to b-quark pairs, which validates the analysis procedure. The shoulder on the upper side is consistent in shape and rate with the expectation from Higgs boson production.

    This is, however, not sufficient to reach the level of detection that can be qualified as observation. To this end, the mass of the b-jet pair is combined with other kinematic variables that show distinct differences between the signal and the various backgrounds, for instance the angular separation between the two b-jets, or the transverse momentum of the associated vector boson. This combination of multiple variables is performed using the technique of boosted decision trees (BDTs). A combination of the BDT outputs from all channels, reordered in terms of signal-to-background ratio, is shown in Figure 3. It can be seen that the signal closely follows the distribution expected from the Standard Model. The BDT outputs are subjected to a sophisticated statistical analysis to extract the “significance” of the signal. This is another way to measure the probability of a fake observation in terms of standard deviations, σ, of a Gaussian distribution. The magic number corresponding to the observation of a signal is 5σ.

    3
    Figure 3: Distribution showing the combination of all BDT outputs reordered in terms of log(S/B), where S and B are the signal and background yields, respectively. The signal is shown in red, and the different backgrounds in various colours. The data are shown as points with error bars. The lower panel shows the “pull”, i.e. the ratio of data minus background to the statistical uncertainty of the background. (Image: ATLAS Collaboration/CERN)

    Observation achieved!

    The analysis of 13 TeV data collected by ATLAS during Run 2 of the LHC in 2015, 2016 and 2017 leads to a significance of 4.9σ – alone almost sufficient to claim observation. This result was combined with those from a similar analysis of Run 1 data and from other searches by ATLAS for the H→bb decay mode, namely where the Higgs boson is produced in association with a top quark pair or via a process known as vector boson fusion (VBF). The significance achieved by this combination is 5.4σ.

    Furthermore, combining the present analysis with others that target Higgs boson decays to pairs of photons and Z bosons measured at 13 TeV provides the observation at 5.3σ of associated VH (V = Z or W) production, in agreement with the Standard Model prediction. All four primary Higgs boson production modes at hadron colliders have now been observed, of which two only this year. In order of discovery: (1) fusion of gluons to a Higgs boson, (2) fusion of weak bosons to a Higgs boson, (3) associated production of a Higgs boson with two top quarks, and (4) associated production of a Higgs boson with a weak boson.

    With these observations, a new era of detailed measurements in the Higgs sector opens up, through which the Standard Model will be further challenged.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN map


    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN


    CERN Courier

    QuantumDiaries


    Quantum Diaries

     
  • richardmitnick 8:42 am on July 10, 2018 Permalink | Reply
    Tags: , , , Combined measurements of Higgs boson couplings reach new level of precision, HEP, , ,   

    From CERN ATLAS: “Combined measurements of Higgs boson couplings reach new level of precision” 

    CERN ATLAS Higgs Event

    CERN/ATLAS
    From CERN ATLAS

    9th July 2018

    1
    Figure 1: Measured cross-sections of main Higgs boson production modes at the LHC, namely gluon-gluon fusion (ggF), weak boson fusion (VBF), associated production with a weak vector boson W or Z (WH and ZH), and associated production with top quarks (ttH and tH), normalized to Standard Model predictions. The uncertainty of each measurement (indicated by the error bar) is broken down into statistical (yellow box) and systematic (blue box) parts. The theory uncertainty (grey box) on the Standard Model prediction (vertical red line at unity) is also shown. (Image: ATLAS Collaboration/CERN)

    The Higgs boson, discovered at the LHC in 2012, has a singular role in the Standard Model of particle physics.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Standard Model of Particle Physics from Symmetry Magazine

    Most notable is the Higgs boson’s affinity to mass, which can be likened to the electric charge for an electric field: the larger the mass of a fundamental particle, the larger the strength of its interaction, or “coupling”, with the Higgs boson. Deviations from these predictions could be a hallmark of new physics in this as-yet little-explored part of the Standard Model.

    Higgs boson couplings manifest themselves in the rate of production of the Higgs boson at the LHC, and its decay branching ratios into various final states. These rates have been precisely measured by the ATLAS experiment, using up to 80 fb–1 of data collected at a proton-proton collision energy of 13 TeV from 2015 to 2017. Measurements were performed in all of the main decay channels of the Higgs boson: to pairs of photons, W and Z bosons, bottom quarks, taus, and muons. The overall production rate of the Higgs boson was measured to be in agreement with Standard Model predictions, with an uncertainty of 8%. The uncertainty is reduced from 11% in the previous combined measurements released last year.

    The measurements are broken down into production modes (assuming Standard Model decay branching ratios), as shown in Figure 1. All four main production modes have now been observed at ATLAS with a significance of more than 5 standard deviations: the long-established gluon-gluon fusion mode, the recently-observed associated production with top-quark pair, and last-remaining weak boson fusion mode, presented today by ATLAS. Together with the observation of production in association with a weak boson and of the H→bb decay in a separate measurement, these results paint a complete picture of Higgs boson production and decay.

    Physicists can use these new results to study the couplings of the Higgs boson to other fundamental particles. As shown in Figure 2, these couplings are in excellent agreement with the Standard Model prediction over a range covering 3 orders of magnitude in mass, from the top quark (the heaviest particle in the Standard Model and thus with the strongest interaction with the Higgs boson) to the much lighter muons (for which only an upper limit of the coupling with the Higgs boson has been obtained so far).

    2
    Figure 2: Higgs boson coupling strength to each particle (error bars) as a function of particle mass compared with Standard Model prediction (blue dotted line). (Image: ATLAS Collaboration/CERN)

    The measurements also probe the coupling of the Higgs boson to gluons in the gluon-gluon fusion production process, which proceeds through a loop diagram and is thus particularly sensitive to new physics. In the Standard Model, the loop is mediated mainly by top quarks. Therefore, possible new physics contributions can be tested by comparing the gluon coupling with the direct measurement of the top quark coupling in Higgs boson production in association with top quarks, as shown in Figure 3.

    3
    Figure 3: Ratios of coupling strengths to each particle. By taking ratios, model assumptions (such as on the total width of the Higgs boson) can be significantly reduced. Among all the interesting tests performed, the one comparing the gluon-gluon fusion and Higgs boson production in association with top quarks is represented by λtg in the plot. (Image: ATLAS Collaboration/CERN)

    The excellent agreement with the Standard Model, which is observed throughout, can be used to set stringent limits on new physics models. These are based on possible modifications to Higgs couplings and complement direct searches performed at the LHC.

    Links:

    Combined measurements of Higgs boson production and decay using up to 80 fb−1 of proton-proton collision data at 13 TeV collected with the ATLAS experiment (ATLAS-CONF-2018-031)
    ICHEP2018 presentation by Nicolas Morange: Measurements of Higgs boson properties using a combination of different Higgs decay channels
    ICHEP2018 presentation by Tancredi Carli: Highlights from the ATLAS and ALICE Experiments
    ICHEP2018 presentation by Giacinto Piacquadio (coming Tuesday 9 July)
    See also the full lists of ATLAS Conference Notes and ATLAS Physics Papers.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles
    LHC at CERN


    CERN Courier

    QuantumDiaries


    Quantum Diaries

     
  • richardmitnick 3:27 pm on July 5, 2018 Permalink | Reply
    Tags: , , , HEP, , , , Quarks observed to interact via minuscule 'weak lightsabers'   

    From CERN ATLAS: “Quarks observed to interact via minuscule ‘weak lightsabers'” 

    CERN ATLAS Higgs Event

    CERN/ATLAS
    From CERN ATLAS

    5th July 2018
    1
    Left: Especially at invariant jet-jet masses, mjj, > 1000 GeV the yellow signal of W±W± W±W± scattering can be clearly seen above the background from other processes. Right: The orange signal of W±Z W±Z scattering is evident as the white contribution at large values of the score value of a multivariate boosted decision tree (BDT). (Image: ATLAS Collaboration/CERN)

    Two among the rarest processes probed so far at the LHC, the scattering between W and Z bosons emitted by quarks in proton-proton collisions, have been established by the ATLAS experiment at CERN.

    W and Z bosons play the same mediating role for the weak nuclear interaction as photons do for electromagnetism. As light beams of photons from torches or lasers unaffectedly penetrate each other, electromagnetic “lightsabers” will forever stay science fiction. However, beams of W and Z bosons – or “weak light rays” – can scatter from one another.

    One of the key motivations for building the Large Hadron Collider (LHC) at CERN was to study exactly this process, called weak “vector boson scattering” (VBS). One quark in each of two colliding protons has to radiate a W or a Z boson. These extremely short-lived particles are only able to fly a distance of 0.1×10-15m before transforming into other particles, and their interaction with other particles is limited to a range of 0.002×10-15m. In other words, these extremely short “weak lightsabers” extend only about 1/10th of a proton’s radius and have to approach each other by 1/500th of a proton’s radius! Such an extremely improbable coincidence happens only about once in 20,000 billion proton-proton interactions, recorded typically in one day of LHC operation.

    Using 2016 data, ATLAS has now doubtlessly observed WZ and WW electroweak production, with the dominant part of it being the weak vector boson scattering: W±W± → W±W± and W±Z → W±Z. This continues the experiment’s long journey to scrutinize the VBS process: using 8 TeV data from 2012, ATLAS had obtained the first evidence for the W±W± → W±W± process with 18 candidate events. Such a yield would occur with a probability of less than 1:3000 as a pure statistical fluctuation. Now, at a higher center-of-mass energy of 13 TeV, ATLAS has identified 60 W±W± → W±W± events, which only would happen less than once in 200 billion cases as a fluctuation from pure background processes. This corresponds to a statistical significance of 6.9 standard deviations (σ) above background. Besides the decay products of the scattered W or Z bosons, the signature of the process are two high-energetic particle jets originating from the two quarks that initially radiated the W or Z.

    ATLAS has also combined 2015 and 2016 data to establish the scattering of W±Z → W±Z with a statistical significance of 5.6 σ above background. In this channel, the lower-energy data of 2012 had revealed a significance of only 1.9σ, not sufficient to claim any evidence for the process. This time, thanks to a multivariate “BDT” analysis technique implemented in 2016, ATLAS was able to isolate 44 signal candidate events, of which about half reveal “BDT score” values above 0.4, where only little background is present.

    For this scattering process of vector bosons, three basic Standard Model “vertices” contribute: the interaction via the well-known “triple-boson-coupling” (green) is drastically reduced by the contributions of “quartic-boson-couplings” (red) and the “boson-Higgs-couplings” (orange). Only the latter ensures that the rate of this scattering for large centre-of-mass energies obeys the basic “unitarity” law, that a probability cannot be bigger than 100%. With the discovery of VBS, a new chapter of Standard Model tests has started, allowing ATLAS to scrutinize the so far experimentally inaccessible quartic-boson-couplings and properties of the Higgs boson.

    Related journal articles
    _________________________________________________
    See the full article for further references with links.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN


    CERN Courier

    QuantumDiaries


    Quantum Diaries

     
  • richardmitnick 1:35 pm on July 4, 2018 Permalink | Reply
    Tags: , , , HEP, , , ,   

    From CERN: “We need to talk about the Higgs” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN

    4 Jul 2018
    Anais Schaeffer

    1
    François Englert (left) and Peter Higgs at CERN on 4 July 2012, on the occasion of the announcement of the discovery of a Higgs boson (Image: Maximilien Brice/CERN)

    It is six years ago that the discovery of the Higgs boson was announced, to great fanfare in the world’s media, as a crowning success of CERN’s Large Hadron Collider (LHC).

    CERN/CMS Detector


    CERN CMS Higgs Event


    CERN/ATLAS detector


    CERN ATLAS Higgs Event

    The excitement of those days now seems a distant memory, replaced by a growing sense of disappointment at the lack of any major discovery thereafter.

    While there are valid reasons to feel less than delighted by the null results of searches for physics beyond the Standard Model (SM), this does not justify a mood of despondency. A particular concern is that, in today’s hyper-connected world, apparently harmless academic discussions risk evolving into a negative outlook for the field in broader society. For example, a recent news article in Nature led on the LHC’s “failure to detect new particles beyond the Higgs”, while The Economist reported that “Fundamental physics is frustrating physicists”. Equally worryingly, the situation in particle physics is sometimes negatively contrasted with that for gravitational waves: while the latter is, quite rightly, heralded as the start of a new era of exploration, the discovery of the Higgs is often described as the end of a long effort to complete the SM.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Let’s look at things more positively. The Higgs boson is a totally new type of fundamental particle that allows unprecedented tests of electroweak symmetry breaking. It thus provides us with a novel microscope with which to probe the universe at the smallest scales, in analogy with the prospects for new gravitational-wave telescopes that will study the largest scales. There is a clear need to measure its couplings to other particles – especially its coupling with itself – and to explore potential connections between the Higgs and hidden or dark sectors. These arguments alone provide ample motivation for the next generation of colliders including and beyond the high-luminosity LHC upgrade.

    So far the Higgs boson indeed looks SM-like, but some perspective is necessary. It took more than 40 years from the discovery of the neutrino to the realisation that it is not massless and therefore not SM-like; addressing this mystery is now a key component of the global particle-physics programme. Turning to my own main research area, the beauty quark – which reached its 40th birthday last year – is another example of a long-established particle that is now providing exciting hints of new phenomena (see Beauty quarks test lepton universality). One thrilling scenario, if these deviations from the SM are confirmed, is that the new physics landscape can be explored through both the b and Higgs microscopes. Let’s call it “multi-messenger particle physics”.

    How the results of our research are communicated to the public has never been more important. We must be honest about the lack of new physics that we all hoped would be found in early LHC data, yet to characterise this as a “failure” is absurd. If anything, the LHC has been more successful than expected, leaving its experiments struggling to keep up with the astonishing rates of delivered data. Particle physics is, after all, about exploring the unknown; the analysis of LHC data has led to thousands of publications and a wealth of new knowledge, and there is every possibility that there are big discoveries waiting to be made with further data and more innovative analyses. We also should not overlook the returns to society that the LHC has brought, from technology developments with associated spin-offs to the training of thousands of highly skilled young researchers.

    The level of expectation that has been heaped on the LHC seems unprecedented in the history of physics. Has any other facility been considered to have produced disappointing results because only one Nobel-prize winning discovery was made in its first few years of operation? Perhaps this reflects that the LHC is simply the right machine at the right time, but that time is not over: our new microscope is set to run for the next two decades and bring physics at the TeV scale into clear focus. The more we talk about that, the better our long-term chances of success.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

     
  • richardmitnick 1:02 pm on July 4, 2018 Permalink | Reply
    Tags: , , , HEP, , , ,   

    From CERN ATLAS: “The Higgs boson: the hunt, the discovery, the study and some future perspectives” 

    CERN ATLAS Higgs Event

    CERN/ATLAS
    From CERN ATLAS

    1
    Figure 1: A candidate Higgs to ZZ to 4-lepton event as seen in the ATLAS detector. The four reconstructed muons are visualised as red lines. The green and blue boxes show where the muons passed through the muon detectors. (Image: ATLAS Collaboration/CERN)


    The origins of the Higgs boson

    Many questions in particle physics are related to the existence of particle mass. The “Higgs mechanism,” which consists of the Higgs field and its corresponding Higgs boson, is said to give mass to elementary particles. By “mass” we mean the inertial mass, which resists when we try to accelerate an object, rather than the gravitational mass, which is sensitive to gravity. In Einstein’s celebrated formula E = mc2, the “m” is the inertial mass of the particle. In a sense, this mass is the essential quantity, which defines that at this place there is a particle rather than nothing.

    In the early 1960s, physicists had a powerful theory of electromagnetic interactions and a descriptive model of the weak nuclear interaction – the force that is at play in many radioactive decays and in the reactions which make the Sun shine. They had identified deep similarities between the structure of these two interactions, but a unified theory at the deeper level seemed to require that particles be massless even though real particles in nature have mass.

    In 1964, theorists proposed a solution to this puzzle. Independent efforts by Robert Brout and François Englert in Brussels, Peter Higgs at the University of Edinburgh, and others lead to a concrete model known as the Brout-Englert-Higgs (BEH) mechanism. The peculiarity of this mechanism is that it can give mass to elementary particles while retaining the nice structure of their original interactions. Importantly, this structure ensures that the theory remains predictive at very high energy. Particles that carry the weak interaction would acquire masses through their interaction with the Higgs field, as would all matter particles. The photon, which carries the electromagnetic interaction, would remain massless.

    In the history of the universe, particles interacted with the Higgs field just 10-12 seconds after the Big Bang. Before this phase transition, all particles were massless and travelled at the speed of light. After the universe expanded and cooled, particles interacted with the Higgs field and this interaction gave them mass. The BEH mechanism implies that the values of the elementary particle masses are linked to how strongly each particle couples to the Higgs field. These values are not predicted by current theories. However, once the mass of a particle is measured, its interaction with the Higgs boson can be determined.

    The BEH mechanism had several implications: first, that the weak interaction was mediated by heavy particles, namely the W and Z bosons, which were discovered at CERN in 1983. Second, the new field itself would materialize in another particle. The mass of this particle was unknown, but researchers knew it should be lower than 1 TeV – a value well beyond the then conceivable limits of accelerators. This particle was later called the Higgs boson and would become the most sought-after particle in all of particle physics.

    The accelerator, the experiments and the Higgs

    The Large Electron-Positron collider (LEP), which operated at CERN from 1989 to 2000, was the first accelerator to have significant reach into the potential mass range of the Higgs boson.

    CERN LEP Collider

    Though LEP did not find the Higgs boson, it made significant headway in the search, determining that the mass should be larger than 114 GeV.

    In 1984, a few physicists and engineers at CERN were exploring the possibility of installing a proton-proton accelerator with a very high collision energy of 10-20 TeV in the same tunnel as LEP. This accelerator would probe the full possible mass range for the Higgs, provided that the luminosity[1] was very high. However, this high luminosity would mean that each interesting collision would be accompanied by tens of background collisions. Given the state of detector technology of the time, this seemed a formidable challenge. CERN wisely launched a strong R&D programme, which enabled fast progress on the detectors. This seeded the early collaborations, which would later become ATLAS, CMS and the other LHC experiments.

    On the theory side, the 1990s saw much progress: physicists studied the production of the Higgs boson in proton-proton collisions and all its different decay modes. As each of these decay modes depends strongly on the unknown Higgs boson mass, future detectors would need to measure all possible kinds of particles to cover the wide mass range. Each decay mode was studied using intensive simulations and the important Higgs decay modes were amongst the benchmarks used to design the detector.

    Meanwhile, at the Fermi National Accelerator Laboratory (Fermilab) outside of Chicago, Illinois, the Tevatron collider was beginning to have some discovery potential for a Higgs boson with mass around 160 GeV. Tevatron, the scientific predecessor of the LHC, collided protons with antiprotons from 1986 to 2011.

    Tevatron Accelerator


    FNAL/Tevatron CDF detector


    FNAL/Tevatron DZero detector

    In 2008, after a long and intense period of construction, the LHC and its detectors were ready for the first beams. On 10 September 2008, the first injection of beams into the LHC was a big event at CERN, with the international press and authorities invited. The machine worked beautifully and we had very high hopes. Alas, ten days later, a problem in the superconducting magnets significantly damaged the LHC. A full year was necessary for repairs and to install a better protection system. The incident revealed a weakness in the magnets, which limited the collision energy to 7 TeV.

    When restarting, we faced a difficult decision: should we take another year to repair the weaknesses all around the ring, enabling operation at 13 TeV? Or should we immediately start and operate the LHC at 7 TeV, even though a factor of three fewer Higgs bosons would be produced? Detailed simulations showed that there was a chance of discovering the Higgs boson at the reduced energy, in particular in the range where the competition of the Tevatron was the most pressing, so we decided that starting immediately at 7 TeV was worth the chance.

    The LHC restarted in 2010 at 7 TeV with a modest luminosity – a luminosity that would increase in 2011. The ATLAS Collaboration had made good use of the forced stop of 2009 to better understand the detector and prepare the analyses. In 2010, Higgs experts from experiments and theory created the LHC Higgs Cross-Section[2] Working Group (LHCHXSWG), which proved invaluable as a forum to accompany the best calculations and to discuss the difficult aspects about Higgs production and decay. These results have since been regularly documented in the “LHCHXSWG Yellow Reports,” famous in the community.

    2
    Figure 2: The invariant mass from pairs of photons selected in the Higgs to γγ analysis, as shown at the seminar at CERN on 4 July 2012. The excess of events over the background prediction around 125 GeV is consistent with predictions for the Standard Model Higgs boson. (Image: ATLAS Collaboration/CERN)

    The discovery of the Higgs boson

    As Higgs bosons are extremely rare, sophisticated analysis techniques are required to spot the signal events within the large backgrounds from other processes. After signal-like events have been identified, powerful statistical methods are used to quantify how significant the signal is. As statistical fluctuations in the background can also look like signals, stringent statistical requirements are made before a new signal is claimed to have been discovered. The significance is typically quoted as σ, or a number of standard deviations of the normal distribution. In particle physics, a significance of 3σ is referred to as evidence, while 5σ is referred to as an observation, corresponding to the probability of a statistical fluctuation from the background of less than 1 in a million.

    Eager physicists analysed the data as soon as it arrived. In the summer of 2011, there was a small excess in the Higgs decay to two W bosons for a mass around 140 GeV. Things got more interesting as an excess at a similar mass was also seen in the diphoton channel. However, as the dataset increased, the size of this excess first increased and then decreased.

    By the end of 2011, ATLAS had collected and analysed 5 fb-1 of data at a centre-of-mass energy of 7 TeV. After combining all the channels, it was found that the Standard Model Higgs boson could be excluded for all masses except for a small window around 125 GeV, where an excess with a significance of around 3σ was observed, largely driven by the diphoton and four lepton decay channels. The results were shown at a special seminar at CERN on 13 December 2011. Although neither experiment had strong enough results to claim observation, what was particularly telling was the fact that both ATLAS and CMS had excesses at the same mass.

    In 2012, the energy of the LHC was increased from 7 to 8 TeV, which increased the cross-sections for Higgs boson production. The data arrived quickly: by the summer of 2012, ATLAS had collected 5 fb-1 at 8 TeV, doubling the dataset. As quickly as the data arrived it was analysed and, sure enough, the significance of that small bump around 125 GeV increased further. Rumours were flying around CERN when a joint seminar between ATLAS and CMS was announced for 4 July 2012. Seats at the seminar were so highly sought after that only the people who queued all night were able to get into the room. The presence of François Englert and Peter Higgs at the seminar increased the excitement even further.

    At the famous seminar, the spokespeople of the ATLAS and CMS Collaborations showed their results consecutively, each finding an excess around 5σ at a mass of 125 GeV. To conclude the session, CERN Director-General Rolf Heuer declared, “I think we have it.”

    The ATLAS Collaboration celebrated the discovery with champagne and by giving each member of the collaboration a t-shirt with the famous plots. Incidentally, only once they were printed was it discovered that there was a typo in the plot. No matter, these t-shirts would go on to become collector’s items.

    ATLAS and CMS published the results in Physics Letters B a few weeks later in a paper titled “Observation of a New Particle in the Search for the Standard Model Higgs Boson with the ATLAS Detector at the LHC.” The Nobel Prize in Physics was later awarded to Peter Higgs and François Englert in 2013.

    What we have learned since discovery

    After discovery, we began to study the properties of the newly-discovered particle to understand if it was the Standard Model Higgs boson or something else. In fact, we initially called it a Higgs-like boson as we did not want to claim it was the Higgs boson until we were certain. The mass, the final unknown parameter in the Standard Model, was one of the first parameters measured and found to be approximately 125 GeV (roughly 130 times larger than the mass of the proton). It turned out that we were very lucky – with this mass, the largest number of decay modes are possible.

    Standard Model of Particle Physics from Symmetry Magazine

    In the Standard Model, the Higgs boson is unique: it has zero spin, no electric charge and no strong force interaction. The spin and parity were measured through angular correlations between the particles it decayed to. Sure enough, these properties were found to be as predicted. At this point, we began to call it “the Higgs boson.” Of course, it still remains to be seen if it is the only Higgs boson or one of many, such as those predicted by supersymmetry.

    The discovery of the Higgs boson relied on measurements of its decay to vector bosons. In the Standard Model, different couplings determine its interactions to fermions and bosons, so new physics might impact them differently.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Therefore, it is important to measure both. The first direct probe of fermionic couplings was to tau particles, which was observed in the combination of ATLAS and CMS results performed at the end of Run 1. During Run 2, the increase in the centre-of-mass energy to 13 TeV and the larger dataset allowed further channels to be probed. Over the past year, the evidence has been obtained for the Higgs decay to bottom quarks and the production of the Higgs boson together with top quarks has been observed. This means that the interaction of the Higgs boson to fermions has been clearly established.

    Perhaps one of the neatest ways to summarise what we currently know about the interaction of the Higgs boson with other Standard Model particles is to compare the interaction strength to the mass of each particle, as shown in Figure 4. This clearly shows that the interaction strength depends on the particle mass: the heavier the particle, the stronger its interaction with the Higgs field. This is one of the main predictions of the BEH mechanism in the Standard Model.

    We don’t only do tests to verify that the properties of the Higgs boson agree with those predicted by the Standard Model – we specifically look for properties that would provide evidence for new physics. For example, constraining the rate that the Higgs boson decays to invisible or unobserved particles provides stringent limits on the existence of new particles with masses below that of the Higgs boson. We also look for decays to combinations of particles forbidden in the Standard Model. So far, none of these searches have found anything unexpected, but that doesn’t mean that we’re going to stop looking anytime soon!

    Outlook

    2018 is the last year that ATLAS will take data as part of the LHC’s Run 2. During this run, 13 TeV proton-proton collisions have been producing approximately 30 times more Higgs bosons than those used in the 2012 Higgs boson discovery. As a result, more and more results have been obtained to study the Higgs boson in greater detail.

    Over the next few years, analysis of the large Run 2 dataset will not only be an opportunity to reach a new level of precision in previous measurements, but also to investigate new methods to probe Standard Model predictions and to test for the presence of new physics in as model-independent a way as possible. This new level of precision will rely on obtaining a deeper level of understanding of the performance of the detector, as well as the simulations and algorithms used to identify particles passing through it. It also poses new challenges for theorists to keep up with the improving experimental precision.

    In the longer term, another big step in performance will be brought by the High-Luminosity LHC (HL-LHC), planned to begin operation in 2024. The HL-LHC will increase the number of collisions by another factor of 10. Among other measurements, this will open the possibility to investigate a very peculiar property of the Higgs boson: that it couples to itself. Events produced via this coupling feature two Higgs bosons in the final state, but they are exceedingly rare. Thus, they can only be studied within a very large number of collisions and using sophisticated analysis techniques. To match the increased performance of the LHC, the ATLAS and CMS detectors will undergo comprehensive upgrades during the years before HL-LHC.

    Looking more generally, the discovery of the Higgs boson with a mass of 125 GeV sets a new foundation for particle physics to build on. Many questions remain in the field, most of which have some relation to the Higgs sector. For example:

    A popular theory beyond the Standard Model is “supersymmetry”, which presents attractive features for solving current issues, such as the nature of dark matter. The minimal version of supersymmetry predicts that the Higgs boson mass should be less than 120-130 GeV, depending on some other parameters. Is it a coincidence that the observed value sits exactly at this critical value, hence still marginally allowing for this supersymmetric model?
    Several models have been recently proposed where the only link of Dark Matter with regular matter would be through the Higgs boson.
    Stability of the universe: the value of 125 GeV is almost at the critical boundary between a stable universe and a meta-stable universe. A meta-stable system possesses another baseline state, into which it can decay anytime due to quantum tunnelling.[3] Is this also a coincidence?
    The phase transition: the details of this transition may play a role in the process which led our universe to be entirely matter and not contain any anti-matter. Present calculations with the Standard Model Higgs boson alone are inconsistent with the observed matter-antimatter asymmetry. Is this a call for new physics or only incomplete calculations?
    Are fermion masses all related to the Higgs boson field? If yes, why is there such a huge hierarchy between the fermion masses spanning from fractions of electron-volts for the mysterious neutrinos up to the very heavy top quark, with a mass on the order of hundreds of billions of electron-volts?

    From what we’ve learned about it so far, the Higgs boson seems to play a very special role in nature… Can it show us the way to answer further questions?

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN


    CERN Courier

    QuantumDiaries


    Quantum Diaries

     
  • richardmitnick 7:44 pm on July 3, 2018 Permalink | Reply
    Tags: , , HEP, , ,   

    From Fermilab: “Fermilab computing experts bolster NOvA evidence, 1 million cores consumed” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermilab , an enduring source of strength for the US contribution to scientific research world wide.

    July 3, 2018
    No writer credit found

    How do you arrive at the physical laws of the universe when you’re given experimental data on a renegade particle that interacts so rarely with matter, it can cruise through light-years of lead? You call on the power of advanced computing.

    The NOvA neutrino experiment, in collaboration with the Department of Energy’s Scientific Discovery through Advanced Computing (SciDAC-4) program and the HEPCloud program at DOE’s Fermi National Accelerator Laboratory, was able to perform the largest-scale analysis ever to support the recent evidence of antineutrino oscillation, a phenomenon that may hold clues to how our universe evolved.

    FNAL/NOvA experiment map


    FNAL NOvA detector in northern Minnesota


    NOvA Far detector 15 metric-kiloton far detector in Minnesota just south of the U.S.-Canada border schematic


    NOvA Far Detector Block


    FNAL Near Detector

    Using Cori, the newest supercomputer at the National Energy Research Scientific Computing Center (NERSC), located at Lawrence Berkeley National Laboratory, NOvA used over 1 million computing cores, or CPUs, between May 14 and 15 and over a short timeframe one week later.

    1
    The Cori supercomputer at NERSC was used to perform a complex computational analysis for NOvA. NOvA used over 1 million computing cores, the largest amount ever used concurrently in a 54-hour period. Photo: Roy Kaltschmidt, Lawrence Berkeley National Laboratory
    NERSC CRAY Cori II supercomputerat NERSC at LBNL, named after Gerty Cori, the first American woman to win a Nobel Prize in science

    This is the largest number of CPUs ever used concurrently over this duration — about 54 hours — for a single high-energy physics experiment. This unprecedented amount of computing enabled scientists to carry out some of the most complicated techniques used in neutrino physics, allowing them to dig deeper into the seldom seen interactions of neutrinos. This Cori allocation was more than 400 times the amount of Fermilab computing allocated to the NOvA experiment and 50 times the total computing capacity at Fermilab allocated for all of its rare-physics experiments. A continuation of the analysis was performed on NERSC’s Cori and Edison supercomputers one week later.

    LBL NERSC Cray XC30 Edison supercomputer

    In total, nearly 35 million core-hours were consumed by NOvA in the 54-hour period. Executing the same analysis on a single desktop computer would take 4,000 years.

    “The special thing about NERSC is that it enabled NOvA to do the science at a new level of precision, a much finer resolution with greater statistical accuracy within a finite amount of time,” said Andrew Norman, NOvA physicist at Fermilab. “It facilitated doing analysis of real data coming off the detector at a rate 50 times faster than that achieved in the past. The first round of analysis was done within 16 hours. Experimenters were able to see what was coming out of the data, and in less than six hours everyone was looking at it. Without these types of resources, we, as a collaboration, could not have turned around results as quickly and understood what we were seeing.”

    The experiment presented the latest finding from the recently collected data at the Neutrino 2018 conference in Germany on June 4.

    “The speed with which NERSC allowed our analysis team to run sophisticated and intense calculations needed to produce our final results has been a game-changer,” said Fermilab scientist Peter Shanahan, NOvA co-spokesperson. “It accelerated our time-to-results on the last step in our analysis from weeks to days, and that has already had a huge impact on what we were able to show at Neutrino 2018.”

    In addition to the state-of-the-art NERSC facility, NOvA relied on work done within the SciDAC HEP Data Analytics on HPC (high-performance computers) project and the Fermilab HEPCloud facility. Both efforts are led by Fermilab scientific computing staff, and both worked together with researchers at NERSC to be able to support NOvA’s antineutrino oscillation evidence.

    The current standard practice for Fermilab experimenters is to perform similar analyses using less complex calculations through a combination of both traditional high-throughput computing and the distributed computing provided by Open Science Grid, a national partnership between laboratories and universities for data-intensive research. These are substantial resources, but they use a different model: Both use a large amount of computing resources over a long period of time. For example, some resources are offered only at a low priority, so their use may be preempted by higher-priority demands. But for complex, time-sensitive analyses such as NOvA’s, researchers need the faster processing enabled by modern, high-performance computing techniques.

    SciDAC-4 is a DOE Office of Science program that funds collaboration between experts in mathematics, physics and computer science to solve difficult problems. The HEP on HPC project was funded specifically to explore computational analysis techniques for doing large-scale data analysis on DOE-owned supercomputers. Running the NOvA analysis at NERSC, the mission supercomputing facility for the DOE Office of Science, was a task perfectly suited for this project. Fermilab’s Jim Kowalkowski is the principal investigator for HEP on HPC, which also has collaborators from DOE’s Argonne National Laboratory, Berkeley Lab, University of Cincinnati and Colorado State University.

    “This analysis forms a kind of baseline. We’re just ramping up, just starting to exploit the other capabilities of NERSC at an unprecedented scale,” Kowalkowski said.

    The project’s goal for its first year is to take compute-heavy analysis jobs like NOvA’s and enable it on supercomputers. That means not just running the analysis, but also changing how calculations are done and learning how to revamp the tools that manipulate the data, all in an effort to improve techniques used for doing these analyses and to leverage the full computational power and unique capabilities of modern high-performance computing facilities. In addition, the project seeks to consume all computing cores at once to shorten that timeline.

    The Fermilab HEPCloud facility provides cost-effective access to compute resources by optimizing usage across all available types and elastically expanding the resource pool on short notice by, for example, renting temporary resources on commercial clouds or using high-performance computers. HEPCloud enables NOvA and physicists from other experiments to use these compute resources in a transparent way.

    For this analysis, “NOvA experimenters didn’t have to change much in terms of business as usual,” said Burt Holzman, HEPCloud principal investigator. “With HEPCloud, we simply expanded our local on-site-at-Fermilab facilities to include Cori and Edison at NERSC.”

    3
    At the Neutrino 2018 conference, Fermilab’s NOvA neutrino experiment announced that it had seen strong evidence of muon antineutrinos oscillating into electron antineutrinos over long distances. NOvA collaborated with the Department of Energy’s Scientific Discovery through Advanced Computing program and Fermilab’s HEPCloud program to perform the largest-scale analysis ever to support the recent evidence. Photo: Reidar Hahn

    Building on work the Fermilab HEPCloud team has been doing with researchers at NERSC to optimize high-throughput computing in general, the HEPCloud team was able to leverage the facility to achieve the million-core milestone. Thus, it holds the record for the most resources ever provisioned concurrently at a single facility to run experimental HEP workflows.

    “This is the culmination of more than a decade of R&D we have done at Fermilab under SciDAC and the first taste of things to come, using these capabilities and HEPCloud,” said Panagiotis Spentzouris, head of the Fermilab Scientific Computing Division and HEPCloud sponsor.

    “NOvA is an experimental facility located more than 2,000 miles away from Berkeley Lab, where NERSC is located. The fact that we can make our resources available to the experimental researchers near real-time to enable their time-sensitive science that could not be completed otherwise is very exciting,” said Wahid Bhimji, a NERSC data architect at Berkeley Lab who worked with the NOvA team. “Led by colleague Lisa Gerhardt, we’ve been working closely with the HEPCloud team over the last couple of years, also to support physics experiments at the Large Hadron Collider. The recent NOvA results are a great example of how the infrastructure and capabilities that we’ve built can benefit a wide range of high energy experiments.”

    Going forward, Kowalkowski, Holzman and their associated teams will continue building on this achievement.

    “We’re going to keep iterating,” Kowalkowski said. “The new facilities and procedures were enthusiastically received by the NOvA collaboration. We will accelerate other key analyses.”

    NERSC is a DOE Office of Science user facility.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.


    FNAL/MINERvA

    FNAL DAMIC

    FNAL Muon g-2 studio

    FNAL Short-Baseline Near Detector under construction

    FNAL Mu2e solenoid

    Dark Energy Camera [DECam], built at FNAL

    FNAL DUNE Argon tank at SURF

    FNAL/MicrobooNE

    FNAL Don Lincoln

    FNAL/MINOS

    FNAL Cryomodule Testing Facility

    FNAL Minos Far Detector

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    FNAL/NOvA experiment map

    FNAL NOvA Near Detector

    FNAL ICARUS

    FNAL Holometer

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: