Tagged: CERN CMS Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:36 am on August 29, 2019 Permalink | Reply
    Tags: "From capturing collisions to avoiding them", , , CERN CMS, , , ,   

    From CERN: “From capturing collisions to avoiding them” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    29 August, 2019
    Kate Kahle

    1
    Around 100 simultaneous proton–proton collisions in an event recorded by the CMS experiment (Image: Thomas McCauley/CMS/CERN)

    With about one billion proton–proton collisions per second at the Large Hadron Collider (LHC), the LHC experiments need to sift quickly through the wealth of data to choose which collisions to analyse. To cope with an even higher number of collisions per second in the future, scientists are investigating computing methods such as machine-learning techniques. A new collaboration is now looking at how these techniques deployed on chips known as field-programmable gate arrays (FPGAs) could apply to autonomous driving, so that the fast decision-making used for particle collisions could help prevent collisions on the road.

    FPGAs have been used at CERN for many years and for many applications. Unlike the central processing unit of a laptop, these chips follow simple instructions and process many parallel tasks at once. With up to 100 high-speed serial links, they are able to support high-bandwidth inputs and outputs. Their parallel processing and re-programmability make them suitable for machine-learning applications.

    2
    An FPGA-based readout card for the CMS tracker (Image: John Coughlan/CMS/CERN)

    The challenge, however, has been to fit complex deep-learning algorithms – a particular class of machine-learning algorithms – in chips of limited capacity. This required software developed for the CERN-based experiments, called “hls4ml”, which reduces the algorithms and produces FPGA-ready code without loss of accuracy or performance, allowing the chips to execute decision-making algorithms in micro-seconds.

    A new collaboration between CERN and Zenuity, the autonomous driving software company headquartered in Sweden, plans to use the techniques and software developed for the experiments at CERN to research their use in deploying deep learning on FPGAs, a particular class of machine-learning algorithms, for autonomous driving. Instead of particle-physics data, the FPGAs will be used to interpret huge quantities of data generated by normal driving conditions, using readouts from car sensors to identify pedestrians and vehicles. The technology should enable automated drive cars to make faster and better decisions and predictions, thus avoiding traffic collisions.

    To find out more about CERN technologies and their potential applications, visit kt.cern/technologies.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Tunnel

    CERN LHC particles

     
  • richardmitnick 2:42 pm on August 26, 2019 Permalink | Reply
    Tags: , CERN CMS, , , , ,   

    From Fermi National Accelerator Lab: “USCMS completes phase 1 upgrade program for CMS detector at CERN” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    August 26, 2019
    James Wetzel

    The CMS experiment at CERN’s Large Hadron Collider has achieved yet another significant milestone in its already storied history as a leader in the field of high-energy experimental particle physics.

    The U.S. contingent of the CMS collaboration, known as USCMS and managed by Fermilab, has been granted the Department of Energy’s final Critical Decision- 4 approval for its multiyear Phase 1 Detector Upgrade program, formally signifying the completion of the project after having met every stated goal — on time and under budget.

    “Getting CD-4 approval is a tremendous vote of confidence for the many people involved in CMS,” said Fermilab scientist Steve Nahn, U.S. project manager for the CMS detector upgrade. “The LHC is the best tool we have for further explication of the particle nature of the universe, and there are still mysteries to solve, so we have to have the best apparatus we can to continue the exploration.”

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    The CMS experiment is a generation-spanning effort to build, operate and upgrade a particle-detecting behemoth that observes its protean prey in a large but cramped cavern 300 feet beneath the French countryside. CMS is one of four large experiments situated along the LHC accelerator complex, operated by CERN in Geneva, Switzerland. The LHC is a 17-mile-round ring of magnets that accelerates two beams of protons in opposite directions, each to 99.999999999% the speed of light, and forces them to collide at the centers of CMS and the LHC’s other experiments: ALICE, LHCb and ATLAS.

    1
    Fermilab scientists Nadja Strobbe and Jim Hirschauer test chips for the CMS detector upgrades. Photo: Reidar Hahn

    The main goal of CMS (and the other LHC experiments) is to keep track of which particles emerge from the rapture of pure energy created from the collisions in order to search for new particles and phenomena. In catching sight of such new phenomena, scientists aim to answer some of the most fundamental questions we have about how the universe works.

    The global CMS collaboration comprises more than 5,000 professionals — including roughly 1,000 students — from over 200 institutes and universities across more than 50 countries. This international team collaborates to design, build, commission and operate the CMS detector, whose data is then distributed to dedicated centers in 40 nations for analysis. And analysis is their raison d’etre. By sussing out patterns in the data, CMS scientists search for previously unseen or unconfirmed phenomena and measure the properties of elementary particles that make up the universe with greater precision. To date, CMS has published over 900 papers.

    The USCMS collaboration is the single largest national group in CMS, involving 51 American universities and institutions in 24 states and Puerto Rico, over 400 Ph.D. physicists, and more than 200 graduate students and other professionals. USCMS has played a primary role in much of the CMS experiment’s original design and construction, including a wide network of eight CMS computing centers located across the United States, and in the experiment’s data analysis. USCMS is supported by the U.S. Department of Energy and the National Science Foundation and has played an integral role in the success of the CMS collaboration as a whole from its founding.

    The CMS experiment, the LHC and the other LHC experiments became operational in 2009 (17 years after the CMS letter of intent), beginning a 10-year data-taking period referred to as Phase 1.

    Phase 1 was divided into four major epochs, alternating two periods of data-taking with two periods of maintenance and upgrade operations. The two data-taking periods are referred to as Run 1 (2009-2013) and Run 2 (2015-2018). It was during Run 1 (in 2012) that the CMS and ATLAS collaborations jointly announced they each had observed the long predicted Higgs boson, resulting in a Nobel Prize awarded a year later to scientists Peter Higgs and François Englert, and a further testament to the strength of the Standard Model of particle physics, the theory within which the Higgs boson was first hypothesized in 1964.

    Peter Higgs

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    “That prize was a historic triumph of every individual, institution and nation involved with the LHC project, not only validating the Higgs conjecture, a cornerstone of the Standard Model, but also giving science a new particle to use as a tool for further exploration,” Nahn said. “This discovery and every milestone CMS has achieved since then is encouragement to continue working toward further discovery. That goes for our latest approval milestone.”

    Standard Model of Particle Physics

    2
    Fermilab scientist Maral Alyari and Stephanie Timpone conduct CMS pixel detector work. Photo: Reidar Hahn

    During the entirety of Phase 1, the wizard-like LHC particle accelerator experts were continually ramping up the collision energy and intensity, or in particle physics parlance, the luminosity of the LHC beam. The CMS technical team was charged with fulfilling the Phase 1 Upgrade plan, a series of hardware upgrades to the detector that allowed it to fully profit from the gains the LHC team was providing.

    While the LHC accelerator folks were prepping to push 20 times as many particles through the experiments per second, the experiments were busy upgrading their systems to handle this major influx of particles and the resulting data. This meant updating many of the readout electronics with faster and more capable brains to manage and process the data produced by CMS.

    With support from the Department of Energy’s Office of Science and the National Science Foundation, USCMS implemented $40 million worth of these strategic upgrades on time and under budget.

    With these upgrades complete, the CMS detector is now ready for LHC Run 3, which will go from 2021-23, and the collaboration is starting the stage of data taking on a solid foundation.

    Still, USCMS isn’t taking a break: The collaboration is already gearing up for its next, even more ambitious set of upgrades, planned for installation after Run 3. This USCMS upgrade phase will prepare the detector for an even higher luminosity, resulting in a data set 10 times greater than what the LHC provides currently.

    Every advance in the CMS detector ensures that it will support the experiment through 2038, when the LHC is planned to complete its final run.

    “For the last decade, we’ve worked to improve and enhance the CMS detector to squeeze everything we can out of the LHC’s collisions,” Nahn said. “We’re prepared to do the same for the next two decades to come.”

    See the full here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 12:35 pm on August 10, 2019 Permalink | Reply
    Tags: "Physicists Working to Discover New Particles, , CERN CMS, , , , , Texas Tech, The LDMX Experiment   

    From Texas Tech via FNAL: “Physicists Working to Discover New Particles, Dark Matter” 

    1

    From TEXAS TECH UNIVERSITY

    via

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    August 5, 2019
    Glenys Young, Texas Tech

    Faculty recently presented their work at the European Physical Society’s 2019 Conference on High Energy Physics.

    Texas Tech University is well known for its research on topics that hit close to home for us here on the South Plains, like agriculture, water use and climate. But Texas Tech also is making its name known among those who study the farthest reaches of space and the mysteries of matter.

    Faculty from the Texas Tech Department of Physics & Astronomy recently presented at the European Physical Society’s 2019 Conference on High Energy Physics on the search for dark matter and other new particles that could help unlock the history and nature of the universe.

    New ways to approach the most classical search for new particles.

    Texas Tech, led by professor and department chair Sung-Won Lee, has been playing a leading role in new-particle hunt for more than a decade. As part of the Compact Muon Solenoid (CMS) experiment, which investigates a wide range of physics, including the search for extra dimensions and particles that could make up dark matter, Lee has led the new-particle search at the European Organization for Nuclear Research (CERN).

    1
    Lee

    “Basically, we’re looking for any experimental evidence of new particles that could open the door to whole new realms of physics that researchers believe could be there,” Lee said. “Researchers at Texas Tech are continuing to look for elusive new particles in the CMS experiment at CERN’s Large Hadron Collider (LHC), and if found, we could answer some of the most profound questions about the structure of matter and the evolution of the early universe.”

    The LHC essentially bounces around tiny particles at incredibly high speeds to see what happens when the particles collide. Lee’s search focuses on identifying possible hints of new physics that could add more subatomic particles to the Standard Model of particle physics.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS

    CERN CMS New

    LHCb
    CERN LHCb New II

    “The Standard Model has been enormously successful, but it leaves many important questions unanswered,” Lee said.

    Standard Model of Particle Physics

    “It is also widely acknowledged that, from the theoretical standpoint, the Standard Model must be part of a larger theory, ‘Beyond the Standard Model’ (BSM), which is yet to be experimentally confirmed.”

    Some BSM theories suggest that the production and decay of new particles could be observed in the LHC by the resulting highly energetic jets that shoot out in opposite directions (dijets) and the resonances they leave. Thus the search for new particles depends on the search for these resonances. In some ways, it’s like trying to trace air movements to find a fan you can’t see, hear or touch.

    In 2018-19, in collaboration with the CMS group, Texas Tech’s team performed a search for narrow dijet resonances using a newly available dataset at the LHC. The data were consistent with the Standard Model predictions, and no significant deviations from the pure background hypothesis were observed. But one spectacular collision was recorded in which the masses of the two jets were the same. This evidence allows for the possibility that the jets originated from BSM-hypothesized particle decay.

    “Since the LHC is the highest energy collider currently in operation, it is crucial to pay special attention to the highest-dijet-mass events where first hints of new physics at higher energies could start to appear,” Lee said. “This unusual high-mass event could likely be a collision created by the Standard Model background or possibly the first hint of new physics, but with only one event in hand, it is not possible to say which.”

    For now, Lee, postdoctoral research fellow Federico De Guio and doctoral student Zhixing (Tyler) Wang are working to update the dijet resonance search using the full LHC dataset and extend the scope of the analysis.

    “This extension of the search could help prove space-time-matter theory, which requires the existence of several extra spatial dimensions to the universe,” Lee said. “I believe that, with our extensive research experience, Texas Tech’s High Energy Physics group can contribute to making such discoveries.”

    Enhancing the missing momentum microscope

    Included in the ongoing new-particle search using the LHC is the pursuit of dark matter, an elusive, invisible form of matter that dominates the matter content of the universe.

    “Currently, the LHC is producing the highest-energy collisions from an accelerator in the world, and my primary research interest is in understanding whether or not new states of matter are being produced in these collisions,” said Andrew Whitbeck, an assistant professor in the Department of Physics & Astronomy.

    4
    Whitbeck

    “Specifically, we are looking for dark matter produced in association with quarks, the constituents of the proton and neutron. These signatures are important for both understanding the nature of dark matter, but also the nature of the Higgs boson, a cornerstone of our theory for how elementary particles interact.”

    The discovery of the Higgs boson at the LHC in 2012 was a widely celebrated accomplishment of the LHC and the detector collaborations involved.

    Peter Higgs


    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    However, the mere existence of the Higgs boson has provoked a lot of questions about whether there are new particles that could help us better understand the Higgs boson and other questions, like why gravity is so weak compared to other forces.

    As an offshoot of that finding, Whitbeck has been working to better understand a type of particle called neutrinos.

    “Neutrinos are a unique particle in the catalog of known particles in that they are the lightest matter particles, and they only can interact with particles via the Weak force, which, as its name suggests, only produces a feeble force between neutrinos and other matter,” Whitbeck said. “Neutrinos are so weakly interacting at the energies produced by the LHC that it is very likely a neutrino travels through the entire earth without deviating from its initial trajectory.

    “Dark matter is expected to behave similarly given that, despite being all around us, we don’t directly see it. This means that in looking for dark matter produced in proton-proton collisions, we often find lots of neutrinos. Understanding how many events with neutrinos there are is an important first step to understanding if there are events with dark matter.”

    Since the discovery of the Higgs boson, many of the most obvious signatures have come up empty for any signs of dark matter, and the latest results are some of the most sensitive measurements done to date. However, Whitbeck and his fellow scientists will continue to look for many more subtle signatures as well as a very powerful signature in which dark matter hypothetically is produced almost by itself, with only one lonely proton fragment visible in the event. The strategy provides powerful constraints for the most difficult-to-see models of dark matter.

    “With all of the traditional ways of searching for dark matter in proton-proton collisions turning up empty, I have also been working to design a new experiment, the Light Dark Matter eXperiment (LDMX), that will employ detector technology and techniques similar to what is used at CMS to look for dark matter,” Whitbeck said.

    6
    Texas Tech The LDMX Experiment schematic

    “One significant difference is that LDMX will look at electrons bombarding a target. If the mass of dark matter is somewhere between the mass of the electron and the mass of the proton, this experiment will likely be able to see it.”

    Texas Tech also is working to upgrade the CMS detector so it can handle much higher rates of collisions after the LHC undergoes some upgrades of its own. The hope is that with higher rates, they’ll be able to see not only new massive particles but also the rarest of processes, such as the production of two Higgs bosons. This detector construction is ramping up now at Texas Tech’s new Advanced Physics Detector Laboratory at Reese Technology Center.

    Besides being a background for dark matter searches, neutrinos also are a growing focus of research in particle physics. Even now, the Fermi National Accelerator Laboratory is able to produce intense beams of neutrinos that can be used to study their idiosyncrasies, but there are plans to upgrade the facility to produce the most intense beams of neutrinos ever and to place the most sensitive neutrino detectors nearby, making the U.S. the center of neutrino physics.

    FNAL/NOvA experiment map

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    Measurements done with these neutrinos could unlock whether these particles play a big role in the creation of a matter-dominated universe.

    Texas Tech’s High Energy Physics group hopes that, in the near future, it can help tackle some of the challenges this endeavor presents.

    See the full here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 1:18 pm on August 5, 2019 Permalink | Reply
    Tags: "Fermilab’s HEPCloud goes live", , CERN CMS, , , , ,   

    From Fermi National Accelerator Lab: “Fermilab’s HEPCloud goes live” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    August 5, 2019
    Marcia Teckenbrock

    To meet the evolving needs of high-energy physics experiments, the underlying computing infrastructure must also evolve. Say hi to HEPCloud, the new, flexible way of meeting the peak computing demands of high-energy physics experiments using supercomputers, commercial services and other resources.

    Five years ago, Fermilab scientific computing experts began addressing the computing resource requirements for research occurring today and in the next decade. Back then, in 2014, some of Fermilab’s neutrino programs were just starting up. Looking further into future, plans were under way for two big projects. One was Fermilab’s participation in the future High-Luminosity Large Hadron Collider at the European laboratory CERN.

    The other was the expansion of the Fermilab-hosted neutrino program, including the international Deep Underground Neutrino Experiment. All of these programs would be accompanied by unprecedented data demands.

    To meet these demands, the experts had to change the way they did business.

    HEPCloud, the flagship project pioneered by Fermilab, changes the computing landscape because it employs an elastic computing model. Tested successfully over the last couple of years, it officially went into production as a service for Fermilab researchers this spring.

    2
    Scientists on Fermilab’s NOvA experiment were able to execute around 2 million hardware threads at a supercomputer [NERSC Cray Cori II supercomputer at NERSC at LBNL, named after Gerty Cori, the first American woman to win a Nobel Prize in science the Office of Science’s National Energy Research Scientific Computing Center.] And scientists on CMS experiment have been running workflows using HEPCloud at NERSC as a pilot project. Photo: Roy Kaltschmidt, Lawrence Berkeley National Laboratory]

    Experiments currently have some fixed computing capacity that meets, but doesn’t overshoot, its everyday needs. For times of peak demand, HEPCloud enables elasticity, allowing experiments to rent computing resources from other sources, such as supercomputers and commercial clouds, and manages them to satisfy peak demand. The prior method was to purchase local resources that on a day-to-day basis, overshoot the needs. In this new way, HEPCloud reduces the costs of providing computing capacity.

    “Traditionally, we would buy enough computers for peak capacity and put them in our local data center to cover our needs,” said Fermilab scientist Panagiotis Spentzouris, former HEPCloud project sponsor and a driving force behind HEPCloud. “However, the needs of experiments are not steady. They have peaks and valleys, so you want an elastic facility.”

    In addition, HEPCloud optimizes resource usage across all types, whether these resources are on site at Fermilab, on a grid such as Open Science Grid, in a cloud such as Amazon or Google, or at supercomputing centers like those run by the DOE Office of Science Advanced Scientific Computing Research program (ASCR). And it provides a uniform interface for scientists to easily access these resources without needing expert knowledge about where and how best to run their jobs.

    The idea to create a virtual facility to extend Fermilab’s computing resources began in 2014, when Spentzouris and Fermilab scientist Lothar Bauerdick began exploring ways to best provide resources for experiments at CERN’s Large Hadron Collider. The idea was to provide those resources based on the overall experiment needs rather than a certain amount of horsepower. After many planning sessions with computing experts from the CMS experiment at the LHC and beyond, and after a long period of hammering out the idea, a scientific facility called “One Facility” was born. DOE Associate Director of Science for High Energy Physics Jim Siegrist coined the name “HEPCloud” — a computing cloud for high-energy physics — during a general discussion about a solution for LHC computing demands. But interest beyond high-energy physics was also significant. DOE Associate Director of Science for Advanced Scientific Computing Research Barbara Helland was interested in HEPCloud for its relevancy to other Office of Science computing needs.

    3
    The CMS detector at CERN collects data from particle collisions at the Large Hadron Collider. Now that HEPCloud is in production, CMS scientists will be able to run all of their physics workflows on the expanded resources made available through HEPCloud. Photo: CERN

    The project was a collaborative one. In addition to many individuals at Fermilab, Miron Livny at the University of Wisconsin-Madison contributed to the design, enabling HEPCloud to use the workload management system known as Condor (now HTCondor), which is used for all of the lab’s current grid activities.

    Since its inception, HEPCloud has achieved several milestones as it moved through the several development phases leading up to production. The project team first demonstrated the use of cloud computing on a significant scale in February 2016, when the CMS experiment used HEPCloud to achieve about 60,000 cores on the Amazon cloud, AWS. In November 2016, CMS again used HEPCloud to run 160,000 cores using Google Cloud Services , doubling the total size of the LHC’s computing worldwide. Most recently in May 2018, NOvA scientists were able to execute around 2 million hardware threads at a supercomputer the Office of Science’s National Energy Research Scientific Computing Center (NERSC), increasing both the scale and the amount of resources provided. During these activities, the experiments were executing and benefiting from real physics workflows. NOvA was even able to report significant scientific results at the Neutrino 2018 conference in Germany, one of the most attended conferences in neutrino physics.

    CMS has been running workflows using HEPCloud at NERSC as a pilot project. Now that HEPCloud is in production, CMS scientists will be able to run all of their physics workflows on the expanded resources made available through HEPCloud.

    Next, HEPCloud project members will work to expand the reach of HEPCloud even further, enabling experiments to use the leadership-class supercomputing facilities run by ASCR at Argonne National Laboratory and Oak Ridge National Laboratory.

    Fermilab experts are working to see that, eventually, all Fermilab experiments be configured to use these extended computing resources.

    This work is supported by the DOE Office of Science.

    See the full here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 12:35 pm on July 18, 2019 Permalink | Reply
    Tags: "CMS releases open data for Machine Learning", , CERN CMS, , , ,   

    From CERN CMS: “CMS releases open data for Machine Learning” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN CMS

    17 July, 2019

    CMS has also provided open access to 100% of its research data recorded in proton–proton collisions in 2010.

    1
    (Image: Fermilab/CERN)

    The CMS collaboration at CERN has released its fourth batch of open data to the public. With this release, which brings the volume of its open data to more than 2 PB (or two million GB), CMS has now provided open access to 100% of its research data recorded in proton–proton collisions in 2010, in line with the collaboration’s data-release policy. The release also includes several new data and simulation samples. The new release builds upon and expands the scope of the successful use of CMS open data in research and in education.

    In this release, CMS open data address the ever-growing application of machine learning (ML) to challenges in high-energy physics. According to a recent paper, collaboration with the data-science and ML community is considered a high-priority to help advance the application of state-of-the-art algorithms in particle physics. CMS has therefore also made available samples that can help foster such collaboration.

    “Modern machine learning is having a transformative impact on collider physics, from event reconstruction and detector simulation to searches for new physics,” remarks Jesse Thaler, an Associate Professor at MIT, who is working on ML using CMS open data with two doctoral students, Patrick Komiske and Eric Metodiev. “The performance of machine-learning techniques, however, is directly tied to the quality of the underlying training data. With the extra information provided in the latest data release from CMS, outside users can now investigate novel strategies on fully realistic samples, which will likely lead to exciting advances in collider data analysis.”

    The ML datasets, derived from millions of CMS simulation events for previous and future runs of the Large Hadron Collider, focus on solving a number of representative challenges for particle identification, tracking and distinguishing between multiple collisions that occur in each crossing of proton bunches. All the datasets come with extensive documentation on what they contain, how to use them and how to reproduce them with modified content.

    In its policy on data preservation and open access, CMS commits to releasing 100% of its analysable data within ten years of collecting them. Around half of proton-proton collision data collected at 7 TeV center-of-mass in 2010 were released in the first CMS release in 2014, and the remaining data are included in this new release. In addition, a small sample of unprocessed raw data from LHC’s Run 1 (2010 to 2012) are also released. These samples will help test the chain for processing CMS data using the legacy software environment.

    Reconstructed data and simulations from the CASTOR calorimeter, which was used by CMS in 2010, are also available and represent the first release of data from the very-forward region of CMS. Finally, CMS has released instructions and examples on how to generate simulated events and how to analyse data in isolated “containers”, within which one has access to the CMS software environment required for specific datasets. It is also easier to search through the simulated data and to discover the provenance of datasets.

    As before, the data are released into the public domain under the Creative Commons CC0 waiver via the CERN Open Data portal. The portal is openly developed by the CERN Information Technology department, in cooperation with the experimental collaborations who release open data on it.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    CMS
    CERN CMS New

     
  • richardmitnick 12:10 pm on July 15, 2019 Permalink | Reply
    Tags: , , , CERN CMS, , , , , ,   

    From CERN: “Exploring the Higgs boson “discovery channels” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    12th July 2019
    ATLAS Collaboration

    1
    Event display of a two-electron two-muon ZH candidate. The Higgs candidate can be seen on the left with the two leading electrons represented by green tracks and green EM calorimeter deposits (pT = 22 and 120 GeV), and two subleading muons indicated by two red tracks (pT = 34 and 43 GeV). Recoiling against the four lepton candidate in the left hemisphere is a dimuon pair in the right hemisphere indicated by two red tracks (pT = 139 and 42 GeV) and an invariant mass of 91.5 GeV, which agrees well with the mass of the Z boson. (Image: ATLAS Collaboration/CERN)

    At the 2019 European Physical Society’s High-Energy Physics conference (EPS-HEP) taking place in Ghent, Belgium, the ATLAS and CMS collaborations presented a suite of new results. These include several analyses using the full dataset from the second run of CERN’s Large Hadron Collider (LHC), recorded at a collision energy of 13 TeV between 2015 and 2018. Among the highlights are the latest precision measurements involving the Higgs boson. In only seven years since its discovery, scientists have carefully studied several of the properties of this unique particle, which is increasingly becoming a powerful tool in the search for new physics.

    The results include new searches for transformations (or “decays”) of the Higgs boson into pairs of muons and into pairs of charm quarks. Both ATLAS and CMS also measured previously unexplored properties of decays of the Higgs boson that involve electroweak bosons (the W, the Z and the photon) and compared these with the predictions of the Standard Model (SM) of particle physics. ATLAS and CMS will continue these studies over the course of the LHC’s Run 3 (2021 to 2023) and in the era of the High-Luminosity LHC (from 2026 onwards).

    The Higgs boson is the quantum manifestation of the all-pervading Higgs field, which gives mass to elementary particles it interacts with, via the Brout-Englert-Higgs mechanism. Scientists look for such interactions between the Higgs boson and elementary particles, either by studying specific decays of the Higgs boson or by searching for instances where the Higgs boson is produced along with other particles. The Higgs boson decays almost instantly after being produced in the LHC and it is by looking through its decay products that scientists can probe its behaviour.

    In the LHC’s Run 1 (2010 to 2012), decays of the Higgs boson involving pairs of electroweak bosons were observed. Now, the complete Run 2 dataset – around 140 inverse femtobarns each, the equivalent of over 10 000 trillion collisions – provides a much larger sample of Higgs bosons to study, allowing measurements of the particle’s properties to be made with unprecedented precision. ATLAS and CMS have measured the so-called “differential cross-sections” of the bosonic decay processes, which look at not just the production rate of Higgs bosons but also the distribution and orientation of the decay products relative to the colliding proton beams. These measurements provide insight into the underlying mechanism that produces the Higgs bosons. Both collaborations determined that the observed rates and distributions are compatible with those predicted by the Standard Model, at the current rate of statistical uncertainty.

    Since the strength of the Higgs boson’s interaction is proportional to the mass of elementary particles, it interacts most strongly with the heaviest generation of fermions, the third. Previously, ATLAS and CMS had each observed these interactions. However, interactions with the lighter second-generation fermions – muons, charm quarks and strange quarks – are considerably rarer. At EPS-HEP, both collaborations reported on their searches for the elusive second-generation interactions.
    ATLAS presented their first result from searches for Higgs bosons decaying to pairs of muons (H→μμ) with the full Run 2 dataset. This search is complicated by the large background of more typical SM processes that produce pairs of muons. “This result shows that we are now close to the sensitivity required to test the Standard Model’s predictions for this very rare decay of the Higgs boson,” says Karl Jakobs, the ATLAS spokesperson. “However, a definitive statement on the second generation will require the larger datasets that will be provided by the LHC in Run 3 and by the High-Luminosity LHC.”
    CMS presented their first result on searches for decays of Higgs bosons to pairs of charm quarks (H→cc). When a Higgs boson decays into quarks, these elementary particles immediately produce jets of particles. “Identifying jets formed by charm quarks and isolating them from other types of jets is a huge challenge,” says Roberto Carlin, spokesperson for CMS. “We’re very happy to have shown that we can tackle this difficult decay channel. We have developed novel machine-learning techniques to help with this task.”

    3
    An event recorded by CMS showing a candidate for a Higgs boson produced in association with two top quarks. The Higgs boson and top quarks decay leading to a final state with seven jets (orange cones), an electron (green line), a muon (red line) and missing transverse energy (pink line) (Image: CMS/CERN)

    The Higgs boson also acts as a mediator of physics processes in which electroweak bosons scatter or bounce off each other. Studies of these processes with very high statistics serve as powerful tests of the Standard Model. ATLAS presented the first-ever measurement of the scattering of two Z bosons. Observing this scattering completes the picture for the W and Z bosons as ATLAS has previously observed the WZ scattering process and both collaborations the WW processes. CMS presented the first observation of electroweak-boson scattering that results in the production of a Z boson and a photon.
    “The experiments are making big strides in the monumental task of understanding the Higgs boson,” says Eckhard Elsen, CERN’s Director of Research and Computing. “After observation of its coupling to the third-generation fermions, the experiments have now shown that they have the tools at hand to address the even more challenging second generation. The LHC’s precision physics programme is in full swing.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 12:38 pm on May 25, 2019 Permalink | Reply
    Tags: "CMS hunts for dark photons coming from the Higgs boson", , CERN CMS, , , One idea is that dark matter comprises dark particles that interact with each other through a mediator particle called the dark photon, , ,   

    From CERN CMS: “CMS hunts for dark photons coming from the Higgs boson” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN CMS

    24 May, 2019
    Ana Lopes

    1
    A proton–proton collision event featuring a muon–antimuon pair (red), a photon (green), and large missing transverse momentum. (Image: CERN)

    They know it’s there but they don’t know what it’s made of. That pretty much sums up scientists’ knowledge of dark matter.

    Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com

    Coma cluster via NASA/ESA Hubble

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL)


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu

    This knowledge comes from observations of the universe, which indicate that the invisible form matter is about five to six times more abundant than visible matter.

    One idea is that dark matter comprises dark particles that interact with each other through a mediator particle called the dark photon, named in analogy with the ordinary photon that acts as a mediator between electrically charged particles. A dark photon would also interact weakly with the known particles described by the Standard Model of particle physics, including the Higgs boson.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    At the Large Hadron Collider Physics (LHCP) conference, happening this week in Puebla, Mexico, the CMS collaboration reported the results of its latest search for dark photons.

    The collaboration used a large proton–proton collision dataset, collected during the Large Hadron Collider’s second run, to search for instances in which the Higgs boson might transform, or “decay”, into a photon and a massless dark photon. They focused on cases in which the boson is produced together with a Z boson that itself decays into electrons or their heavier cousins known as muons.

    Such instances are expected to be extremely rare, and finding them requires deducing the presence of the potential dark photon, which particle detectors won’t see. For this, researchers add up the momenta of the detected particles in the transverse direction – that is, at right angles to the colliding beams of protons – and identify any missing momentum needed to reach a total value of zero. Such missing transverse momentum indicates an undetected particle.

    But there’s another step to distinguish between a possible dark photon and known particles. This entails estimating the mass of the particle that decays into the detected photon and the undetected particle. If the missing transverse momentum is carried by a dark photon produced in the decay of the Higgs boson, that mass should correspond to the Higgs-boson mass.

    The CMS collaboration followed this approach but found no signal of dark photons. However, the collaboration placed upper bounds on the likelihood that a signal would have been seen.

    Another null result? Yes, but results such as these and the ATLAS results on supersymmetry also presented this week in Puebla, while not finding new particles or ruling out their existence, are much needed to guide future work, both experimental and theoretical.

    For more details about this result, see the CMS website.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    CMS
    CERN CMS New

     
  • richardmitnick 12:04 pm on May 14, 2019 Permalink | Reply
    Tags: >Model-dependent vs model-independent research, , , CERN CMS, , , , , , , ,   

    From Symmetry: “Casting a wide net” 

    Symmetry Mag
    From Symmetry

    05/14/19
    Jim Daley

    1
    Illustration by Sandbox Studio, Chicago

    In their quest to discover physics beyond the Standard Model, physicists weigh the pros and cons of different search strategies.

    On October 30, 1975, theorists John Ellis, Mary K. Gaillard and D.V. Nanopoulos published a paper [Science Direct] titled “A Phenomenological Profile of the Higgs Boson.” They ended their paper with a note to their fellow scientists.

    “We should perhaps finish with an apology and a caution,” it said. “We apologize to experimentalists for having no idea what is the mass of the Higgs boson… and for not being sure of its couplings to other particles, except that they are probably all very small.

    “For these reasons, we do not want to encourage big experimental searches for the Higgs boson, but we do feel that people performing experiments vulnerable to the Higgs boson should know how it may turn up.”

    What the theorists were cautioning against was a model-dependent search, a search for a particle predicted by a certain model—in this case, the Standard Model of particle physics.

    Standard Model of Particle Physics

    It shouldn’t have been too much of a worry. Around then, most particle physicists’ experiments were general searches, not based on predictions from a particular model, says Jonathan Feng, a theoretical particle physicist at the University of California, Irvine.

    Using early particle colliders, physicists smashed electrons and protons together at high energies and looked to see what came out. Samuel Ting and Burton Richter, who shared the 1976 Nobel Prize in physics for the discovery of the charm quark, for example, were not looking for the particle with any theoretical prejudice, Feng says.

    That began to change in the 1980s and ’90s. That’s when physicists began exploring elegant new theories such as supersymmetry, which could tie up many of the Standard Model’s theoretical loose ends—and which predict the existence of a whole slew of new particles for scientists to try to find.

    Of course, there was also the Higgs boson. Even though scientists didn’t have a good prediction of its mass, they had good motivations for thinking it was out there waiting to be discovered.

    And it was. Almost 40 years after the theorists’ tongue-in-cheek warning about searching for the Higgs, Ellis found himself sitting in the main auditorium at CERN next to experimentalist Fabiola Gianotti, the spokesperson of the ATLAS experiment at the Large Hadron Collider who, along with CMS spokesperson Joseph Incandela, had just co-announced the discovery of the particle he had once so pessimistically described.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    Model-dependent vs model-independent

    Scientists’ searches for particles predicted by certain models continue, but in recent years, searches for new physics independent of those models have begun to enjoy a resurgence as well.

    “A model-independent search is supposed to distill the essence from a whole bunch of specific models and look for something that’s independent of the details,” Feng says. The goal is to find an interesting common feature of those models, he explains. “And then I’m going to just look for that phenomenon, irrespective of the details.”

    Particle physicist Sara Alderweireldt uses model-independent searches in her work on the ATLAS experiment at the Large Hadron Collider.

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    Alderweireldt says that while many high-energy particle physics experiments are designed to make very precise measurements of a specific aspect of the Standard Model, a model-independent search allows physicists to take a wider view and search more generally for new particles or interactions. “Instead of zooming in, we try to look in as many places as possible in a consistent way.”

    Such a search makes room for the unexpected, she says. “You’re not dependent on the prior interpretation of something you would be looking for.”

    Theorist Patrick Fox and experimentalist Anadi Canepa, both at Fermilab, collaborate on searches for new physics.


    In Canepa’s work on the CMS experiment, the other general-purpose particle detector at the LHC, many of the searches are model-independent.

    While the nature of these searches allows them to “cast a wider net,” Fox says, “they are in some sense shallower, because they don’t manage to strongly constrain any one particular model.”

    At the same time, “by combining the results from many independent searches, we are getting closer to one dedicated search,” Canepa says. “Developing both model-dependent and model-independent searches is the approach adopted by the CMS and ATLAS experiments to fully exploit the unprecedented potential of the LHC.”

    Driven by data and powered by machine learning

    Model-dependent searches focus on a single assumption or look for evidence of a specific final state following an experimental particle collision. Model-independent searches are far broader—and how broad is largely driven by the speed at which data can be processed.

    “We have better particle detectors, and more advanced algorithms and statistical tools that are enabling us to understand searches in broader terms,” Canepa says.

    One reason model-independent searches are gaining prominence is because now there is enough data to support them. Particle detectors are recording vast quantities of information, and modern computers can run simulations faster than ever before, she says. “We are able to do model-independent searches because we are able to better understand much larger amounts of data and extreme regions of parameter and phase space.”

    Machine-learning is a key part of this processing power, Canepa says. “That’s really a change of paradigm, because it really made us make a major leap forward in terms of sensitivity [to new signals]. It really allows us to benefit from understanding the correlations that we didn’t capture in a more classical approach.”

    These broader searches are an important part of modern particle physics research, Fox says.

    “At a very basic level, our job is to bequeath to our descendants a better understanding of nature than we got from our ancestors,” he says. “One way to do that is to produce lots of information that will stand the test of time, and one way of doing that is with model-independent searches.”

    Models go in and out of fashion, he adds. “But model-independent searches don’t feel like they will.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 2:12 pm on March 22, 2019 Permalink | Reply
    Tags: , , CERN CMS, , Muoscope-a new small-scale portable muon telescope, , ,   

    From CERN CMS: “A ‘muoscope’ with CMS technology” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN CMS

    22 March, 2019
    Cristina Agrigoroae

    1
    The resistive plate chambers (RPC) at CMS are fast gaseous detectors that provide a muon trigger system (Image: CERN)

    Particle physicists are experts at seeing invisible things and their detecting techniques have already found many applications in medical imaging or the analysis of art works. Researchers from the CMS experiment at the Large Hadron Collider are developing a new application based on one of the experiment’s particle detectors: a new, small-scale, portable muon telescope, which will allow imaging of visually inaccessible spaces.

    CERN CMS Muoscope- a new, small-scale, portable muon telescope developed by the CMS Collaborators from Ghent University and the University of Louvain in Belgium

    Earth’s atmosphere is constantly bombarded by particles arriving from outer space. By interacting with atmospheric matter, they decay into a cascade of new particles, generating a flux of muons, heavier cousins of electrons. These cosmic-ray muons continue their journey towards the Earth’s surface, travelling through almost all material objects.

    This “superpower” of muons makes them the perfect partners for seeing through thick walls or other visually challenging subjects. Volcanic eruptions, enigmatic ancient pyramids, underground caves and tunnels: these can all be scanned and explored from the inside using muography, an imaging method using naturally occurring background radiation in the form of cosmic-ray muons.

    Large-area muon telescopes have been developed in recent years for many different applications, some of which use technology developed for the LHC detectors. The muon telescope conceived by CMS researchers from two Belgian universities, Ghent University and the Catholic University of Louvain, is compact and light and therefore easy to transport. It is nonetheless able to perform muography at high resolution. It will be the first spin-off for muography using the CMS Resistive Plate Chambers (RPC) technology. A first prototype of the telescope, also baptised a “muoscope”, has been built with four RPC planes with an active area of 16×16 cm. The same prototype was used in the “UCL to Mars” project; it was tested for its robustness in a simulation of Mars-like conditions in the Utah Desert, where it operated for one month and later came back fully functional.

    Other CMS technologies have been used in muon tomography for security and environmental protection, as well as for homeland security.

    Learn more about the muon telescope here.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    CMS
    CERN CMS New

     
  • richardmitnick 3:26 pm on February 26, 2019 Permalink | Reply
    Tags: "What’s in store for the CMS detector over the next two years?", , CERN CMS, , , , ,   

    From CERN CMS: “What’s in store for the CMS detector over the next two years?” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN CMS

    26 February, 2019
    Letizia Diamante

    CERN/CMS Detector

    A jewel of particle physics, the CMS experiment is a 14 000-tonne detector that aims to solve a wide range of questions about the mysteries around the Higgs boson and dark matter.

    CERN CMS Higgs Event

    Now that the Large Hadron Collider (LHC) beam has been switched off for a two-year technical stop, Long Shutdown 2 (LS2), CMS is preparing for significant maintenance work and upgrades.

    1
    This diagram of the CMS detector shows some of the maintenance and upgrades in store over the next years

    All the LHC experiments at CERN want to exploit the full benefits of the accelerator’s upgrade, the High-Luminosity LHC (HL-LHC), scheduled to start in 2026.

    The HL-LHC will produce between five and ten times more collisions than the LHC, allowing more precision measurements of rare phenomena that are predicted in the Standard Model to be taken, and maybe even detecting new particles that have never been seen before. To take advantage of this, some of CMS’s components need to be replaced.

    Standard Moldel of Particle Physics

    Standard Model of Particle Physics from Symmetry Magazine

    In the heart of CMS

    Hidden inside several layers of subdetectors, the pixel detector surrounding the beam pipe is the core of the experiment, as it is the closest to the particle-collision point. During LS2, the innermost layer of the present pixel detector will be replaced, using more high-luminosity-tolerant and radiation-tolerant components. The beam pipe will also be replaced in LS2, with one that will allow the extremities of the future pixel detectors to get even closer to the interaction point. This third-generation pixel detector will be installed during the third long shutdown (LS3) in 2024–2026.

    4
    CMS core removal during the Long Shutdown 2 (LS2) (Image: Maximilien Brice/Julien Ordan/CERN)

    Without missing a thing

    Beyond the core, the CMS collaboration is also planning to work on the outermost part of the detector, which detects and measures muons – particles similar to electrons, but much heavier. They are preparing to install 40 large Multi-Gas Electron Multiplier (GEM) chambers to measure muons that scatter at an angle of around 10° – one of the most challenging angles for the detector to deal with. Invented in 1997 by Fabio Sauli, GEM chambers are already used in other CERN experiments, including COMPASS, TOTEM and LHCb, but the scale of CMS is far greater than the other detectors. The GEM chambers consist of a thin, metal-clad polymer foil, chemically pierced with millions of holes, typically 50 to 100 per millimetre, submerged in a gas. As muons pass through, electrons released by the gas drift into the holes, multiply in a very strong electric field and transfer to a collection region.

    Fast-forward to the future

    Some of the existing detectors would not perform well enough during the HL-LHC phase, as the number of proton–proton collisions produced in the HL-LHC will be ten times higher than that originally planned for the CMS experiment. Therefore, the high-granularity calorimeter (HGCAL) will replace the existing endcap electromagnetic and hadronic calorimeters during LS3, between 2024 and 2026. The new detector will comprise over 1000 m² of hexagonal silicon sensors and plastic scintillator tiles, distributed over 100 layers (50 in each endcap), providing unprecedented information about electrons, photons and hadrons. Exploiting this detector is a major challenge for software and analysis, and physicists and computer science experts are already working on advanced techniques, such as machine learning.

    4
    Ongoing tests on the modules of the high-granularity calorimeter (HGCAL). Intense R&D is planned for LS2 to ensure that the new detector will be ready for installation during LS3. (Image: Maximilien Brice/CERN)

    Building, building, building

    CMS has also been involved with the HL-LHC civil-engineering work, which kick-started in June 2018 and is ongoing. The project includes five new buildings on the surface at Cessy, France, as well as modifications to the underground cavern and galleries.

    CMS’s ambitious plan for the near and longer-term future is preparing the detector for more exciting undertakings. Stay tuned for more.

    Read more in “CMS has high luminosity in sight” in the latest CERN Courier, as well as LS2 highlights from ALICE, ATLAS and LHCb.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    CMS
    CERN CMS New

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: