Tagged: CERN LHC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:29 pm on January 14, 2020 Permalink | Reply
    Tags: , , , , CERN LHC, Dilepton channel, Drell–Yan process, , , Searching for new physics in the TeV regime by looking for the decays of new particles., The dark photon (Zd)?   

    From CERN Courier: “CMS goes scouting for dark photons” 


    From CERN Courier

    6 December 2019
    A report from the CMS experiment

    One of the best strategies for searching for new physics in the TeV regime is to look for the decays of new particles. The CMS collaboration has searched in the dilepton channel for particles with masses above a few hundred GeV since the start of LHC data taking. Thanks to newly developed triggers, the searches are now being extended to the more difficult lower range of masses. A promising possible addition to the Standard Model (SM) that could exist in this mass range is the dark photon (Zd). Its coupling with SM particles and production rate depend on the value of a kinetic mixing coefficient ε, and the resulting strength of the interaction of the Zd with ordinary matter may be several orders of magnitude weaker than the electroweak interaction.

    The CMS collaboration has recently presented results of a search for a narrow resonance decaying to a pair of muons in the mass range from 11.5 to 200 GeV. This search looks for a strikingly sharp peak on top of a smooth dimuon mass spectrum that arises mainly from the Drell–Yan process. At masses below approximately 40 GeV, conventional triggers are the main limitation for this analysis as the thresholds on the muon transverse momenta (pT), which are applied online to reduce the rate of events saved for offline analysis, introduce a significant kinematic acceptance loss, as evident from the red curve in figure 1.

    1
    Fig. 1. Dimuon invariant-mass distributions obtained from data collected by the standard dimuon triggers (red) and the dimuon scouting triggers (green).

    A dedicated set of high-rate dimuon “scouting” triggers, with some additional kinematic constraints on the dimuon system and significantly lower muon pT thresholds, was deployed during Run 2 to overcome this limitation. Only a minimal amount of high-level information from the online reconstruction is stored for the selected events. The reduced event size allows significantly higher trigger rates, up to two orders of magnitude higher than the standard muon triggers. The green curve in figure 1 shows the dimuon invariant mass distribution obtained from data collected with the scouting triggers. The increase in kinematic acceptance for low masses can be well appreciated.

    The full data sets collected with the muon scouting and standard dimuon triggers during Run 2 are used to probe masses below 45 GeV, and between 45 and 200 GeV, respectively, excluding the mass range from 75 to 110 GeV where Z-boson production dominates. No significant resonant peaks are observed, and limits are set on ε2 at 90% confidence as a function of the ZD mass (figure 2). These are among the world’s most stringent constraints on dark photons in this mass range.

    2
    Fig. 2. Upper limits on ε2 as a function of the ZD mass. Results obtained with data collected by the dimuon scouting triggers are to the left of the dashed line. Constraints from measurement of the electroweak observables are shown in light blue.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN/ATLAS detector

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 12:32 pm on December 9, 2019 Permalink | Reply
    Tags: "Part of a disCERNing crowd", , Australian astrophysicist Martin White discusses life with and around the Large Hadron Collider., CERN LHC, , , ,   

    From COSMOS Magazine: “Part of a disCERNing crowd” 

    Cosmos Magazine bloc

    From COSMOS Magazine

    09 December 2019

    Australian astrophysicist Martin White discusses life with and around the Large Hadron Collider.

    1
    An aerial view of the CERN site, enlivened by Martin White’s hand-written annotations. Credit: Atlas experiment / CERN

    It’s lunchtime, and I am standing with a colleague under the main site of the CERN laboratory, trying to work out whether to go right or left.

    With the rainy Geneva winter in full swing, he informs me that he’s found a hidden entrance to a network of tunnels under the foyer of CERN’s main building and has worked out how to get to the fabled Restaurant 2 without getting wet.

    All we have to do is follow his secret route through the tunnels, which it transpires is so secret that he himself has forgotten it. After half an hour squeezing past hanging cables and scary radiation warnings, we emerge starving exactly where we started out.

    This is life at CERN in a nutshell – an endless search for the unknown conducted in a spirit of frivolity by permanently hungry practitioners. Established in 1954, the European Organisation for Nuclear Research (CERN) hosts the largest particle accelerator ever built by humankind, named, rather appropriately, the Large Hadron Collider (LHC).

    It also has an ambitious and wide-ranging program of other experiments, which test various aspects of particle and nuclear physics, and develop practical spin-off applications of the cutting-edge technology required to push our understanding of the universe to deeper and deeper levels.

    Having lived there on and off for many years, the question I get asked more than any other is: “What does a person at CERN actually do all day?”

    2
    Martin White – proudly part of “an endless search for the unknown’. Credit: GLENN HUNT

    I never had a typical day at CERN, since my work brought me into contact with computer scientists, civil and electrical engineers, medical physicists, theoretical physicists, accelerator experts, and detector physicists.

    The only common thread was attendance at a large number of meetings which, when located at opposite ends of the main site, led to frantic daily runs of a few kilometres that contributed to a significant weight loss – until I discovered the CERN cake selection.

    The preferred language is English, but it’s not easy to recognise it as such, due to a heavy reliance on jargon and acronyms.

    Moreover, I met physicists who could answer me in English, before translating for an Italian colleague, and mocking my question in German to a bystander.

    Nevertheless, I am always surprised at how quickly the exotic becomes normalised at CERN, whether that means getting acclimatised to constantly being surrounded by extraordinarily smart people or becoming used to dinner party statements like “I have a terrible day tomorrow – I have to reassemble the positron accumulator!”

    My work at CERN has involved the ATLAS experiment, one of the seven experiments of the LHC whose job is to filter and record the results of proton-proton collisions that occur more than one billion times a second.

    The middle of this detector is effectively a giant digital camera, consisting of 6.3 million strips of silicon, and my first job at CERN was to write the software that monitored each of these strips individually to confirm that the system was operating smoothly.

    I am one of CERN’s 12,000 users, and like most of them I have worked for various universities and research institutes scattered around the world, with frequent travel to the CERN laboratory as an external participant.

    The intense lure of CERN is that it remains the best international facility for discovering the new particles and laws of nature that would explain both how the Universe works on its smallest scales, and how it operated 0.0000000001 seconds after the Big Bang.

    The Standard Model of particle physics that I learnt as an undergraduate, and now pass on to my students, remains incapable of explaining most of the matter in the Universe, and it is widely believed that the LHC will finally shift us to a higher plane of understanding.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 8:15 am on October 2, 2019 Permalink | Reply
    Tags: "How AI could change science", , , , , CERN LHC, , , Kavli Institute for Cosmological Physics,   

    From University of Chicago: “How AI could change science” 

    U Chicago bloc

    From University of Chicago

    Oct 1, 2019
    Louise Lerner
    Rob Mitchum

    1
    At the University of Chicago, researchers are using artificial intelligence’s ability to analyze massive amounts of data in applications from scanning for supernovae to finding new drugs. shutterstock.com

    Researchers at the University of Chicago seek to shape an emerging field.

    AI technology is increasingly used to open up new horizons for scientists and researchers. At the University of Chicago, researchers are using it for everything from scanning the skies for supernovae to finding new drugs from millions of potential combinations and developing a deeper understanding of the complex phenomena underlying the Earth’s climate.

    Today’s AI commonly works by starting from massive data sets, from which it figures out its own strategies to solve a problem or make a prediction—rather than rely on humans explicitly programming it how to reach a conclusion. The results are an array of innovative applications.

    “Academia has a vital role to play in the development of AI and its applications. While the tech industry is often focused on short-term returns, realizing the full potential of AI to improve our world requires long-term vision,” said Rebecca Willett, professor of statistics and computer science at the University of Chicago and a leading expert on AI foundations and applications in science. “Basic research at universities and national laboratories can establish the fundamentals of artificial intelligence and machine learning approaches, explore how to apply these technologies to solve societal challenges, and use AI to boost scientific discovery across fields.”

    2
    Prof. Rebecca Willett gives an introduction to her research on AI and data science foundations. Photo by Clay Kerr

    Willett is one of the featured speakers at the InnovationXLab Artificial Intelligence Summit hosted by UChicago-affiliated Argonne National Laboratory, which will soon be home to the most powerful computer in the world—and it’s being designed with an eye toward AI-style computing. The Oct. 2-3 summit showcases the U.S. Department of Energy lab, bringing together industry, universities, and investors with lab innovators and experts.

    Depiction of ANL ALCF Cray Intel SC18 Shasta Aurora exascale supercomputer

    The workshop comes as researchers around UChicago and the labs are leading new explorations into AI.

    For example, say that Andrew Ferguson, an associate professor at the Pritzker School of Molecular Engineering, wants to look for a new vaccine or flexible electronic materials. New materials essentially are just different combinations of chemicals and molecules, but there are literally billions of such combinations. How do scientists pick which ones to make and test in the labs? AI could quickly narrow down the list.

    “There are many areas where the Edisonian approach—that is, having an army of assistants make and test hundreds of different options for the lightbulb—just isn’t practical,” Ferguson said.

    Then there’s the question of what happens if AI takes a turn at being the scientist. Some are wondering whether AI models could propose new experiments that might never have occurred to their human counterparts.

    “For example, when someone programmed the rules for the game of Go into an AI, it invented strategies never seen in thousands of years of humans playing the game,” said Brian Nord, an associate scientist in the Kavli Institute for Cosmological Physics and UChicago-affiliated Fermi National Accelerator Laboratory.

    “Maybe sometimes it will have more interesting ideas than we have.”

    Ferguson agreed: “If we write down the laws of physics and input those, what can AI tell us about the universe?”

    3
    Scenes from the 2016 games of Go, an ancient Chinese game far more complex than chess, between Google’s AI “AlphaGo” and world-record Go player Lee Sedol. The match ended with the AI up 4-1. Image courtesy of Bob van den Hoek.

    But ensuring those applications are accurate, equitable, and effective requires more basic computer science research into the fundamentals of AI. UChicago scientists are exploring ways to reduce bias in model predictions, use advanced tools even when data is scarce, and developing “explainable AI” systems that will produce more actionable insights and raise trust among users of those models.

    “Most AIs right now just spit out an answer without any context. But a doctor, for example, is not going to accept a cancer diagnosis unless they can see why and how the AI got there,” Ferguson said.

    With the right calibration, however, researchers see a world of uses for AI. To name just a few: Willett, in collaboration with scientists from Argonne and the Department of Geophysical Sciences, is using machine learning to study clouds and their effect on weather and climate. Chicago Booth economist Sendhil Mullainathan is studying ways in which machine learning technology could change the way we approach social problems, such as policies to alleviate poverty; while neurobiologist David Freedman, a professor in the University’s Division of Biological Sciences, is using machine learning to understand how brains interpret sights and sounds and make decisions.

    Below are looks into three projects at the University showcasing the breadth of AI applications happening now.

    The depths of the universe to the structures of atoms

    We’re getting better and better at building telescopes to scan the sky and accelerators to smash particles at ever-higher energies. What comes along with that, however, is more and more data. For example, the Large Hadron Collider in Europe generates one petabyte of data per second; for perspective, in less than five minutes, that would fill up the world’s most powerful supercomputer.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    That’s way too much data to store. “You need to quickly pick out the interesting events to keep, and dump the rest,” Nord said.

    But see “From UC Santa Barbara: “Breaking Data out of the Silos

    Similarly, each night hundreds of telescopes scan the sky. Existing computer programs are pretty good at picking interesting things out of them, but there’s room to improve. (After LIGO detected the gravity waves from two neutron stars crashing together in 2017, telescopes around the world had rooms full of people frantically looking through sky photos to find the point of light it created.)

    MIT /Caltech Advanced aLigo


    VIRGO Gravitational Wave interferometer, near Pisa, Italy

    Years ago, Nord was sitting and scanning telescope images to look for gravitational lensing, an effect in which large objects distort light as it passes.

    Gravitational Lensing NASA/ESA

    “We were spending all this time doing this by hand, and I thought, surely there has to be a better way,” he said. In fact, the capabilities of AI were just turning a corner; Nord began writing programs to search for lensing with neural networks. Others had the same idea; the technique is now emerging as a standard approach to find gravitational lensing.

    This year Nord is partnering with computer scientist Yuxin Chen to explore what they call a “self-driving telescope”: a framework that could optimize when and where to point telescopes to gather the most interesting data.

    “I view this collaboration between AI and science, in general, to be in a very early phase of development,” Chen said. “The outcome of the research project will not only have transformative effects in advancing the basic science, but it will also allow us to use the science involved in the physical processes to inform AI development.”

    Disentangling style and content for art and science

    In recent years, popular apps have sprung up that can transform photographs into different artistic forms—from generic modes such as charcoal sketches or watercolors to the specific styles of Dali, Monet and other masters. These “style transfer” apps use tools from the cutting edge of computer vision—primarily the neural networks that prove adept at image classification for applications such as image search and facial recognition.

    But beyond the novelty of turning your selfie into a Picasso, these tools kick-start a deeper conversation around the nature of human perception. From a young age, humans are capable of separating the content of an image from its style; that is, recognizing that photos of an actual bear, a stuffed teddy bear, or a bear made out of LEGOs all depict the same animal. What’s simple for humans can stump today’s computer vision systems, but Assoc. Profs. Jason Salavon and Greg Shakhnarovich think the “magic trick” of style transfer could help them catch up.

    Photo gallery 1/2

    4
    This tryptych of images demonstrates how neural networks can transform images with different artistic forms. [Sorry, I do not see the point here.]

    “The fact that we can look at pictures that artists create and still understand what’s in them, even though they sometimes look very different from reality, seems to be closely related to the holy grail of machine perception: what makes the content of the image understandable to people,” said Shakhnarovich, an associate professor at the Toyota Technological Institute of Chicago.

    Salavon and Shakhnarovich are collaborating on new style transfer approaches that separate, capture and manipulate content and style, unlocking new potential for art and science. These new models could transform a headshot into a much more distorted style, such as the distinctive caricatures of The Simpsons, or teach self-driving cars to better understand road signs in different weather conditions.

    “We’re in a global arms race for making cool things happen with these technologies. From what would be called practical space to cultural space, there’s a lot of action,” said Salavon, an associate professor in the Department of Visual Arts at the University of Chicago and an artist who makes “semi-autonomous art”. “But ultimately, the idea is to get to some computational understanding of the ‘essence’ of images. That’s the rich philosophical question.”

    5
    Researchers hope to use AI to decode nature’s rules for protein design, in order to create synthetic proteins with a range of applications. Image courtesy of Emw / CC BY-SA 3.0

    Learning nature’s rules for protein design

    Nature is an unparalleled engineer. Millions of years of evolution have created molecular machines capable of countless functions and survival in challenging environments, like deep sea vents. Scientists have long sought to harness these design skills and decode nature’s blueprints to build custom proteins of their own for applications in medicine, energy production, environmental clean-up and more. But only recently have the computational and biochemical technologies needed to create that pipeline become possible.

    Ferguson and Prof. Rama Ranganathan are bringing these pieces together in an ambitious project funded by a Center for Data and Computing seed grant. Combining recent advancements in machine learning and synthetic biology, they will build an iterative pipeline to learn nature’s rules for protein design, then remix them to create synthetic proteins with elevated or even new functions and properties.

    “It’s not just rebuilding what nature built, we can push it beyond what nature has ever shown us before,” said Ranganathan. “This proposal is basically the starting point for building a whole framework of data-driven molecular engineering.”

    “The way we think of this project is we’re trying to mimic millions of years of evolution in the lab, using computation and experiments instead of natural selection,” Ferguson said.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Chicago Campus

    An intellectual destination

    One of the world’s premier academic and research institutions, the University of Chicago has driven new ways of thinking since our 1890 founding. Today, UChicago is an intellectual destination that draws inspired scholars to our Hyde Park and international campuses, keeping UChicago at the nexus of ideas that challenge and change the world.

    The University of Chicago is an urban research university that has driven new ways of thinking since 1890. Our commitment to free and open inquiry draws inspired scholars to our global campuses, where ideas are born that challenge and change the world.

    We empower individuals to challenge conventional thinking in pursuit of original ideas. Students in the College develop critical, analytic, and writing skills in our rigorous, interdisciplinary core curriculum. Through graduate programs, students test their ideas with UChicago scholars, and become the next generation of leaders in academia, industry, nonprofits, and government.

    UChicago research has led to such breakthroughs as discovering the link between cancer and genetics, establishing revolutionary theories of economics, and developing tools to produce reliably excellent urban schooling. We generate new insights for the benefit of present and future generations with our national and affiliated laboratories: Argonne National Laboratory, Fermi National Accelerator Laboratory, and the Marine Biological Laboratory in Woods Hole, Massachusetts.

    The University of Chicago is enriched by the city we call home. In partnership with our neighbors, we invest in Chicago’s mid-South Side across such areas as health, education, economic growth, and the arts. Together with our medical center, we are the largest private employer on the South Side.

    In all we do, we are driven to dig deeper, push further, and ask bigger questions—and to leverage our knowledge to enrich all human life. Our diverse and creative students and alumni drive innovation, lead international conversations, and make masterpieces. Alumni and faculty, lecturers and postdocs go on to become Nobel laureates, CEOs, university presidents, attorneys general, literary giants, and astronauts.

     
  • richardmitnick 8:35 pm on August 29, 2019 Permalink | Reply
    Tags: "Forget About Electrons And Protons; The Unstable Muon Could Be The Future Of Particle Physics", , CERN LHC, , , , MICE collaboration — which stands for Muon Ionization Cooling Experiment — continues to push this technology to new heights and may make a muon collider a real possibility for the future.,   

    From Ethan Siegel: “Forget About Electrons And Protons; The Unstable Muon Could Be The Future Of Particle Physics” 

    From Ethan Siegel
    Aug 29, 2019

    1
    The particle tracks emanating from a high energy collision at the LHC in 2014 show the creation of many new particles. It’s only because of the high-energy nature of this collision that new masses can be created. (WIKIMEDIA COMMONS USER PCHARITO)

    Electron-positron or proton-proton colliders are all the rage. But the unstable muon might be the key to unlocking the next frontier.

    If you want to probe the frontiers of fundamental physics, you have to collide particles at very high energies: with enough energy that you can create the unstable particles and states that don’t exist in our everyday, low-energy Universe. So long as you obey the Universe’s conservation laws and have enough free energy at your disposal, you can create any massive particle (and/or its antiparticle) from that energy via Einstein’s E = mc².

    Traditionally, there have been two strategies to do this.

    Collide electrons moving in one direction with positrons moving in the opposite direction, tuning your beams to whatever energy corresponds to the mass of particles you wish to produce.
    Collide protons in one direction with either other protons or anti-protons in the other, reaching higher energies but creating a much messier, less controllable signal to extract.

    One Nobel Laureate, Carlo Rubbia, has called for physicists to build something entirely novel: a muon collider.

    2
    Carlo Rubbia at the 62nd Lindau Nobel Laureate Meeting on July 4, 2012. Markus Pössel (user name: Mapos)

    It’s ambitious and presently impractical, but it just might be the future of particle physics.

    3
    The particles and antiparticles of the Standard Model have now all been directly detected, with the last holdout, the Higgs Boson, falling at the LHC earlier this decade.

    Standard Model of Particle Physics

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    All of these particles can be created at LHC energies, and the masses of the particles lead to fundamental constants that are absolutely necessary to describe them fully. These particles can be well-described by the physics of the quantum field theories underlying the Standard Model, but they do not describe everything, like dark matter. (E. SIEGEL / BEYOND THE GALAXY)

    Above, you can see the particles and antiparticles of the Standard Model, which have now all been discovered. The Large Hadron Collider (LHC) at CERN discovered the Higgs boson, the long-sought-after last holdout, earlier this decade.

    While there’s still much science left to be done at the LHC — it’s only taken 2% of all the data it will acquire by the end of the 2030s — particle physicists are already looking ahead to the next generation of future colliders.

    5
    A hypothetical new accelerator, either a long linear one or one inhabiting a large tunnel beneath the Earth, could dwarf the sensitivity to new particles that prior and current colliders can achieve. Even at that, there’s no guarantee we’ll find anything new, but we’re certain to find nothing new if we fail to try. ILC collaboration

    All of the plans put forth involve scaled-up version of existing technologies that have been used in past and/or current accelerators. We know how to accelerate electrons, positrons, and protons in a straight line. We know how to bend them into a circle, and maximize both the energy of the collisions and the number of particles colliding per second. Larger, more energetic versions of existing technologies are the simplest approach.

    FNAL/Tevatron map

    CERN map

    Future Circular Collider (FCC) Larger LHC

    CERN FCC Future Circular Collider map

    CERN Future Circular Collider

    The scale of the proposed Future Circular Collider (FCC), compared with the LHC presently at CERN and the Tevatron, formerly operational at Fermilab. The Future Circular Collider is perhaps the most ambitious proposal for a next-generation collider to date, including both lepton and proton options as various phases of its proposed scientific programme. (PCHARITO / WIKIMEDIA COMMONS)

    Of course, there are both benefits and drawbacks to each method we could use. You can build a linear collider, but the energy you can reach is going to be limited by how powerfully you can impart energy to these particles per-unit-distance as well as how long you build your accelerator. The drawback is that, without a continuous injection of circulating particles, linear colliders have lower collision rates and take longer amounts of time to collect the same amount of data.

    The other main style of collider is the style currently used at CERN: circular colliders. Instead of only getting one continuous shot to accelerate your particles before giving them the opportunity to collide, you speed them up while bending them in a circle, adding more and more particles to each clockwise and counterclockwise beam with every revolution. You set up your detectors at designated collision points, and measure what comes out.

    6
    A candidate Higgs event in the ATLAS detector. Note how even with the clear signatures and transverse tracks, there is a shower of other particles; this is due to the fact that protons are composite particles. This is only the case because the Higgs gives mass to the fundamental constituents that compose these particles. At high enough energies, the currently most-fundamental particles known may yet split apart themselves. (THE ATLAS COLLABORATION / CERN)

    CERN ATLAS Image Claudia Marcelloni

    This is the preferred method, so long as your tunnel is long enough and your magnets are strong enough, for both electron/positron and proton/proton colliders. Compared to linear colliders, with a circular collider, you get

    greater numbers of particles inside the beam at any one time,
    second and third and thousandth chances for particles that missed one another on the prior pass through,
    and much greater collision rates overall, particularly for lower-energy heavy particles like the Z-boson.

    In general, electron/positron colliders are better for precision studies of known particles, while proton/proton colliders are better for probing the energy frontier.

    7
    A four-muon candidate event in the ATLAS detector at the Large Hadron Collider. The muon/anti-muon tracks are highlighted in red, as the long-lived muons travel farther than any other unstable particle. The energies achieved by the LHC are sufficient for creating Higgs bosons; previous electron-positron colliders could not achieve the necessary energies. (ATLAS COLLABORATION/CERN)

    In fact, if you compare the LHC — which collides protons with protons — with the previous collider in the same tunnel (LEP, which collided electrons with positrons), you’d find something that surprises most people: the particles inside LEP went much, much faster than the ones inside the LHC!

    CERN LEP Collider


    CERN LEP Collider

    Everything in this Universe is limited by the speed of light in a vacuum: 299,792,458 m/s. It’s impossible to accelerate any massive particle to that speed, much less past it. At the LHC, particles get accelerated up to extremely high energies of 7 TeV per particle. Considering that a proton’s rest energy is only 938 MeV (or 0.000938 TeV), it’s easy to see how it reaches a speed of 299,792,455 m/s.

    But the electrons and positrons at LEP went even faster: 299,792,457.9964 m/s. Yet despite these enormous speeds, they only reached energies of ~110 GeV, or 1.6% the energies achieved at the LHC.

    Let’s understand how colliding particles create new ones. First, the energy available for creating new particles — the “E” in E = mc² — comes from the center-of-mass energy of the two colliding particles. In a proton-proton collision, it’s the internal structures that collide: quarks and gluons. The energy of each proton is divided up among many constituent particles, and these particles zip around inside the proton as well. When two of them collide, the energy available for creating new particles might still be large (up to 2 or 3 TeV), but isn’t the full-on 14 TeV.

    But the electron-positron idea is a lot cleaner: they’re not composite particles, and they don’t have internal structure or energy divided among constituents. Accelerate an electron and positron to the same speed in opposite directions, and 100% of that energy goes into creating new particles. But it won’t be anywhere near 14 TeV.

    8
    A number of the various lepton colliders, with their luminosity (a measure of the collision rate and the number of detections one can make) as a function of center-of-mass collision energy. Note that the red line, which is a circular collider option, offers many more collisions than the linear version, but gets less superior as energy increases. Beyond about 380 GeV, circular colliders cannot achieve those energies, and a linear collider like CLIC is the far superior option. (GRANADA STRATEGY MEETING SUMMARY SLIDES / LUCIE LINSSEN (PRIVATE COMMUNICATION))

    Even though electrons and positrons go much faster than protons do, the total amount of energy a particle possesses is determined by its speed and also its original mass. Even though the electrons and positrons are much closer to the speed of light, it takes nearly 2,000 of them to make up as much rest mass as a proton. They have a greater speed but a much lower rest mass, and hence, a lower energy overall.

    There’s a good physics reasons why, even with the same radius ring and the same strong magnetic fields to bend them into a circle, electrons won’t reach the same energy as protons: synchrotron radiation. When you accelerate a charged particle with a magnetic field, it gives off radiation, which means it carries energy away.

    9
    Relativistic electrons and positrons can be accelerated to very high speeds, but will emit synchrotron radiation (blue) at high enough energies, preventing them from moving faster. This synchrotron radiation is the relativistic analog of the radiation predicted by Rutherford so many years ago, and has a gravitational analogy if you replace the electromagnetic fields and charges with gravitational ones. (CHUNG-LI DONG, JINGHUA GUO, YANG-YUAN CHEN, AND CHANG CHING-LIN, ‘SOFT-X-RAY SPECTROSCOPY PROBES NANOMATERIAL-BASED DEVICES’)

    The amount of energy radiated away is dependent on the field strength (squared), the energy of the particle (squared), but also on the inherent charge-to-mass ratio of the particle (to the fourth power). Since electrons and positrons have the same charge as the proton, but just 1/1836th of a proton’s mass, that synchrotron radiation is the limiting factor for electron-positron systems in a circular collider. You’d need a circular collider 100 km around just to be able to create a pair of top-antitop quarks in a next-generation particle accelerator using electrons and positrons.

    This is where the big idea of using muons comes in. Muons (and anti-muons) are the cousins of electrons (and positrons), being:

    fundamental (and not composite) particles,
    being 206 times as massive as an electron (with a much smaller charge-to-mass ratio and much less synchrotron radiation),
    and also, unlike electrons or positrons, being fundamentally unstable.

    That last difference is the present dealbreaker: muons have a mean lifetime of just 2.2 microseconds before decaying away.

    10
    An earlier design plan (now defunct) for a full-scale muon-antimuon collider at Fermilab, the source of the world’s second-most powerful particle accelerator behind the LHC at CERN. (FERMILAB)

    In the future, however, we might be able to work around that anyway. You see, Einstein’s special relativity tells us that as particles move closer and closer to the speed of light, time dilates for that particle in the observer’s reference frame. In other words, if we make this muon move fast enough, we can dramatically increase the time it lives before decaying; this is the same physics behind why cosmic ray muons pass through us all the time!

    If we could accelerate a muon up to the same 6.5 TeV in energy that LHC protons achieved during their prior data-taking run, that muon would live for 135,000 microseconds instead of 2.2 microseconds: enough time to circle the LHC some 1,500 times before decaying away. If you could collide a muon/anti-muon pair at those speeds, you’d have 100% of that energy — all 13 TeV of it — available for particle creation.

    11
    The prototype MICE 201-megahertz RF module, with the copper cavity mounted, is shown during assembly at Fermilab. This apparatus could focus and collimate a muon beam, enabling the muons to be accelerated and survive for much longer than 2.2 microseconds. (Y. TORUN / IIT / FERMILAB TODAY)

    Humanity can always choose to build a bigger ring or invest in producing stronger-field magnets; those are easy ways to go to higher energies in particle physics. But there’s no cure for synchrotron radiation with electrons and positrons; you’d have to use heavier particles instead. There’s no cure for energy being distributed among multiple constituent particles inside a proton; you’d have to use fundamental particles instead.

    The muon is the one particle that could solve both of these issues. The only drawback is that they’re unstable, and difficult to keep alive for a long time. However, they’re easy to make: smash a proton beam into a piece of acrylic and you’ll produce pions, which will decay into both muons and anti-muons. Accelerate those muons to high energy and collimate them into beams, and you can put them in a circular collider.

    12
    While many unstable particles, both fundamental and composite, can be produced in particle physics, only protons, neutrons (bound in nuclei) and the electron are stable, along with their antimatter counterparts and the photon. Everything else is short-lived, but if muons can be kept at high enough speeds, they might live long enough to forge a next-generation particle collider out of. (CONTEMPORARY PHYSICS EDUCATION PROJECT (CPEP), U.S. DEPARTMENT OF ENERGY / NSF / LBNL)

    The MICE collaboration — which stands for Muon Ionization Cooling Experiment — continues to push this technology to new heights, and may make a muon collider a real possibility for the future. The goal is to reveal whatever secrets nature might have waiting in store for us, and these are secrets we cannot predict. As Carlo Rubbia himself said,

    “…these fundamental choices are coming from nature, not from individuals. Theorists can do what they like, but nature is the one deciding in the end….”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 11:31 am on August 20, 2019 Permalink | Reply
    Tags: "With open data scientists share their work", , , CERN LHC, Gran Sasso, ,   

    From Symmetry: “With open data, scientists share their work” 

    Symmetry Mag
    From Symmetry

    08/20/19
    Meredith Fore

    1
    Illustration by Sandbox Studio, Chicago

    There are barriers to making scientific data open, but doing so has already contributed to scientific progress.

    It could be said that astronomy, one of the oldest sciences, was one of the first fields to have open data. The open records of Chinese astronomers from 1054 A.D. allowed astronomer Carlo Otto Lampland to identify the Crab Nebula as the remnant of a supernova in 1921.

    Supernova remnant Crab nebula. NASA/ESA Hubble

    In 1705 Edward Halley used the previous observations of Johannes Kepler and Petrus Apianus—who did their work before Halley was old enough to use a telescope—to deduce the orbit of his eponymous comet.

    2
    Comet 1P/Halley as taken March 8, 1986 by W. Liller, Easter Island, part of the International Halley Watch (IHW) Large Scale Phenomena Network.
    NASA/W. Liller

    In science, making data open means making available, free of charge, the observations or other information collected in a scientific study for the purpose of allowing other researchers to examine it for themselves, either to verify it or to conduct new analyses.

    Scientists continue to use open data to make new discoveries today. In 2010, a team of scientists led by Professor Doug Finkbeiner at Harvard University found vast gamma-ray bubbles above and below the Milky Way. The accomplishment was compared to the discovery of a new continent on Earth. The scientists didn’t find the bubbles by making their own observations; they did it by analyzing publicly available data from the Fermi Gamma Ray Telescope.

    NASA/Fermi LAT


    NASA/Fermi Gamma Ray Space Telescope

    “Open data often can be used to answer other kinds of questions that the people who collected the data either weren’t interested in asking, or they just never thought to ask,” says Kyle Cranmer, a professor at New York University. By making scientific data available, “you’re enabling a lot of new science by the community to go forward in a more efficient and powerful way.”

    Cranmer is a member of ATLAS, one of the two general-purpose experiments that, among other things, co-discovered the Higgs boson at the Large Hadron Collider at CERN.

    CERN ATLAS Image Claudia Marcelloni

    CERN ATLAS Higgs Event

    He and other CERN researchers recently published a letter in Nature Physics titled “Open is not enough,” which shares lessons learned about providing open data in high-energy physics. The CERN Open Data Portal, which facilitates public access of datasets from CERN experiments, now contains more than two petabytes of information.

    3
    Computing at CERN

    The fields of both particle physics and astrophysics have seen rapid developments in the use and spread of open data, says Ulisses Barres, an astrophysicist at the Brazilian Center for Research in Physics. “Astronomy is going to, in the next decade, increase the amount of data that it produces by a factor of hundreds,” he says. “As the amount of data grows, there is more pressure for increasing our capacity to convert information into knowledge.”

    The Square Kilometer Array Telescope—built in Australia and South Africa and set to turn on in the 2020s—is expected to produce about 600 terabytes of data per year.

    SKA Square Kilometer Array


    SKA South Africa

    Raw data from studies conducted during the site selection process are already available on the SKA website, with a warning that “these files are very large indeed, and before you download them you should check whether your local file system will be able to handle them.”

    Barres sees the growth in open data as an opportunity for developing nations to participate in the global science community in new ways. He and a group of fellow astrophysicists helped develop something called the Open Universe Initiative “with the objective of stimulating a dramatic increase in the availability and usability of space science data, extending the potential of scientific discovery to new participants in all parts of the world and empowering global educational services.”

    The initiative, proposed by the government of Italy, is currently in the “implementation” phase within the United Nations Office for Outer Space Affairs.

    “I think that data is this proper entry point for science development in places that don’t have much science developed yet,” Barres says. “Because it’s there, it’s available, there is much more data than we can properly analyze.”

    There are barriers to implementing open data. One is the concept of ownership—a lab might not want to release data that they could use for another project or might worry about proper credit and attribution. Another is the natural human fear of being accused of being wrong or having your data used irresponsibly.

    But one of the biggest barriers, according to physics professor Jesse Thaler of MIT, is making the data understandable. “From the user perspective, every single aspect of using public data is challenging,” Thaler says.

    Think of a high school student’s chemistry lab notebook. A student might mark certain measurements in her data table with a star, to remind herself that she used a different instrument to take those measurements. Or she may use acronyms to name different samples. Unless she writes these schemes down, another student wouldn’t know the star’s significance and wouldn’t be able to know what the samples were.

    This has been a challenge for the CERN Open Data Portal, Cranmer says. “It’s very well curated, but it’s hard to use, because the data has got a lot of structure to it. It’s very complicated. You have to put additional effort to make it more usable.”

    And for a lot of scientists already working to manage gigantic projects, doing extra work to make their data useable to outside groups—well, “that’s just not mission critical,” he says. But Thaler adds that the CMS experiment has been very responsive to the needs of outside users.


    CERN CMS Higgs Event

    “Figuring out how to release data is challenging because you want to provide as much relevant information to outside users as possible,” Thaler says. “But it’s often not obvious, until outside users actually get their hands on the data, what information is relevant.”

    Still, there are many examples of open data benefiting astrophysics and particle physics. Members of the wider scientific community have discovered exoplanets through public data from the Kepler Space Telescope. When the Gaia spacecraft mapped the positions of 1.7 billion stars and released them as open data, scientists flocked to hackathons hosted by the Flatiron Institute to interpret it and produced about 20 papers’ worth of research.

    Open data policies have allowed for more accountability. The physics community was able to thoroughly check data from the first black hole collisions detected by LIGO and question a proposed dark-matter signal from the DAMA/LIBRA experiment.

    DAMA-LIBRA at Gran Sasso


    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    Open data has also allowed for new collaborations and has nourished existing ones. Thaler, who is a theorist, says the dialogue between experimentalists and theorists has always been strong, but “open data is an opportunity to accelerate that conversation,” he says.

    For Cari Cesarotti, a graduate student who uses CMS Open Data for research in particle physics theory at Harvard, one of the most important benefits of open data is how it maximizes the scientific value of data experimentalists have to work very hard to obtain.

    “Colliders are really expensive and quite laborious to build and test,” she says. “So the more that we can squeeze out utility using the tools that we already have—to me, that’s the right thing to do, to try to get as much mileage as we possibly can out of the data set.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 11:37 am on August 15, 2019 Permalink | Reply
    Tags: , Azure ML, , CERN LHC, Every proton collision at the Large Hadron Collider is different but only a few are special. The special collisions generate particles in unusual patterns — possible manifestations of new rule-break, Fermilab is the lead U.S. laboratory for the CMS experiment., , , , , , , The challenge: more data more computing power   

    From Fermi National Accelerator Lab- “A glimpse into the future: accelerated computing for accelerated particles” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    August 15, 2019
    Leah Hesla

    Every proton collision at the Large Hadron Collider is different, but only a few are special. The special collisions generate particles in unusual patterns — possible manifestations of new, rule-breaking physics — or help fill in our incomplete picture of the universe.

    Finding these collisions is harder than the proverbial search for the needle in the haystack. But game-changing help is on the way. Fermilab scientists and other collaborators successfully tested a prototype machine-learning technology that speeds up processing by 30 to 175 times compared to traditional methods.

    Confronting 40 million collisions every second, scientists at the LHC use powerful, nimble computers to pluck the gems — whether it’s a Higgs particle or hints of dark matter — from the vast static of ordinary collisions.

    Rifling through simulated LHC collision data, the machine learning technology successfully learned to identify a particular postcollision pattern — a particular spray of particles flying through a detector — as it flipped through an astonishing 600 images per second. Traditional methods process less than one image per second.

    The technology could even be offered as a service on external computers. Using this offloading model would allow researchers to analyze more data more quickly and leave more LHC computing space available to do other work.

    It is a promising glimpse into how machine learning services are supporting a field in which already enormous amounts of data are only going to get bigger.

    1
    Particles emerging from proton collisions at CERN’s Large Hadron Collider travel through through this stories-high, many-layered instrument, the CMS detector. In 2026, the LHC will produce 20 times the data it does currently, and CMS is currently undergoing upgrades to read and process the data deluge. Photo: Maximilien Brice, CERN

    The challenge: more data, more computing power

    Researchers are currently upgrading the LHC to smash protons at five times its current rate.

    By 2026, the 17-mile circular underground machine at the European laboratory CERN will produce 20 times more data than it does now.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    CMS is one of the particle detectors at the Large Hadron Collider, and CMS collaborators are in the midst of some upgrades of their own, enabling the intricate, stories-high instrument to take more sophisticated pictures of the LHC’s particle collisions. Fermilab is the lead U.S. laboratory for the CMS experiment.

    If LHC scientists wanted to save all the raw collision data they’d collect in a year from the High-Luminosity LHC, they’d have to find a way to store about 1 exabyte (about 1 trillion personal external hard drives), of which only a sliver may unveil new phenomena. LHC computers are programmed to select this tiny fraction, making split-second decisions about which data is valuable enough to be sent downstream for further study.

    Currently, the LHC’s computing system keeps roughly one in every 100,000 particle events. But current storage protocols won’t be able to keep up with the future data flood, which will accumulate over decades of data taking. And the higher-resolution pictures captured by the upgraded CMS detector won’t make the job any easier. It all translates into a need for more than 10 times the computing resources than the LHC has now.

    The recent prototype test shows that, with advances in machine learning and computing hardware, researchers expect to be able to winnow the data emerging from the upcoming High-Luminosity LHC when it comes online.

    “The hope here is that you can do very sophisticated things with machine learning and also do them faster,” said Nhan Tran, a Fermilab scientist on the CMS experiment and one of the leads on the recent test. “This is important, since our data will get more and more complex with upgraded detectors and busier collision environments.”

    2
    Particle physicists are exploring the use of computers with machine learning capabilities for processing images of particle collisions at CMS, teaching them to rapidly identify various collision patterns. Image: Eamonn Maguire/Antarctic Design

    Machine learning to the rescue: the inference difference

    Machine learning in particle physics isn’t new. Physicists use machine learning for every stage of data processing in a collider experiment.

    But with machine learning technology that can chew through LHC data up to 175 times faster than traditional methods, particle physicists are ascending a game-changing step on the collision-computation course.

    The rapid rates are thanks to cleverly engineered hardware in the platform, Microsoft’s Azure ML, which speeds up a process called inference.

    To understand inference, consider an algorithm that’s been trained to recognize the image of a motorcycle: The object has two wheels and two handles that are attached to a larger metal body. The algorithm is smart enough to know that a wheelbarrow, which has similar attributes, is not a motorcycle. As the system scans new images of other two-wheeled, two-handled objects, it predicts — or infers — which are motorcycles. And as the algorithm’s prediction errors are corrected, it becomes pretty deft at identifying them. A billion scans later, it’s on its inference game.

    Most machine learning platforms are built to understand how to classify images, but not physics-specific images. Physicists have to teach them the physics part, such as recognizing tracks created by the Higgs boson or searching for hints of dark matter.

    Researchers at Fermilab, CERN, MIT, the University of Washington and other collaborators trained Azure ML to identify pictures of top quarks — a short-lived elementary particle that is about 180 times heavier than a proton — from simulated CMS data. Specifically, Azure was to look for images of top quark jets, clouds of particles pulled out of the vacuum by a single top quark zinging away from the collision.

    “We sent it the images, training it on physics data,” said Fermilab scientist Burt Holzman, a lead on the project. “And it exhibited state-of-the-art performance. It was very fast. That means we can pipeline a large number of these things. In general, these techniques are pretty good.”

    One of the techniques behind inference acceleration is to combine traditional with specialized processors, a marriage known as heterogeneous computing architecture.

    Different platforms use different architectures. The traditional processors are CPUs (central processing units). The best known specialized processors are GPUs (graphics processing units) and FPGAs (field programmable gate arrays). Azure ML combines CPUs and FPGAs.

    “The reason that these processes need to be accelerated is that these are big computations. You’re talking about 25 billion operations,” Tran said. “Fitting that onto an FPGA, mapping that on, and doing it in a reasonable amount of time is a real achievement.”

    And it’s starting to be offered as a service, too. The test was the first time anyone has demonstrated how this kind of heterogeneous, as-a-service architecture can be used for fundamental physics.

    5
    Data from particle physics experiments are stored on computing farms like this one, the Grid Computing Center at Fermilab. Outside organizations offer their computing farms as a service to particle physics experiments, making more space available on the experiments’ servers. Photo: Reidar Hahn

    At your service

    In the computing world, using something “as a service” has a specific meaning. An outside organization provides resources — machine learning or hardware — as a service, and users — scientists — draw on those resources when needed. It’s similar to how your video streaming company provides hours of binge-watching TV as a service. You don’t need to own your own DVDs and DVD player. You use their library and interface instead.

    Data from the Large Hadron Collider is typically stored and processed on computer servers at CERN and partner institutions such as Fermilab. With machine learning offered up as easily as any other web service might be, intensive computations can be carried out anywhere the service is offered — including off site. This bolsters the labs’ capabilities with additional computing power and resources while sparing them from having to furnish their own servers.

    “The idea of doing accelerated computing has been around decades, but the traditional model was to buy a computer cluster with GPUs and install it locally at the lab,” Holzman said. “The idea of offloading the work to a farm off site with specialized hardware, providing machine learning as a service — that worked as advertised.”

    The Azure ML farm is in Virginia. It takes only 100 milliseconds for computers at Fermilab near Chicago, Illinois, to send an image of a particle event to the Azure cloud, process it, and return it. That’s a 2,500-kilometer, data-dense trip in the blink of an eye.

    “The plumbing that goes with all of that is another achievement,” Tran said. “The concept of abstracting that data as a thing you just send somewhere else, and it just comes back, was the most pleasantly surprising thing about this project. We don’t have to replace everything in our own computing center with a whole bunch of new stuff. We keep all of it, send the hard computations off and get it to come back later.”

    Scientists look forward to scaling the technology to tackle other big-data challenges at the LHC. They also plan to test other platforms, such as Amazon AWS, Google Cloud and IBM Cloud, as they explore what else can be accomplished through machine learning, which has seen rapid evolution over the past few years.

    “The models that were state-of-the-art for 2015 are standard today,” Tran said.

    As a tool, machine learning continues to give particle physics new ways of glimpsing the universe. It’s also impressive in its own right.

    “That we can take something that’s trained to discriminate between pictures of animals and people, do some modest amount computation, and have it tell me the difference between a top quark jet and background?” Holzman said. “That’s something that blows my mind.”

    This work is supported by the DOE .

    See the full here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 12:35 pm on August 10, 2019 Permalink | Reply
    Tags: "Physicists Working to Discover New Particles, , , CERN LHC, , , , Texas Tech, The LDMX Experiment   

    From Texas Tech via FNAL: “Physicists Working to Discover New Particles, Dark Matter” 

    1

    From TEXAS TECH UNIVERSITY

    via

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    August 5, 2019
    Glenys Young, Texas Tech

    Faculty recently presented their work at the European Physical Society’s 2019 Conference on High Energy Physics.

    Texas Tech University is well known for its research on topics that hit close to home for us here on the South Plains, like agriculture, water use and climate. But Texas Tech also is making its name known among those who study the farthest reaches of space and the mysteries of matter.

    Faculty from the Texas Tech Department of Physics & Astronomy recently presented at the European Physical Society’s 2019 Conference on High Energy Physics on the search for dark matter and other new particles that could help unlock the history and nature of the universe.

    New ways to approach the most classical search for new particles.

    Texas Tech, led by professor and department chair Sung-Won Lee, has been playing a leading role in new-particle hunt for more than a decade. As part of the Compact Muon Solenoid (CMS) experiment, which investigates a wide range of physics, including the search for extra dimensions and particles that could make up dark matter, Lee has led the new-particle search at the European Organization for Nuclear Research (CERN).

    1
    Lee

    “Basically, we’re looking for any experimental evidence of new particles that could open the door to whole new realms of physics that researchers believe could be there,” Lee said. “Researchers at Texas Tech are continuing to look for elusive new particles in the CMS experiment at CERN’s Large Hadron Collider (LHC), and if found, we could answer some of the most profound questions about the structure of matter and the evolution of the early universe.”

    The LHC essentially bounces around tiny particles at incredibly high speeds to see what happens when the particles collide. Lee’s search focuses on identifying possible hints of new physics that could add more subatomic particles to the Standard Model of particle physics.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS

    CERN CMS New

    LHCb
    CERN LHCb New II

    “The Standard Model has been enormously successful, but it leaves many important questions unanswered,” Lee said.

    Standard Model of Particle Physics

    “It is also widely acknowledged that, from the theoretical standpoint, the Standard Model must be part of a larger theory, ‘Beyond the Standard Model’ (BSM), which is yet to be experimentally confirmed.”

    Some BSM theories suggest that the production and decay of new particles could be observed in the LHC by the resulting highly energetic jets that shoot out in opposite directions (dijets) and the resonances they leave. Thus the search for new particles depends on the search for these resonances. In some ways, it’s like trying to trace air movements to find a fan you can’t see, hear or touch.

    In 2018-19, in collaboration with the CMS group, Texas Tech’s team performed a search for narrow dijet resonances using a newly available dataset at the LHC. The data were consistent with the Standard Model predictions, and no significant deviations from the pure background hypothesis were observed. But one spectacular collision was recorded in which the masses of the two jets were the same. This evidence allows for the possibility that the jets originated from BSM-hypothesized particle decay.

    “Since the LHC is the highest energy collider currently in operation, it is crucial to pay special attention to the highest-dijet-mass events where first hints of new physics at higher energies could start to appear,” Lee said. “This unusual high-mass event could likely be a collision created by the Standard Model background or possibly the first hint of new physics, but with only one event in hand, it is not possible to say which.”

    For now, Lee, postdoctoral research fellow Federico De Guio and doctoral student Zhixing (Tyler) Wang are working to update the dijet resonance search using the full LHC dataset and extend the scope of the analysis.

    “This extension of the search could help prove space-time-matter theory, which requires the existence of several extra spatial dimensions to the universe,” Lee said. “I believe that, with our extensive research experience, Texas Tech’s High Energy Physics group can contribute to making such discoveries.”

    Enhancing the missing momentum microscope

    Included in the ongoing new-particle search using the LHC is the pursuit of dark matter, an elusive, invisible form of matter that dominates the matter content of the universe.

    “Currently, the LHC is producing the highest-energy collisions from an accelerator in the world, and my primary research interest is in understanding whether or not new states of matter are being produced in these collisions,” said Andrew Whitbeck, an assistant professor in the Department of Physics & Astronomy.

    4
    Whitbeck

    “Specifically, we are looking for dark matter produced in association with quarks, the constituents of the proton and neutron. These signatures are important for both understanding the nature of dark matter, but also the nature of the Higgs boson, a cornerstone of our theory for how elementary particles interact.”

    The discovery of the Higgs boson at the LHC in 2012 was a widely celebrated accomplishment of the LHC and the detector collaborations involved.

    Peter Higgs


    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    However, the mere existence of the Higgs boson has provoked a lot of questions about whether there are new particles that could help us better understand the Higgs boson and other questions, like why gravity is so weak compared to other forces.

    As an offshoot of that finding, Whitbeck has been working to better understand a type of particle called neutrinos.

    “Neutrinos are a unique particle in the catalog of known particles in that they are the lightest matter particles, and they only can interact with particles via the Weak force, which, as its name suggests, only produces a feeble force between neutrinos and other matter,” Whitbeck said. “Neutrinos are so weakly interacting at the energies produced by the LHC that it is very likely a neutrino travels through the entire earth without deviating from its initial trajectory.

    “Dark matter is expected to behave similarly given that, despite being all around us, we don’t directly see it. This means that in looking for dark matter produced in proton-proton collisions, we often find lots of neutrinos. Understanding how many events with neutrinos there are is an important first step to understanding if there are events with dark matter.”

    Since the discovery of the Higgs boson, many of the most obvious signatures have come up empty for any signs of dark matter, and the latest results are some of the most sensitive measurements done to date. However, Whitbeck and his fellow scientists will continue to look for many more subtle signatures as well as a very powerful signature in which dark matter hypothetically is produced almost by itself, with only one lonely proton fragment visible in the event. The strategy provides powerful constraints for the most difficult-to-see models of dark matter.

    “With all of the traditional ways of searching for dark matter in proton-proton collisions turning up empty, I have also been working to design a new experiment, the Light Dark Matter eXperiment (LDMX), that will employ detector technology and techniques similar to what is used at CMS to look for dark matter,” Whitbeck said.

    6
    Texas Tech The LDMX Experiment schematic

    “One significant difference is that LDMX will look at electrons bombarding a target. If the mass of dark matter is somewhere between the mass of the electron and the mass of the proton, this experiment will likely be able to see it.”

    Texas Tech also is working to upgrade the CMS detector so it can handle much higher rates of collisions after the LHC undergoes some upgrades of its own. The hope is that with higher rates, they’ll be able to see not only new massive particles but also the rarest of processes, such as the production of two Higgs bosons. This detector construction is ramping up now at Texas Tech’s new Advanced Physics Detector Laboratory at Reese Technology Center.

    Besides being a background for dark matter searches, neutrinos also are a growing focus of research in particle physics. Even now, the Fermi National Accelerator Laboratory is able to produce intense beams of neutrinos that can be used to study their idiosyncrasies, but there are plans to upgrade the facility to produce the most intense beams of neutrinos ever and to place the most sensitive neutrino detectors nearby, making the U.S. the center of neutrino physics.

    FNAL/NOvA experiment map

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    Measurements done with these neutrinos could unlock whether these particles play a big role in the creation of a matter-dominated universe.

    Texas Tech’s High Energy Physics group hopes that, in the near future, it can help tackle some of the challenges this endeavor presents.

    See the full here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 12:10 pm on July 15, 2019 Permalink | Reply
    Tags: , , , , CERN LHC, , , , ,   

    From CERN: “Exploring the Higgs boson “discovery channels” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    12th July 2019
    ATLAS Collaboration

    1
    Event display of a two-electron two-muon ZH candidate. The Higgs candidate can be seen on the left with the two leading electrons represented by green tracks and green EM calorimeter deposits (pT = 22 and 120 GeV), and two subleading muons indicated by two red tracks (pT = 34 and 43 GeV). Recoiling against the four lepton candidate in the left hemisphere is a dimuon pair in the right hemisphere indicated by two red tracks (pT = 139 and 42 GeV) and an invariant mass of 91.5 GeV, which agrees well with the mass of the Z boson. (Image: ATLAS Collaboration/CERN)

    At the 2019 European Physical Society’s High-Energy Physics conference (EPS-HEP) taking place in Ghent, Belgium, the ATLAS and CMS collaborations presented a suite of new results. These include several analyses using the full dataset from the second run of CERN’s Large Hadron Collider (LHC), recorded at a collision energy of 13 TeV between 2015 and 2018. Among the highlights are the latest precision measurements involving the Higgs boson. In only seven years since its discovery, scientists have carefully studied several of the properties of this unique particle, which is increasingly becoming a powerful tool in the search for new physics.

    The results include new searches for transformations (or “decays”) of the Higgs boson into pairs of muons and into pairs of charm quarks. Both ATLAS and CMS also measured previously unexplored properties of decays of the Higgs boson that involve electroweak bosons (the W, the Z and the photon) and compared these with the predictions of the Standard Model (SM) of particle physics. ATLAS and CMS will continue these studies over the course of the LHC’s Run 3 (2021 to 2023) and in the era of the High-Luminosity LHC (from 2026 onwards).

    The Higgs boson is the quantum manifestation of the all-pervading Higgs field, which gives mass to elementary particles it interacts with, via the Brout-Englert-Higgs mechanism. Scientists look for such interactions between the Higgs boson and elementary particles, either by studying specific decays of the Higgs boson or by searching for instances where the Higgs boson is produced along with other particles. The Higgs boson decays almost instantly after being produced in the LHC and it is by looking through its decay products that scientists can probe its behaviour.

    In the LHC’s Run 1 (2010 to 2012), decays of the Higgs boson involving pairs of electroweak bosons were observed. Now, the complete Run 2 dataset – around 140 inverse femtobarns each, the equivalent of over 10 000 trillion collisions – provides a much larger sample of Higgs bosons to study, allowing measurements of the particle’s properties to be made with unprecedented precision. ATLAS and CMS have measured the so-called “differential cross-sections” of the bosonic decay processes, which look at not just the production rate of Higgs bosons but also the distribution and orientation of the decay products relative to the colliding proton beams. These measurements provide insight into the underlying mechanism that produces the Higgs bosons. Both collaborations determined that the observed rates and distributions are compatible with those predicted by the Standard Model, at the current rate of statistical uncertainty.

    Since the strength of the Higgs boson’s interaction is proportional to the mass of elementary particles, it interacts most strongly with the heaviest generation of fermions, the third. Previously, ATLAS and CMS had each observed these interactions. However, interactions with the lighter second-generation fermions – muons, charm quarks and strange quarks – are considerably rarer. At EPS-HEP, both collaborations reported on their searches for the elusive second-generation interactions.
    ATLAS presented their first result from searches for Higgs bosons decaying to pairs of muons (H→μμ) with the full Run 2 dataset. This search is complicated by the large background of more typical SM processes that produce pairs of muons. “This result shows that we are now close to the sensitivity required to test the Standard Model’s predictions for this very rare decay of the Higgs boson,” says Karl Jakobs, the ATLAS spokesperson. “However, a definitive statement on the second generation will require the larger datasets that will be provided by the LHC in Run 3 and by the High-Luminosity LHC.”
    CMS presented their first result on searches for decays of Higgs bosons to pairs of charm quarks (H→cc). When a Higgs boson decays into quarks, these elementary particles immediately produce jets of particles. “Identifying jets formed by charm quarks and isolating them from other types of jets is a huge challenge,” says Roberto Carlin, spokesperson for CMS. “We’re very happy to have shown that we can tackle this difficult decay channel. We have developed novel machine-learning techniques to help with this task.”

    3
    An event recorded by CMS showing a candidate for a Higgs boson produced in association with two top quarks. The Higgs boson and top quarks decay leading to a final state with seven jets (orange cones), an electron (green line), a muon (red line) and missing transverse energy (pink line) (Image: CMS/CERN)

    The Higgs boson also acts as a mediator of physics processes in which electroweak bosons scatter or bounce off each other. Studies of these processes with very high statistics serve as powerful tests of the Standard Model. ATLAS presented the first-ever measurement of the scattering of two Z bosons. Observing this scattering completes the picture for the W and Z bosons as ATLAS has previously observed the WZ scattering process and both collaborations the WW processes. CMS presented the first observation of electroweak-boson scattering that results in the production of a Z boson and a photon.
    “The experiments are making big strides in the monumental task of understanding the Higgs boson,” says Eckhard Elsen, CERN’s Director of Research and Computing. “After observation of its coupling to the third-generation fermions, the experiments have now shown that they have the tools at hand to address the even more challenging second generation. The LHC’s precision physics programme is in full swing.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 7:29 am on May 23, 2019 Permalink | Reply
    Tags: "Atom smasher could be making new particles that are hiding in plain sight", , , , CERN LHC, Compact Detector for Exotics at LHCb, ,   

    From Science Magazine: “Atom smasher could be making new particles that are hiding in plain sight” 

    AAAS
    From Science Magazine

    May. 22, 2019
    Adrian Cho

    1
    In a simulated event, the track of a decay particle called a muon (red), displaced slightly from the center of particle collisions, could be a sign of new physics.
    ATLAS EXPERIMENT © 2019 CERN

    Are new particles materializing right under physicists’ noses and going unnoticed? The world’s great atom smasher, the Large Hadron Collider (LHC), could be making long-lived particles that slip through its detectors, some researchers say.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    Next week, they will gather at the LHC’s home, CERN, the European particle physics laboratory near Geneva, Switzerland, to discuss how to capture them.


    They argue the LHC’s next run should emphasize such searches, and some are calling for new detectors that could sniff out the fugitive particles.

    It’s a push born of anxiety. In 2012, experimenters at the $5 billion LHC discovered the Higgs boson, the last particle predicted by the standard model of particles and forces, and the key to explaining how fundamental particles get their masses.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    But the LHC has yet to blast out anything beyond the standard model.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    “We haven’t found any new physics with the assumptions we started with, so maybe we need to change the assumptions,” says Juliette Alimena, a physicist at Ohio State University (OSU) in Columbus who works with the Compact Muon Solenoid (CMS), one of the two main particle detectors fed by the LHC.

    CERN/CMS Detector


    For decades, physicists have relied on a simple strategy to look for new particles: Smash together protons or electrons at ever-higher energies to produce heavy new particles and watch them decay instantly into lighter, familiar particles within the huge, barrel-shaped detectors. That’s how CMS and its rival detector, A Toroidal LHC Apparatus (ATLAS), spotted the Higgs, which in a trillionth of a nanosecond can decay into, among other things, a pair of photons or two “jets” of lighter particles.

    CERN ATLAS Credit CERN SCIENCE PHOTO LIBRARY

    Long-lived particles, however, would zip through part or all of the detector before decaying. That idea is more than a shot in the dark, says Giovanna Cottin, a theorist at National Taiwan University in Taipei. “Almost all the frameworks for beyond-the-standard-model physics predict the existence of long-lived particles,” she says. For example, a scheme called supersymmetry posits that every standard model particle has a heavier superpartner, some of which could be long-lived. Long-lived particles also emerge in “dark sector” theories that envision undetectable particles that interact with ordinary matter only through “porthole” particles, such as a dark photon that every so often would replace an ordinary photon in a particle interaction.

    CMS and ATLAS, however, were designed to detect particles that decay instantaneously. Like an onion, each detector contains layers of subsystems—trackers that trace charged particles, calorimeters that measure particle energies, and chambers that detect penetrating and particularly handy particles called muons—all arrayed around a central point where the accelerator’s proton beams collide. Particles that fly even a few millimeters before decaying would leave unusual signatures: kinked or offset tracks, or jets that emerge gradually instead of all at once.

    Standard data analysis often assumes such oddities are mistakes and junk, notes Tova Holmes, an ATLAS member from the University of Chicago in Illinois who is searching for the displaced tracks of decays from long-lived supersymmetric particles. “It’s a bit of a challenge because the way we’ve designed things, and the software people have written, basically rejects these things,” she says. So Holmes and colleagues had to rewrite some of that software.

    More important is ensuring that the detectors record the odd events in the first place. The LHC smashes bunches of protons together 400 million times a second. To avoid data overload, trigger systems on CMS and ATLAS sift interesting collisions from dull ones and immediately discard data about 1999 of every 2000 collisions. The culling can inadvertently toss out long-lived particles. Alimena and colleagues wanted to look for particles that live long enough to get stuck in CMS’s calorimeter and decay only later. So they had to put in a special trigger that occasionally reads out the entire detector between the proton collisions.

    Long-lived particle searches had been fringe efforts, says James Beacham, an ATLAS experimenter from OSU. “It’s always been one guy working on this thing,” he says. “Your support group was you in your office.” Now, researchers are joining forces. In March, 182 of them released a 301-page white paper on how to optimize their searches.

    Some want ATLAS and CMS to dedicate more triggers to long-lived particle searches in the next LHC run, from 2021 through 2023. In fact, the next run “is probably our last chance to look for unusual rare events,” says Livia Soffi, a CMS member from the Sapienza University of Rome. Afterward, an upgrade will increase the intensity of the LHC’s beams, requiring tighter triggers.

    Others have proposed a half-dozen new detectors to search for particles so long-lived that they escape the LHC’s existing detectors altogether. Jonathan Feng, a theorist at the University of California, Irvine, and colleagues have won CERN approval for the Forward Search Experiment (FASER), a small tracker to be placed in a service tunnel 480 meters down the beamline from ATLAS.

    CERN FASER experiment schematic

    Supported by $2 million from private foundations and built of borrowed parts, FASER will look for low-mass particles such as dark photons, which could spew from ATLAS, zip through the intervening rock, and decay into electron-positron pairs.

    Another proposal calls for a tracking chamber in an empty hall next to the LHCb, a smaller detector fed by the LHC.

    CERN/LHCb detector

    The Compact Detector for Exotics at LHCb would look for long-lived particles, especially those born in Higgs decays, says Vladimir Gligorov, an LHCb member from the Laboratory for Nuclear Physics and High Energies in Paris.

    3
    The Compact Detector for Exotics at LHCb. https://indico.cern.ch/event/755856/contributions/3263683/attachments/1779990/2897218/PBC2019_CERN_CodexB_report.pdf

    Even more ambitious would be a detector called MATHUSLA, essentially a large, empty building on the surface above the subterranean CMS detector.

    5
    MATHUSLA. http://cds.cern.ch/record/2653848

    Tracking chambers in the ceiling would detect jets spraying up from the decays of long-lived particles created 70 meters below, says David Curtin, a theorist at the University of Toronto in Canada and project co-leader. Curtin is “optimistic” MATHUSLA would cost less than €100 million. “Given that it has sensitivity to this broad range of signatures—and that we haven’t seen anything else—I’d say it’s a no-brainer.”

    Physicists have a duty to look for the odd particles, Beacham says. “The nightmare scenario is that in 20 years, Jill Theorist says, ‘The reason you didn’t see anything is you didn’t keep the right events and do the right search.’”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 12:04 pm on May 14, 2019 Permalink | Reply
    Tags: >Model-dependent vs model-independent research, , , , CERN LHC, , , , , , ,   

    From Symmetry: “Casting a wide net” 

    Symmetry Mag
    From Symmetry

    05/14/19
    Jim Daley

    1
    Illustration by Sandbox Studio, Chicago

    In their quest to discover physics beyond the Standard Model, physicists weigh the pros and cons of different search strategies.

    On October 30, 1975, theorists John Ellis, Mary K. Gaillard and D.V. Nanopoulos published a paper [Science Direct] titled “A Phenomenological Profile of the Higgs Boson.” They ended their paper with a note to their fellow scientists.

    “We should perhaps finish with an apology and a caution,” it said. “We apologize to experimentalists for having no idea what is the mass of the Higgs boson… and for not being sure of its couplings to other particles, except that they are probably all very small.

    “For these reasons, we do not want to encourage big experimental searches for the Higgs boson, but we do feel that people performing experiments vulnerable to the Higgs boson should know how it may turn up.”

    What the theorists were cautioning against was a model-dependent search, a search for a particle predicted by a certain model—in this case, the Standard Model of particle physics.

    Standard Model of Particle Physics

    It shouldn’t have been too much of a worry. Around then, most particle physicists’ experiments were general searches, not based on predictions from a particular model, says Jonathan Feng, a theoretical particle physicist at the University of California, Irvine.

    Using early particle colliders, physicists smashed electrons and protons together at high energies and looked to see what came out. Samuel Ting and Burton Richter, who shared the 1976 Nobel Prize in physics for the discovery of the charm quark, for example, were not looking for the particle with any theoretical prejudice, Feng says.

    That began to change in the 1980s and ’90s. That’s when physicists began exploring elegant new theories such as supersymmetry, which could tie up many of the Standard Model’s theoretical loose ends—and which predict the existence of a whole slew of new particles for scientists to try to find.

    Of course, there was also the Higgs boson. Even though scientists didn’t have a good prediction of its mass, they had good motivations for thinking it was out there waiting to be discovered.

    And it was. Almost 40 years after the theorists’ tongue-in-cheek warning about searching for the Higgs, Ellis found himself sitting in the main auditorium at CERN next to experimentalist Fabiola Gianotti, the spokesperson of the ATLAS experiment at the Large Hadron Collider who, along with CMS spokesperson Joseph Incandela, had just co-announced the discovery of the particle he had once so pessimistically described.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    Model-dependent vs model-independent

    Scientists’ searches for particles predicted by certain models continue, but in recent years, searches for new physics independent of those models have begun to enjoy a resurgence as well.

    “A model-independent search is supposed to distill the essence from a whole bunch of specific models and look for something that’s independent of the details,” Feng says. The goal is to find an interesting common feature of those models, he explains. “And then I’m going to just look for that phenomenon, irrespective of the details.”

    Particle physicist Sara Alderweireldt uses model-independent searches in her work on the ATLAS experiment at the Large Hadron Collider.

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    Alderweireldt says that while many high-energy particle physics experiments are designed to make very precise measurements of a specific aspect of the Standard Model, a model-independent search allows physicists to take a wider view and search more generally for new particles or interactions. “Instead of zooming in, we try to look in as many places as possible in a consistent way.”

    Such a search makes room for the unexpected, she says. “You’re not dependent on the prior interpretation of something you would be looking for.”

    Theorist Patrick Fox and experimentalist Anadi Canepa, both at Fermilab, collaborate on searches for new physics.


    In Canepa’s work on the CMS experiment, the other general-purpose particle detector at the LHC, many of the searches are model-independent.

    While the nature of these searches allows them to “cast a wider net,” Fox says, “they are in some sense shallower, because they don’t manage to strongly constrain any one particular model.”

    At the same time, “by combining the results from many independent searches, we are getting closer to one dedicated search,” Canepa says. “Developing both model-dependent and model-independent searches is the approach adopted by the CMS and ATLAS experiments to fully exploit the unprecedented potential of the LHC.”

    Driven by data and powered by machine learning

    Model-dependent searches focus on a single assumption or look for evidence of a specific final state following an experimental particle collision. Model-independent searches are far broader—and how broad is largely driven by the speed at which data can be processed.

    “We have better particle detectors, and more advanced algorithms and statistical tools that are enabling us to understand searches in broader terms,” Canepa says.

    One reason model-independent searches are gaining prominence is because now there is enough data to support them. Particle detectors are recording vast quantities of information, and modern computers can run simulations faster than ever before, she says. “We are able to do model-independent searches because we are able to better understand much larger amounts of data and extreme regions of parameter and phase space.”

    Machine-learning is a key part of this processing power, Canepa says. “That’s really a change of paradigm, because it really made us make a major leap forward in terms of sensitivity [to new signals]. It really allows us to benefit from understanding the correlations that we didn’t capture in a more classical approach.”

    These broader searches are an important part of modern particle physics research, Fox says.

    “At a very basic level, our job is to bequeath to our descendants a better understanding of nature than we got from our ancestors,” he says. “One way to do that is to produce lots of information that will stand the test of time, and one way of doing that is with model-independent searches.”

    Models go in and out of fashion, he adds. “But model-independent searches don’t feel like they will.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: