Tagged: CERN LHC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:42 pm on April 10, 2018 Permalink | Reply
    Tags: , CERN LHC, , , Now the question is what if there is a whole sector of undiscovered particles that cannot communicate with our standard particles but can interact with the Higgs boson?, , , , , Theorists predict that about 90 percent of Higgs bosons are created through gluon fusion   

    From Symmetry: “How to make a Higgs boson” 

    Symmetry Mag
    Symmetry

    04/10/18
    Sarah Charley

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    It doesn’t seem like collisions of particles with no mass should be able to produce the “mass-giving” boson, the Higgs. But every other second at the LHC, they do.

    Einstein’s most famous theory, often written as E=mc2, tells us that energy and matter are two sides of the same coin.

    The Large Hadron Collider uses this principle to convert the energy contained within ordinary particles into new particles that are difficult to find in nature—particles like the Higgs boson, which is so massive that it almost immediately decays into pairs of lighter, more stable particles.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    But not just any collision can create a Higgs boson.

    “The Higgs is not just created from a ‘poof’ of energy,” says Laura Dodd, a researcher at the University of Wisconsin, Madison. “Particles follow a strict set of laws that dictate how they can form, decay and interact.”

    One of these laws states that Higgs bosons can be produced only by particles that interact with the Higgs field—in other words, particles with mass.

    The Higgs field is like an invisible spider’s web that permeates all of space. As particles travel through it, some get tangled in the sticky tendrils, a process that makes them gain mass and slow down. But for other particles—such as photons and gluons—this web is completely transparent, and they glide through unhindered.

    Given enough energy, the particles wrapped in the Higgs field can transfer their energy into it and kick out a Higgs boson. Because massless particles do not interact with the Higgs field, it would make sense to say that they can’t create a Higgs. But scientists at the LHC would beg to differ.

    The LHC accelerates protons around its 17-mile circumference to just under the speed of light and then brings them into head-on collisions at four intersections along its ring. Protons are not fundamental particles, particles that cannot be broken down into any smaller constituent pieces. Rather they are made up of gluons and quarks.

    As two pepped-up protons pass through each other, it’s usually pairs of massless gluons that infuse invisible fields with their combined energy and excite other particles into existence—and that includes Higgs bosons.

    __________________________________________________________

    We know that particles follow strict rules about who can talk to whom.
    __________________________________________________________

    How? Gluons have found a way to cheat.

    “It would be impossible to generate Higgs bosons with gluons if the collisions in the LHC were a simple, one-step processes,” says Richard Ruiz, a theorist at Durham University’s Institute for Particle Physics Phenomenology.

    Luckily, they aren’t.

    Gluons can momentarily “launder” their energy to a virtual particle, which converts the gluon’s energy into mass. If two gluons produce a pair of virtual top quarks, the tops can recombine and annihilate into a Higgs boson.

    To be clear, virtual particles are not stable particles at all, but rather irregular disturbances in quantum mechanical fields that exist in a half-baked state for an incredibly short period of time. If a real particle were a thriving business, then a virtual particle would be a shell company.

    Theorists predict that about 90 percent of Higgs bosons are created through gluon fusion. The probability of two gluons colliding, creating a top quark-antitop pair and propitiously producing a Higgs is roughly one in 2 billion. However, because the LHC generates 10 million proton collisions every second, the odds are in scientists’ favor and the production rate for Higgs bosons is roughly one every two seconds.

    Shortly after the Higgs discovery, scientists were mostly focused on what happens to Higgs bosons after they decay, according to Dodd.

    “But now that we have more data and a better understanding of the Higgs, we’re starting to look closer at the collision byproducts to better understand how frequently the Higgs is produced through the different mechanisms,” she says.

    The Standard Model of particle physics predicts that almost all Higgs bosons are produced through one of four possible processes.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    What scientists would love to see are Higgs bosons being created in a way that the Standard Model of particle physics does not predict, such as in the decay of a new particle. Breaking the known rules would show that there is more going on than physicists previously understood.

    “We know that particles follow strict rules about who can talk to whom because we’ve seen this time and time again during our experiments,” Ruiz says. “So now the question is, what if there is a whole sector of undiscovered particles that cannot communicate with our standard particles but can interact with the Higgs boson?”

    Scientists are keeping an eye out for anything unexpected, such as an excess of certain particles radiating from a collision or decay paths that occur more or less frequently than scientists predicted. These indicators could point to undiscovered heavy particles morphing into Higgs bosons.

    At the same time, to find hints of unexpected ingredients in the chain reactions that sometimes make Higgs bosons, scientists must know very precisely what they should expect.

    “We have fantastic mathematical models that predict all this, and we know what both sides of the equations are,” Ruiz says. “Now we need to experimentally test these predictions to see if everything adds up, and if not, figure out what those extra missing variables might be.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


    Advertisements
     
  • richardmitnick 2:44 pm on April 5, 2018 Permalink | Reply
    Tags: , CERN LHC, , , , , Particle Physicists begin to invent reasons to build next larger Particle Collider, , ,   

    From BBC via Back Reaction: “Particle Physicists begin to invent reasons to build next larger Particle Collider” 

    BBC
    BBC

    Back Reaction

    April 04, 2018

    2
    Sabine Hossenfelder

    Nigel Lockyer, the director of Fermilab [FNAL], recently spoke to BBC about the benefits of building a next larger particle collider, one that reaches energies higher than the Large Hadron Collider (LHC).

    Nigel Lockyer

    ,

    Such a new collider could measure more precisely the properties of the Higgs-boson. But that’s not all, at least according to Lockyer. He claims he knows there is something new to discover too:

    “Everybody believes there’s something there, but what we’re now starting to question is the scale of the new physics. At what energy does this new physics show up,” said Dr Lockyer. “From a simple calculation of the Higgs’ mass, there has to be new science. We just can’t give up on everything we know as an excuse for where we are now.”

    First, let me note that “everybody believes” is an argument ad populum. It isn’t only non-scientific, it is also wrong because I don’t believe it, qed. But more importantly, the argument for why there has to be new science is wrong.

    To begin with, we can’t calculate the Higgs mass; it’s a free parameter that is determined by measurement. Same with the Higgs mass as with the masses of all other elementary particles. But that’s a matter of imprecise phrasing, and I only bring it up because I’m an ass.

    The argument Lockyer is referring to are calculations of quantum corrections to the Higgs-mass. I.e., he is making the good, old, argument from naturalness.

    If that argument were right, we should have seen supersymmetric particles already. We didn’t. That’s why Giudice, head of the CERN theory division, has recently rung in the post-naturalness era. Even New Scientist took note of that. But maybe the news hasn’t yet arrived in the USA.

    Naturalness arguments never had a solid mathematical basis. But so far you could have gotten away saying they are handy guides for theory development. Now, however, seeing that these guides were bad guides in that their predictions turned out incorrect, using arguments from naturalness is no longer scientifically justified. If it ever was. This means we have no reason to expect new science, not in the not-yet analyzed LHC data and not at a next larger collider.

    Of course there could be something new. I am all in favor of building a larger collider and just see what happens. But please let’s stick to the facts: There is no reason to think a new discovery is around the corner.

    I don’t think Lockyer deliberately lied to BBC. He’s an experimentalist and probably actually believes what the theorists tell him. He has all reasons for wanting to believe it. But really he should know better.

    Much more worrisome than Lockyer’s false claim is that literally no one from the community tried to correct it. Heck, it’s like the head of NASA just told BBC we know there’s life on Mars! If that happened, astrophysicists would collectively vomit on social media. But particle physicists? They all keep their mouth shut if one of theirs spreads falsehoods. And you wonder why I say you can’t trust them?

    Meanwhile Gordon Kane, a US-Particle physicist known for his unswerving support of super-symmetry, has made an interesting move: he discarded of naturalness arguments altogether.

    You find this in a paper which appeared on the arXiv today. It seems to be a promotional piece that Kane wrote together with Stephen Hawking some months ago to advocate the Chinese Super Proton Proton Collider (SPPC) [So far, the Chinese physics community thinks this is a waste of money.].

    Kane has claimed for 15 years or so that the LHC would have to see supersymmetric particles because of naturalness. Now that this didn’t work out, he has come up with a new reason for why a next larger collider should see something:

    “Some people have said that the absence of superpartners or other phenomena at LHC so far makes discovery of superpartners unlikely. But history suggests otherwise. Once the [bottom] quark was found, in 1979, people argued that “naturally” the top quark would only be a few times heavier. In fact the top quark did exist, but was forty-one times heavier than the [bottom] quark, and was only found nearly twenty years later. If superpartners were forty-one times heavier than Z-bosons they would be too heavy to detect at LHC and its upgrades, but could be detected at SPPC.”

    Indeed, nothing forbids superpartners to be forty-one times heavier than Z-bosons. Neither is there anything that forbids them to be four-thousand times heavier, or four billion times heavier. Indeed, they don’t even have to be there at all. Isn’t it beautiful?

    Leaving aside that just because we can’t calculate the masses doesn’t mean they have to be near the discovery-threshold, the historical analogy doesn’t work for several reasons.

    Most importantly, quarks come in pairs that are SU(2) doublets. This means once you have the bottom quark, you know it needs to have a partner. If there wouldn’t be one, you’d have to discontinue the symmetry of the standard model which was established with the lighter quarks.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    Supersymmetry, on contrast, has no evidence among the already known particles speaking in its favor.

    Standard model of Supersymmetry DESY

    Physicists also knew since the early 1970s that the weak nuclear force violates CP-invariance, which requires (at least) three generations of quarks. Because of this, the existence of both the bottom and top quark were already predicted in 1973.

    Finally, for anomaly cancellation to work you need equally many leptons as quarks, and the tau and tau-neutrino (third generation of leptons) had been measured already in 1975 and 1977, respectively. (We also know the top quark mass can’t be too far away from the bottom quark mass, and the Higgs mass has to be close by the top quark mass, but this calculation wasn’t available in the 1970s.)

    In brief this means if the top quark had not been found, the whole standard model wouldn’t have worked. The standard model, however, works just fine without supersymmetric particles.

    Of course Gordon Kane knows all this. But desperate times call for desperate measures I guess.

    In the Kane-Hawking pamphlet we also read:

    “In addition, a supersymmetric theory has the remarkable property that it can relate physics at our scale, where colliders take data, with the Planck scale, the natural scale for a fundamental physics theory, which may help in the efforts to find a deeper underlying theory.”

    I don’t disagree with this. But it’s a funny statement because for 30 years or so we have been told that supersymmetry has the virtue of removing the sensitivity to Planck scale effects. So, actually the absence of naturalness holds much more promise to make that connection to higher energy. In other words, I say, the way out is through.

    I wish I could say I’m surprised to see such wrong claims boldly being made in public. But then I only just wrote two weeks ago that the lobbying campaign is likely to start soon. And, lo and behold, here we go.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 10:45 am on April 5, 2018 Permalink | Reply
    Tags: A Second 'Big Bang' Could End Our Universe in an Instant, , CERN LHC, , , , , , , , Thanks to The Higgs Boson   

    From Harvard via Science Alert: “A Second ‘Big Bang’ Could End Our Universe in an Instant, Thanks to The Higgs Boson” 

    Harvard University
    Harvard University

    ScienceAlert

    Science Alert

    Well, that’s just great.

    1
    A Black Hole Artist Concept. (NASA/JPL-Caltech)

    5 APR 2018
    JEREMY BERKE, BUSINESS INSIDER

    Our universe may end the same way it was created: with a big, sudden bang. That’s according to new research from a group of Harvard physicists, who found that the destabilization of the Higgs boson – a tiny quantum particle that gives other particles mass – could lead to an explosion of energy that would consume everything in the known universe and upend the laws of physics and chemistry.

    As part of their study, published last month in the journal Physical Review D, the researchers calculated when our universe could end.

    It’s nothing to worry about just yet. They settled on a date 10139 years from now, or 10 million trillion trillion trillion trillion trillion trillion trillion trillion trillion trillion trillion years in the future. And they’re at least 95 percent sure – a statistical measure of certainty – that the universe will last at least another 1058 years.

    The Higgs boson, discovered in 2012 by researchers smashing subatomic protons together at the Large Hadron Collider, has a specific mass.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    If the researchers are correct, that mass could change, turning physics on its head and tearing apart the elements that make life possible, according to the New York Post.

    And rather than burning slowly over trillions of years, an unstable Higgs boson could create an instantaneous bang, like the Big Bang that created our universe.

    The researchers say a collapse could be driven by the curvature of space-time around a black hole, somewhere deep in the universe. When space-time curves around super-dense objects, like a black hole, it throws the laws of physics out of whack and causes particles to interact in all sorts of strange ways.

    The researchers say the collapse may have already begun – but we have no way of knowing, as the Higgs boson particle may be far away from where we can analyse it, within our seemingly infinite universe. “It turns out we’re right on the edge between a stable universe and an unstable universe,” Joseph Lykken, a physicist from the Fermi National Accelerator Laboratory who was not involved in the study, told the Post.

    He added: “We’re sort of right on the edge where the universe can last for a long time, but eventually, it should go ‘boom.'”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Harvard University campus
    Harvard is the oldest institution of higher education in the United States, established in 1636 by vote of the Great and General Court of the Massachusetts Bay Colony. It was named after the College’s first benefactor, the young minister John Harvard of Charlestown, who upon his death in 1638 left his library and half his estate to the institution. A statue of John Harvard stands today in front of University Hall in Harvard Yard, and is perhaps the University’s best known landmark.

    Harvard University has 12 degree-granting Schools in addition to the Radcliffe Institute for Advanced Study. The University has grown from nine students with a single master to an enrollment of more than 20,000 degree candidates including undergraduate, graduate, and professional students. There are more than 360,000 living alumni in the U.S. and over 190 other countries.

     
  • richardmitnick 6:50 am on March 29, 2018 Permalink | Reply
    Tags: , , CERN LHC, , , , , ,   

    From Science Node: “CERN pushes back the frontiers of physics” 

    Science Node bloc
    Science Node

    27 Mar, 2018
    Maria Girone
    CERN openlab Chief Technology Officer

    “Researchers at the European Organization for Nuclear Research (CERN) are probing the fundamental structure of the universe. They use the world’s largest and most complex scientific machines to study the basic constituents of matter — the fundamental particles.

    These particles are made to collide at close to the speed of light. This process gives physicists clues about how the particles interact, and provides insights into the laws of nature.

    CERN is home to the Large Hadron Collider (LHC), the world’s most powerful particle accelerator.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    It consists of a 27km ring of superconducting magnets, combined with accelerating structures to boost the energy of the particles prior to the collisions. Special detectors — similar to large, 3D digital cameras built in cathedral-sized caverns —observe and record the results of these collisions.

    One billion collisions per second

    Up to about 1 billion particle collisions can take place every second inside the LHC experiments’ detectors. It is not possible to examine all of these events. Hardware and software filtering systems are used to select potentially interesting events for further analysis.

    Even after filtering, the CERN data center processes hundreds of petabytes (PB) of data every year. Around 150 PB are stored on disk at the site in Switzerland, with over 200 PB on tape — the equivalent of about 2,000 years of HD video.

    Physicists must sift through the 30-50 PB of data produced annually by the LHC experiments to determine if the collisions have revealed any interesting physics. The Worldwide LHC Computing Grid (WLCG), a distributed computing infrastructure arranged in tiers, gives a community of thousands of physicists near-real-time access to LHC data.

    2
    Power up. The planned upgrades to the Large Hadron Collider. Image courtesy CERN.

    With 170 computing centers in 42 countries, the WLCG is the most sophisticated data-taking and analysis system ever built for science. It runs more than two million jobs per day.

    The LHC has been designed to follow a carefully planned program of upgrades. The LHC typically produces particle collisions for a period of around three years (known as a ‘run’), followed by a period of about two years for upgrade and maintenance work (known as a ‘long shutdown’).

    The High-Luminosity Large Hadron Collider (HL-LHC), scheduled to come online around 2026, will crank up the performance of the LHC and increase the potential for discoveries. The higher the luminosity, the more collisions, and the more data the experiments can gather.

    An increased rate of collision events means that digital reconstruction becomes significantly more complex. At the same time, the LHC experiments plan to employ new, more flexible filtering systems that will collect a greater number of events.

    This will drive a huge increase in computing needs. Using current software, hardware, and analysis techniques, the estimated computing capacity required would be around 50-100 times higher than today. Data storage needs are expected to be in the order of exabytes by this time.

    Technology advances over the next seven to ten years will likely yield an improvement of approximately a factor ten in both the amount of processing and storage available at the same cost, but will still leave a significant resource gap. Innovation is therefore vital; we are exploring new technologies and methodologies together with the world’s leading information and communications technology (ICT) companies.

    Tackling tomorrow’s challenges today

    CERN openlab works to develop and test the new ICT techniques that help to make groundbreaking physics discoveries possible. Established in 2001, the unique public-private partnership provides a framework through which CERN collaborates with leading companies to accelerate the development of cutting-edge technologies.

    My colleagues and I have been busy working to identify the key challenges that will face the LHC research community in the coming years. Last year, we carried out an in-depth consultation process, involving workshops and discussions with representatives of the LHC experiments, the CERN IT department, our collaborators from industry, and other ‘big science’ projects.

    Based on our findings, we published the CERN openlab white paper on future ICT challenges in scientific research. We identified 16 ICT challenge areas, grouped into major R&D topics that are ripe for tackling together with industry collaborators.

    In data-center technologies, we need to ensure that data-center architectures are flexible and cost effective and that cloud computing resources can be used in a scalable, hybrid manner. New technologies for solving storage capacity issues must be thoroughly investigated, and long-term data-storage systems should be reliable and economically viable.

    We also need modernized code to ensure that maximum performance can be achieved on the new hardware platforms. Sucessfully translating the huge potential of machine learning into concrete solutions will play a role in monitoring the accelerator chain, optimizing the use of IT resources, and even hunting for new physics.

    Several IT challenges are common across research disciplines. With ever more research fields adopting methodologies driven by big data, it’s vital that we collaborate with research communities such as astrophysics, biomedicine, and Earth sciences.

    As well as sharing tools and learning from one another’s experience, working together to address common challenges can increase our ability to ensure that leading ICT companies are producing solutions that meet our common needs.

    These challenges must be tackled over the coming years in order to ensure that physicists across the globe can exploit CERN’s world-leading experimental infrastructure to its maximum potential. We believe that working together with industry leaders through CERN openlab can play a key role in overcoming these challenges, for the benefit of both the high-energy physics community and wider society.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 2:22 pm on March 27, 2018 Permalink | Reply
    Tags: , , CERN LHC, , , , ,   

    From Symmetry: “Keeping the LHC Cold” 

    Symmetry Mag
    Symmetry

    1
    Artwork by Sandbox Studio, Chicago with Ana Kova

    03/27/18
    Sarah Charley

    The LHC is one of the coldest places on the planet.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    OTHER PROJECTS AT CERN

    CERN AEGIS

    CERN ALPHA

    CERN ALPHA

    CERN AMS

    CERN ACACUSA

    CERN ASACUSA

    CERN ATRAP

    CERN ATRAP

    CERN AWAKE

    CERN AWAKE

    CERN CAST

    CERN CAST Axion Solar Telescope

    CERN CLOUD

    CERN CLOUD

    CERN COMPASS

    CERN COMPASS

    CERN DIRAC

    CERN DIRAC

    CERN ISOLDE

    CERN ISOLDE

    CERN LHCf

    CERN LHCf

    CERN NA62

    CERN NA62

    CERN NTOF

    CERN TOTEM

    CERN UA9

    Liquid helium is constantly pulsing through sophisticated plumbing that runs both inside and outside of the Large Hadron Collider. Thanks to this cryogenic cooling system, the LHC is colder than interstellar space.

    But why does it need to be kept at these intensely frigid temperatures?

    “Because if not, the magnets would not work,” says Serge Claudet, the deputy head of CERN’s cryogenics group.

    The cable that is coiled to make the LHC’s powerful electromagnets carries 11,800 amperes of current—roughly as much as a small bolt of lightning. The average toaster, for reference, uses only 9 amperes.

    For a cable the width of a finger to carry this much current and not burn up, it must be a superconductor. A superconductor is a type of material that carries an electrical current with zero electrical resistance. You see evidence of electrical resistance every time you turn on a light. If a lightbulb filament were made from a superconducting wire, it would give off no heat and no light—the electricity would pass straight through.

    Most industrial superconductors gain the magical property of superconductivity only at extremely low temperatures—a few degrees above absolute zero.

    So perhaps surprisingly, the LHC lives in a pleasantly warm tunnel, about 80 degrees Fahrenheit. To insulate the superconducting magnets from this temperate climate, engineers nestled layers of insulation inside one another like a matryoshka doll, each colder than the last, protecting the magnetic core.

    On the very outside is a vacuum chamber, which acts like the walls of a thermos. On the very inside, the magnets are submerged in a static bath of 1.9-Kelvin superfluid liquid helium, which seeps into every nook and cranny of the LHC’s magnetic coils and supports.

    If engineers had to worry only about protecting the LHC from the warmth of the tunnel, two feet of protection swollen with liquid helium might be enough. But their most formidable foe lies within.

    “Most heating is internal,” says Gareth Jones, a CERN cryogenic operator. “It comes from the proton beam and the magnets.”

    Heat is a measurement of how much particles jostle, and the 3.5 quintillion protons that stream through the heart of the LHC certainly create a stir. Every time a proton rounds a corner, it releases quick bursts of light, which are absorbed by the surrounding material and awaken sleeping molecules.

    Meanwhile, the loosely bound electrons of the copper-coated beampipe flow through the metal in pursuit of the positively charged proton beam, generating an electrical current. Some electrons will even leave their atomic confines and leap into the vacuum, only to crash and liberate even more electrons. These electrons move like water down a river gorge, bouncing off obstacles and swirling in eddies. All of this generates more and more heat, which threatens the sensitive conditions required to keep the magnets superconducting.

    “If the magnets get above 2.17 Kelvin, they start to lose their superconducting properties,” says Guy Crockford, an LHC operator. “When this happens, what was originally just a little bit of internal heating quickly escalates into a lot of heat.”

    To keep these magnets cool, engineers designed a complex cryogenic system that takes advantage of a very simple principle: When a liquid transforms into a gas, it absorbs heat. This is why we feel cold after a shower; it’s not because the water is cold, but because it carries away our heat as droplets evaporate off our skin.

    A long and thin pipe pierces the magnet support structure and delivers a stream of pressurized, ultra-cold liquid helium. As the liquid helium absorbs the excess heat, it evaporates and is quickly pumped out.

    Another cooling pipe runs through the inside of the beampipe and sops up energy right at the source. These internal capillaries are fed by a highway of five pipes running alongside the LHC: Two transport cold helium for injection; two carry warm helium back for re-cooling; and one is the main artery that helps maintain the pressure and temperature of the entire circuit. The LHC cycles about 16 liters of liquid helium every second to keep the entire system operational.

    Despite all of these efforts, LHC magnets do sometimes heat up enough to lose their superconductivity in an event called a magnet quench.

    “It’s normally just one concentrated point that warms up, and it happens so fast,” Crockford says.

    Sensors detect the change in voltage and trigger a system that fires quench heater strips, which distribute the heat throughout the entire magnet and divert the electrical current away from the magnet. At the same time, the LHC beam is automatically rerouted into a concrete block called a beam dump, and the entire accelerator takes a pause for a few hours while the magnet recovers back to its super-cooled state.

    “This has happened only about once every two years,” Crockford says. “We want to protect our magnets at all costs, and cryogenics is always on our mind.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 5:34 pm on March 22, 2018 Permalink | Reply
    Tags: , , CERN LHC, , , , , , Rutgers Physics, ,   

    From Rutgers: “Physicists at Crossroads in Trying to Understand Universe” 

    Rutgers smaller
    Our once and future Great Seal.

    Rutgers University

    March 21, 2018
    Todd Bates
    848-932-0550
    todd.bates@rutgers.edu

    1
    This image shows the evolution of the universe from its Big Bang birth (on the left) to the present (on the right), a timespan of nearly 14 billion years. By producing the world’s highest energy collisions, CERN’s Large Hadron Collider in Switzerland acts as a time machine that takes Rutgers physics professors Scott Thomas and Sunil Somalwar all the way back to the first trillionth of a second after the Big Bang.
    Image: NASA/WMAP Science Team

    Scientists at Rutgers University–New Brunswick and elsewhere are at a crossroads in their 50-year quest to go beyond the Standard Model in physics.

    Rutgers Today asked professors Sunil Somalwar and Scott Thomas in the Department of Physics and Astronomy at the School of Arts and Sciences to discuss mysteries of the universe. Somalwar’s research focuses on experimental elementary particle physics, or high energy physics, which involves smashing particles together at large particle accelerators such as the one at CERN in Switzerland. Thomas’s research focuses on theoretical particle physics.

    The duo, who collaborate on experiments, and other Rutgers physicists – including Yuri Gershtein – contributed to the historic 2012 discovery of the Higgs boson, a subatomic particle responsible for the structure of all matter and a key component of the Standard Model.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event


    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    Rutgers Today: What is the Standard Model?

    Thomas: It is a theory started about 50 years ago. It should be called “the most fantastically successful theory of everything ever” because it’s a triumph of human intellect. It explains, in a theoretical structure and in great quantitative detail, every single experiment ever done in the laboratory. And no experiment so far conflicts with this theory. The capstone to the Standard Model experimentally was the discovery of the Higgs boson. It predicted the existence and interactions of lots of different particles, all of which were found. The problem is that as theorists, we are victims of our own success. The Standard Model is so successful that the theory does not point to answers to some of the questions we still have. The Higgs boson answered many questions, but we don’t get clues directly from this theoretical structure how the remaining questions might be answered, so we’re at a crossroads in this 50-year quest. We need some hints from experiments and then, hopefully, the hints will be enough to tell us the next theoretical structure that underlies the Standard Model.

    Rutgers Today: What questions remain?

    Somalwar: The Standard Model says that matter and antimatter should be nearly equal. But after the Big Bang about 13.8 billion years ago, matter amounted to one part in 10 billion and antimatter dropped to virtually zero. A big mystery is what happened to all the antimatter. And why are neutrinos (also subatomic particles) so light? Is the Higgs boson particle by itself or is there a Higgs zoo? There are good reasons that the Higgs boson could not possibly be alone. There’s got to be more to the picture.

    Rutgers Today: What are you focusing on?

    Somalwar: I am looking for evidence of heavy particles that might have existed a picosecond after the Big Bang. These particles don’t exist anymore because they degenerate. They’re very unstable. They could explain why neutrinos are so light and why virtually all antimatter disappeared but not all matter disappeared. What we do is called frontier science – it’s at the forefront of physics: the smallest distances and highest energies. Once you get to the frontier, you occupy much of the area and start prospecting. But at some point, things are mined out and you need a new frontier. We’ve just begun prospecting here. We don’t have enough mined areas and we may have some gems lying there and more will come in the next year or two. So, it’s a very exciting time right now because it’s like we’ve gotten to the gold rush.

    Thomas: I am trying to understand the physics underlying the Higgs sector of the Standard Model theory, which must include at least one particle – the Higgs boson. This sector is very important because it determines the size of atoms and the mass of elementary particles. The physics underlying the Higgs sector is a roadblock to understanding physics at a more fundamental scale. Are there other species of Higgs particles? What are their interactions and what properties do they have? That would start to give us clues and then maybe we could reconstruct a theory of what underlies the Standard Model. The real motivation is to understand the way the universe works at its most fundamental level. That’s what drives us all.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    rutgers-campus

    Rutgers, The State University of New Jersey, is a leading national research university and the state’s preeminent, comprehensive public institution of higher education. Rutgers is dedicated to teaching that meets the highest standards of excellence; to conducting research that breaks new ground; and to providing services, solutions, and clinical care that help individuals and the local, national, and global communities where they live.

    Founded in 1766, Rutgers teaches across the full educational spectrum: preschool to precollege; undergraduate to graduate; postdoctoral fellowships to residencies; and continuing education for professional and personal advancement.

    As a ’67 graduate of University college, second in my class, I am proud to be a member of

    Alpha Sigma Lamda, National Honor Society of non-tradional students.

     
  • richardmitnick 10:12 am on March 1, 2018 Permalink | Reply
    Tags: , , , CERN LHC, , , , MIT physicists observe electroweak production of same-sign W boson pairs, , ,   

    From MIT: “MIT physicists observe electroweak production of same-sign W boson pairs” 

    MIT News

    MIT Widget

    MIT News

    February 27, 2018
    Scott Morley | Laboratory for Nuclear Science

    1
    Vector-boson scattering processes are characterized by two high-energetic jets in the forward regions of the detector. The Figure shows a significant excess of events in the distribution of the mass of the two tagging jets in yellow, labelled as EW WW. Image: Markus Klute

    In research conducted by a group led by MIT Laboratory for Nuclear Science researcher and associate professor of physics Markus Klute, electroweak productions of same-sign W boson pairs were observed, the first such observation of its kind and a milestone toward precision testing of vector boson scattering (W and Z bosons) at the Large Hadron Collider (LHC).

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    The LHC at CERN in Geneva, Switzerland, was proposed in the 1980s as a machine to either find the Higgs boson or discover yet unknown particles or interactions.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    This idea, that the LHC would be able to make a discovery, whatever that might be, is called by theorists No-lose Theorem, and is connected to probing the scattering of W boson pairs at energies above 1 teraelectronvolt (TeV). In 2012, only two years after the first high-energy collision at the LHC, this proposal paid huge dividends when the Higgs boson was discovered by the ATLAS and Compact Muon Solenid (CMS) collaborations.

    According to CERN, the CMS detector at the LHC utilizes a massive solenoid magnet to study everything from the Higgs boson to dark matter to the Standard Model.

    CERN/CMS Detector

    The Standard Model of elementary particles , with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    CMS is capable of generating a magnetic field that is approximately 100,000 times that of Earth. It resides in an underground cavern near Cessy, France, which is northwest of Geneva.

    The main goal of a recent measurement by CMS was to identify W boson pairs with the same sign (W+W+ or W-W-) produced purely via the electroweak interaction and probing the scattering of W bosons. The result does not unveil physics beyond the Standard Model, but this first observation of this process marks a starting point for a field of study to independently test whether the discovered Higgs boson is or is not the particle predicted by Robert Brout, François Englert, and Peter Higgs. It is anticipated that the rapidly growing data sets available at the LHC will further knowledge along these lines. Studies show that the high luminosity LHC will likely allow the direct study of longitudinal W boson scattering.

    “The measurement of vector-boson scattering processes, like the one studied in this paper, is an important test bench of the nature of the Higgs boson, as small deviations from the Standard Model expectation can have a large impact on event rates,” Klute says. “While challenging new physics models, these processes also allow a unique model-independent measurement of Higgs boson couplings to the W and Z boson at the LHC.”

    “The observation of this vector-boson scattering process is an important milestone toward future precision measurements,” Klute says. “These measurements are very challenging experimentally and require theoretical predictions with high precision. Both areas are pushed forward by the published results.”

    The work, while within CMS, was performed by MIT and included Klute, his students Andrew Levin and Xinmei Nui, and research scientist Guillelmo Gomez-Ceballos, along with University of Antwerp colleague Xavier Janssen and his student Jasper Lauwers.

    The work has been published in Physical Review Letters.

    This research was funded with support from U.S. Department of Energy.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 3:42 pm on February 12, 2018 Permalink | Reply
    Tags: , Aleksandra Dimitrievska, , , CERN LHC, , , , ,   

    From LBNL- “From Belgrade to Berkeley: A Postdoctoral Researcher’s Path in Particle Physics” 

    Berkeley Logo

    Berkeley Lab

    February 12, 2018

    Berkeley Lab’s Aleksandra Dimitrievska is working on a next-gen particle detector for CERN’s Large Hadron Collider

    1
    Aleksandra Dimitrievska works on prototype chips for a planned upgrade at CERN’s Large Hadron Collider. (Credit: Marilyn Chung/Berkeley Lab)

    After completing her Ph.D. thesis in calculating the mass of the W boson – an elementary particle that mediates one of the universe’s fundamental forces – physics researcher Aleksandra Dimitrievska is now testing out components for a scheduled upgrade of the world’s largest particle detectors.

    Dimitrievska left the University of Belgrade in Serbia late last year to join the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) as the recipient of an Owen Chamberlain Postdoctoral Fellowship in Experimental Particle Physics & Cosmology in the Lab’s Physics Division. The fellowship will extend up to five years.

    “Before, I was working behind a computer on coding. Now, I am in a clean room making wire bonds on computer chips, so it’s a much different experience,” Dimitrievska said. “I completely feel like a physicist now.”

    The Chamberlain Fellowship was created in 2002 to honor the late Owen Chamberlain, a Berkeley Lab physicist and UC Berkeley professor who received the Nobel Prize in Physics in 1959 for his work on the team that discovered the anti-proton using the Lab’s Bevatron accelerator. He also worked on the development of the time projection chamber, a type of detector that has been widely used in particle physics experiments.

    Dimitrievska’s path toward a career in particle physics led her to CERN’s Large Hadron Collider (LHC), a particle collider with an underground tunnel measuring 17 miles in circumference that is used to accelerate protons up to nearly the speed of light and collide them in detectors to measure the ensuing subatomic fireworks.

    “I started as a summer student at CERN in 2012. After that I went back to Belgrade – my Ph.D. advisor was involved in work on the W boson mass measurement,” she said. He connected her with a CERN team led by French physicist Maarten Boonekamp.

    The W boson and Z boson, which were both discovered in CERN experiments in 1983, are carriers of the “weak force” that is responsible for the particle process triggering fusion in the sun and other stars, the presence of radiation across the universe, and the breakdown of radioactive elements via a process known as beta decay. The W boson can have a positive or negative charge while the Z boson has a neutral charge, and each of these particles has a mass that is heavier than an iron atom.

    But despite such large masses, it has been difficult to pinpoint the W boson’s mass because of the typical noisy mess of other particle processes associated with its creation in collider experiments.

    “This is a really difficult measurement,” Dimitrievska said. The W boson’s mass must be calculated based on indirect measurements – a careful dissection of the data from related particle processes including recoil, in which particles are ejected from other particles in high-energy collisions at the LHC.

    “We started from scratch, one step at a time,” she said, to find the best way to calibrate the W boson measurements. “We tried different approaches and different ideas. The most important things are the uncertainties,” she said, and in finding ways to reduce the uncertainties in the analyses of data from experiments. “It takes a lot of time to really calibrate each source.”

    The team conducting the analysis found that a useful way to measure the W boson is to use measurements of the Z boson for calibration. “You are calibrating the recoil on the Z boson events, and then you extrapolate (measurements) for the W boson,” she said, based in part on the uncertainties in the Z boson measurements.

    The team worked with data from millions of particle collisions that produced candidate W bosons in the 2011 run of the LHC. Ongoing studies will apply the same techniques developed for the 2011 analysis for larger sets of data accumulated at the LHC in 2012, 2015, and 2016. The latest sets of LHC data, because they can involve larger numbers of colliding protons, are even more challenging to pick through in isolating individual particle properties.

    Such painstaking analyses can ultimately test whether the standard model of particle physics, developed through decades of experiments and theories, holds up to increasingly precise measurements.

    In this case, Dimitrievska’s team found good agreement in their measurements with the standard model. “There is no hint of physics beyond the standard model, but this result is important because we have something new to put in front of the theoretical ideas and see where there is place for improvement in the measurements,” she said.

    She added, “The calibration and methods we used will also be used for other measurements at higher energies.”

    The latest measurement, published Feb. 6 in the European Physical Journal C, determined the mass of the W boson to be about 80,370 mega (million) electronvolts, or MeV, with a statistical uncertainty of plus or minus 7 MeV, which is consistent with an average from previous measurements of about 80,385 MeV, with uncertainty at plus or minus 15 MeV. An electronvolt is a unit of energy that is a common measure of mass for subatomic particles.

    Dimitrievska successfully defended her Ph.D. thesis on the W boson mass measurement at the University of Belgrade in December.

    Her current work at Berkeley Lab is focused on testing 2-centimeter-by-1-centimeter prototype computer chips for the planned High-Luminosity LHC at CERN that will produce a higher volume of particle collisions and data.

    “Because we will have more data, the readout system has to be faster,” she said. “Basically, we have to improve everything.”

    2
    Aleksandra Dimitrievska holds a prototype chip for planned detector upgrades at CERN. (Credit: Marilyn Chung/Berkeley Lab)

    The final version of the chips that she is testing will be installed in the inner part of the ATLAS and CMS detectors at CERN and must be radiation-hardened to withstand the constant drumming of high-energy particles. She has used 3-D printers at UC Berkeley to fabricate prototype components related to the chip assemblies she works with.

    “For now, I am just testing if the chips work – how they are collecting data,” she said. A next step for her research group is to set up a particle beam to monitor how the chips perform under simulated experimental conditions.

    As an active member of Berkeley Lab’s ATLAS collaboration team, Dimitrievska also participates remotely in several meetings per week hosted at CERN, and she said she looks forward to the opportunity to work on the LHC upgrade project as it moves forward from its R&D stages to actual fabrication, assembly, and installation.

    “I think this is the really nice part about this work,” she said. “You can see the development of something that you can actually use later. You can participate first in the development of the detector, and then do the analysis and see how it really works.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 12:54 pm on January 30, 2018 Permalink | Reply
    Tags: , , CERN LHC, , , , , , ,   

    From LBNL: “Applying Machine Learning to the Universe’s Mysteries” 

    Berkeley Logo

    Berkeley Lab

    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    1
    The colored lines represent calculated particle tracks from particle collisions occurring within Brookhaven National Laboratory’s STAR detector at the Relativistic Heavy Ion Collider, and an illustration of a digital brain. The yellow-red glow at center shows a hydrodynamic simulation of quark-gluon plasma created in particle collisions. (Credit: Berkeley Lab)

    BNL/RHIC Star Detector

    Computers can beat chess champions, simulate star explosions, and forecast global climate. We are even teaching them to be infallible problem-solvers and fast learners.

    And now, physicists at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and their collaborators have demonstrated that computers are ready to tackle the universe’s greatest mysteries. The team fed thousands of images from simulated high-energy particle collisions to train computer networks to identify important features.

    The researchers programmed powerful arrays known as neural networks to serve as a sort of hivelike digital brain in analyzing and interpreting the images of the simulated particle debris left over from the collisions. During this test run the researchers found that the neural networks had up to a 95 percent success rate in recognizing important features in a sampling of about 18,000 images.

    The study was published Jan. 15 in the journal Nature Communications.

    The researchers programmed powerful arrays known as neural networks to serve as a sort of hivelike digital brain in analyzing and interpreting the images of the simulated particle debris left over from the collisions. During this test run the researchers found that the neural networks had up to a 95 percent success rate in recognizing important features in a sampling of about 18,000 images.

    The next step will be to apply the same machine learning process to actual experimental data.

    Powerful machine learning algorithms allow these networks to improve in their analysis as they process more images. The underlying technology is used in facial recognition and other types of image-based object recognition applications.

    The images used in this study – relevant to particle-collider nuclear physics experiments at Brookhaven National Laboratory’s Relativistic Heavy Ion Collider and CERN’s Large Hadron Collider – recreate the conditions of a subatomic particle “soup,” which is a superhot fluid state known as the quark-gluon plasma believed to exist just millionths of a second after the birth of the universe. Berkeley Lab physicists participate in experiments at both of these sites.

    BNL RHIC Campus

    BNL/RHIC

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    “We are trying to learn about the most important properties of the quark-gluon plasma,” said Xin-Nian Wang, a nuclear physicist in the Nuclear Science Division at Berkeley Lab who is a member of the team. Some of these properties are so short-lived and occur at such tiny scales that they remain shrouded in mystery.

    In experiments, nuclear physicists use particle colliders to smash together heavy nuclei, like gold or lead atoms that are stripped of electrons. These collisions are believed to liberate particles inside the atoms’ nuclei, forming a fleeting, subatomic-scale fireball that breaks down even protons and neutrons into a free-floating form of their typically bound-up building blocks: quarks and gluons.

    3
    The diagram at left, which maps out particle distribution in a simulated high-energy heavy-ion collision, includes details on particle momentum and angles. Thousands of these images were used to train and test a neural network to identify important features in the images. At right, a neural network used the collection of images to created this “importance map” – the lighter colors represent areas that are considered more relevant to identify equation of state for the quark-gluon matter created in particle collisions. (Credit: Berkeley Lab)

    Researchers hope that by learning the precise conditions under which this quark-gluon plasma forms, such as how much energy is packed in, and its temperature and pressure as it transitions into a fluid state, they will gain new insights about its component particles of matter and their properties, and about the universe’s formative stages.

    But exacting measurements of these properties – the so-called “equation of state” involved as matter changes from one phase to another in these collisions – have proven challenging. The initial conditions in the experiments can influence the outcome, so it’s challenging to extract equation-of-state measurements that are independent of these conditions.

    “In the nuclear physics community, the holy grail is to see phase transitions in these high-energy interactions, and then determine the equation of state from the experimental data,” Wang said. “This is the most important property of the quark-gluon plasma we have yet to learn from experiments.”

    Researchers also seek insight about the fundamental forces that govern the interactions between quarks and gluons, what physicists refer to as quantum chromodynamics.

    Long-Gang Pang, the lead author of the latest study and a Berkeley Lab-affiliated postdoctoral researcher at UC Berkeley, said that in 2016, while he was a postdoctoral fellow at the Frankfurt Institute for Advanced Studies, he became interested in the potential for artificial intelligence (AI) to help solve challenging science problems.

    He saw that one form of AI, known as a deep convolutional neural network – with architecture inspired by the image-handling processes in animal brains – appeared to be a good fit for analyzing science-related images.

    “These networks can recognize patterns and evaluate board positions and selected movements in the game of Go,” Pang said. “We thought, ‘If we have some visual scientific data, maybe we can get an abstract concept or valuable physical information from this.’”

    Wang added, “With this type of machine learning, we are trying to identify a certain pattern or correlation of patterns that is a unique signature of the equation of state.” So after training, the network can pinpoint on its own the portions of and correlations in an image, if any exist, that are most relevant to the problem scientists are trying to solve.

    Accumulation of data needed for the analysis can be very computationally intensive, Pang said, and in some cases it took about a full day of computing time to create just one image. When researchers employed an array of GPUs that work in parallel – GPUs are graphics processing units that were first created to enhance video game effects and have since exploded into a variety of uses – they cut that time down to about 20 minutes per image.

    They used computing resources at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC) in their study, with most of the computing work focused at GPU clusters at GSI in Germany and Central China Normal University in China.

    NERSC Cray XC40 Cori II supercomputer

    LBL NERSC Cray XC30 Edison supercomputer


    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF


    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    A benefit of using sophisticated neural networks, the researchers noted, is that they can identify features that weren’t even sought in the initial experiment, like finding a needle in a haystack when you weren’t even looking for it. And they can extract useful details even from fuzzy images.

    “Even if you have low resolution, you can still get some important information,” Pang said.

    Discussions are already underway to apply the machine learning tools to data from actual heavy-ion collision experiments, and the simulated results should be helpful in training neural networks to interpret the real data.

    “There will be many applications for this in high-energy particle physics,” Wang said, beyond particle-collider experiments.

    Also participating in the study were Kai Zhou, Nan Su, Hannah Petersen, and Horst Stocker from the following institutions: Frankfurt Institute for Advanced Studies, Goethe University, GSI Helmholtzzentrum für Schwerionenforschung (GSI), and Central China Normal University. The work was supported by the U.S Department of Energy’s Office of Science, the National Science Foundation, the Helmholtz Association, GSI, SAMSON AG, Goethe University, the National Natural Science Foundation of China, the Major State Basic Research Development Program in China, and the Helmholtz International Center for the Facility for Antiproton and Ion Research.

    NERSC is DOE Office of Science user facility.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 12:47 pm on January 18, 2018 Permalink | Reply
    Tags: , CERN LHC, , Long-lived physics, MATHUSLA- Massive Timing Hodoscope for Ultra Stable Neutral Particles, ,   

    From CERN: “Long-lived physics” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    18 Jan 2018
    Iva Raynova

    1
    The CMS experiment is looking for exotic long-lived particles that could get trapped in its detector layers (Image: Michael Hoch, Maximilien Brice/CERN)

    New particles produced in the LHC’s high-energy proton-proton collisions don’t hang around for long. A Higgs boson exists for less than a thousandth of a billionth of a billionth of a second before decaying into lighter particles, which can then be tracked or stopped in our detectors. Nothing rules out the existence of much longer-lived particles though, and certain theoretical scenarios predict that such extraordinary objects could get trapped in the LHC detectors, sitting there quietly for days.

    The CMS collaboration has reported new results [JHEP] in its search for heavy long-lived particles (LLPs), which could lose their kinetic energy and come to a standstill in the LHC detectors. Provided that the particles live for longer than a few tens of nanoseconds, their decay would be visible during periods when no LHC collisions are taking place, producing a stream of ordinary matter seemingly out of nowhere.

    The CMS team looked for these types of non-collision events in the densest detector materials of the experiment, where the long-lived particles are most likely to be stopped, based on LHC collisions in 2015 and 2016. Despite scouring data from a period of more than 700 hours, nothing strange was spotted. The results set the tightest cross-section and mass limits for hadronically-decaying long-lived particles that stop in the detector to date, and the first limits on stopped long-lived particles produced in proton-proton collisions at an energy of 13 TeV.

    The Standard Model, the theoretical framework that describes all the elementary particles, was vindicated in 2012 with the discovery of the Higgs boson.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    But some of the universe’s biggest mysteries remain unexplained, such as why matter prevailed over antimatter in the early universe or what exactly dark matter is. Long-lived particles are among numerous exotic species that would help address these mysteries and their discovery would constitute a clear sign of physics beyond the Standard Model. In particular, the decays searched for in CMS concerned long-lived gluinos arising in a model called “split” supersymmetry (SUSY) and exotic particles called “MCHAMPs”.

    While the search for long-lived particles at the LHC is making rapid progress at both CMS and ATLAS, the construction of a dedicated LLP detector has been proposed for the high-luminosity era of the LHC. MATHUSLA (Massive Timing Hodoscope for Ultra Stable Neutral Particles) is planned to be a surface detector placed 100 metres above either ATLAS or CMS.

    1

    It would be an enormous (200 × 200 × 20 m) box, mostly empty except for the very sensitive equipment used to detect LLPs produced in LHC collisions.

    Since LLPs interact weakly with ordinary matter, they will experience no trouble travelling through the rocks between the underground experiment and MATHUSLA. This process is similar to how weakly interacting cosmic rays travel through the atmosphere and pass through the Earth to reach our underground detectors, only in reverse. If constructed, the experiment will explore many more scenarios and bring us closer to discovering new physics.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: