Tagged: CERN LHC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:01 pm on May 2, 2019 Permalink | Reply
    Tags: , An unexpected signature, , CERN LHC, It’s not always about what you discover, Nature might be tough with us- but maybe nature is testing us and making us stronger., , , , Taking a closer look, Why the force of gravity is so much weaker than other known forces like electromagnetism. There is only one right answer. We haven’t found it yet.   

    From Symmetry: “The unseen progress of the LHC” 

    Symmetry Mag
    From Symmetry

    05/02/19
    Sarah Charley

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    It’s not always about what you discover.

    About seven years ago, physicist Stephane Willocq at the University of Massachusetts became enthralled with a set of theories that predicted the existence of curled-up extra dimensions hiding within our classical four dimensions of spacetime.

    “The idea of extra spatial dimensions is appealing because it allows us to look at the fundamental problems in particle physics from a different viewpoint,” Willocq says.

    As an experimental physicist, Willocq can do more than ponder. At the Large Hadron Collider at CERN, he put his pet theories to the test.

    Models based on those theories predicted how curled-up extra dimensions would affect the outcome of proton-proton collisions at the LHC. They predicted the collisions would produce more high-energy particles than expected.

    After several searches, Willocq and his colleagues found nothing out of the ordinary. “It was a great idea and disappointing to see it fade away, bit by bit,” he says, “but that’s how scientific progress works—finding the right idea by process of elimination.”

    The LHC research program is famous for discovering and studying the long-sought Higgs boson. But out of the spotlight, scientists have been using the LHC for an equally important scientific endeavor: testing, constraining and eliminating hundreds of theories that propose solutions to outstanding problems in physics, such as why the force of gravity is so much weaker than other known forces like electromagnetism.

    “There is only one right answer,” Willocq says. “We haven’t found it yet.”

    Now that scientists are at the end of the second run of the LHC, they have covered a huge amount of ground, eliminating the simplest versions of numerous theoretical ideas. They’ve covered four times as much phase space as previous searches for heavy new particles and set strict limits on what is physically possible.

    These studies don’t get the same attention as the Higgs boson, but these null results—results that don’t support a certain hypothesis—have moved physics forward as well.

    An unexpected signature

    Having chased down their most obvious leads, physicists are now adapting their methodology and considering new possibilities in their pursuit of new physics.

    Thus far, physicists have often used a straightforward formula to look for new particles. Massive particles produced in particle collisions will almost instantly decay, transforming into more stable particles. If scientists can measure all of those particles, they can reconstruct the mass and properties of the original particle that produced them.

    This worked wonderfully when scientists discovered the top quark in 1995 and the Higgs boson in 2012. But finding the next new thing might take a different tactic.

    “Finding new physics is more challenging than we expected it to be,” says University of Wisconsin physicist Tulika Bose of the CMS experiment. “Challenging situations make people come up with clever ideas.”

    One idea is that maybe scientists have been so focused on instantly decaying particles that they’ve been missing a whole host of particles that can travel up to several meters before falling apart. This would look like a firework exploding randomly in one of the detector subsystems.

    Scientists are rethinking how they reconstruct the data as a way to cast a bigger net and potentially catch particles with signatures like these. “If we only used our standard analysis methods, we would definitely not be sensitive to anything like this,” Bose says. “We’re no longer just reloading previous analyses but exploring innovative ideas.”

    Taking a closer look

    Since looking for excess particles coming out of collisions has yet to yield evidence of extra spatial dimensions, Willocq has decided to devote some of his efforts to a different method used at LHC experiments: precision measurements.

    Models also make predictions about properties of particles such as how often they decay into one set of particles versus another set. If precise measurements show deviations from predictions by the Standard Model of particle physics, it can mean that something new is at play.

    “Several new physics models predict an enhanced rate of rare subatomic processes,” Bose says. “However, their rates are so low that we have not been able to measure them yet.”

    In the past, precision measurements of well-known particles have overturned seemingly bulletproof paradigms. In the 1940s, for example, the measurement of a property called the “magnetic moment” of the neutron showed that it was not a fundamental particle, as had been previously assumed. This eventually helped lead to the discovery of particles that make up neutrons: quarks.

    Another example is the measurement of the mismatched decays of certain matter and antimatter particles, which led to the prediction of a new group of quarks—later confirmed by the discoveries of the top and bottom quarks.

    The plan for the LHC research program is to collect a huge amount of data, which will give scientists the resolution they need to examine every shadowy corner of the Standard Model.

    “This work naturally pushes our search methods towards making more detailed and higher precision measurements that will help us constrain possible deviations by new physics,” Willocq says.

    Because many of these predictions have never been thoroughly tested, scientists are hoping that they’ll find a few small deviations that could open the door to a new era of physics research. “Nature might be tough with us,” Bose says, “but maybe nature is testing us and making us stronger.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 12:32 pm on April 18, 2019 Permalink | Reply
    Tags: "When Beauty Gets in the Way of Science", , CERN LHC, , , , , , , , ,   

    From Nautilus: “When Beauty Gets in the Way of Science” 

    Nautilus

    From Nautilus

    April 18, 2019
    Sabine Hossenfelder

    Insisting that new ideas must be beautiful blocks progress in particle physics.

    When Beauty Gets in the Way of Science. Nautilus

    The biggest news in particle physics is no news. In March, one of the most important conferences in the field, Rencontres de Moriond, took place. It is an annual meeting at which experimental collaborations present preliminary results. But the recent data from the Large Hadron Collider (LHC), currently the world’s largest particle collider, has not revealed anything new.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    Forty years ago, particle physicists thought themselves close to a final theory for the structure of matter. At that time, they formulated the Standard Model of particle physics to describe the elementary constituents of matter and their interactions.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    After that, they searched for the predicted, but still missing, particles of the Standard Model. In 2012, they confirmed the last missing particle, the Higgs boson.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    The Higgs boson is necessary to make sense of the rest of the Standard Model. Without it, the other particles would not have masses, and probabilities would not properly add up to one. Now, with the Higgs in the bag, the Standard Model is complete; all Pokémon caught.

    1
    HIGGS HANGOVER: After the Large Hadron Collider (above) confirmed the Higgs boson, which validated the Standard Model, it’s produced nothing newsworthy, and is unlikely to, says physicist Sabine Hossenfelder.Shutterstock

    The Standard Model may be physicists’ best shot at the structure of fundamental matter, but it leaves them wanting. Many particle physicists think it is simply too ugly to be nature’s last word. The 25 particles of the Standard Model can be classified by three types of symmetries that correspond to three fundamental forces: The electromagnetic force, and the strong and weak nuclear forces. Physicists, however, would rather there was only one unified force. They would also like to see an entirely new type of symmetry, the so-called “supersymmetry,” because that would be more appealing.

    2
    Supersymmetry builds on the Standard Model, with many new supersymmetric particles, represented here with a tilde (~) on them. ( From the movie “Particle fever” reproduced by Mark Levinson)

    Oh, and additional dimensions of space would be pretty. And maybe also parallel universes. Their wish list is long.

    It has become common practice among particle physicists to use arguments from beauty to select the theories they deem worthy of further study. These criteria of beauty are subjective and not evidence-based, but they are widely believed to be good guides to theory development. The most often used criteria of beauty in the foundations of physics are presently simplicity and naturalness.

    By “simplicity,” I don’t mean relative simplicity, the idea that the simplest theory is the best (a.k.a. “Occam’s razor”). Relying on relative simplicity is good scientific practice. The desire that a theory be simple in absolute terms, in contrast, is a criterion from beauty: There is no deep reason that the laws of nature should be simple. In the foundations of physics, this desire for absolute simplicity presently shows in physicists’ hope for unification or, if you push it one level further, in the quest for a “Theory of Everything” that would merge the three forces of the Standard Model with gravity.

    The other criterion of beauty, naturalness, requires that pure numbers that appear in a theory (i.e., those without units) should neither be very large nor very small; instead, these numbers should be close to one. Exactly how close these numbers should be to one is debatable, which is already an indicator of the non-scientific nature of this argument. Indeed, the inability of particle physicists to quantify just when a lack of naturalness becomes problematic highlights that the fact that an unnatural theory is utterly unproblematic. It is just not beautiful.

    Anyone who has a look at the literature of the foundations of physics will see that relying on such arguments from beauty has been a major current in the field for decades. It has been propagated by big players in the field, including Steven Weinberg, Frank Wilczek, Edward Witten, Murray Gell-Mann, and Sheldon Glashow. Countless books popularized the idea that the laws of nature should be beautiful, written, among others, by Brian Greene, Dan Hooper, Gordon Kane, and Anthony Zee. Indeed, this talk about beauty has been going on for so long that at this point it seems likely most people presently in the field were attracted by it in the first place. Little surprise, then, they can’t seem to let go of it.

    Trouble is, relying on beauty as a guide to new laws of nature is not working.

    Since the 1980s, dozens of experiments looked for evidence of unified forces and supersymmetric particles, and other particles invented to beautify the Standard Model. Physicists have conjectured hundreds of hypothetical particles, from “gluinos” and “wimps” to “branons” and “cuscutons,” each of which they invented to remedy a perceived lack of beauty in the existing theories. These particles are supposed to aid beauty, for example, by increasing the amount of symmetries, by unifying forces, or by explaining why certain numbers are small. Unfortunately, not a single one of those particles has ever been seen. Measurements have merely confirmed the Standard Model over and over again. And a theory of everything, if it exists, is as elusive today as it was in the 1970s. The Large Hadron Collider is only the most recent in a long series of searches that failed to confirm those beauty-based predictions.

    These decades of failure show that postulating new laws of nature just because they are beautiful according to human standards is not a good way to put forward scientific hypotheses. It’s not the first time this has happened. Historical precedents are not difficult to find. Relying on beauty did not work for Kepler’s Platonic solids, it did not work for Einstein’s idea of an eternally unchanging universe, and it did not work for the oh-so-pretty idea, popular at the end of the 19th century, that atoms are knots in an invisible ether. All of these theories were once considered beautiful, but are today known to be wrong. Physicists have repeatedly told me about beautiful ideas that didn’t turn out to be beautiful at all. Such hindsight is not evidence that arguments from beauty work, but rather that our perception of beauty changes over time.

    That beauty is subjective is hardly a breakthrough insight, but physicists are slow to learn the lesson—and that has consequences. Experiments that test ill-motivated hypotheses are at high risk to only find null results; i.e., to confirm the existing theories and not see evidence of new effects. This is what has happened in the foundations of physics for 40 years now. And with the new LHC results, it happened once again.

    The data analyzed so far shows no evidence for supersymmetric particles, extra dimensions, or any other physics that would not be compatible with the Standard Model. In the past two years, particle physicists were excited about an anomaly in the interaction rates of different leptons. The Standard Model predicts these rates should be identical, but the data demonstrates a slight difference. This “lepton anomaly” has persisted in the new data, but—against particle physicists’ hopes—it did not increase in significance, is hence not a sign for new particles. The LHC collaborations succeeded in measuring the violation of symmetry in the decay of composite particles called “D-mesons,” but the measured effect is, once again, consistent with the Standard Model. The data stubbornly repeat: Nothing new to see here.

    Of course it’s possible there is something to find in the data yet to be analyzed. But at this point we already know that all previously made predictions for new physics were wrong, meaning that there is now no reason to expect anything new to appear.

    Yes, null results—like the recent LHC measurements—are also results. They rule out some hypotheses. But null results are not very useful results if you want to develop a new theory. A null-result says: “Let’s not go this way.” A result says: “Let’s go that way.” If there are many ways to go, discarding some of them does not help much.

    To find the way forward in the foundations of physics, we need results, not null-results. When testing new hypotheses takes decades of construction time and billions of dollars, we have to be careful what to invest in. Experiments have become too costly to rely on serendipitous discoveries. Beauty-based methods have historically not worked. They still don’t work. It’s time that physicists take note.

    And it’s not like the lack of beauty is the only problem with the current theories in the foundations of physics. There are good reasons to think physics is not done. The Standard Model cannot be the last word, notably because it does not contain gravity and fails to account for the masses of neutrinos. It also describes neither dark matter nor dark energy, which are necessary to explain galactic structures.

    So, clearly, the foundations of physics have problems that require answers. Physicists should focus on those. And we currently have no reason to think that colliding particles at the next higher energies will help solve any of the existing problems. New effects may not appear until energies are a billion times higher than what even the next larger collider could probe. To make progress, then, physicists must, first and foremost, learn from their failed predictions.

    So far, they have not. In 2016, the particle physicists Howard Baer, Vernon Barger, and Jenny List wrote an essay for Scientific American arguing that we need a larger particle collider to “save physics.” The reason? A theory the authors had proposed themselves, that is natural (beautiful!) in a specific way, predicts such a larger collider should see new particles. This March, Kane, a particle physicist, used similar beauty-based arguments in an essay for Physics Today. And a recent comment in Nature Reviews Physics about a big, new particle collider planned in Japan once again drew on the same motivations from naturalness that have already not worked for the LHC. Even the particle physicists who have admitted their predictions failed do not want to give up beauty-based hypotheses. Instead, they have argued we need more experiments to test just how wrong they are.

    Will this latest round of null-results finally convince particle physicists that they need new methods of theory-development? I certainly hope so.

    As an ex-particle physicist myself, I understand very well the desire to have an all-encompassing theory for the structure of matter. I can also relate to the appeal of theories such a supersymmetry or string theory. And, yes, I quite like the idea that we live in one of infinitely many universes that together make up the “multiverse.” But, as the latest LHC results drive home once again, the laws of nature care heartily little about what humans find beautiful.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 11:37 am on April 16, 2019 Permalink | Reply
    Tags: , , CERN LHC, , , , ,   

    From Symmetry: “A collision of light” 

    Symmetry Mag
    From Symmetry

    04/16/19
    Sarah Charley

    1
    Natasha Hartono

    One of the latest discoveries from the LHC takes the properties of photons beyond what your electrodynamics teacher will tell you in class.

    Professor Anne Sickles is currently teaching a laboratory class at the University of Illinois in which her students will measure what happens when two photons meet.

    What they will find is that the overlapping waves of light get brighter when two peaks align and dimmer when a peak meets a trough. She tells her students that this is process called interference, and that—unlike charged particles, which can merge, bond and interact—light waves can only add or subtract.

    “We teach undergraduates the classical theory,” Sickles says. “But there are situations where effects forbidden in the classical theory are allowed in the quantum theory.”

    Sickles is a collaborator on the ATLAS experiment at CERN and studies what happens when particles of light meet inside the Large Hadron Collider.

    CERN ATLAS Credit CERN SCIENCE PHOTO LIBRARY

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    For most of the year, the LHC collides protons, but for about a month each fall, the LHC switches things up and collides heavy atomic nuclei, such as lead ions. The main purpose of these lead collisions is to study a hot and dense subatomic fluid called the quark-gluon plasma, which is harder to create in collisions of protons. But these ion runs also enable scientists to turn the LHC into a new type of machine: a photon-photon collider.

    “This result demonstrates that photons can scatter off each other and change each other’s direction,” says Peter Steinberg, and ATLAS scientist at Brookhaven National Laboratory.

    When heavy nuclei are accelerated in the LHC, they are encased within an electromagnetic aura generated by their large positive charges.

    As the nuclei travel faster and faster, their surrounding fields are squished into disks, making them much more concentrated. When two lead ions pass closely enough that their electromagnetic fields swoosh through one another, the high-energy photons which ultimately make up these fields can interact. In rare instances, a photon from one lead ion will merge with a photon from an oncoming lead ion, and they will ricochet in different directions.

    However, according to Steinberg, it’s not as simple as two solid particles bouncing off each other. Light particles are both chargeless and massless, and must go through a quantum mechanical loophole (literally called a quantum loop) to interact with one another.

    “That’s why this process is so rare,” he says. “They have no way to bounce off of each other without help.”

    When the two photons see each other inside the LHC, they sometimes overreact with excitement and split themselves into an electron and positron pair. These electron-positron pairs are not fully formed entities, but rather unstable quantum fluctuations that scientists call virtual particles. The four virtual particles swirl into each other and recombine to form two new photons, which scatter off at weird angles into the detector.

    “It’s like a quantum-mechanical square dance,” Steinberg says.

    When ATLAS first saw hints of this process in 2017, they had only 13 candidate events with the correct characteristics (collisions that resulted in two low-energy photons inside the detector and nothing else).

    After another two years of data taking, they have now collected 59 candidate events, bumping this original observation into the statistical certainty of a full-fledged discovery.

    Steinberg sees this discovery as a big win for quantum electrodynamics, a theory about the quantum behavior of light that predicted this interaction. “This amazingly precise theory, which was developed in the first half of the 20th century, made a prediction that we are finally able to confirm many decades later.”

    Sickles says she is looking forward to exploring these kinds of light-by-light interactions and figuring out what else they could teach us about the laws of physics. “It’s one thing to see something,” she says. “It’s another thing to study it.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 11:26 am on April 12, 2019 Permalink | Reply
    Tags: , CERN LHC, , , , , ,   

    From Fermi National Accelerator Lab: “Quarks, squarks, stops and charm at this year’s Moriond conference” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    April 11, 2019
    Don Lincoln

    1
    Fermilab RAs Kevin Pedro and Nadja Strobbe presented a variety of CMS and ATLAS research results at the 53rd annual Recontres de Moriond conference.

    This March, scientists from around the world gathered in LaThuile, Italy, for the 53rd annual Recontres de Moriond conference, one of the longest running and most prestigious conferences in particle physics. This conference is broken into two distinct weeks, with the first week usually covering electroweak physics and the second covering processes involving quantum chromodynamics. Fermilab and the LHC Physics Center were well represented at the conference.

    Fermilab research associates Kevin Pedro and Nadja Strobbe from the CMS group both presented talks on LHC physics result. Pedro spoke on searches for new physics with unconventional signatures at both the ATLAS and CMS experiments. The interest in unusual signatures is driven by the fact that many researchers have already searched for more commonly accepted physical processes. Looking for the unconventional opens up the possibility of unanticipated discoveries. Pedro covered long-lived particles emerging from a complex dark matter sector. The signature for this possible physics result is a jet that originates far from the interaction vertex. He also covered long-lived particles that disappear in the detector. This is a signature for a form of supersymmetry.

    Strobbe presented a thorough overview of searches for strong-force-produced signatures of supersymmetry. She covered both ATLAS and CMS results, covering a broad range of signatures, including the associated production of b quarks and Higgs bosons, diphotons, several stop squark analyses, and the associated production of three bottom quarks and missing transverse momentum. In total, she presented 12 distinct analyses. The phenomenology of strong-force-produced supersymmetry is diverse, and it provides a rich source for the possible discovery of new physics. This is Strobbe’s last Moriond presentation as a Fermilab research associate, as she has recently accepted a faculty position at the University of Minnesota, where she will be starting in the fall.

    Strobbe and Pedro were not the only people associated with the LHC Physics Center presenting or involved at Moriond. Fermilab Senior Scientist Boaz Klima has long been a member of the organizing committee. Meng Xiao (Johns Hopkins) and Greg Landsberg (Brown) also presented.

    More broadly, many interesting physics topics were covered at the conference. The LHCb experiment announced the discovery of new pentaquarks containing charm quarks. They also reported that a peak in the data that was previously thought to be a single pentaquark was actually two distinct particles. Studies of mesons containing both bottom and charm quarks were very well-represented, with ATLAS, CMS and LHCb all making presentations. In the first week of the Moriond conference, both ATLAS and LHCb announced studies in the matter-antimatter asymmetry in decays of mesons containing both bottom and strange quarks. And in an example of very quick inter-collaboration cooperation, the experiments presented a combined result in the second week.

    While the LHC is best known for colliding two beams of protons (studies of which were well represented at Moriond), the LHC also collides lead ions to study the behavior of superhot quark matter – what is called quark-gluon plasma. ALICE presented studies of charmed mesons called J/psi, which showed that charm quarks are affected in quark-gluon plasmas, just like lighter quarks. The ALICE experiment presented data gathered in a special run of proton-proton collisions at an energy unusual for the LHC an observation of charmed baryons in LHC collisions. These particles occur more often in proton-proton collisions than in electron-positron ones.

    The Moriond conference is a fascinating one. It is small and cozy and allows for conversations and collaboration between researchers, with a storied history of over half a century. In its 53rd year, researchers are showing that its second half century will be just as exciting.

    Don Lincoln is a Fermilab scientist on the CMS experiment.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world collaborate at Fermilab on experiments at the frontiers of discovery.

    FNAL MINERvA front face Photo Reidar Hahn

    FNAL DAMIC

    FNAL Muon g-2 studio

    FNAL Short-Baseline Near Detector under construction

    FNAL Mu2e solenoid

    Dark Energy Camera [DECam], built at FNAL

    FNAL DUNE Argon tank at SURF

    FNAL/MicrobooNE

    FNAL Don Lincoln

    FNAL/MINOS

    FNAL Cryomodule Testing Facility

    FNAL MINOS Far Detector in the Soudan Mine in northern Minnesota

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    FNAL/NOvA experiment map

    FNAL NOvA Near Detector

    FNAL ICARUS

    FNAL Holometer

     
  • richardmitnick 11:37 am on April 1, 2019 Permalink | Reply
    Tags: "Highlights from the 2019 Moriond conference (electroweak physics)", , , CERN LHC, , , ,   

    From CERN: “Highlights from the 2019 Moriond conference (electroweak physics)” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    29 March, 2019

    The latest experimental data provide more stringent tests of the Standard Model and of rare phenomena of the microworld.

    At the 66th Rencontres de Moriond conference, which is taking place in La Thuile, Italy, physicists working at CERN are presenting their most recent results. Since the start of the conference on 16 March, a wide range of topics from measurements of the Higgs boson and Standard Model processes to searches for rare and exotic phenomena have been presented.

    The Standard Model of particle physics is a successful theory that describes how elementary particles and forces govern the properties of the Universe, but it is incomplete as it cannot explain certain phenomena, such as gravity, dark matter and dark energy.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    For this reason, physicists welcome any measurement that shows discrepancies with the Standard Model, as these give hints of new particles and new forces – of new physics, in other words. At the conference, the ATLAS and CMS collaborations have presented new results based on up to 140fb–1 of proton-proton collision data collected during Run 2 of the Large Hadron Collider (LHC) from 2015 to 2018. Many of these analyses benefited from novel machine-learning techniques used to extract data from background processes.

    Since the discovery of the Higgs boson in 2012, ATLAS and CMS physicists have made significant progress in understanding its properties, how it is formed and how it interacts with other known particles.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    Thanks to the large quantity of Higgs bosons produced in the collisions of Run 2, the collaborations were able to measure most of the Higgs boson’s main production and decay modes with a statistical significance far exceeding five standard deviations. In addition, many searches for new, additional Higgs bosons have been presented. From a combination of all Higgs boson measurements, ATLAS obtained new constraints on the Higgs self-coupling. CMS has presented updated results on the Higgs decay to two Z bosons and has also derived new information on the strength of the interaction between Higgs bosons and top quarks. This interaction is measured in two ways, using top quark pairs and using a rare process in which four top quarks are produced. The probability of four top quarks being produced at the LHC is about a factor of ten less likely than the production of Higgs bosons together with two top quarks, and about a factor of ten thousand less likely than the production of just a top quark pair.

    3
    ATLAS event display showing the clean signature of light-by-light scattering (Image: ATLAS/CERN)

    The ATLAS collaboration has also reported first evidence for the simultaneous production of three W or Z bosons, which are the mediator particles of the weak force. Tri-boson production is a rare process predicted by the Standard Model, and is sensitive to possible contributions from yet unknown particles or forces. The very large new dataset has also been used by the ATLAS and CMS collaborations to expand the searches for new particles beyond the Standard Model at the energy available at the LHC. One of the possible theories is supersymmetry, an extension of the Standard Model, which features a symmetry between matter and force and introduces many new particles, including possible candidates for dark matter. These hypothetical particles have not been detected in experiments so far, and the collaborations have set stronger lower limits on the possible range of masses that they could have.

    4
    A collision event recorded by CMS, containing a missing-transverse-energy signature, which is one of the characteristics sought in the search for SUSY (Image: CMS/CERN)

    The CMS collaboration has placed new limits on the parameters of new physics theories that describe hypothetical slowly moving heavy particles. These are detected by measuring how fast particles travel through the detector: while the regular particles propagate at speeds close to that of light, straight from the proton collisions, these heavy particles are expected to move measurably slower before decaying into a shower of other particles, creating a “delayed jet”. CMS has also presented first evidence for another rare process, the production of two W bosons in not one but two simultaneous interactions between the constituents of the colliding protons.

    In addition, ATLAS and CMS have presented new studies on the search for hypothetical Z′ (Z-prime) bosons. The existence of such neutral heavy particles is predicted by certain Grand Unified theories that could provide an elegant extension of the Standard Model. Although no significant signs of Z′ particles have been observed thus far, the results provide constraints on their production rate.

    The LHCb collaboration has presented several new measurements concerning particles containing beauty or charm quarks. Certain properties of these particles can be affected by the existence of new particles beyond the Standard Model. This allows LHCb to search for signs of new physics via a complementary, indirect route. One much anticipated result, shown for the first time at the conference, is a measurement using data taken from 2011 to 2016 of the ratio of two related rare decays of a B+ particle. These decays are predicted in the Standard Model to occur at the same rate to within 1%; the data collected are consistent with this prediction but favour a lower value. This follows a pattern of intriguing hints in other, similar decay processes; while none of these results are significant enough to constitute evidence of new physics on their own, they have captured the interest of physicists and will be investigated further with the full LHCb data set. LHCb also presented the first observation of matter–antimatter asymmetry known as CP violation in charm particle decays, as reported in a dedicated press release last week.

    Finally, using the results of lead-ion collisions taken in 2018, the ATLAS collaboration has been able to clearly observe a very rare phenomenon in which two photons – particles of light – interact, producing another pair of photons, with a significance of over 8 standard deviations. This process was among the earliest predictions of quantum electrodynamics (QED), the quantum theory of electromagnetism, and is forbidden by Maxwell’s classical theory of electrodynamics.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New
    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

    OTHER PROJECTS AT CERN

    CERN AEGIS

    CERN ALPHA

    CERN ALPHA


    CERN ALPHA-g Detector

    CERN ALPHA-g Detector


    CERN AMS

    CERN ACACUSA

    CERN ASACUSA

    CERN ATRAP

    CERN ATRAP

    CERN AWAKE

    CERN AWAKE

    CERN CAST

    CERN CAST Axion Solar Telescope

    CERN CLOUD

    CERN CLOUD

    CERN COMPASS

    CERN COMPASS

    CERN DIRAC

    CERN DIRAC

    CERN GBAR

    CERN GBAR

    CERN ISOLDE

    CERN ISOLDE

    CERN LHCf

    CERN LHCf

    CERN NA62

    CERN NA62

    CERN NTOF

    CERN TOTEM

    CERN UA9

    CERN Proto Dune

    CERN Proto Dune

     
  • richardmitnick 9:54 am on March 20, 2019 Permalink | Reply
    Tags: "Report reveals full reach of LHC programme", , , CERN LHC, , , ,   

    From CERN: “Report reveals full reach of LHC programme” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    19 March, 2019
    Matthew Chalmers

    1
    The excavation of the two new shafts for the HL-LHC at points 1 and 5 of the accelerator has recently been completed. © Antonino Panté, Reproduced with permission.

    The High-Luminosity LHC (HL-LHC), scheduled to operate from 2026, will increase the instantaneous luminosity of the LHC by at least a factor of five beyond its initial design luminosity. The analysis of a fraction of the data already delivered by the LHC – a mere 6% of what is expected by the end of HL-LHC in the late-2030s – led to the discovery of the Higgs boson and a diverse set of measurements and searches that have been documented in some 2000 physics papers published by the LHC experiments. “Although the HL-LHC is an approved and funded project, its physics programme evolves with scientific developments and also with the physics programmes planned at future colliders,” says Aleandro Nisati of ATLAS, who is a member of the steering group for a new report quantifying the HL-LHC physics potential.

    The 1000+ page report, published in January, contains input from more than 1000 experts from the experimental and theory communities. It stems from an initial workshop at CERN held in late 2017 (CERN Courier January/February 2018 p44) and also addresses the physics opportunities at a proposed high-energy upgrade (HE-LHC). Working groups have carried out hundreds of projections for physics measurements within the extremely challenging HL-LHC collision environment, taking into account the expected evolution of the theoretical landscape in the years ahead. In addition to their experience with LHC data analysis, the report factors in the improvements expected from the newly upgraded detectors and the likelihood that new analysis techniques will be developed. “A key aspect of this report is the involvement of the whole LHC community, working closely together to ensure optimal scientific progress,” says theorist and steering-group member Michelangelo Mangano.

    Physics streams

    The physics programme has been distilled into five streams: Standard Model (SM), Higgs, beyond the SM, flavour and QCD matter at high density.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    The LHC results so far have confirmed the validity of the SM up to unprecedented energy scales and with great precision in the strong, electroweak and flavour sectors. Thanks to a 10-fold larger data set, the HL-LHC will probe the SM with even greater precision, give access to previously unseen rare processes, and will extend the experiments’ sensitivity to new physics in direct and indirect searches for processes with low-production cross sections and more elusive signatures. The precision of key measurements, such as the coupling of the Higgs boson to SM particles, is expected to reach the percent level, where effects of new physics could be seen. The experimental uncertainty on the top-quark mass will be reduced to a few hundred MeV, and vector-boson scattering – recently observed in LHC data – will be studied with an accuracy of a few percent using various diboson processes.

    The 2012 discovery of the Higgs boson opens brand-new studies of its properties, the SM in general, and of possible physics beyond the SM. Outstanding opportunities have emerged for measurements of fundamental importance at the HL-LHC, such as the first direct constraints on the Higgs trilinear self-coupling and the natural width. The experience of LHC Run 2 has led to an improved understanding of the HL-LHC’s ability to probe Higgs pair production, a key measure of its self-interaction, with a projected combined ATLAS and CMS sensitivity of four standard deviations. In addition to significant improvements on the precision of Higgs-boson measurements, the HL-LHC will improve searches for heavier Higgs bosons motivated by theories beyond the SM and will be able to probe very rare exotic decay modes thanks to the huge dataset expected.

    The new report considers a large variety of new-physics models that can be probed at HL-LHC. In addition to searches for new heavy resonances and supersymmetry models, it includes results on dark matter and dark sectors, long-lived particles, leptoquarks, sterile neutrinos, axion-like particles, heavy scalars, vector-like quarks, and more. “Particular attention is placed on the potential opened by the LHC detector upgrades, the assessment of future systematic uncertainties, and new experimental techniques,” says steering-group member Andreas Meyer of CMS. “In addition to extending the present LHC mass and coupling reach by 20–50% for most new-physics scenarios, the HL-LHC will be able to potentially discover, or constrain, new physics that is not in reach of the current LHC dataset.”

    Pushing for precision

    The flavour-physics programme at the HL-LHC comprises many different probes – the weak decays of beauty, charm, strange and top quarks, as well as of the τ lepton and the Higgs boson – in which the experiments can search for signs of new physics. ATLAS and CMS will push the measurement precision of Higgs couplings and search for rare top decays, while the proposed second phase of the LHCb upgrade will greatly enhance the sensitivity with a range of beauty-, charm-, and strange-hadron probes. “It’s really exciting to see the full potential of the HL-LHC as a facility for precision flavour physics,” says steering-group member Mika Vesterinen of LHCb. “The projected experimental advances are also expected to be accompanied by improvements in theory, enhancing the current mass-reach on new physics by a factor as large as four.”

    Finally, the report identifies four major scientific goals for future high-density QCD studies at the LHC, including detailed characterisation of the quark–gluon plasma and its underlying parton dynamics, the development of a unified picture of particle production, and QCD dynamics from small to large systems. To address these goals, high-luminosity lead–lead and proton–lead collision programmes are considered as priorities, while high-luminosity runs with intermediate-mass nuclei such as argon could extend the heavy-ion programme at the LHC into the HL-LHC phase.

    High-energy considerations

    High Energy LHC (HE-LHC)

    One of the proposed options for a future collider at CERN is the HE-LHC, a new pp collider in the LHC ring with CM energy in the range of 27 TeV, which would occupy the same tunnel but be built from advanced high-field dipole magnets that could support roughly double the LHC’s energy. Such a machine would be expected to deliver an integrated proton–proton luminosity of 15,000 fb–1 at a centre-of-mass energy of 27 TeV, increasing the discovery mass-reach beyond anything possible at the HL-LHC. The HE-LHC would provide precision access to rare Higgs boson (H) production modes, with approximately a 2% uncertainty on the ttH coupling, as well as an unambiguous observation of the HH signal and a precision of about 20% on the trilinear coupling. An HE-LHC would enable a heavy new Z´ gauge boson discovered at the HL-LHC to be studied in detail, and in general double the discovery reach of the HL-LHC to beyond 10 TeV.

    The HL/HE-LHC reports were submitted to the European Strategy for Particle Physics Update in December 2018, and are also intended to bring perspective to the physics potential of future projects beyond the LHC. “We now have a better sense of our potential to characterise the Higgs boson, hunt for new particles and make Standard Model measurements that restrict the opportunities for new physics to hide,” says Mangano. “This report has made it clear that these planned 3000 fb–1 of data from HL-LHC, and much more in the case of a future HE-LHC, will play a central role in particle physics for decades to come.”

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

     
  • richardmitnick 1:10 pm on March 10, 2019 Permalink | Reply
    Tags: A quantum computer would greatly speed up analysis of the collisions hopefully finding evidence of supersymmetry much sooner—or at least allowing us to ditch the theory and move on., And they’ve been waiting for decades. Google is in the race as are IBM Microsoft Intel and a clutch of startups academic groups and the Chinese government., , At the moment researchers spend weeks and months sifting through the debris from proton-proton collisions in the LCH trying to find exotic heavy sister-particles to all our known particles of matter., “This is a marathon” says David Reilly who leads Microsoft’s quantum lab at the University of Sydney Australia. “And it's only 10 minutes into the marathon.”, , CERN LHC, , For CERN the quantum promise could for instance help its scientists find evidence of supersymmetry or SUSY which so far has proven elusive., , IBM has steadily been boosting the number of qubits on its quantum computers starting with a meagre 5-qubit computer then 16- and 20-qubit machines and just recently showing off its 50-qubit processor, In a bid to make sense of the impending data deluge some at CERN are turning to the emerging field of quantum computing., In a quantum computer each circuit can have one of two values—either one (on) or zero (off) in binary code; the computer turns the voltage in a circuit on or off to make it work., In theory a quantum computer would process all the states a qubit can have at once and with every qubit added to its memory size its computational power should increase exponentially., Last year physicists from the California Institute of Technology in Pasadena and the University of Southern California managed to replicate the discovery of the Higgs boson found at the LHC in 2012, None of the competing teams have come close to reaching even the first milestone., , , , The quest has now lasted decades and a number of physicists are questioning if the theory behind SUSY is really valid., Traditional computers—be it an Apple Watch or the most powerful supercomputer—rely on tiny silicon transistors that work like on-off switches to encode bits of data., Venture capitalists invested some $250 million in various companies researching quantum computing in 2018 alone.,   

    From WIRED: “Inside the High-Stakes Race to Make Quantum Computers Work” 

    Wired logo

    From WIRED

    03.08.19
    Katia Moskvitch

    1
    View Pictures/Getty Images

    Deep beneath the Franco-Swiss border, the Large Hadron Collider is sleeping.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    But it won’t be quiet for long. Over the coming years, the world’s largest particle accelerator will be supercharged, increasing the number of proton collisions per second by a factor of two and a half.

    Once the work is complete in 2026, researchers hope to unlock some of the most fundamental questions in the universe. But with the increased power will come a deluge of data the likes of which high-energy physics has never seen before. And, right now, humanity has no way of knowing what the collider might find.

    To understand the scale of the problem, consider this: When it shut down in December 2018, the LHC generated about 300 gigabytes of data every second, adding up to 25 petabytes (PB) annually. For comparison, you’d have to spend 50,000 years listening to music to go through 25 PB of MP3 songs, while the human brain can store memories equivalent to just 2.5 PB of binary data. To make sense of all that information, the LHC data was pumped out to 170 computing centers in 42 countries [http://greybook.cern.ch/]. It was this global collaboration that helped discover the elusive Higgs boson, part of the Higgs field believed to give mass to elementary particles of matter.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    To process the looming data torrent, scientists at the European Organization for Nuclear Research, or CERN, will need 50 to 100 times more computing power than they have at their disposal today. A proposed Future Circular Collider, four times the size of the LHC and 10 times as powerful, would create an impossibly large quantity of data, at least twice as much as the LHC.

    CERN FCC Future Circular Collider map

    In a bid to make sense of the impending data deluge, some at CERN are turning to the emerging field of quantum computing. Powered by the very laws of nature the LHC is probing, such a machine could potentially crunch the expected volume of data in no time at all. What’s more, it would speak the same language as the LHC. While numerous labs around the world are trying to harness the power of quantum computing, it is the future work at CERN that makes it particularly exciting research. There’s just one problem: Right now, there are only prototypes; nobody knows whether it’s actually possible to build a reliable quantum device.

    Traditional computers—be it an Apple Watch or the most powerful supercomputer—rely on tiny silicon transistors that work like on-off switches to encode bits of data.

    ORNL IBM AC922 SUMMIT supercomputer, No.1 on the TOP500. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    Each circuit can have one of two values—either one (on) or zero (off) in binary code; the computer turns the voltage in a circuit on or off to make it work.

    A quantum computer is not limited to this “either/or” way of thinking. Its memory is made up of quantum bits, or qubits—tiny particles of matter like atoms or electrons. And qubits can do “both/and,” meaning that they can be in a superposition of all possible combinations of zeros and ones; they can be all of those states simultaneously.

    For CERN, the quantum promise could, for instance, help its scientists find evidence of supersymmetry, or SUSY, which so far has proven elusive.

    Standard Model of Supersymmetry via DESY

    At the moment, researchers spend weeks and months sifting through the debris from proton-proton collisions in the LCH, trying to find exotic, heavy sister-particles to all our known particles of matter. The quest has now lasted decades, and a number of physicists are questioning if the theory behind SUSY is really valid. A quantum computer would greatly speed up analysis of the collisions, hopefully finding evidence of supersymmetry much sooner—or at least allowing us to ditch the theory and move on.

    A quantum device might also help scientists understand the evolution of the early universe, the first few minutes after the Big Bang. Physicists are pretty confident that back then, our universe was nothing but a strange soup of subatomic particles called quarks and gluons. To understand how this quark-gluon plasma has evolved into the universe we have today, researchers simulate the conditions of the infant universe and then test their models at the LHC, with multiple collisions. Performing a simulation on a quantum computer, governed by the same laws that govern the very particles that the LHC is smashing together, could lead to a much more accurate model to test.

    Beyond pure science, banks, pharmaceutical companies, and governments are also waiting to get their hands on computing power that could be tens or even hundreds of times greater than that of any traditional computer.

    And they’ve been waiting for decades. Google is in the race, as are IBM, Microsoft, Intel and a clutch of startups, academic groups, and the Chinese government. The stakes are incredibly high. Last October, the European Union pledged to give $1 billion to over 5,000 European quantum technology researchers over the next decade, while venture capitalists invested some $250 million in various companies researching quantum computing in 2018 alone. “This is a marathon,” says David Reilly, who leads Microsoft’s quantum lab at the University of Sydney, Australia. “And it’s only 10 minutes into the marathon.”

    Despite the hype surrounding quantum computing and the media frenzy triggered by every announcement of a new qubit record, none of the competing teams have come close to reaching even the first milestone, fancily called quantum supremacy—the moment when a quantum computer performs at least one specific task better than a standard computer. Any kind of task, even if it is totally artificial and pointless. There are plenty of rumors in the quantum community that Google may be close, although if true, it would give the company bragging rights at best, says Michael Biercuk, a physicist at the University of Sydney and founder of quantum startup Q-CTRL. “It would be a bit of a gimmick—an artificial goal,” says Reilly “It’s like concocting some mathematical problem that really doesn’t have an obvious impact on the world just to say that a quantum computer can solve it.”

    That’s because the first real checkpoint in this race is much further away. Called quantum advantage, it would see a quantum computer outperform normal computers on a truly useful task. (Some researchers use the terms quantum supremacy and quantum advantage interchangeably.) And then there is the finish line, the creation of a universal quantum computer. The hope is that it would deliver a computational nirvana with the ability to perform a broad range of incredibly complex tasks. At stake is the design of new molecules for life-saving drugs, helping banks to adjust the riskiness of their investment portfolios, a way to break all current cryptography and develop new, stronger systems, and for scientists at CERN, a way to glimpse the universe as it was just moments after the Big Bang.

    Slowly but surely, work is already underway. Federico Carminati, a physicist at CERN, admits that today’s quantum computers wouldn’t give researchers anything more than classical machines, but, undeterred, he’s started tinkering with IBM’s prototype quantum device via the cloud while waiting for the technology to mature. It’s the latest baby step in the quantum marathon. The deal between CERN and IBM was struck in November last year at an industry workshop organized by the research organization.

    Set up to exchange ideas and discuss potential collab­orations, the event had CERN’s spacious auditorium packed to the brim with researchers from Google, IBM, Intel, D-Wave, Rigetti, and Microsoft. Google detailed its tests of Bristlecone, a 72-qubit machine. Rigetti was touting its work on a 128-qubit system. Intel showed that it was in close pursuit with 49 qubits. For IBM, physicist Ivano Tavernelli took to the stage to explain the company’s progress.

    IBM has steadily been boosting the number of qubits on its quantum computers, starting with a meagre 5-qubit computer, then 16- and 20-qubit machines, and just recently showing off its 50-qubit processor.

    IBM iconic image of Quantum computer

    Carminati listened to Tavernelli, intrigued, and during a much needed coffee break approached him for a chat. A few minutes later, CERN had added a quantum computer to its impressive technology arsenal. CERN researchers are now starting to develop entirely new algorithms and computing models, aiming to grow together with the device. “A fundamental part of this process is to build a solid relationship with the technology providers,” says Carminati. “These are our first steps in quantum computing, but even if we are coming relatively late into the game, we are bringing unique expertise in many fields. We are experts in quantum mechanics, which is at the base of quantum computing.”

    The attraction of quantum devices is obvious. Take standard computers. The prediction by former Intel CEO Gordon Moore in 1965 that the number of components in an integrated circuit would double roughly every two years has held true for more than half a century. But many believe that Moore’s law is about to hit the limits of physics. Since the 1980s, however, researchers have been pondering an alternative. The idea was popularized by Richard Feynman, an American physicist at Caltech in Pasadena. During a lecture in 1981, he lamented that computers could not really simulate what was happening at a subatomic level, with tricky particles like electrons and photons that behave like waves but also dare to exist in two states at once, a phenomenon known as quantum superposition.

    Feynman proposed to build a machine that could. “I’m not happy with all the analyses that go with just the classical theory, because nature isn’t classical, dammit,” he told the audience back in 1981. “And if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy.”

    And so the quantum race began. Qubits can be made in different ways, but the rule is that two qubits can be both in state A, both in state B, one in state A and one in state B, or vice versa, so there are four probabilities in total. And you won’t know what state a qubit is at until you measure it and the qubit is yanked out of its quantum world of probabilities into our mundane physical reality.

    In theory, a quantum computer would process all the states a qubit can have at once, and with every qubit added to its memory size, its computational power should increase exponentially. So, for three qubits, there are eight states to work with simultaneously, for four, 16; for 10, 1,024; and for 20, a whopping 1,048,576 states. You don’t need a lot of qubits to quickly surpass the memory banks of the world’s most powerful modern supercomputers—meaning that for specific tasks, a quantum computer could find a solution much faster than any regular computer ever would. Add to this another crucial concept of quantum mechanics: entanglement. It means that qubits can be linked into a single quantum system, where operating on one affects the rest of the system. This way, the computer can harness the processing power of both simultaneously, massively increasing its computational ability.

    While a number of companies and labs are competing in the quantum marathon, many are running their own races, taking different approaches. One device has even been used by a team of researchers to analyze CERN data, albeit not at CERN. Last year, physicists from the California Institute of Technology in Pasadena and the University of Southern California managed to replicate the discovery of the Higgs boson, found at the LHC in 2012, by sifting through the collider’s troves of data using a quantum computer manufactured by D-Wave, a Canadian firm based in Burnaby, British Columbia. The findings didn’t arrive any quicker than on a traditional computer, but, crucially, the research showed a quantum machine could do the work.

    One of the oldest runners in the quantum race, D-Wave announced back in 2007 that it had built a fully functioning, commercially available 16-qubit quantum computer prototype—a claim that’s controversial to this day. D-Wave focuses on a technology called quantum annealing, based on the natural tendency of real-world quantum systems to find low-energy states (a bit like a spinning top that inevitably will fall over). A D-Wave quantum computer imagines the possible solutions of a problem as a landscape of peaks and valleys; each coordinate represents a possible solution and its elevation represents its energy. Annealing allows you to set up the problem, and then let the system fall into the answer—in about 20 milliseconds. As it does so, it can tunnel through the peaks as it searches for the lowest valleys. It finds the lowest point in the vast landscape of solutions, which corresponds to the best possible outcome—although it does not attempt to fully correct for any errors, inevitable in quantum computation. D-Wave is now working on a prototype of a universal annealing quantum computer, says Alan Baratz, the company’s chief product officer.

    Apart from D-Wave’s quantum annealing, there are three other main approaches to try and bend the quantum world to our whim: integrated circuits, topological qubits and ions trapped with lasers. CERN is placing high hopes on the first method but is closely watching other efforts too.

    IBM, whose computer Carminati has just started using, as well as Google and Intel, all make quantum chips with integrated circuits—quantum gates—that are superconducting, a state when certain metals conduct electricity with zero resistance. Each quantum gate holds a pair of very fragile qubits. Any noise will disrupt them and introduce errors—and in the quantum world, noise is anything from temperature fluctuations to electromagnetic and sound waves to physical vibrations.

    To isolate the chip from the outside world as much as possible and get the circuits to exhibit quantum mechanical effects, it needs to be supercooled to extremely low temperatures. At the IBM quantum lab in Zurich, the chip is housed in a white tank—a cryostat—suspended from the ceiling. The temperature inside the tank is a steady 10 millikelvin or –273 degrees Celsius, a fraction above absolute zero and colder than outer space. But even this isn’t enough.

    Just working with the quantum chip, when scientists manipulate the qubits, causes noise. “The outside world is continually interacting with our quantum hardware, damaging the information we are trying to process,” says physicist John Preskill at the California Institute of Technology, who in 2012 coined the term quantum supremacy. It’s impossible to get rid of the noise completely, so researchers are trying to suppress it as much as possible, hence the ultracold temperatures to achieve at least some stability and allow more time for quantum computations.

    “My job is to extend the lifetime of qubits, and we’ve got four of them to play with,” says Matthias Mergenthaler, an Oxford University postdoc student working at IBM’s Zurich lab. That doesn’t sound like a lot, but, he explains, it’s not so much the number of qubits that counts but their quality, meaning qubits with as low a noise level as possible, to ensure they last as long as possible in superposition and allow the machine to compute. And it’s here, in the fiddly world of noise reduction, that quantum computing hits up against one of its biggest challenges. Right now, the device you’re reading this on probably performs at a level similar to that of a quantum computer with 30 noisy qubits. But if you can reduce the noise, then the quantum computer is many times more powerful.

    Once the noise is reduced, researchers try to correct any remaining errors with the help of special error-correcting algorithms, run on a classical computer. The problem is, such error correction works qubit by qubit, so the more qubits there are, the more errors the system has to cope with. Say a computer makes an error once every 1,000 computational steps; it doesn’t sound like much, but after 1,000 or so operations, the program will output incorrect results. To be able to achieve meaningful computations and surpass standard computers, a quantum machine has to have about 1,000 qubits that are relatively low noise and with error rates as corrected as possible. When you put them all together, these 1,000 qubits will make up what researchers call a logical qubit. None yet exist—so far, the best that prototype quantum devices have achieved is error correction for up to 10 qubits. That’s why these prototypes are called noisy intermediate-scale quantum computers (NISQ), a term also coined by Preskill in 2017.

    For Carminati, it’s clear the technology isn’t ready yet. But that isn’t really an issue. At CERN the challenge is to be ready to unlock the power of quantum computers when and if the hardware becomes available. “One exciting possibility will be to perform very, very accurate simulations of quantum systems with a quantum computer—which in itself is a quantum system,” he says. “Other groundbreaking opportunities will come from the blend of quantum computing and artificial intelligence to analyze big data, a very ambitious proposition at the moment, but central to our needs.”

    But some physicists think NISQ machines will stay just that—noisy—forever. Gil Kalai, a professor at Yale University, says that error correcting and noise suppression will never be good enough to allow any kind of useful quantum computation. And it’s not even due to technology, he says, but to the fundamentals of quantum mechanics. Interacting systems have a tendency for errors to be connected, or correlated, he says, meaning errors will affect many qubits simultaneously. Because of that, it simply won’t be possible to create error-correcting codes that keep noise levels low enough for a quantum computer with the required large number of qubits.

    “My analysis shows that noisy quantum computers with a few dozen qubits deliver such primitive computational power that it will simply not be possible to use them as the building blocks we need to build quantum computers on a wider scale,” he says. Among scientists, such skepticism is hotly debated. The blogs of Kalai and fellow quantum skeptics are forums for lively discussion, as was a recent much-shared article titled “The Case Against Quantum Computing”—followed by its rebuttal, “The Case Against the Case Against Quantum Computing.

    For now, the quantum critics are in a minority. “Provided the qubits we can already correct keep their form and size as we scale, we should be okay,” says Ray Laflamme, a physicist at the University of Waterloo in Ontario, Canada. The crucial thing to watch out for right now is not whether scientists can reach 50, 72, or 128 qubits, but whether scaling quantum computers to this size significantly increases the overall rate of error.

    3
    The Quantum Nano Centre in Canada is one of numerous big-budget research and development labs focussed on quantum computing. James Brittain/Getty Images

    Others believe that the best way to suppress noise and create logical qubits is by making qubits in a different way. At Microsoft, researchers are developing topological qubits—although its array of quantum labs around the world has yet to create a single one. If it succeeds, these qubits would be much more stable than those made with integrated circuits. Microsoft’s idea is to split a particle—for example an electron—in two, creating Majorana fermion quasi-particles. They were theorized back in 1937, and in 2012 researchers at Delft University of Technology in the Netherlands, working at Microsoft’s condensed matter physics lab, obtained the first experimental evidence of their existence.

    “You will only need one of our qubits for every 1,000 of the other qubits on the market today,” says Chetan Nayak, general manager of quantum hardware at Microsoft. In other words, every single topological qubit would be a logical one from the start. Reilly believes that researching these elusive qubits is worth the effort, despite years with little progress, because if one is created, scaling such a device to thousands of logical qubits would be much easier than with a NISQ machine. “It will be extremely important for us to try out our code and algorithms on different quantum simulators and hardware solutions,” says Carminati. “Sure, no machine is ready for prime time quantum production, but neither are we.”

    Another company Carminati is watching closely is IonQ, a US startup that spun out of the University of Maryland. It uses the third main approach to quantum computing: trapping ions. They are naturally quantum, having superposition effects right from the start and at room temperature, meaning that they don’t have to be supercooled like the integrated circuits of NISQ machines. Each ion is a singular qubit, and researchers trap them with special tiny silicon ion traps and then use lasers to run algorithms by varying the times and intensities at which each tiny laser beam hits the qubits. The beams encode data to the ions and read it out from them by getting each ion to change its electronic states.

    In December, IonQ unveiled its commercial device, capable of hosting 160 ion qubits and performing simple quantum operations on a string of 79 qubits. Still, right now, ion qubits are just as noisy as those made by Google, IBM, and Intel, and neither IonQ nor any other labs around the world experimenting with ions have achieved quantum supremacy.

    As the noise and hype surrounding quantum computers rumbles on, at CERN, the clock is ticking. The collider will wake up in just five years, ever mightier, and all that data will have to be analyzed. A non-noisy, error-corrected quantum computer will then come in quite handy.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 1:29 pm on March 8, 2019 Permalink | Reply
    Tags: , And finally theywill be shipped to CERN, “The need to go beyond the already excellent performance of the LHC is at the basis of the scientific method” said Giorgio Apollinari Fermilab scientist and HL-LHC AUP project manager., , , CERN LHC, Each magnet will have four sets of coils making it a quadrupole., Earlier this month the AUP earned approval for both Critical Decisions 2 and 3b from DOE., Fermilab will manufacture 43 coils and Brookhaven National Laboratory in New York will manufacture another 41, , , In its current configuration on average an astonishing 1 billion collisions occur every second at the LHC., It’s also the reason behind the collider’s new name the High-Luminosity LHC., LHC AUP began just over two years ago and on Feb. 11 it received key approvals allowing the project to transition into its next steps., , , , Superconducting niobium-tin magnets have never been used in a high-energy particle accelerator like the LHC., The AUP calls for 84 coils fabricated into 21 magnets., The first upgrade is to the magnets that focus the particles., The magnets will be sent to Brookhaven to be tested before being shipped back to Fermilab., The new technologies developed for the LHC will boost that number by a factor of 10., The second upgrade is a special type of accelerator cavity., The U.S. Large Hadron Collider Accelerator Upgrade Project is the Fermilab-led collaboration of U.S. laboratories in partnership with CERN and a dozen other countries., These new magnets will generate a maximum magnetic field of 12 tesla roughly 50 percent more than the niobium-titanium magnets currently in the LHC., This means that significantly more data will be available to experiments at the LHC., This special cavity called a crab cavity is used to increase the overlap of the two beams so that more protons have a chance of colliding., Those will then be delivered to Lawrence Berkeley National Laboratory to be formed into accelerator magnets, Twenty successful magnets will be inserted into 10 containers which are then tested by Fermilab, U.S. Department of Energy projects undergo a series of key reviews and approvals referred to as “Critical Decisions” that every project must receive., U.S. physicists and engineers helped research and develop two technologies to make this upgrade possible.   

    From Brookhaven National Lab: “Large Hadron Collider Upgrade Project Leaps Forward” 

    From Brookhaven National Lab

    March 4, 2019
    Caitlyn Buongiorno

    1
    Staff members of the Superconducting Magnet Division at Brookhaven National Laboratory next to the “top hat”— the interface between the room temperature components of the magnet test facility and the LHC high-luminosity magnet to be tested. The magnet is attached to the bottom of the top hat and tested in superfluid helium at temperatures close to absolute zero. Left to right: Joseph Muratore, Domenick Milidantri, Sebastian Dimaiuta, Raymond Ceruti, and Piyush Joshi. Credit: Brookhaven National Laboratory

    The U.S. Large Hadron Collider Accelerator Upgrade Project is the Fermilab-led collaboration of U.S. laboratories that, in partnership with CERN and a dozen other countries, is working to upgrade the Large Hadron Collider.

    LHC AUP began just over two years ago and, on Feb. 11, it received key approvals, allowing the project to transition into its next steps.

    LHC

    CERN map

    CERN LHC Tunnel

    CERN LHC particles

    U.S. Department of Energy projects undergo a series of key reviews and approvals, referred to as “Critical Decisions” that every project must receive. Earlier this month, the AUP earned approval for both Critical Decisions 2 and 3b from DOE. CD-2 approves the performance baseline — the scope, cost and schedule — for the AUP. In order to stay on that schedule, CD-3b allows the project to receive the funds and approval necessary to purchase base materials and produce final design models of two technologies by the end of 2019.

    The LHC, a 17-mile-circumference particle accelerator on the French-Swiss border, smashes together two opposing beams of protons to produce other particles. Researchers use the particle data to understand how the universe operates at the subatomic scale.

    In its current configuration, on average, an astonishing 1 billion collisions occur every second at the LHC. The new technologies developed for the LHC will boost that number by a factor of 10. This increase in luminosity — the number of proton-proton interactions per second — means that significantly more data will be available to experiments at the LHC. It’s also the reason behind the collider’s new name, the High-Luminosity LHC.

    2
    This “crab cavity” is designed to maximize the chance of collision between two opposing particle beams. Photo: Paolo Berrutti

    “The need to go beyond the already excellent performance of the LHC is at the basis of the scientific method,” said Giorgio Apollinari, Fermilab scientist and HL-LHC AUP project manager. “The endorsement and support received for this U.S. contribution to the HL-LHC will allow our scientists to remain at the forefront of research at the energy frontier.”

    U.S. physicists and engineers helped research and develop two technologies to make this upgrade possible. The first upgrade is to the magnets that focus the particles. The new magnets rely on niobium-tin conductors and can exert a stronger force on the particles than their predecessors. By increasing the force, the particles in each beam are driven closer together, enabling more proton-proton interactions at the collision points.

    The second upgrade is a special type of accelerator cavity. Cavities are structures inside colliders that impart energy to the particle beam and propel them forward. This special cavity, called a crab cavity, is used to increase the overlap of the two beams so that more protons have a chance of colliding.

    “This approval is a recognition of 15 years of research and development started by a U.S. research program and completed by this project,” said Giorgio Ambrosio, Fermilab scientist and HL-LHC AUP manager for magnets.

    3
    This completed niobium-tin magnet coil will generate a maximum magnetic field of 12 tesla, roughly 50 percent more than the niobium-titanium magnets currently in the LHC. Photo: Alfred Nobrega

    Magnets help the particles go ’round

    Superconducting niobium-tin magnets have never been used in a high-energy particle accelerator like the LHC. These new magnets will generate a maximum magnetic field of 12 tesla, roughly 50 percent more than the niobium-titanium magnets currently in the LHC. For comparison, an MRI’s magnetic field ranges from 0.5 to 3 tesla, and Earth’s magnetic field is only 50 millionths of one tesla.

    There are multiple stages to creating the niobium-tin coils for the magnets, and each brings its challenges.

    Each magnet will have four sets of coils, making it a quadrupole. Together the coils conduct the electric current that produces the magnetic field of the magnet. In order to make niobium-tin capable of producing a strong magnetic field, the coils must be baked in an oven and turned into a superconductor. The major challenge with niobium-tin is that the superconducting phase is brittle. Similar to uncooked spaghetti, a small amount of pressure can snap it in two if the coils are not well supported. Therefore, the coils must be handled delicately from this point on.

    The AUP calls for 84 coils, fabricated into 21 magnets. Fermilab will manufacture 43 coils, and Brookhaven National Laboratory in New York will manufacture another 41. Those will then be delivered to Lawrence Berkeley National Laboratory to be formed into accelerator magnets. The magnets will be sent to Brookhaven to be tested before being shipped back to Fermilab. Twenty successful magnets will be inserted into 10 containers, which are then tested by Fermilab, and finally shipped to CERN.

    With CD-2/3b approval, AUP expects to have the first magnet assembled in April and tested by July. If all goes well, this magnet will be eligible for installation at CERN.

    Crab cavities for more collisions

    Cavities accelerate particles inside a collider, boosting them to higher energies. They also form the particles into bunches: As individual protons travel through the cavity, each one is accelerated or decelerated depending on whether they are below or above an expected energy. This process essentially sorts the beam into collections of protons, or particle bunches.

    HL-LHC puts a spin on the typical cavity with its crab cavities, which get their name from how the particle bunches appear to move after they’ve passed through the cavity. When a bunch exits the cavity, it appears to move sideways, similar to how a crab walks. This sideways movement is actually a result of the crab cavity rotating the particle bunches as they pass through.

    Imagine that a football was actually a particle bunch. Typically, you want to throw a football straight ahead, with the pointed end cutting through the air. The same is true for particle bunches; they normally go through a collider like a football. Now let’s say you wanted to ensure that your football and another football would collide in mid-air. Rather than throwing it straight on, you’d want to throw the football on its side to maximize the size of the target and hence the chance of collision.

    Of course, turning the bunches is harder than turning a football, as each bunch isn’t a single, rigid object.

    To make the rotation possible, the crab cavities are placed right before and after the collision points at two of the particle detectors at the LHC, called ATLAS and CMS. An alternating electric field runs through each cavity and “tilts” the particle bunch on its side. To do this, the front section of the bunch gets a “kick” to one side on the way in and, before it leaves, the rear section gets a “kick” to the opposite side. Now, the particle bunch looks like a football on its side. When the two bunches meet at the collision point, they overlap better, which makes the occurrence of a particle collision more likely.

    After the collision point, more crab cavities straighten the remaining bunches, so they can travel through the rest of the LHC without causing unwanted interactions.

    With CD-2/3b approval, all raw materials necessary for construction of the cavities can be purchased. Two crab cavity prototypes are expected by the end of 2019. Once the prototypes have been certified, the project will seek further approval for the production of all cavities destined to the LHC tunnel.

    After further testing, the cavities will be sent out to be “dressed”: placed in a cooling vessel. Once the dressed cavities pass all acceptance criteria, Fermilab will ship all 10 dressed cavities to CERN.

    “It’s easy to forget that these technological advances don’t benefit just accelerator programs,” said Leonardo Ristori, Fermilab engineer and an HL-LHC AUP manager for crab cavities. “Accelerator technology existed in the first TV screens and is currently used in medical equipment like MRIs. We might not be able to predict how these technologies will appear in everyday life, but we know that these kinds of endeavors ripple across industries.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 1:20 pm on February 28, 2019 Permalink | Reply
    Tags: , , CERN LHC, Croatia to become an Associate Member of CERN, , , ,   

    From CERN: “Croatia to become an Associate Member of CERN” 

    Cern New Bloc

    Cern New Particle Event

    From CERN

    28 February, 2019

    1
    Fabiola Gianotti, CERN Director-General, and Blaženka Divjak, Minister of Science and Education of the Republic of Croatia, signed an Agreement admitting Croatia as an Associate Member of CERN.

    Zagreb. Today, the Director-General of CERN1, Fabiola Gianotti, and the Minister of Science and Education of the Republic of Croatia, Blaženka Divjak, in the presence of Croatian Prime Minister Andrej Plenković, signed an Agreement admitting Croatia as an Associate Member of CERN. The status will come into effect on the date the Director-General receives Croatia’s notification that it has completed its internal approval procedures in respect of the Agreement.

    “It is a great pleasure to welcome Croatia into the CERN family as an Associate Member. Croatian scientists have made important contributions to a large variety of experiments at CERN for almost four decades and as an Associate Member, new opportunities open up for Croatia in scientific collaboration, technological development, education and training,” said Fabiola Gianotti.

    “Croatian participation in CERN as an Associate Member is also a way to retain young and capable people in the country because they can participate in important competitive international projects, working and studying in the Croatian educational and scientific institutions that collaborate with CERN,” said Blaženka Divjak.

    Croatian scientists have been engaged in scientific work at CERN for close to 40 years. Already in the late 1970s, researchers from Croatian institutes worked on the SPS heavy-ion programme. In 1994, research groups from Split officially joined the CMS collaboration and one year later a research group from Zagreb joined the ALICE collaboration, working with Croatian industry partners to contribute to the construction of the experiments’ detectors. Scientists from Croatia have also been involved in other CERN experiments such as CAST, NA61, ISOLDE, nTOF and OPERA.

    CERN and Croatia signed a Cooperation Agreement in 2001, setting priorities for scientific and technical cooperation. This resulted in an increased number of scientists and students from Croatia participating in CERN’s programmes, including the CERN Summer Student Programme. In May 2014, Croatia applied for Associate Membership.

    As an Associate Member, Croatia will be entitled to participate in the CERN Council, Finance Committee and Scientific Policy Committee. Nationals of Croatia will be eligible for staff positions and Croatia’s industry will be able to bid for CERN contracts, opening up opportunities for industrial collaboration in advanced technologies.

    Footnote(s)

    1. CERN, the European Organization for Nuclear Research, is one of the world’s leading laboratories for particle physics. The Organization is located on the French-Swiss border, with its headquarters in Geneva. Its Member States are: Austria, Belgium, Bulgaria, Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Israel, Italy, Netherlands, Norway, Poland, Portugal, Romania, Slovakia, Spain, Sweden, Switzerland and United Kingdom. Cyprus, Serbia and Slovenia are Associate Member States in the pre-stage to Membership. India, Lithuania, Pakistan, Turkey and Ukraine are Associate Member States. The European Union, Japan, JINR, the Russian Federation, UNESCO and the United States of America currently have Observer status.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

     
  • richardmitnick 12:00 pm on February 28, 2019 Permalink | Reply
    Tags: "First ATLAS result with full Run 2 dataset: a search for new heavy particles", , , CERN LHC, , , ,   

    From CERN ATLAS: “First ATLAS result with full Run 2 dataset: a search for new heavy particles” 

    CERN/ATLAS detector

    CERN ATLAS Higgs Event

    CERN/ATLAS

    27th February 2019
    ATLAS Collaboration

    1
    Figure 1: Measured dielectron mass distribution for the data (black points), together with the total background fit result is shown (red continuous line), with various possible Z’ signal distributions overlaid (dashed red line). The sub-panel shows the significance of the deviation between the observed data and the background prediction in each bin of the distribution. (Image: ATLAS Collaboration/CERN).

    Could a Grand Unified Theory resolve the remaining mysteries of the Standard Model?

    Standard Model of Particle Physics


    Standard Model of Particle Physics from Symmetry Magazine

    If verified, it would provide an elegant description of the unification of Standard Model forces at very high energies, and might even explain the existence of dark matter and neutrino masses. ATLAS physicists are searching for evidence of new heavy particles predicted by such theories, including a neutral Z’ gauge boson.

    The ATLAS collaboration has today released its very first result utilising its entire LHC Run 2 dataset, collected between 2015 and 2018. This analysis searches for new heavy particles decaying into dilepton final states, where the leptons are either two electrons or two muons. This is one of the most sensitive decays to search for new physics, thanks to the ATLAS detector’s excellent energy and momentum resolution for leptons and the strong signal-to-background differentiation as a result of the simple two-lepton signature.

    The new ATLAS result also employs a novel data-driven approach for estimating the Standard Model background. While the previous analysis predominantly used simulations for the background prediction and was carried out with a fraction of the data, this new analysis takes advantage of the vast Run 2 dataset by fitting the observed data with a functional form motivated by and validated with our understanding of the Standard Model processes contributing to these events. If present, the new particles would appear as bumps on top of a smoothly falling background shape, making them straightforward to identify (see Figure 1). This is similar to one of the ways that the Higgs boson was discovered in 2012, through its decay to two photons.

    In addition to probing unexplored territory in the search for new physics, a great deal of work in this analysis has gone into understanding the ATLAS detector and collaborating with the various detector performance groups to improve the identification of very high-energy electrons and muons. This included accounting for the multiplicity of tracks in the inner part of the detector, as it continuously increased due to the rising average number of proton-proton collisions per bunch crossing during Run 2.

    No significant sign of new physics has been observed thus far. The result sets stringent constraints on the production rate of various types of hypothetical Z’ particles. As well as setting exclusion limits on specific theoretical models, the result has also been provided in a generic format that allows physicists to re-interpret the data under different theoretical assumptions. This study has deepened the exploration of physics at the energy frontier; ATLAS physicists are excited about further analysing the large Run 2 dataset.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN map


    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN


    CERN Courier

    QuantumDiaries


    Quantum Diaries

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: