Tagged: Accelerator Science Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:05 am on April 26, 2017 Permalink | Reply
    Tags: Accelerator Science, , Charged-particle reconstruction at the energy frontier   

    From ATLAS at CERN: “Charged-particle reconstruction at the energy frontier” 

    CERN ATLAS Higgs Event

    CERN/ATLAS
    ATLAS

    26th April 2017
    ATLAS Collaboration

    1
    Figure 1: Illustration of isolated measurements (left) in the ATLAS pixel detector and merged measurements (right) due to very collimated tracks. Merged measurements are more common in higher energetic jets and are harder to distinguish. The ATLAS event reconstruction software was optimized for Run 2 and is now better able to resolve merged measurements. Different colors represent energy deposits from different charged particles traversing the sensor and the particles trajectories are shown as arrows. (Image: ATLAS Collaboration/CERN)

    A new age of exploration dawned at the start of Run 2 of the Large Hadron Collider, as protons began colliding at the unprecedented centre-of-mass energy of 13 TeV. The ATLAS experiment now frequently observes highly collimated bundles of particles (known as jets) with energies of up to multiple TeV, as well as tau-leptons and b-hadrons that pass through the innermost detector layers before decaying. These energetic collisions are prime hunting grounds for signs of new physics, including massive, hypothetical new particles that would decay to much lighter – and therefore highly boosted – bosons.

    In these very energetic jets, the average separation of charged particles is comparable to the size of individual inner detector elements. This easily creates confusion within the algorithms responsible for reconstructing charged particle trajectories (tracks). Therefore, without careful consideration, this can limit the track reconstruction efficiency in these dense environments. This would result in poor identification of long-lived b-hadrons and hadronic tau decays, and difficulties in calibrating the energy and mass of jets.

    2
    Figure 2: Efficiency to reconstruct a track of a charged particle from decays of a tau-lepton, rho-meson and B0-hadron as a function of these particles initial transverse momentum. At higher transverse momentum, merged measurements are more abundant and therefore the efficiency drops. This effect is exacerbated by a higher charged-particle multiplicity in the decay, as clearly visible for the tau-lepton’s decay into five charged particles (green circles). (Image: ATLAS Collaboration/CERN)

    Similar to increasing the magnification of a microscope, in preparation for Run 2, the ATLAS event reconstruction software was optimized to better resolve these close-by particles. As a result, at angular separations between a jet and a charged particle below 0.02, the reconstruction efficiency for a charged particle track is still around 80% for jets with a transverse momentum of 1400 to 1600 GeV in simulated dijet events. This has maximised the potential for discovery, allowing for more detailed measurements of the newly opened kinematic regime.

    Recently published results give a general overview of the new track reconstruction algorithm, highlighting the ATLAS detector’s excellent performance in reconstructing charged particles in dense environments. The results also present, for the first time, a novel method for determining in situ (i.e. from data) the efficiency of reconstructing tracks in such an environment. The study uses the ionization energy loss (dE/dx), measured with the ATLAS pixel detector, to deduce the probability of failing to reconstruct a track. The obtained results confirm the excellent performance expected from studies on simulated data.

    3
    Figure 3: The ionization energy loss (dE/dx) of charged particles in the ATLAS pixel detector. Three distinct distributions were created to extract the track reconstruction performance in the core of jets: isolated measurements (blue); merged measurements (green); and the data (black circles) which, due to specific selections, should resemble isolated measurements. A possible inefficiency of the track reconstruction is determined by fitting the green and blue distributions to the data (the result is shown as a red line). The fitted contribution of the green distribution to the data corresponds to an inefficiency of the track reconstruction. (Image: ATLAS Collaboration/CERN)

    Links:

    Performance of the ATLAS Track Reconstruction Algorithms in Dense Environments in LHC Run 2: arXiv link to come.
    See also the full lists of ATLAS Conference Notes and ATLAS Physics Papers.

    See the full article here .

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 1:25 pm on April 24, 2017 Permalink | Reply
    Tags: Accelerator Science, , , New ALICE results show novel phenomena in proton collisions, , , , Strange quark   

    From ALICE at CERN: “New ALICE results show novel phenomena in proton collisions” 

    CERN
    CERN New Masthead

    CERN ALICE Icon HUGE

    24 Apr 2017.
    Harriet Kim Jarlett

    1
    As the number of particles produced in proton collisions (the blue lines) increase, the more of these so-called strange hadrons are measured (as shown by the orange to red squares in the graph) (Image: ALICE/CERN)

    In a paper published today in Nature Physics , the ALICE collaboration reports that proton collisions sometimes present similar patterns to those observed in the collisions of heavy nuclei. This behaviour was spotted through observation of so-called strange hadrons in certain proton collisions in which a large number of particles are created. Strange hadrons are well-known particles with names such as Kaon, Lambda, Xi and Omega, all containing at least one so-called strange quark. The observed ‘enhanced production of strange particles’ is a familiar feature of quark-gluon plasma, a very hot and dense state of matter that existed just a few millionths of a second after the Big Bang, and is commonly created in collisions of heavy nuclei. But it is the first time ever that such a phenomenon is unambiguously observed in the rare proton collisions in which many particles are created. This result is likely to challenge existing theoretical models that do not predict an increase of strange particles in these events.

    “We are very excited about this discovery,” said Federico Antinori, Spokesperson of the ALICE collaboration. “We are again learning a lot about this primordial state of matter. Being able to isolate the quark-gluon-plasma-like phenomena in a smaller and simpler system, such as the collision between two protons, opens up an entirely new dimension for the study of the properties of the fundamental state that our universe emerged from.”

    The study of the quark-gluon plasma provides a way to investigate the properties of strong interaction, one of the four known fundamental forces, while enhanced strangeness production is a manifestation of this state of matter. The quark-gluon plasma is produced at sufficiently high temperature and energy density, when ordinary matter undergoes a transition to a phase in which quarks and gluons become ‘free’ and are thus no longer confined within hadrons. These conditions can be obtained at the Large Hadron Collider by colliding heavy nuclei at high energy. Strange quarks are heavier than the quarks composing normal matter, and typically harder to produce. But this changes in presence of the high energy density of the quark-gluon plasma, which rebalances the creation of strange quarks relative to non-strange ones. This phenomenon may now have been observed within proton collisions as well.

    In particular, the new results show that the production rate of these strange hadrons increases with the ‘multiplicity’ – the number of particles produced in a given collision – faster than that of other particles generated in the same collision. While the structure of the proton does not include strange quarks, data also show that the higher the number of strange quarks contained in the induced hadron, the stronger is the increase of its production rate. No dependence on the collision energy or the mass of the generated particles is observed, demonstrating that the observed phenomenon is related to the strange quark content of the particles produced. Strangeness production is in practice determined by counting the number of strange particles produced in a given collision, and calculating the ratio of strange to non-strange particles.

    Enhanced strangeness production had been suggested as a possible consequence of quark-gluon plasma formation since the early eighties, and discovered in collisions of nuclei in the nineties by experiments at CERN’s Super Proton Synchrotron.

    CERN Super Proton Synchrotron

    Another possible consequence of the quark gluon plasma formation is a spatial correlation of the final state particles, causing a distinct preferential alignment with the shape of a ridge. Following its detection in heavy-nuclei collisions, the ridge has also been seen in high-multiplicity proton collisions at the Large Hadron Collider, giving the first indication that proton collisions could present heavy-nuclei-like properties. Studying these processes more precisely will be key to better understand the microscopic mechanisms of the quark-gluon plasma and the collective behaviour of particles in small systems.

    The ALICE experiment has been designed to study collisions of heavy nuclei. It also studies proton-proton collisions, which primarily provide reference data for the heavy-nuclei collisions. The reported measurements have been performed with 7 TeV proton collision data from LHC run 1.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    CernCourier
    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS
    ATLAS
    CERN/ATLAS detector

    ALICE
    CERN ALICE New

    CMS
    CERN/CMS Detector

    LHCb

    CERN/LHCb

    LHC

    CERN/LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles


    Quantum Diaries

     
  • richardmitnick 2:16 pm on April 22, 2017 Permalink | Reply
    Tags: Accelerator Science, , , , , Videos   

    From CMS at CERN: Fantastic Videos 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    These incredible videos are presented in no particular order.,


    An introduction to the CMS Experiment at CERN


    Welcome to LHC season 2: new frontiers in physics at #13TeV


    LHC animation: The path of the protons


    The Large Hadron Collider Returns in the Hunt for New Physics


    Physics Run 2016


    Back to the Big Bang: Inside the Large Hadron Collider – From the World Science Festival


    Higgs boson: what’s next? #13TeV

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    CernCourier
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 11:56 am on April 20, 2017 Permalink | Reply
    Tags: Accelerator Science, , , , New LHC Results Hint At New Physics... But Are We Crying Wolf?, ,   

    From Ethan Siegel: “New LHC Results Hint At New Physics… But Are We Crying Wolf?” 

    Ethan Siegel
    Apr 20, 2017

    1
    The LHCb collaboration is far less famous than CMS or ATLAS, but the bottom-quark-containing particles they produce holds new physics hints that the other detectors cannot probe. CERN / LHCb Collaboration

    Over at the Large Hadron Collider at CERN, particles are accelerated to the greatest energies they’ve ever reached in history. In the CMS and ATLAS detectors, new fundamental particles are continuously being searched for, although only the Higgs boson has come through. But in a much lesser-known detector — LHCb — particles containing bottom quarks are produced in tremendous numbers. One class of these particles, quark-antiquark pairs where one is a bottom quark, have recently been observed to decay in a way that runs counter to the Standard Model’s predictions. Even though the evidence isn’t very good, it’s the biggest hint for new physics we’ve had from accelerators in years.

    2
    A decaying B-meson, as shown here, may decay more frequently to one type of lepton pair than the other, contradicting Standard Model expectations. KEK / BELLE collaboration

    KEK Belle detector, at the High Energy Accelerator Research Organisation (KEK) in Tsukuba, Ibaraki Prefecture, Japan

    There are two ways, throughout history, that we’ve made extraordinary advances in fundamental physics. One is when an unexplained, robust phenomenon pops up, and we’re compelled to rethink our conception of the Universe. The other is when multiple, competing, but heretofore indistinguishable explanations of the same set of observations are subject to a critical test, where only one explanation emerges as a valid one. Particle physics is at a crossroads right now, because even though there are fundamentally unsolved questions, the energy scales that we can probe with experiments all give results that are perfectly in line with the Standard Model.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    3
    The discovery of the Higgs Boson in the di-photon (γγ) channel at CMS. That ‘bump’ in the data is an unambiguous new particle: the Higgs.

    CERN CMS Higgs Event

    CERN/CMS Detector

    The Higgs boson, discovered earlier this decade, was created over and over at the LHC, with its decays measured in excruciating detail. If there were any hints of departures from the Standard Model — if it decayed into one type of particle more-or-less frequently than predicted — it could be an extraordinary hint of new physics. Similarly, physicists searches exhaustively for new “bumps” where there shouldn’t be any in the data: a signal of a potential new particle. Although they showed up periodically, with some mild significance, they always went away entirely with more and better data.

    4
    The observed Higgs decay channels vs. the Standard Model agreement, with the latest data from ATLAS and CMS included. The agreement is astounding, but there are outliers (which is expected) when the error-bars are larger.

    CERN ATLAS Higgs Event

    CERN/ATLAS detector

    Statistically, this is about what you’d expect. If you had a fair coin and tossed it 10 times, you might expect that you’d get 5 heads and 5 tails. Although that’s reasonable, sometimes you’ll get 6 and 4, sometimes you’ll get 8 and 2, and sometimes you’ll get 10 and 0, respectively. If you got 10 heads and 0 tails, you might begin to suspect that the coin isn’t fair, but the odds aren’t that bad: about 0.2% of the time, you’ll have all ten flips give the same result. And if you have 1000 people each flipping a coin ten times, it’s very likely (86%) that at least one of them will get the same result all ten times.

    The Standard Model makes predictions for lots of different quantities — particle production rates, scattering amplitudes, decay probabilities, branching ratios, etc. — for every single particle (both fundamental and composite) that can be created. Literally, there are hundreds of such composite particles that have been created in such numbers, and thousands of quantities like that we can measure. Since we look at all of them, we demand an extremely high level of statistical significance before we’re willing to claim a discovery. In particle physics, the odds of a fluke need to be less than one-in-three-million to get there.

    6
    The standard model calculated predictions (the four colored points) and the LHCb results (black, with error bars) for the electron/positron to muon/antimuon ratios at two different energies. LHCb Collaboration / Tommaso Dorigo

    Earlier this week, the LHCb collaboration announced their greatest departure yet observed from the Standard Model: a difference in the rate of decay of bottom-quark-containing mesons into strange-quark-containing mesons with either a muon-antimuon pair or an electron-positron pairs. In the Standard Model, the ratios should be 1.0 (once mass differences of muons and electrons are taken into account), but they observed a ratio of 0.6. That sure sounds like a big deal, and like it might be a hint of physics beyond the Standard Model!

    7
    The known particles and antiparticles of the Standard Model all have been discovered. All told, they make explicit predictions. Any violation of those predictions would be a sign of new physics, which we’re desperately seeking. E. Siegel

    The case gets even stronger when you consider that the BELLE collaboration, last decade, discovered these decays and began to notice a slight discrepancy themselves. But a closer inspection of the latest data shows that the statistical significance is only about 2.4 and 2.5 sigma, respectively, at the two energies measured. This is about a 1.5% chance of a fluke individually, or about 3.7-sigma significance (0.023% chance of a fluke) combined. Now, 3.7-sigma is a lot more exciting than 2.5-sigma, but it’s still not exciting enough. Given that there were thousands of things these experiments looked at, these results barely even register as “suggestive” of new physics, much less as compelling evidence.

    7
    The ATLAS and CMS diphoton bumps from 2015, displayed together, clearly correlating at ~750 GeV. This suggestive result was significant at more than 3-sigma, but went away entirely with more data. CERN, CMS/ATLAS collaborations; Matt Strassler

    Yet already, just on Wednesday, there were six new papers out (with more surely coming) attempting to use beyond-the-Standard-Model physics to explain this not-even-promising result.

    Why?

    Because, quite frankly, we don’t have any good ideas in place. Supersymmetry, grand unification, string theory, technicolor, and extra dimensions, among others, were the leading extensions to the Standard Model, and colliders like the LHC have yielded absolutely no evidence for any of them. Signals from direct experiments for physics beyond the Standard Model have all yielded results completely consistent with the Standard Model alone. What we’re seeing now is rightly called ambulance-chasing, but it’s even worse than that.

    8
    The Standard Model particles and their supersymmetric counterparts. Non-white-male-American scientists have been instrumental in the development of the Standard Model and its extensions. Claire David

    We know that results like this have a history of not holding up at all; we expect there to be fluctuations like this in the data, and this one isn’t even as significant as the others that have gone away with more and better data. You expect a 2-sigma discrepancy in one out of every 20 measurements you make, and these two are little better than that. Even combined, they’re hardly impressive, and the other things you’d seek to measure about this decay line up with the Standard Model perfectly. In short, the Standard Model is much more likely than not to hold up once more and better data arrives.

    9
    The string landscape might be a fascinating idea that’s full of theoretical potential, but it doesn’t predict anything that we can observe in our Universe. University of Cambridge

    What we’re seeing right now is a response from the community is what we’d expect to an alarm that’s crying “Wolf!” There might be something fantastic and impressive out there, and so, of course we have to look. But we know that, more than 99% of the time, an alarm like this is merely the result of which way the wind blew. Physicists are so bored and so out of good, testable ideas to extend the Standard Model — which is to say, the Standard Model is so maddeningly successful — that even a paltry result like this is enough to shift the theoretical direction of the field.

    A few weeks ago, famed physicist (and supersymmetry-advocate) John Ellis asked the question, Where is Particle Physics going? Unless experiments can generate new, unexpected results, the answer is likely to be “nowhere new; nowhere good” for the indefinite future.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 12:29 pm on April 19, 2017 Permalink | Reply
    Tags: A new search to watch from LHCb, Accelerator Science, ,   

    From Symmetry: “A new search to watch from LHCb” 

    Symmetry Mag

    Symmetry

    04/18/17
    Sarah Charley

    A new result from the LHCb experiment could be an early indicator of an inconsistency in the Standard Model.

    CERN/LHCb

    The subatomic universe is an intricate mosaic of particles and forces. The Standard Model of particle physics is a time-tested instruction manual that precisely predicts how particles and forces behave.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    But it’s incomplete, ignoring phenomena such as gravity and dark matter.

    Today the LHCb experiment at CERN European research center released a result that could be an early indication of new, undiscovered physics beyond the Standard Model.

    However, more data is needed before LHCb scientists can definitively claim they’ve found a crack in the world’s most robust roadmap to the subatomic universe.

    “In particle physics, you can’t just snap your fingers and claim a discovery,” says Marie-Hélène Schune, a researcher on the LHCb experiment from Le Centre National de la Recherche Scientifique in Orsay, France. “It’s not magic. It’s long, hard work and you must be obstinate when facing problems. We always question everything and never take anything for granted.”

    The LHCb experiment records and analyzes the decay patterns of rare hadrons—particles made of quarks—that are produced in the Large Hadron Collider’s energetic proton-proton collisions.

    CERN/LHC Map


    CERN LHC Tube



    CERN LHC

    By comparing the experimental results to the Standard Model’s predictions, scientists can search for discrepancies. Significant deviations between the theory and experimental results could be an early indication of an undiscovered particle or force at play.

    This new result looks at hadrons containing a bottom quark as they transform into hadrons containing a strange quark. This rare decay pattern can generate either two electrons or two muons as byproducts. Electrons and muons are different types or “flavors” of particles called leptons. The Standard Model predicts that the production of electrons and muons should be equally favorable—essentially a subatomic coin toss every time this transformation occurs.

    “As far as the Standard Model is concerned, electrons, muons and tau leptons are completely interchangeable,” Schune says. “It’s completely blind to lepton flavors; only the large mass difference of the tau lepton plays a role in certain processes. This 50-50 prediction for muons and electrons is very precise.”

    But instead of finding a 50-50 ratio between muons and electrons, the latest results from the LHCb experiment show that it’s more like 40 muons generated for every 60 electrons.

    “If this initial result becomes stronger with more data, it could mean that there are other, invisible particles involved in this process that see flavor,” Schune says. “We’ll leave it up to the theorists’ imaginations to figure out what’s going on.”

    However, just like any coin-toss, it’s difficult to know if this discrepancy is the result of an unknown favoritism or the consequence of chance. To delineate between these two possibilities, scientists wait until they hit a certain statistical threshold before claiming a discovery, often 5 sigma.

    “Five sigma is a measurement of statistical deviation and means there is only a 1-in-3.5-million chance that the Standard Model is correct and our result is just an unlucky statistical fluke,” Schune says. “That’s a pretty good indication that it’s not chance, but rather the first sightings of a new subatomic process.”

    Currently, this new result is at approximately 2.5 standard deviations, which means there is about a 1-in-125 possibility that there’s no new physics at play and the experimenters are just the unfortunate victims of statistical fluctuation.

    This isn’t the first time that the LHCb experiment has seen unexpected behavior in related processes. Hassan Jawahery from the University of Maryland also works on the LHCb experiment and is studying another particle decay involving bottom quarks transforming into charm quarks. He and his colleagues are measuring the ratio of muons to tau leptons generated during this decay.

    “Correcting for the large mass differences between muons and tau leptons, we’d expect to see about 25 taus produced for every 100 muons,” Jawahery says. “We measured a ratio of 34 taus for every 100 muons.”

    On its own, this measurement is below the line of statistical significance needed to raise an eyebrow. However, two other experiments—the BaBar experiment at SLAC and the Belle experiment in Japan—also measured this process and saw something similar.

    “We might be seeing the first hints of a new particle or force throwing its weight around during two independent subatomic processes,” Jawahery says. “It’s tantalizing, but as experimentalists we are still waiting for all these individual results to grow in significance before we get too excited.”

    More data and improved experimental techniques will help the LHCb experiment and its counterparts narrow in on these processes and confirm if there really is something funny happening behind the scenes in the subatomic universe.

    “Conceptually, these measurements are very simple,” Schune says. “But practically, they are very challenging to perform. These first results are all from data collected between 2011 and 2012 during Run 1 of the LHC. It will be intriguing to see if data from Run 2 shows the same thing.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 1:56 pm on April 18, 2017 Permalink | Reply
    Tags: Accelerator Science, , , , , LHCb Finds New Hints of Possible Deviations from the Standard Model, ,   

    From Astro Watch: “LHCb Finds New Hints of Possible Deviations from the Standard Model” 

    Astro Watch bloc

    Astro Watch

    April 18, 2017
    CERN

    1
    CERN LHCb

    The LHCb experiment finds intriguing anomalies in the way some particles decay. If confirmed, these would be a sign of new physics phenomena not predicted by the Standard Model of particle physics. The observed signal is still of limited statistical significance, but strengthens similar indications from earlier studies. Forthcoming data and follow-up analyses will establish whether these hints are indeed cracks in the Standard Model or a statistical fluctuation.

    Today, in a seminar at CERN, the LHCb collaboration presented new long-awaited results on a particular decay of B0 mesons produced in collisions at the Large Hadron Collider. The Standard Model of particle physics predicts the probability of the many possible decay modes of B0 mesons, and possible discrepancies with the data would signal new physics.

    In this study, the LHCb collaboration looked at the decays of B0 mesons to an excited kaon and a pair of electrons or muons. The muon is 200 times heavier than the electron, but in the Standard Model its interactions are otherwise identical to those of the electron, a property known as lepton universality. Lepton universality predicts that, up to a small and calculable effect due to the mass difference, electron and muons should be produced with the same probability in this specific B0 decay. LHCb finds instead that the decays involving muons occur less often.

    While potentially exciting, the discrepancy with the Standard Model occurs at the level of 2.2 to 2.5 sigma, which is not yet sufficient to draw a firm conclusion. However, the result is intriguing because a recent measurement by LHCb involving a related decay exhibited similar behavior.

    While of great interest, these hints are not enough to come to a conclusive statement. Although of a different nature, there have been many previous measurements supporting the symmetry between electrons and muons. More data and more observations of similar decays are needed in order to clarify whether these hints are just a statistical fluctuation or the first signs for new particles that would extend and complete the Standard Model of particles physics. The measurements discussed were obtained using the entire data sample of the first period of exploitation of the Large Hadron Collider (Run 1). If the new measurements indeed point to physics beyond the Standard Model, the larger data sample collected in Run 2 will be sufficient to confirm these effects.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 1:45 pm on April 18, 2017 Permalink | Reply
    Tags: Accelerator Science, , , Gabriella Carini, How do you catch femtosecond light?, , , , , ,   

    From SLAC: “How do you catch femtosecond light?” 


    SLAC Lab

    1
    Gabriella Carini
    Staff Scientist
    Joined SLAC: 2011
    Specialty: Developing detectors that capture light from X-ray sources
    Interviewed by: Amanda Solliday

    Gabriella Carini enjoys those little moments—after hours and hours of testing in clean rooms, labs and at X-ray beamlines—when she first sees an instrument work.

    She earned her PhD in electronic engineering at the University of Palermo in Italy and now heads the detectors department at the Linac Coherent Light Source (LCLS), the X-ray free-electron laser at SLAC.

    SLAC/LCLS

    Scientists from around the world use the laser to probe natural processes that occur in tiny slivers of time. To see on this timescale, they need a way to collect the light and convert it into data that can be examined and interpreted.

    It’s Carini’s job to make sure LCLS has the right detector equipment at hand to catch the “precious”, very intense laser pulses, which may last only a few femtoseconds.

    When the research heads in new directions, as it constantly does, this requires her to look for fresh technology and turn these ideas into reality.

    When did you begin working with detectors?

    I moved to the United States as a doctoral student. My professor at the time suggested I join a collaboration at Brookhaven National Laboratory, where I started developing gamma ray detectors to catch radioactive materials.

    Radioactive materials give off gamma rays as they decay, and gamma rays are the most energetic photons, or particles of light. The detectors I worked on were made from cadmium zinc telluride, which has very good stopping power for highly energetic photons. These detectors can identify radioactive isotopes for security—such as the movement of nuclear materials—and contamination control, but also gamma rays for medical and astrophysical observations.

    We had some medical projects going on at the time, too, with detectors that scan for radioactive tracers used to map tissues and organs with positron emission tomography.

    From gamma ray detectors, I then moved to X-rays, and I began working on the earliest detectors for LCLS.

    How do you explain your job to someone outside the X-ray science community?

    I say, “There are three ingredients for an experiment—the source, the sample and the detector.”

    You need a source of light that illuminates your sample, which is the problem you want to solve or investigate. To understand what is happening, you have to be able to see the signal produced by the light as it interacts with the sample. That’s where the detector comes in. For us, the detector is like the “eyes” of the experimental set-up.

    What do you like most about your work?

    2

    There’s always a way we can help researchers optimize their experiments, tweak some settings, do more analysis and correction.

    This is important because scientists are going to encounter a lot of different types of detectors if they work at various X-ray facilities.

    I like to have input from people who are running the experiments. Because I did experiments myself as a graduate student, I’m very sensitive to whether a system is user-friendly. If you don’t make something that researchers can take the best advantage of, then you didn’t do your job fully.

    And detectors are never perfect, no matter which one you buy or build.

    There are a lot of people who have to come together to make a detector system. It’s not one person’s work. It’s many, many people with lots of different expertise. You need to have lots of good interpersonal skills.

    What are some of the challenges of creating detectors for femtosecond science?

    In more traditional X-ray sources the photons arrive distributed over time, one after the other, but when you work with ultrafast laser pulses like the ones from LCLS, all your information about a sample arrives in a few femtoseconds. Your detector has to digest this entire signal at once, process the information and send it out before another pulse comes. This requires deep understanding of the detector physics and needs careful engineering. You need to optimize the whole signal chain from the sensor to the readout electronics to the data transmission.

    We also have mechanical challenges because we have to operate in very unusual conditions: intense optical lasers, injectors with gas and liquids, etc. In many cases we need to use special filters to protect the detectors from these sources of contamination.

    4
    And often, you work in vacuum. With “soft” or low-energy X-rays, they are absorbed very quickly in air. Your entire system has to be vacuum-compatible. With many of our substantial electronics, this requires some care.

    So there are lots of things to take into account. Those are just a few examples. It’s very complicated and can vary quite a bit from experiment to experiment.

    Is there a new project you are really excited about?

    All of LCLS-II—this fills my life! We’re coming up with new ideas and new technologies for SLAC’s next X-ray laser, which will have a higher firing rate—up to a million pulses per second. For me, this is a multidimensional puzzle. Every science case and every instrument has its own needs and we have to find a route through the many options and often-competing parameters to achieve our goals.

    X-ray free-electron lasers are a big driver for detector development. Ten years ago, no one would have talked about X-ray cameras delivering 10,000 pictures per second. The new X-ray lasers are really a game-changer in developing detectors for photon science, because they require detectors that are just not readily available.

    LCLS-II will be challenging, but it’s exciting. For me, it’s thinking about what we can do now for the very first day of operation. And while doing that, we need to keep pushing the limits of what we have to do next to take full advantage of our new machine.

    6

    SLAC LCLS-II

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.
    i1

     
  • richardmitnick 1:13 pm on April 16, 2017 Permalink | Reply
    Tags: Accelerator Science, , , , , ,   

    From Nature: “Muons’ big moment could fuel new physics” 

    Nature Mag
    Nature

    11 April 2017
    Elizabeth Gibney

    1
    The Muon g-2 experiment will look for deviations from the standard model by measuring how muons wobble in a magnetic field. Credit: FNAL

    In the search for new physics, experiments based on high-energy collisions inside massive atom smashers are coming up empty-handed. So physicists are putting their faith in more-precise methods: less crash-and-grab and more watching-ways-of-wobbling. Next month, researchers in the United States will turn on one such experiment. It will make a super-accurate measurement of the way that muons, heavy cousins of electrons, behave in a magnetic field. And it could provide evidence of the existence of entirely new particles.

    The particles hunted by the new experiment, at the Fermi National Laboratory in Batavia, Illinois, comprise part of the virtual soup that surrounds and interacts with all forms of matter. Quantum theory says that short-lived virtual particles constantly ‘blip’ in and out of existence. Physicists already account for the effects of known virtual particles, such as photons and quarks. But the virtual soup might have mysterious, and as yet unidentified, ingredients. And muons could be particularly sensitive to them.

    The new Muon g−2 experiment will measure this sensitivity with unparalleled precision. And in doing so, it will reanalyse a muon anomaly that has puzzled physicists for more than a decade. If the experiment confirms that the anomaly is real, then the most likely explanation is that it is caused by virtual particles that do not appear in the existing physics playbook — the standard model.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    2
    Adapted from go.nature.com/2naoxaw

    “It would be the first direct evidence of not only physics beyond the standard model, but of entirely new particles,” says Dominik Stöckinger, a theorist at the Technical University of Dresden, Germany, and a member of the Muon g−2 collaboration.

    Physicists are crying out for a successor to the standard model — a theory that has been fantastically successful yet is known to be incomplete because it fails to account for many phenomena, such as the existence of dark matter. Experiments at the Large Hadron Collider (LHC) at CERN, Europe’s particle-physics lab near Geneva, Switzerland, have not revealed a specific chink, despite performing above expectation and carrying out hundreds of searches for physics beyond the standard model. The muon anomaly is one of only a handful of leads that physicists have.

    Measurements of the muon’s magnetic moment — a fundamental property that relates to the particle’s inherent magnetism — could hold the key, because it is tweaked by interactions with virtual particles. When last measured 15 years ago at the Brookhaven National Laboratory in New York, the muon’s magnetic moment was larger than theory predicts.

    BNL RHIC Campus

    BNL/RHIC

    FNAL G-2 magnet from Brookhaven Lab finds a new home in the FNAL Muon G-2 experiment

    Physicists think that interaction with unknown particles, perhaps those envisaged by a theory called supersymmetry, might have caused this anomaly.

    Other possible explanations are a statistical fluke, or a flaw in the theorists᾽ standard-model calculation, which combines the complex effects of known particles. But that is becoming less likely, says Stöckinger, who says that new calculation methods and experimental cross-checks make the theoretical side much more robust than it was 15 years ago.

    “With this tantalizing result from Brookhaven, you really have to do a better experiment,” says Lee Roberts, a physicist at Boston University in Massachusetts, who is joint leader of the Muon g−2 experiment. The Fermilab set-up will use 20 times the number of muons used in the Brookhaven experiment to shrink uncertainty by a factor of 4. “If we agree, but with much smaller error, that will show definitively that there’s some particle that hasn’t been observed anywhere else,” he says.

    To probe the muons, Fermilab physicists will inject the particles into a magnetic field contained in a ring some 14 metres across. Each particle has a magnetic property called spin, which is analogous to Earth spinning on its axis. As the muons travel around the ring at close to the speed of light, their axes of rotation wobble in the field, like off-kilter spinning tops. Combining this precession rate with a measurement of the magnetic field gives the particles’ magnetic moment.

    Since the Brookhaven result, some popular explanations for the anomaly — including effects of hypothetical dark photons — seem to have been ruled out by other experiments, says Stöckinger. “But if you look at the whole range of scenarios for physics beyond the standard model, there are many possibilities.”

    3
    Fermilab is the home of the Muon g−2 experiment.

    Although a positive result would give little indication of exactly what the new particles are, it would provide clues to how other experiments might pin them down. If the relatively large Brookhaven discrepancy is maintained, it can only come from relatively light particles, which should be within reach of the LHC, says Stöckinger, even if they interact so rarely that it takes years for them to emerge.

    Indeed, the desire to build on previous findings is so strong that to avoid possible bias, Fermilab experimenters will process their incoming results ‘blind’ and apply a different offset to each of two measurements that combine to give the magnetic moment. Only once the offsets are revealed will anyone know whether they have proof of new particles hiding in the quantum soup. “Until then nobody knows what the answer is,” says Roberts. “It will be an exciting moment.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

     
  • richardmitnick 1:25 pm on April 14, 2017 Permalink | Reply
    Tags: Accelerator Science, , , , ,   

    From Ethan Siegel: “Can muons — which live for microseconds — save experimental particle physics?” 

    Ethan Siegel

    Apr 14, 2017

    You lose whether you use protons or electrons in your collider, for different reasons. Could the unstable muon solve both problems?

    1
    A four-muon candidate event in the ATLAS detector at the Large Hadron Collider. The muon/anti-muon tracks are highlighted in red, as the long-lived muons travel farther than any other unstable particle. Image credit: ATLAS Collaboration / CERN.

    “It does not matter how slowly you go as long as you do not stop.” -Confucius

    High-energy physics is facing its greatest crisis ever. The Standard Model is complete, as all the particles our most successful physics theories have predicted have been discovered.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    The Large Hadron Collider at CERN, the most energetic particle collider ever developed (with more than six times the energies of any prior collider), discovered the long-sought-after Higgs boson, but nothing else.

    CERN/LHC Map

    CERN LHC Tube


    LHC at CERN

    Traditionally, the way to discover new particles has been to go to higher energies with one of two strategies:

    Collide electrons and positrons, getting a “clean” signal where 100% of the collider energy goes into producing new particles.
    Collide protons and either anti-protons or other protons, getting a messy signal but reaching higher energies due to the heavier mass of the proton.

    Both methods have their limitations, but one unstable particle might give us a third option to make the elusive breakthrough we desperately need: the muon.

    2
    The known particles in the Standard Model. These are all the fundamental particles that have been directly discovered. Image credit: E. Siegel.

    The Standard Model is made up of all the fundamental particles and antiparticles we’ve ever discovered. They include six quarks and antiquarks, each in three colors, three charged leptons and three types of neutrino, along with their antiparticle counterparts, and the bosons: the photon, the weak bosons (W+, W-, Z0), the eight gluons (with color/anticolor combinations attached), and the Higgs boson. While countless different combinations of these particles exist in nature, only a precious few are stable. The electron, photon, proton (made of two up and one down quark), and, if they’re bound together in nuclei, the neutron (with two down and one up quark) are stable, along with their antimatter counterparts. That’s why all the normal matter we see in the Universe is made up of protons, neutrons, and electrons; nothing else with any significant interactions is stable.

    3
    While many unstable particles, both fundamental and composite, can be produced in particle physics, only protons, neutrons (bound in nuclei) and the electron are stable, along with their antimatter counterparts and the photon. Everything else is short-lived. Image credit: Contemporary Physics Education Project (CPEP), U.S. Department of Energy / NSF / LBNL.

    The way you create these unstable particles is by colliding the stable ones together at high enough energies. Because of a fundamental principle of nature — mass/energy equivalence, given by Einstein’s E = mc2 — you can turn pure energy into mass if you have enough of it. (So long as you obey all the other conservation laws.) This is exactly the way we’ve created almost all the other particles of the Standard Model: by colliding particles into one another at enough energy that the energy you get out (E) is high enough to create the new particles (of mass m) you’re attempting to discover.

    4
    The particle tracks emanating from a high energy collision at the LHC in 2014 show the creation of many new particles. It’s only because of the high-energy nature of this collision that new masses can be created.

    We know there are almost certainly more particles beyond the ones we’ve discovered; we expect there to be particle explanations for mysteries like the baryon asymmetry (why there’s more matter than antimatter), the missing mass problem in the Universe (what we suspect will be solved by dark matter), the neutrino mass problem (why they’re so incredibly light), the quantum nature of gravity (i.e., there should be a force-carrying particle for the gravitational interaction, like the graviton), and the strong-CP problem (why certain decays don’t happen), among others. But our colliders haven’t reached the energies necessary to uncover those new particles, if they even exist. What’s even worse: both of the current methods have severe drawbacks that may prohibit us from building colliders that go to significantly higher energies.

    The Large Hadron Collider is the current record-holder, accelerating protons up to energies of 6.5 TeV apiece before smashing them together. The energy you can reach is directly proportional to two things only: the radius of your accelerator (R) and the strength of the magnetic field used to bend the protons into a circle (B). Collide those two protons together, and they hit with an energy of 13 TeV. But you’ll never make a 13 TeV particle colliding two protons at the LHC; only a fraction of that energy is available to create new particles via E = mc². The reason? A proton is made of multiple, composite particles — quarks, gluons, and even quark/antiquark pairs inside — meaning that only a tiny fraction of that energy goes into making new, massive particles.

    5
    A candidate Higgs event in the ATLAS detector. Note how even with the clear signatures and transverse tracks, there is a shower of other particles; this is due to the fact that protons are composite particles. Image credit: The ATLAS collaboration / CERN.

    CERN ATLAS Higgs Event

    CERN/ATLAS detector

    You might think to use fundamental particles instead, then, like electrons and positrons. If you were to put them in the same ring (with the same R) and subject them to the same magnetic field (the same B), you might think you could reach the same energies, only this time, 100% of the energy could make new particles. And that would be true, if it weren’t for one factor: synchrotron radiation. You see, when you accelerate a charged particle in a magnetic field, it gives off radiation. Because a proton is so massive compared to its electric charge, that radiation is negligible, and you can take protons up to the highest energies we’ve ever reached without worrying about it. But electrons and positrons are only 1/1836th of a proton’s mass, and synchrotron radiation would limit them to only about 0.114 TeV of energy under the same conditions.

    6
    Relativistic electrons and positrons can be accelerated to very high speeds, but will emit synchrotron radiation (blue) at high enough energies, preventing them from moving faster. Image credit: Chung-Li Dong, Jinghua Guo, Yang-Yuan Chen, and Chang Ching-Lin, ‘Soft-x-ray spectroscopy probes nanomaterial-based devices’.

    But there’s a third option that’s never been put into practice: use muons and anti-muons. A muon is just like an electron in the sense that it’s a fundamental particle, it’s charged, it’s a lepton, but it’s 206 times heavier than the electron. This is massive enough that synchrotron radiation doesn’t matter for muons or anti-muons, which is great! The only downside? The muon is unstable, with a mean lifetime of only 2.2 microseconds before decaying away.

    5
    The prototype MICE 201-megahertz RF module, with the copper cavity mounted, is shown during assembly at Fermilab. This apparatus could focus and collimate a muon beam, enabling the muons to be accelerated and survive for much longer than 2.2 microseconds. Image credit: Y. Torun / IIT / Fermilab Today.

    That might be okay, though, because special relativity can rescue us! When you bring an unstable particle close to the speed of light, the amount of time that it lives increases dramatically, thanks to the relativistic phenomenon of time dilation. If you brought a muon all the way up to 6.5 TeV of energy, it would live for 135,000 microseconds: enough time to circle the Large Hadron Collider 1,500 times before decaying away. And this time, your hopes would be absolutely true: 100% of that energy, 6.5 TeV + 6.5 TeV = 13 TeV, would be available for particle creation.

    6
    A design plan for a full-scale muon-antimuon collider at Fermilab, the source of the world’s second-most powerful particle accelerator. Image credit: Fermilab.

    We can always build a bigger ring or invent stronger magnets, and we may well do exactly that. But there’s no cure for synchrotron radiation except to use heavier particles, and there’s no cure for energy spreading out among the components of composite particles other than not to use them at all. Muons are unstable and difficult to keep alive for a long time, but as we get to higher and higher energies, that task gets progressively easier. Muon colliders have long been touted as a mere pipe dream, but recent progress by the MICE collaboration — for Muon Ionization Cooling Experiment — has demonstrated that this may be possible after all. A circular muon/anti-muon collider may be the particle accelerator that takes us beyond the LHC’s reach, and, if we’re lucky, into the realm of the new physics we’re so desperately seeking.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 11:32 am on April 6, 2017 Permalink | Reply
    Tags: Accelerator Science, , , Improving our understanding of photon pairs, ,   

    From CERN ATLAS: “Improving our understanding of photon pairs” 

    CERN ATLAS Higgs Event

    CERN/ATLAS
    ATLAS

    5th April 2017
    ATLAS Collaboration

    1
    Figure 1: The measured differential cross section as a function of the invariant mass of the photon pair is compared to predictions from four theoretical computations. The invariant mass is often the most scrutinized distribution when searching for new physics. (Image: ATLAS Collaboration/CERN)

    High-energy photon pairs at the LHC are famous for two things. First, as a clean decay channel of the Higgs boson. Second, for triggering some lively discussions in the scientific community in late 2015, when a modest excess above Standard Model predictions was observed by the ATLAS and CMS collaborations. When the much larger 2016 dataset was analysed, however, no excess was observed.

    Yet most photon pairs produced at the LHC do not originate from the decay of a Higgs boson (or a new, undiscovered particle). Instead, more than 99% are from rather simple interactions between the proton constituents, such as quark-antiquark annihilation. ATLAS physicists have put significant effort into improving our understanding of these Standard Model processes.

    ATLAS has released a new measurement of the inclusive di-photon cross section based on the full 2012 proton-proton collision dataset recorded at a centre-of-mass energy of 8 TeV. The precision is increased by a factor of two compared to the previous ATLAS measurement (based on the smaller 2011 data sample recorded at 7 TeV), such that the total experimental uncertainty is now typically 5%.

    According to the theory of strong interactions, the production rate of such Standard Model processes is sensitive to both high-order perturbative terms (more complex particle interactions involving quantum fluctuations) and the dynamics of additional low-energy particles emitted during the scattering process. Theoretical predictions are thus currently precise only at the 10% level. Calculations based on a fixed number of perturbative terms in the series expansion (next-to-leading order and next-to-next to leading order in the strong coupling strength) underestimate the data beyond the projected theoretical uncertainties.

    2
    Figure 2: The measured differential cross section as a function of the φ* variable is compared to predictions from four theoretical computations. The low φ* region is most sensitive to the dynamics of additional low-energy particles emitted during the scattering process. (Image: ATLAS Collaboration/CERN)

    In the new ATLAS result, the distortion in the photon pair production rate originating from the emission of low-energy particles has been probed very precisely thanks to the study of two new observables. By accurately modelling the additional emission, the predictions are found to agree with the data in the sensitive regions.

    These results provide crucial information for both experimentalists and theorists on the dynamics of the strong interaction at the LHC, and should lead to improved Standard Model predictions of di-photon processes.

    Links:

    Measurements of integrated and differential cross sections for isolated photon pair production in pp collisions at 8TeV with the ATLAS detector.

    See the full article here .

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: