Tagged: Accelerator Science Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:25 pm on December 4, 2019 Permalink | Reply
    Tags: "Discovering the Top Quark", Accelerator Science, , FNAL Tevatron CDF, , , , ,   

    From particlebites: “Discovering the Top Quark” 

    particlebites bloc

    From particlebites

    December 3, 2019
    Adam Green

    This post is about the discovery of the most massive quark in the Standard Model, the Top quark. Below is a “discovery plot” [1] from the Collider Detector at Fermilab collaboration (CDF). Here is the original paper [Physical Review Letters].

    FNAL/Tevatron CDF detector

    FNAL/Tevatron tunnel

    FNAL/Tevatron map

    1
    This plot confirms the existence of the Top quark. Let’s understand how.

    For each proton collision that passes certain selection conditions, the horizontal axis shows the best estimate of the Top quark mass. These selection conditions encode the particle “fingerprint” of the Top quark. Out of all possible proton collisions events, we only want to look at ones that perhaps came from Top quark decays. This subgroup of events can inform us of a best guess at the mass of the Top quark. This is what is being plotted on the x axis.

    On the vertical axis are the number of these events.

    The dashed distribution is the number of these events originating from the Top quark if the Top quark exists and decays this way. This could very well not be the case.

    The dotted distribution is the background for these events, events that did not come from Top quark decays.

    The solid distribution is the measured data.

    To claim a discovery, the background (dotted) plus the signal (dashed) should equal the measured data (solid). We can run simulations for different top quark masses to give us distributions of the signal until we find one that matches the data. The inset at the top right is showing that a Top quark of mass of 175GeV best reproduces the measured data.

    Taking a step back from the technicalities, the Top quark is special because it is the heaviest of all the fundamental particles. In the Standard Model, particles acquire their mass by interacting with the Higgs. Particles with more mass interact more with the Higgs. The Top mass being so heavy is an indicator that any new physics involving the Higgs may be linked to the Top quark.

    References / Further Reading

    [1] – Observation of Top Quark Production in pp Collisions with the Collider Detector at Fermilab – This is the “discovery paper” announcing experimental evidence of the Top.

    [2] – Observation of tt(bar)H Production [Physical Review Letters]– Who is to say that the Top and the Higgs even have significant interactions to lowest order? The CMS collaboration finds evidence that they do in fact interact at “tree-level.”

    [2] – The Perfect Couple: Higgs and top quark spotted together – This article further describes the interconnection between the Higgs and the Top.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    What is ParticleBites?

    ParticleBites is an online particle physics journal club written by graduate students and postdocs. Each post presents an interesting paper in a brief format that is accessible to undergraduate students in the physical sciences who are interested in active research.

    The papers are accessible on the arXiv preprint server. Most of our posts are based on papers from hep-ph (high energy phenomenology) and hep-ex (high energy experiment).

    Why read ParticleBites?

    Reading a technical paper from an unfamiliar subfield is intimidating. It may not be obvious how the techniques used by the researchers really work or what role the new research plays in answering the bigger questions motivating that field, not to mention the obscure jargon! For most people, it takes years for scientific papers to become meaningful.

    Our goal is to solve this problem, one paper at a time. With each brief ParticleBite, you should not only learn about one interesting piece of current work, but also get a peek at the broader picture of research in particle physics.

    Who writes ParticleBites?

    ParticleBites is written and edited by graduate students and postdocs working in high energy physics. Feel free to contact us if you’re interested in applying to write for ParticleBites.

    ParticleBites was founded in 2013 by Flip Tanedo following the Communicating Science (ComSciCon) 2013 workshop.

    2
    Flip Tanedo UCI Chancellor’s ADVANCE postdoctoral scholar in theoretical physics. As of July 2016, I will be an assistant professor of physics at the University of California, Riverside

    It is now organized and directed by Flip and Julia Gonski, with ongoing guidance from Nathan Sanders.

     
  • richardmitnick 2:04 pm on December 1, 2019 Permalink | Reply
    Tags: "NA61/SHINE gives neutrino experiments a helping hand", Accelerator Science, , , , ,   

    From CERN: “NA61/SHINE gives neutrino experiments a helping hand” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    How particle measurements made by the NA61/SHINE experiment at CERN are helping neutrino experiments in the US and Japan

    1
    Inside the NA61/SHINE experiment at CERN (Image: CERN)

    Neutrinos are the lightest of all the known particles that have mass. Yet their behaviour as they travel could help answer one of the greatest puzzles in physics: why the present-day universe is made mostly of matter when the Big Bang should have produced equal amounts of matter and antimatter. In two recent papers, the NA61/SHINE collaboration reports particle measurements that are crucial for accelerator-based experiments studying such neutrino behaviour.

    Neutrinos come in three types, or “flavours”, and neutrino experiments are measuring with ever increasing detail how they and their antimatter counterparts, antineutrinos, “oscillate” from one flavour to another while they travel. If it turns out that neutrinos and antineutrinos oscillate in a different way from one another, this may partially account for the present-day matter–antimatter imbalance.

    Accelerator-based neutrino experiments look for neutrino oscillations by producing a beam of neutrinos of one flavour and measuring the beam after it has travelled a long distance. The neutrino beams are typically produced by firing a beam of high-energy protons into long, thin carbon or beryllium targets. These proton–target interactions produce hadrons, such as pions and kaons, which are focused using magnetic aluminium horns and directed into long tunnels, in which they transform into neutrinos and other particles.

    To get a reliable measurement of the neutrino oscillations, the researchers working on these experiments need to estimate the number of neutrinos in the beam before oscillation and how this number varies with the energy of the particles. Estimating this “neutrino flux” is hard, because neutrinos interact very weakly with other particles and cannot be measured easily. To get around this, researchers estimate instead the number of hadrons. But measuring the number of hadrons is also challenging, because there are too many of them to measure precisely.

    This is where experiments such as NA61/SHINE at CERN’s Super Proton Synchrotron come in. NA61/SHINE can reproduce the proton–target interactions that generate the hadrons that transform into neutrinos. It can also reproduce subsequent interactions that protons and hadrons undergo in the targets and focusing horns. These subsequent interactions can produce additional neutrino-yielding hadrons.

    The NA61/SHINE collaboration has previously measured hadrons generated in experiments at 31 GeV/c proton energy (where c is the speed of light) to help predict the neutrino flux in the Tokai-to-Kamioka (T2K) neutrino-oscillation experiment in Japan. The collaboration has also been gathering data at 60 and 120 GeV/c energies to benefit the MINERνA, NOνA and DUNE experiments at Fermilab in the US. The analysis of these datasets is progressing well and has most recently led to two papers: one describing measurements of interactions of protons with carbon, beryllium and aluminium, and another reporting measurements of interactions of pions with carbon and beryllium.

    “These results are crucial for Fermilab’s neutrino experiments,” says Laura Fields, an NA61/SHINE collaboration member and co-spokesperson for MINERνA. “To predict the neutrino fluxes for these experiments, researchers need an extremely detailed simulation of the entire beamline and all of the interactions that happen within it. For that simulation we need to know the probability that each type of interaction will happen, the particles that will be produced, and their properties. So interaction measurements such as the latest ones will be vital to make these simulations much more accurate,” she explains.

    “Looking into the future, NA61/SHINE will focus on measurements for the next generation of neutrino-oscillation experiments, including DUNE and T2HK in Japan, to enable these experiments to produce high-precision results in neutrino physics,” Fields concludes.

    See also this Experimental Physics newsletter article.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Tunnel

    CERN LHC particles

     
  • richardmitnick 12:27 pm on November 27, 2019 Permalink | Reply
    Tags: "The plot thickens for a hypothetical “X17” particle", Accelerator Science, Additional evidence of an unknown particle from a Hungarian lab, , , , , , ,   

    From CERN: “The plot thickens for a hypothetical “X17” particle” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    27 November, 2019
    Ana Lopes

    Additional evidence of an unknown particle from a Hungarian lab gives a new impetus to NA64 searches.

    CERN NA64


    The NA64 experiment at CERN (Image: CERN)

    Fresh evidence of an unknown particle that could carry a fifth force of nature gives the NA64 collaboration at CERN a new incentive to continue searches.

    In 2015, a team of scientists spotted [Physical Review Letters] an unexpected glitch, or “anomaly”, in a nuclear transition that could be explained by the production of an unknown particle. About a year later, theorists suggested [Physical Review Letters] that the new particle could be evidence of a new fundamental force of nature, in addition to electromagnetism, gravity and the strong and weak forces. The findings caught worldwide attention and prompted, among other studies, a direct search [Physical Review Letters] for the particle by the NA64 collaboration at CERN.

    A new paper from the same team, led by Attila Krasznahorkay at the Atomki institute in Hungary, now reports another anomaly, in a similar nuclear transition, that could also be explained by the same hypothetical particle.

    The first anomaly spotted by Krasznahorkay’s team was seen in a transition of beryllium-8 nuclei. This transition emits a high-energy virtual photon that transforms into an electron and its antimatter counterpart, a positron. Examining the number of electron–positron pairs at different angles of separation, the researchers found an unexpected surplus of pairs at a separation angle of about 140º. In contrast, theory predicts that the number of pairs decreases with increasing separation angle, with no excess at a particular angle. Krasznahorkay and colleagues reasoned that the excess could be interpreted by the production of a new particle with a mass of about 17 million electronvolts (MeV), the “X17” particle, which would transform into an electron–positron pair.

    The latest anomaly reported by Krasznahorkay’s team, in a paper [.pdf above] that has yet to be peer-reviewed, is also in the form of an excess of electron–positron pairs, but this time the excess is from a transition of helium-4 nuclei. “In this case, the excess occurs at an angle 115º but it can also be interpreted by the production of a particle with a mass of about 17 MeV,” explained Krasznahorkay. “The result lends support to our previous result and the possible existence of a new elementary particle,” he adds.

    Sergei Gninenko, spokesperson for the NA64 collaboration at CERN, which has not found signs of X17 in its direct search, says: “The Atomki anomalies could be due to an experimental effect, a nuclear physics effect or something completely new such as a new particle. To test the hypothesis that they are caused by a new particle, both a detailed theoretical analysis of the compatibility between the beryllium-8 and the helium-4 results as well as independent experimental confirmation is crucial.”

    The NA64 collaboration searches for X17 by firing a beam of tens of billions of electrons from the Super Proton Synchrotron accelerator onto a fixed target.

    The Super Proton Synchrotron (SPS), CERN’s second-largest accelerator

    If X17 did exist, the interactions between the electrons and nuclei in the target would sometimes produce this particle, which would then transform into an electron–positron pair. The collaboration has so far found no indication that such events took place, but its datasets allowed them to exclude part of the possible values for the strength of the interaction between X17 and an electron. The team is now upgrading their detector for the next round of searches, which are expected to be more challenging but at the same time more exciting, says Gninenko.

    Among other experiments that could also hunt for X17 in direct searches are the LHCb experiment and the recently approved FASER experiment, both at CERN.

    CERN/LHCb detector

    CERN FASER experiment schematic

    Jesse Thaler, a theoretical physicist from the Massachusetts Institute of Technology, says: “By 2023, the LHCb experiment should be able to make a definitive measurement to confirm or refute the interpretation of the Atomki anomalies as arising from a new fundamental force. In the meantime, experiments such as NA64 can continue to chip away at the possible values for the hypothetical particle’s properties, and every new analysis brings with it the possibility (however remote) of discovery.”

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Tunnel

    CERN LHC particles

     
  • richardmitnick 4:10 pm on November 14, 2019 Permalink | Reply
    Tags: Accelerator Science, , , , , , ,   

    From Fermi National Accelerator Lab: “Discovery of a new type of particle beam instability” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    November 14, 2019
    Alexey Burov

    Accelerated, charged particle beams do what light does for microscopes: illuminate matter. The more intense the beams, the more easily scientists can examine the object they are looking at. But intensity comes with a cost: the more intense the beams, the more they become prone to instabilities.

    One type of instability occurs when the average energy of accelerated particles traveling through a circular machine reaches its transition value. The transition point occurs when the particles revolve around the ring at the same rate, even though they do not all carry the same energy — in fact, they exhibit a range of energies. The specific motion of the particles near the transition energy makes them extremely prone to collective instabilities.

    These particular instabilities were observed for decades, but they were not sufficiently understood. In fact, they were misinterpreted. In a paper published this year, I suggest a new theory about these instabilities. The application of this theory to the Fermilab Booster accelerator predicted the main features of the instability there at the transition crossing, suggesting better ways to suppress the instability. Recent measurements confirmed the predictions, and more detailed experimental beam studies are planned in the near future.

    1
    Recent measurements at the Fermilab Booster accelerator confirmed existence of a certain kind of particle beam instability. More measurements are planned for the near future to examine new methods proposed to mitigate it.

    Accelerating high-intensity beams is a crucial part of the Fermilab scientific program. A solid theoretical understanding of particle beam behavior equips experimentalists to better manipulate the accelerator parameters to suppress instability. This leads to the high-intensity beams needed for Fermilab’s experiments in fundamental physics. It is also useful for any experiment or institution operating circular accelerators.

    Beam protons talk to each other by electromagnetic fields, which are of two kinds. One is called the Coulomb field. These fields are local and, by themselves, cannot drive instabilities. The second kind is the wake field. Wake fields are radiated by the particles and trail behind them, sometimes far behind.

    When a particle strays from the beam path, the wake field translates this departure backward — in the wake left by the particle. Even a small departure from the path may not escape being carried backward by these electromagnetic fields. If the beams are intense enough, their wakes can destabilize them.

    In the new theory, I suggested a compact mathematical model that effectively takes both sorts of fields into account, realizing that both of them are important when they are strong enough, as they typically are near transition energy.

    This kind of huge amplification happens at CERN’s Proton Synchrotron, for example, as I showed in my more recent paper, submitted to Physical Review Accelerators and Beams. If not suppressed one way or another, this amplification may grow until the beam touches the vacuum chamber wall and becomes lost. Recent measurements at the Fermilab Booster confirmed existence of a similar instability there; more measurements are planned for the near future to examine new methods proposed to mitigate it.

    These phenomena are called transverse convective instabilities, and the discoveries of how they arise open new doors to theoretical, numerical and experimental ways to better understanding and better dealing with the intense proton beams.

    This work is supported by the DOE Office of Science.

    Science paper:
    Convective instabilities of bunched beams with space charge
    Physical Review Accelerators and Beams

    See the full here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 2:23 pm on November 6, 2019 Permalink | Reply
    Tags: Accelerator Science, , CERN Council selected Fabiola Gianotti as the Organization’s next Director-General for her second term of office., , , ,   

    From CERN: “CERN Council appoints Fabiola Gianotti for second term of office as CERN Director General” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    6 November, 2019

    At its 195th Session today, the CERN Council selected Fabiola Gianotti, as the Organization’s next Director-General, for her second term of office.

    1
    President of the CERN Council, Ursula Bassler and Director-General of CERN, Fabiola Gianotti (Image: CERN)

    At its 195th Session today, the CERN Council selected Fabiola Gianotti, as the Organization’s next Director-General, for her second term of office. The appointment will be formalised at the December Session of the Council, and Gianotti’s new five-year term of office will begin on 1 January 2021. This is the first time in CERN’s history that a Director-General has been appointed for a full second term.

    “I congratulate Fabiola Gianotti very warmly for her reappointment as Director-General for another five-year term of office. With her at the helm, CERN will continue to benefit from her strong leadership and experience, especially for important upcoming projects such as the High-Luminosity LHC, implementation of the European Strategy for Particle Physics, and the construction of the Science Gateway,” said President of the CERN Council, Ursula Bassler. “During her first term, she excelled in leading our diverse and international scientific organisation, becoming a role model, especially for women in science”.

    “I am deeply grateful to the CERN Council for their renewed trust. It is a great privilege and a huge responsibility,” said CERN Director-General, Fabiola Gianotti. “The following years will be crucial for laying the foundations of CERN’s future projects and I am honoured to have the opportunity to work with the CERN Member States, Associate Member States, other international partners and the worldwide particle physics community.”

    Gianotti has been CERN’s Director-General since 1 January 2016. She received her Ph.D. in experimental particle physics from the University of Milano in 1989 and has been a research physicist at CERN since 1994. She was the leader of the ATLAS experiment’s collaboration from March 2009 to February 2013, including the period in which the LHC experiments ATLAS and CMS announced the discovery of the Higgs boson. The discovery was recognised in 2013 with the Nobel Prize in Physics being awarded to theorists François Englert and Peter Higgs. Gianotti is a member of many international committees, and has received numerous prestigious awards. She was the first woman to become the Director-General of CERN.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Tunnel

    CERN LHC particles

     
  • richardmitnick 10:17 am on October 9, 2019 Permalink | Reply
    Tags: Accelerator Science, , CERN FCC Future Circular Collider, China Circular Electron Positron Collider (CEPC), , , , , ,   

    From CERN Courier: “European strategy [for HEP] enters next phase” 


    From CERN Courier

    2 October 2019

    Matthew Chalmers, editor

    European strategy enters next phase

    Physicists in Europe have published a 250-page “briefing book” to help map out the next major paths in fundamental exploration. Compiled by an expert physics-preparatory group set up by the CERN Council, the document is the result of an intense effort to capture the status and prospects for experiment, theory, accelerators, computing and other vital machinery of high-energy physics.

    Last year, the European Strategy Group (ESG) — which includes scientific delegates from CERN’s member and associate-member states, directors and representatives of major European laboratories and organisations and invitees from outside Europe — was tasked with formulating the next update of the European strategy for particle physics. Following a call for input in September 2018, which attracted 160 submissions, an open symposium was held in Granada, Spain, on 13-16 May at which more than 600 delegates discussed the potential merits and challenges of the proposed research programmes. The ESG briefing book distills input from the working groups and the Granada symposium to provide an objective scientific summary.

    “This document is the result of months of work by hundreds of people, and every effort has been made to objectively analyse the submitted inputs,” says ESG chair Halina Abramowicz of Tel Aviv University. “It does not take a position on the strategy process itself, or on individual projects, but rather is intended to represent the forward thinking of the community and be the main input to the drafting session in Germany in January.”

    Collider considerations

    An important element of the European strategy update is to consider which major collider should follow the LHC. The Granada symposium revealed there is clear support for an electron–positron collider to study the Higgs boson in greater detail, but four possible options at different stages of maturity exist: an International Linear Collider (ILC) in Japan, a Compact Linear Collider (CLIC) or Future Circular Collider (FCC-ee) at CERN, and a Circular Electron Positron Collider (CEPC) in China. The briefing book states that, in a global context, CLIC and FCC-ee are competing with the ILC and with CEPC. As Higgs factories, however, the report finds all four to have similar reach, albeit with different time schedules and with differing potentials for the study of physics topics at other energies.


    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan



    CLIC Collider annotated

    CERN FCC Future Circular Collider details of proposed 100km-diameter successor to LHC

    China Circular Electron Positron Collider (CEPC) map

    Also considered in depth are design studies in Europe for colliders that push the energy frontier, including a 3 TeV CLIC and a 100 TeV circular hadron collider (FCC-hh). The briefing book details the estimated timescales to develop some of these technologies, observing that the development of 16 T dipole magnets for FCC-hh will take a comparable time (about 20 years) to that projected for novel acceleration technologies such as plasma-wakefield techniques to reach conceptual designs.

    “The Granada symposium and the briefing book mention the urgent need for intensifying accelerator R&D, including that for muon colliders,” says Lenny Rivkin of Paul Scherrer Institut, who was co-convener of the chapter on accelerator science and technology. “Another important aspect of the strategy update is to recognize the potential impact of the development of accelerator and associated technology on the progress in other branches of science, such as astroparticle physics, cosmology and nuclear physics.”

    The bulk of the briefing book details the current physics landscape and prospects for progress, with chapters devoted to electroweak physics, strong interactions, flavour physics, neutrinos, cosmic messengers, physics beyond the Standard Model, and dark-sector exploration. A preceding chapter about theory emphasises the importance of keeping theoretical research in fundamental physics “free and diverse” and “not only limited to the goals of ongoing experimental projects”. It points to historical success stories such as Peter Higgs’ celebrated 1964 paper, which had the purely theoretical aim to show that Gilbert’s theorem is invalid for gauge theories at a time when applications to electroweak interactions were well beyond the horizon.

    “While an amazing amount of progress has been made in the past seven years since the Higgs boson discovery, our knowledge of the couplings of the Higgs-boson to the W and Z and to third-generation charged fermions is quite imprecise, and the couplings of the Higgs boson to the other charged fermions and to itself are unmeasured,” says Beate Heinemann of DESY, who co-convened the report’s electroweak chapter. “The imperative to study this unique particle further derives from its special properties and the special role it might play in resolving some of the current puzzles of the universe, for example dark matter, the matter-antimatter asymmetry or the hierarchy problem.”

    Readers are reminded that the discovery of neutrino oscillations constitutes a “laboratory” proof of physics beyond the Standard Model. The briefing book also notes the significant role played by Europe, via CERN, in neutrino-experiment R&D since the last strategy update concluded in 2013. Flavour physics too should remain at the forefront of the European strategy, it argues, noting that the search for flavour and CP violation in the quark and lepton sectors at different energy frontiers “has a great potential to lead to new physics at moderate cost”. An independent determination of the proton structure is needed if present and future hadron colliders are to be turned into precision machines, reports the chapter on strong interactions, and a diverse global programme based on fixed-target experiments as well as dedicated electron-proton colliders is in place.

    Europe also has the opportunity to play a leading role in the searches for dark matter “by fully exploiting the opportunities offered by the CERN facilities, such as the SPS, the potential Beam Dump Facility, and the LHC itself, and by supporting the programme of searches for axions to be hosted at other European institutions”. The briefing book notes the strong complementarity between accelerator and astrophysical searches for dark matter, and the demand for deeper technology sharing between particle and astroparticle physics.

    Scientific diversity

    The diversity of the experimental physics programme is a strong feature of the strategy update. The briefing book lists outstanding puzzles that did not change in the post-Run 2 LHC era – such as the origin of electroweak symmetry breaking, the nature of the Higgs boson, the pattern of quark and lepton masses and the neutrino’s nature – that can also be investigated by smaller scale experiments at lower energies, as explored by CERN’s dedicated Physics Beyond Colliders initiative.

    Finally, in addressing the vital roles of detector & accelerator development, computing and instrumentation, the report acknowledges both the growing importance of energy efficiency and the risks posed by “the limited amount of success in attracting, developing and retaining instrumentation and computing experts”, urging that such activities be recognized correctly as fundamental research activities. The strong support in computing and infrastructure is also key to the success of the high-luminosity LHC which, the report states, will see “a very dynamic programme occupying a large fraction of the community” during the next two decades – including a determination of the couplings between the Higgs boson and Standard Model particles “at the percent level”.

    Following a drafting session to take place in Bad Honnef, Germany, on 20-24 January, the ESG is due to submit its recommendations for the approval of the CERN Council in May 2020 in Budapest, Hungary.

    “Now comes the most challenging part of the strategy update process: how to turn the exciting and well-motivated scientific proposals of the community into a viable and coherent strategy which will ensure progress and a bright future for particle physics in Europe,” says Abramowicz. “Its importance cannot be overestimated, coming at a time when the field faces several crossroads and decisions about how best to maintain progress in fundamental exploration, potentially for generations to come.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN/ATLAS detector

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 2:41 pm on October 7, 2019 Permalink | Reply
    Tags: "Watching the top quark mass run", Accelerator Science, , , , , ,   

    From CERN CMS: “Watching the top quark mass run” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN CMS

    10.7.19
    CMS Collaboration

    1
    A candidate event for a top quark–antiquark pair recorded by the CMS detector. Such an event is expected to produce an electron (green), a muon (red) of opposite charge, two high-energy “jets” of particles (orange) and a large amount of missing energy (purple) (Image: CMS/CERN)

    For the first time, CMS physicists have investigated an effect called the “running” of the top quark mass, a fundamental quantum effect predicted by the Standard Model.

    Mass is one of the most complex concepts in fundamental physics, which went through a long history of conceptual developments. Mass was first understood in classical mechanics as a measure of inertia and was later interpreted in the theory of special relativity as a form of energy. Mass has a similar meaning in modern quantum field theories that describe the subatomic world. The Standard Model of particle physics is such a quantum field theory, and it can describe the interaction of all known fundamental particles at the energies of the Large Hadron Collider.

    Quantum Chromodynamics is the part of the Standard Model that describes the interactions of fundamental constituents of nuclear matter: quarks and gluons. The strength of the interaction between these particles depends on a fundamental parameter called the strong coupling constant. According to Quantum Chromodynamics, the strong coupling constant rapidly decreases at higher energy scales. This effect is called asymptotic freedom, and the scale evolution is referred to as the “running of the coupling constant.” The same is also true for the masses of the quarks, which can themselves be understood as fundamental couplings, for example, in connection with the interaction with the Higgs field. In Quantum Chromodynamics, the running of the strong coupling constant and of the quark masses can be predicted, and these predictions can be experimentally tested.

    The experimental verification of the running mass is an essential test of the validity of Quantum Chromodynamics. At the energies probed by the Large Hadron Collider, the effects of physics beyond the Standard Model could lead to modifications of the running of mass. Therefore, a measurement of this effect is also a search for unknown physics. Over the past decades, the running of the strong coupling constant has been experimentally verified for a wide range of scales. Also, evidence was found for the running of the masses of the charm and beauty quarks.

    2
    Figure 1: Display of an LHC collision detected by the CMS detector that contains a reconstructed top quark-antiquark pair. The display shows an electron (green) and a muon (red) of opposite charge, two highly energetic jets (orange) and a large amount of missing energy (purple).

    With a new measurement, the CMS Collaboration investigates for the first time the running of the mass of the heaviest of the quarks: the top quark. The production rate of top quark pairs (a quantity that depends on the top quark mass) was measured at different energy scales. From this measurement, the top quark mass is extracted at those energy scales using theory predictions that predict the rate at which top quark-antiquark pairs are produced.

    3
    Figure 2: The running of the top quark mass determined from the data (black points) compared to the theoretical prediction (red line). As the absolute scale of the top quark mass is not relevant for this measurement, the values have been normalised to the second data point.

    Experimentally, interesting top quark pair collisions are selected by searching for the specific decay products of a top quark-antiquark pair. In the overwhelming majority of cases, top quarks decay into an energetic jet and a W boson, which in turn can decay into a lepton and a neutrino. Jets and leptons can be identified and measured with high precision by the CMS detector, while neutrinos escape undetected and reveal themselves as missing energy. A collision that is likely the production of a top quark-antiquark pair as it is seen in the CMS detector is shown in Figure 1. Such a collision is expected to contain an electron, a muon, two energetic jets, and a large amount of missing energy.

    The measured running of the top quark mass is shown in Figure 2. The markers correspond to the measured points, while the red line represents the theoretical prediction according to Quantum Chromodynamics. The result provides the first indication of the validity of the fundamental quantum effect of the running of the top quark mass and opens a new window to test our understanding of the strong interaction. While a lot more data will be collected in the future LHC runs starting with Run 3 in 2021, this particular CMS result is mostly sensitive to uncertainties coming from the theoretical knowledge of the top quark in Quantum Chromodynamics. To witness the top quark mass running with even higher precision and maybe unveil signs of new physics, theory developments and experimental efforts will both be necessary. In the meantime, watch the top quark run!

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    CMS
    CERN CMS New

     
  • richardmitnick 10:39 am on September 30, 2019 Permalink | Reply
    Tags: "Stanford physicists funded to pursue ‘tabletop’ physics experiments", Accelerator Science, , , , , ,   

    From Stanford University: “Stanford physicists funded to pursue ‘tabletop’ physics experiments” 

    Stanford University Name
    From Stanford University

    September 25, 2019
    Ker Than
    (650) 723-9820
    kerthan@stanford.edu

    1
    Peter Graham and Savas Dimopoulos are among Stanford physicists working on smaller-scale devices to answer large questions. (Image credit: L.A. Cicero)

    With the future of large particle accelerators uncertain, Stanford theorists are exploring the use of smaller, more precise “tabletop” experiments to investigate fundamental questions in physics.

    The history of particle accelerators is one of seemingly constant one-upmanship. Ever since the 1920s, the machines – which spur charged particles to near light speeds before crashing them together – have grown ever larger, more complex and more powerful.

    Consider: When the 2-mile-long linear accelerator at SLAC National Accelerator Laboratory opened for business in 1966, it could boost electrons to energies of about 19 gigaelectronvolts. The Large Hadron Collider (LHC) at CERN, which finished construction in 2008, can boost protons to more than 700 times higher energy levels and resides in a massive elliptical tunnel wide enough to encircle a small town. Future supercolliders being planned by CERN, China and Japan promise to be even more immense and energetic (and also more expensive).

    CERN FCC Future Circular Collider map

    China Circular Electron Positron Collider (CEPC) map

    J-PARC Facility Japan Proton Accelerator Research Complex , located in Tokai village, Ibaraki prefecture, on the east coast of Japan

    The strategy has paid off handsomely with discoveries that have helped confirm the soundness of the Standard Model, our current best understanding of how nature’s fundamental forces and subatomic matter interact.

    As successful as particle accelerators have been, however, Stanford theorists Savas Dimopoulos and Peter Graham are betting that scientific treasures await discovery in the other direction as well. For years, the pair have argued that smaller and less expensive, but more sensitive, instruments could help answer stubborn mysteries in physics that have resisted the efforts of even the largest atom smashers – questions like “What is dark matter?” and “Do extra spatial dimensions exist?”

    “Peter and I and our group have been thinking about this for 15 years,” said Dimopoulos, who is the Hamamoto Family Professor at Stanford’s School of Humanities and Sciences. “We were sort of lonely but very happy because we were exploring new territory all the time and it was a lot of fun. We felt like eternal graduate students.”

    Scalpel vs. hammer

    But their ideas have been slowly gaining traction among physicists, and last fall the Gordon and Betty Moore Foundation awarded Stanford and SLAC researchers three grants totaling roughly $15 million to use quantum technologies to explore new fundamental physics. Key to these efforts are the kinds of small-scale, “tabletop” experiments (so-called because most of them would fit on a lab bench or in a modest-sized room) that Dimopoulos and Graham have long advocated for. “Everything is smaller, except for the ideas,” Dimopoulos quipped. “These types of experiments could help solve some very important problems in physics.”

    The instruments Dimopoulos and Graham have in mind exploit the weird properties of quantum mechanics – such as wave-particle duality and the seemingly telepathic link between entangled particles – to detect and measure minute signals and effects that particle accelerators are simply not attuned to.

    Tabletop experiments are considered high-risk, high-reward projects because they are generally cheaper to build and operate than colliders, said Asimina Arvanitaki, a theoretical physicist at the Perimeter Institute. “If you’re pitching a project that costs several billion dollars, you better have a very good reason for its existence and be reasonably sure you’re going to succeed,” added Arvanitaki, a former Stanford postdoc in Dimopoulos’ lab. “But the cost of tabletop experiments is so low, and the timescales for producing results is so short, that it takes some of that pressure off.”

    Building on existing technologies

    The Moore Foundation grants will fund three projects: Two are experimental and will focus on developing new technologies for detecting dark matter and measuring gravitational waves. But the third, worth about $2.5 million and awarded to Dimopoulos and Graham, will be used to further develop the theoretical underpinnings that will enable future experiments.

    “There’s been a history of particle accelerators discovering new physics and finding new particles, but it’s not clear that that can go on forever, so it’s important to think of other complementary ways to get at these underlying questions about nature,” said Ernie Glover, the Moore Foundation’s science program officer.

    Crucially, the experiments Dimopoulos and Graham are proposing rely on relatively mature, high-precision technologies that, for the most part, were developed with other uses in mind and for other fields, such as medicine and applied physics. “That’s what got us really excited,” Dimopoulos said. “We realized there were all these possibilities out there that particle theorists weren’t really thinking about.”

    A good example is nuclear magnetic resonance, or NMR, imaging, which forms the basis of magnetic resonance imaging, or MRI, a common medical scanning technique.

    A few years ago, Graham and others theorized that a proposed ultralightweight dark matter candidate called an axion could influence the nuclear spin of normal matter. Dark matter is thought to make up the bulk of the matter in the universe, but it has evaded every attempt so far at characterization. Excited, Graham contacted an atomic physicist at the University of California, Berkeley, named Dmitry Budker to discuss designing a dark matter detector based on this effect – only to discover that the technology already exists.

    “He said it’s going to work because what we were describing was basically NMR,” said Graham, a theoretical physicist at the Stanford Institute for Theoretical Physics.

    Graham and Budker teamed up with other physicists to design the Cosmic Axion Spin Precession Experiment, or “CASPEr,” which uses NMR (nuclear magnetic resonance) to detect axion and axion-like particles. These particles are predicted to have such weak interactions and low masses that they would never show up in a collider, which are better equipped to search for massive dark matter candidates such as WIMPs (weakly interacting massive particles).

    Similarly, another Moore Foundation-funded tabletop experiment called MAGIS-100 relies on atom interferometry technology initially developed in the 1990s as a general-purpose tool for making precise measurements. The project, a collaboration between Stanford’s Mark Kasevich and Jason Hogan and researchers at Fermilab and other universities, could potentially detect ripples in spacetime known as gravitational waves around 1 hertz, a frequency range beyond the sensitivity of most existing or even proposed detectors.

    Current gravitational wave detectors like LIGO are sensitive to the very final moments of the black hole collisions that generate the spacetime ripples, but MAGIS-100 could provide scientists with a much longer viewing window.

    “LIGO saw just a fraction of a second of the event, but the black holes were twirling around each other and generating gravitational waves for millions or billions of years before that. Those waves were just in lower frequency bands,” Graham said. “By looking at other frequencies, we could observe the black holes for longer and perhaps discover new gravitational wave sources.”

    Intuition

    Dimopoulos and Graham plan to use the Moore Foundation-funding to continue devising new schemes for co-opting technologies like NMR and atom interferometry in the service of fundamental physics research.

    “It’s that connection that’s hard,” Graham said. “The experimental physicists and engineers who develop the technologies aren’t necessarily thinking about what other deep, fundamental questions could be tested, and the theorists are often unaware that tools for testing their ideas already exist.”

    But Dimopoulos and Graham are now old hands at making such connections. “In principle, you have to know all possible technologies,” Graham said. “In practice, you just have to know the right ones, but it takes a nontrivial intuition to realize something like ‘Oh, wait a minute, it looks like this technique might actually be able to observe extra dimensions or some other new physics.’”

    In one sense, what Dimopoulos and Graham are advocating for is a return to the way physics was done before colliders came to play such an important role in physics and the division of physicists into primarily theoretical and experimental camps.

    “Before World War II, physics was just like what we’re doing right now,” Dimopoulos said. “Felix Bloch was both a theorist and an experimentalist, and so was Enrico Fermi. Even Einstein did experiments. There wasn’t a ready group of experimentalists that you could outsource your ideas to. You had to invent the techniques and look around at emerging technologies.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Stanford University campus. No image credit

    Stanford University

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

     
  • richardmitnick 12:25 pm on September 29, 2019 Permalink | Reply
    Tags: "Ask Ethan: Why Are There Only Three Generations Of Particles?", Accelerator Science, , , , , ,   

    From Ethan Siegel: “Ask Ethan: Why Are There Only Three Generations Of Particles?” 

    From Ethan Siegel
    Sep 28, 2019

    1
    The particles of the standard model, with masses (in MeV) in the upper right. The Fermions make up the left three columns (three generations); the bosons populate the right two columns. If a speculative idea like mirror-matter is correct, there may be a mirror-matter counterpart for each of these particles. (WIKIMEDIA COMMONS USER MISSMJ, PBS NOVA, FERMILAB, OFFICE OF SCIENCE, UNITED STATES DEPARTMENT OF ENERGY, PARTICLE DATA GROUP)

    With the discovery of the Higgs boson, the Standard Model is now complete. Can we be sure there isn’t another generation of particles out there?

    The Universe, at a fundamental level, is made up of just a few different types of particles and fields that exist amidst the spacetime fabric that composes otherwise empty space. While there may be a few components of the Universe that we don’t understand — like dark matter and dark energy — the normal matter and radiation not only well-understood, it’s perfectly well-described by our best theory of particles and their interactions: the Standard Model. There’s an intricate but ordered structure to the Standard Model, with three “generations” of particles. Why three? That’s what Peter Brouwer wants to know, asking:

    Particle families appear as a set of 3, characterised by the electron, muon and tau families. The last 2 being unstable and decaying. So my question is: Is it possible that higher order particles exist? And if so, what energies might such particles be found? If not, how do we know that they don’t exist.

    This is a big question. Let’s dive in.

    2
    The particles and antiparticles of the Standard Model have now all been directly detected, with the last holdout, the Higgs Boson, falling at the LHC earlier this decade. All of these particles can be created at LHC energies, and the masses of the particles lead to fundamental constants that are absolutely necessary to describe them fully. These particles can be well-described by the physics of the quantum field theories underlying the Standard Model, but they do not describe everything, like dark matter. (E. SIEGEL / BEYOND THE GALAXY)

    There are two classes of particles in the Standard Model: the fermions, which have half-integer spins (±½, ±1½, ±2½, etc.) and where every fermion has an antimatter (anti-fermion) counterpart, and the bosons, which have integer spins (0, ±1, ±2, etc.) and are neither matter nor antimatter. The bosons simply are what they are: 1 Higgs boson, 1 boson (photon) for the electromagnetic force, 3 bosons (W+, W- and Z) for the weak force, and 8 gluons for the strong force.

    The bosons are the force-carrying particles that enable the fermions to interact, but the fermions (and anti-fermions) carry fundamental charges that dictate which forces (and bosons) they’re affected by. While the quarks couple to all three forces, the leptons (and anti-leptons) don’t feel the strong force, and the neutrinos (and anti-neutrinos) don’t feel the electromagnetic force, either.

    3
    This diagram displays the structure of the standard model (in a way that displays the key relationships and patterns more completely, and less misleadingly, than in the more familiar image based on a 4×4 square of particles). In particular, this diagram depicts all of the particles in the Standard Model (including their letter names, masses, spins, handedness, charges, and interactions with the gauge bosons: i.e., with the strong and electroweak forces). It also depicts the role of the Higgs boson, and the structure of electroweak symmetry breaking, indicating how the Higgs vacuum expectation value breaks electroweak symmetry, and how the properties of the remaining particles change as a consequence. Note that the Z boson couples to both quarks and leptons, and can decay through neutrino channels. (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    But what’s perhaps most puzzling about the Standard Model is that unlike the bosons, there are “copies” of the fermions. In addition to the fermionic particles that make up the stable or quasi-stable matter we’re familiar with:

    protons and neutrons (made of bound states of up-and-down quarks along with the gluons),
    atoms (made of atomic nuclei, which is made of protons and neutrons, as well as electrons),
    and electron neutrinos and electron antineutrinos (created in the nuclear reactions that involve building up to or decaying down from pre-existing nuclear combinations),

    there are two additional generations of heavier particles for each of these. In addition to the up-and-down quarks and antiquarks in 3 colors apiece, there are also the charm-and-strange quarks plus the top-and-bottom quarks. In addition to the electron, the electron neutrino and their antimatter counterparts, there are also the muon and muon neutrino, plus the tau and the tau neutrino.

    4
    A four-muon candidate event in the ATLAS detector at the Large Hadron Collider. (Technically, this decay involves two muons and two anti-muons.) The muon/anti-muon tracks are highlighted in red, as the long-lived muons travel farther than any other unstable particle. The energies achieved by the LHC are sufficient for creating Higgs bosons; previous electron-positron colliders could not achieve the necessary energies. (ATLAS COLLABORATION/CERN)

    For some reason, there are three copies, or generations, of fermionic particles that show up in the Standard Model. The heavier versions of these particles don’t spontaneously arise from conventional particle interactions, but will show up at very high energies.

    In particle physics, you can create any particle-antiparticle pair at all so long as you have enough available energy at your disposal. How much energy do you need? Whatever the mass of your particle is, you need enough energy to create both it and its partner antiparticle (which happens to always have the same mass as its particle counterpart). From Einstein’s E = mc², which details the conversion between mass and energy, so long as you have enough energy to make them, you can. This is exactly how we create particles of all types from high-energy collisions, like the kind occurring in cosmic rays or at the Large Hadron Collider.

    Cosmic rays produced by high-energy astrophysics sources (ASPERA collaboration – AStroParticle ERAnet)

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    5
    A decaying B-meson, as shown here, may decay more frequently to one type of lepton pair than the other, contradicting Standard Model expectations. If this is the case, we’ll either have to modify the Standard Model or incorporate a new parameter (or set of parameters) into our understanding of how these particles behave, as we needed to do when we discovered that neutrinos had mass. (KEK / BELLE COLLABORATION)

    KEK-Accelerator Laboratory

    KEK Belle detector, at the High Energy Accelerator Research Organisation (KEK) in Tsukuba, Ibaraki Prefecture, Japan

    By the same token, whenever you create one of these unstable quarks or leptons (leaving neutrinos and antineutrinos aside), there’s always the possibility that they’ll decay to a lighter particle through the weak interactions. Because all the Standard Model fermions couple to the weak force, it’s only a matter of a fraction-of-a-second before any of the following particles — strange, charm, bottom, or top quarks, as well as the muon or tau leptons — decay down to that stable first generation of particles.

    As long as it’s energetically allowed and not forbidden by any of the other quantum rules or symmetries that exist in our Universe, the heavier particles will always decay in this fashion. The big question, though, of why there are three generations, is driven not by theoretical motivations, but by experimental results.

    6
    The first muon ever detected, along with other cosmic ray particles, was determined to be the same charge as the electron, but hundreds of times heavier, due to its speed and radius of curvature. The muon was the first of the heavier generations of particles to be discovered, dating all the way back to the 1930s. (PAUL KUNZE, IN Z. PHYS. 83 (1933))

    The muon is the lightest of the fermions to extend beyond the first generation of particles, and caused the famed physicist I. I. Rabi to exclaim, when he was shown the evidence of this particle “who ordered that?” As particle accelerators became more ubiquitous and more energetic over the next decades, particles like mesons and baryons, including ones with strange quarks and later charmed quarks, soon surfaced.

    However, it was only with the advent of the Mark I experiment at SLAC in the 1970s (which co-discovered the charm quark) that evidence for a third generation arose: in the form of the tau (and anti-tau) lepton. That 1976 discovery is now 43 years old. In the time since, we’ve directly detected every particle in the Standard Model, including all of the quarks and neutrinos and anti-neutrinos. Not only have we found them, but we’ve measured their particle properties exquisitely.

    7
    The rest masses of the fundamental particles in the Universe determine when and under what conditions they can be created, and also describe how they will curve spacetime in General Relativity. The properties of particles, fields, and spacetime are all required to describe the Universe we inhabit. (FIG. 15–04A FROM UNIVERSE-REVIEW.CA)

    Based on all we now know, we should be able to predict how these particles interact with themselves and one another, how they decay, and how they contribute to things like cross-sections, scattering amplitudes, branching ratios and event rates for any particle we choose to examine.

    The structure of the Standard Model is what enables us to do these calculations, and the particle content of the Standard Model enables us to predict which light particles the heavier ones will decay into. Perhaps the strongest example is the Z-boson, the neutral particle that mediates the weak force. The Z-boson is the third most massive particle known, with a rest mass of 91.187 GeV/c²: nearly 100 times more massive than a proton. Every time we create a Z-boson, we can experimentally measure the probability that it will decay into any particular particle or combinations of particles.

    8
    At LEP, the large electron-positron collider, thousands upon thousands of Z-bosons were created, and the decays of those Z particles were measured to reconstruct what fraction of Z-bosons became various quark and lepton combinations. The results clearly indicate that there are no fourth-generation particles below 45 GeV/c² in energy. (CERN / ALEPH COLLABORATION)

    CERN LEP Collider

    9
    For detecting the direction and momenta of charged particles with extreme accuracy, the ALEPH detector had at its core a time projection chamber, for years the world’s largest. In the foreground from the left, Jacques Lefrancois, Jack Steinberger, Lorenzo Foa and Pierre Lazeyras. ALEPH was an experiment on the LEP accelerator, which studied high-energy collisions between electrons and positrons from 1989 to 2000.

    By examining what fraction of the Z-bosons we create in accelerators decay to:

    electron/positron pairs,
    muon/anti-muon pairs,
    tau/anti-tau pairs,
    and “invisible” channels (i.e., neutrinos),

    we can determine how many generations of particles there are. As it turns out, 1-out-of-30 Z-bosons decay to each of electron/positron, muon/anti-muon, and tau/anti-tau pairs, while a total out of 1-in-5 Z-boson decays are invisible. According the Standard Model and our theory of particles and their interactions, that translates to 1-in-15 Z-bosons (with ~6.66% odds) will decay to each of the three types of neutrinos that exist.

    These results tell us that if there is a fourth (or more) generation of particles, every single one of them, including leptons and neutrinos, have a mass that’s greater than 45 GeV/c²: a threshold that only the Z, W, Higgs, and top particles are known to exceed.

    10
    The final results from many different particle accelerator experiments have definitively showed that the Z-boson decays to charged leptons about 10% of the time, neutral leptons about 20%, and hadrons (quark-containing particles) about 70% of the time. This is consistent with 3 generations of particles and no other number. (CERN / LEP COLLABORATION)

    Now, there’s nothing forbidding a fourth generation from existing and being much, much heavier than any of the particles we’ve observed so far; theoretically, it’s very much allowed. But experimentally, these collider results aren’t the only thing constraining the number of generational species in the Universe; there’s another constraint: the abundance of the light elements that were created in the early stages of the Big Bang.

    When the Universe was approximately one second old, it contains only protons, neutrons, electrons (and positrons), photons, and neutrinos and anti-neutrinos among the Standard Model particles. Over those first few minutes, protons and neutrons will eventually fuse to form deuterium, helium-3, helium-4, and lithium-7.

    11
    The predicted abundances of helium-4, deuterium, helium-3 and lithium-7 as predicted by Big Bang Nucleosynthesis, with observations shown in the red circles. Note the key point here: a good scientific theory (Big Bang Nucleosynthesis) makes robust, quantitative predictions for what should exist and be measurable, and the measurements (in red) line up extraordinarily well with the theory’s predictions, validating it and constraining the alternatives. The curves and the red line are for 3 neutrino species; more or fewer lead to results that conflict with the data severely, particularly for deuterium and helium-3. (NASA / WMAP SCIENCE TEAM)

    But how much will they form? That’s dependent on just a few parameters, like the baryon-to-photon ratio, which is commonly used to predict these abundances as the only parameter we vary.

    But we can vary any number of parameters we typically assume are fixed, such as the number of neutrino generations. From Big Bang Nucleosynthesis, as well as from the imprint of neutrinos on the leftover radiation glow from the Big Bang (the cosmic microwave background), we can conclude that there are three — not two or fewer and not four or more — generations of particles in the Universe.

    12
    The fit of the number of neutrino species required to match the CMB fluctuation data. Since we know there are three neutrino species, we can use this information to infer the temperature-equivalent of massless neutrinos at these early times, and arrive at a number: 1.96 K, with an uncertainty of just 0.02 K. (BRENT FOLLIN, LLOYD KNOX, MARIUS MILLEA, AND ZHEN PAN (2015) PHYS. REV. LETT. 115, 091301)

    It is eminently possible that there are more particles out there than the Standard Model, as we know it, presently predicts. In fact, given all the components of the Universe that aren’t accounted for in the Standard Model, from dark matter to dark energy to inflation to the origin of the matter-antimatter asymmetry, it’s practically unreasonable to conclude that there aren’t additional particles.

    But if the additional particles fit into the structure of the Standard Model as an additional generation, there are tremendous constraints. They could not have been created in great abundance during the early Universe. None of them can be less massive than 45.6 GeV/c². And they could not imprint an observable signature on the cosmic microwave background or in the abundance of the light elements.

    Experimental results are the way we learn about the Universe, but the way those results fit into our most successful theoretical frameworks is how we conclude what else does and doesn’t exist in our Universe. Unless a future accelerator result surprises us tremendously, three generations is all we get: no more, no less, and nobody knows why.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 12:21 pm on September 27, 2019 Permalink | Reply
    Tags: Accelerator Science, , , , , , ,   

    From Brookhaven National Lab: “U.S. ATLAS Phase I Upgrade Completed” 

    From Brookhaven National Lab

    September 27, 2019
    Stephanie Kossman
    skossman@bnl.gov
    (631) 344-8671

    Peter Genzer,
    genzer@bnl.gov
    (631) 344-3174

    Major upgrades to the ATLAS experiment at CERN will give unprecedented insight into open physics questions.

    1
    Brookhaven physicist Shaochun Tang is shown with the new ATLAS trigger board he designed and engineered for the U.S. ATLAS Phase I Upgrade project.

    The ATLAS experiment at CERN’s Large Hadron Collider (LHC) is ready to begin another chapter in its search for new physics.

    CERN ATLAS Image Claudia Marcelloni

    A significant upgrade to the experiment, called the U.S. ATLAS Phase I Upgrade, has received Critical Decision-4 approval from the U.S. Department of Energy (DOE), signifying the completion of the project and a transition to operations.

    “This milestone will enable us to push the boundaries of our understanding, following the discovery of the Higgs boson at CERN, which resulted in the 2013 Nobel Prize in Physics,” said Project Director Jonathan Kotcher, a senior scientist at DOE’s Brookhaven National Laboratory. “The completion of this project is a major step in the physics campaign being mounted at the energy frontier, which integrates state-of-the-art accelerator and detector technology to probe the fundamental forces and particles of nature. We are very excited to turn to the physics and data analysis that all this hard work has enabled.”

    Led by Brookhaven Lab and Stony Brook University (SBU), the U.S. ATLAS Phase I Upgrade is the initial stage of a larger upgrade planned for the LHC—the High Luminosity Large Hadron Collider (HL-LHC) project.

    The goal is to substantially increase the LHC’s luminosity, enabling scientists to collect 10 times more data from particle collisions, observe very rare processes, and make new discoveries about the building blocks of matter. But first, long-term experiments like ATLAS needed to undergo initial upgrades to prepare for the coming years before the LHC will transition to the HL-LHC mode.

    “The ATLAS experiment has been at the forefront of high-energy particle physics exploration and discoveries for a decade now,” said Deputy Project Manager Marc-André Pleier, a physicist at Brookhaven. “While we have learned a lot so far, our current understanding of the universe cannot explain phenomena such as dark matter, dark energy, or antimatter/matter asymmetry. Providing these detector upgrades for ATLAS will enable us to study even rarer processes than ever before and shed light on poorly understood or unexplored corners of our understanding of how the universe works.”

    “The U.S. ATLAS Phase I Upgrade involved building modern electronics to replace ageing elements with more efficient ones, but it also provided the experiment with new and improved functionalities,” said Project Manager Christopher Bee, a senior scientist at SBU.

    Specifically, the project focused on three components of ATLAS: the trigger/data acquisition system, the liquid argon calorimeter, and the forward muon detector (known as the New Small Wheel). Combined, upgrades to these three components will provide scientists with the ability to collect data more efficiently and at higher data collection rates.

    “Every second, there are several billion proton-proton collision events detected by ATLAS, but only a few hundred are recorded,” said Bee. Those events are selected by the trigger system, which sifts through a wealth of uninteresting events to find ones that may point to new physics or rare Standard Model events. “The data acquisition system moves data from the detector through the trigger system, and then it puts the selected events onto storage for further analysis. Upgrades to this system will improve its ability to select key events.”

    Upgrades to the calorimeter electronics will increase the precision of data coming out of the calorimeter detector. The New Small Wheel will dramatically improve ATLAS’s triggering capability and efficiency for events with muons, subatomic particles known as the “heavy cousins” of electrons.

    “In the experiment’s initial runs, the muon trigger had a substantial ‘fake’ muon rate,” said Pleier. “It was giving the ‘OK’ to accept a large fraction of events that were not interesting. The principle goal of the new small wheel is to reduce the fake trigger rate dramatically.”

    12 U.S. universities and DOE’s Argonne National Laboratory collaborated with Brookhaven Lab and SBU to complete the U.S. ATLAS Phase I Upgrade on time and under budget. This $44 million upgrade project was supported by DOE ($33 million) and the National Science Foundation (NSF) ($11 million).

    “We very much appreciate the support from both DOE and NSF that has allowed us to realize our goals in helping prepare ATLAS for a bright future,” said Bee. “We look forward to capitalizing on the new scientific opportunities enabled by these upgrades.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus


    BNL Center for Functional Nanomaterials

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: