Tagged: CERN LHC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:09 am on March 14, 2021 Permalink | Reply
    Tags: "Searching for elusive supersymmetric particles", , , , CERN LHC, , , , ,   

    From UC Riverside(US): “Searching for elusive supersymmetric particles” 

    UC Riverside bloc

    From UC Riverside(US)

    March 10, 2021
    Iqbal Pittalwala
    Senior Public Information Officer
    (951) 827-6050
    iqbal.pittalwala@ucr.edu

    1
    CMS. Credit: CERN


    European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire(CH)

    The Standard Model of particle physics is the best explanation to date for how the universe works at the subnuclear level and has helped explain, correctly, the elementary particles and forces between them.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS).

    But the model is incomplete, requiring “extensions” to address its shortfalls.

    Owen Long, a professor of physics and astronomy at the University of California, Riverside, is a key member of an international team of scientists that has explored supersymmetry, or SUSY, as an extension of the Standard Model.

    He is also a member of the Compact Muon Solenoid, or CMS, Collaboration at the Large Hadron Collider at CERN in Geneva. CMS is one of CERN’s large particle-capturing detectors.

    “The data from our CMS experiments do not allow us to claim we have found SUSY,” Long said. “But in science, not finding something — a null result — can also be exciting.”

    A theory of physics beyond the Standard Model, SUSY refers to the symmetry between two kinds of elementary particles, bosons and fermions, and is tied to their spins. SUSY proposes that all known fundamental particles have heavier, supersymmetric counterparts, with each supersymmetric partner differing from its Standard Model counterpart by one-half unit in spin. This doubles the number of particle types in nature, allowing many new interactions between the regular particles and new SUSY particles.

    “This is a big change to the Standard Model,” Long said. “The extension can provide answers to some of the fundamental questions that are still unanswered, such as: What is dark matter?”

    The Standard Model explains neither gravity nor dark matter. But in the case of the latter, SUSY does offer a candidate in the form of the lightest supersymmetric particle, which is stable, electrically neutral, and weakly interacting. The invocation of SUSY also naturally explains the small mass of the Higgs boson.

    “The discovery of the elusive SUSY particles would provide an extraordinary insight into the nature of reality,” Long said. “And it would be a revolutionary moment in physics for experimentalists and theorists.”

    At CMS, Long and other scientists hoped to find evidence for SUSY particles by examining signs of their decay as measured by an energy imbalance called missing transverse energy. When they examined the data, they found no signs of the expected energy imbalance from producing SUSY particles.

    “We, therefore, have no evidence for SUSY,” Long said. “But perhaps SUSY is there, and it is just more hidden than initially thought. It’s true we did not find something new, which is disappointing. But it is still very important scientific progress. We now know a lot more about where SUSY does not exist. Our null result motivates us to do follow-up work and guides us where to look next.”

    Long explained that he and his fellow scientists have been looking for SUSY for a long time through a technique based on a connection to dark matter.

    “Those efforts did not find SUSY particles,” he said. “Our new result involves a completely different approach, developed over a couple of years and driven by our interest in looking for SUSY in novel ways. While we found no evidence for SUSY, there is still interest in exploring the idea that SUSY could exist in ways that are more difficult to find. We already have preliminary measurements we are working on.”

    Long was funded by a grant from the Department of Energy. He was joined by three other senior scientists from other institutions in the research.

    UCR is a founding member of the CMS experiment — one of only five U.S. institutions with that distinction.

    Science paper:
    Search for top squarks in final states with two top quarks and several light-flavor jets in proton-proton collisions at s√= 13 TeV.
    Physical Review D

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC Riverside Campus

    The University of California, Riverside(US) is a public land-grant research university in Riverside, California. It is one of the 10 campuses of the University of California(US) system. The main campus sits on 1,900 acres (769 ha) in a suburban district of Riverside with a branch campus of 20 acres (8 ha) in Palm Desert. In 1907, the predecessor to UC Riverside was founded as the UC Citrus Experiment Station, Riverside which pioneered research in biological pest control and the use of growth regulators responsible for extending the citrus growing season in California from four to nine months. Some of the world’s most important research collections on citrus diversity and entomology, as well as science fiction and photography, are located at Riverside.

    UC Riverside’s undergraduate College of Letters and Science opened in 1954. The Regents of the University of California declared UC Riverside a general campus of the system in 1959, and graduate students were admitted in 1961. To accommodate an enrollment of 21,000 students by 2015, more than $730 million has been invested in new construction projects since 1999. Preliminary accreditation of the UC Riverside School of Medicine was granted in October 2012 and the first class of 50 students was enrolled in August 2013. It is the first new research-based public medical school in 40 years.

    UC Riverside is classified among “R1: Doctoral Universities – Very high research activity.” The 2019 U.S. News & World Report Best Colleges rankings places UC Riverside tied for 35th among top public universities and ranks 85th nationwide. Over 27 of UC Riverside’s academic programs, including the Graduate School of Education and the Bourns College of Engineering, are highly ranked nationally based on peer assessment, student selectivity, financial resources, and other factors. Washington Monthly ranked UC Riverside 2nd in the United States in terms of social mobility, research and community service, while U.S. News ranks UC Riverside as the fifth most ethnically diverse and, by the number of undergraduates receiving Pell Grants (42 percent), the 15th most economically diverse student body in the nation. Over 70% of all UC Riverside students graduate within six years without regard to economic disparity. UC Riverside’s extensive outreach and retention programs have contributed to its reputation as a “university of choice” for minority students. In 2005, UCR became the first public university campus in the nation to offer a gender-neutral housing option.UC Riverside’s sports teams are known as the Highlanders and play in the Big West Conference of the National Collegiate Athletic Association (NCAA) Division I. Their nickname was inspired by the high altitude of the campus, which lies on the foothills of Box Springs Mountain. The UC Riverside women’s basketball team won back-to-back Big West championships in 2006 and 2007. In 2007, the men’s baseball team won its first conference championship and advanced to the regionals for the second time since the university moved to Division I in 2001.

    History

    At the turn of the 20th century, Southern California was a major producer of citrus, the region’s primary agricultural export. The industry developed from the country’s first navel orange trees, planted in Riverside in 1873. Lobbied by the citrus industry, the UC Regents established the UC Citrus Experiment Station (CES) on February 14, 1907, on 23 acres (9 ha) of land on the east slope of Mount Rubidoux in Riverside. The station conducted experiments in fertilization, irrigation and crop improvement. In 1917, the station was moved to a larger site, 475 acres (192 ha) near Box Springs Mountain.

    The 1944 passage of the GI Bill during World War II set in motion a rise in college enrollments that necessitated an expansion of the state university system in California. A local group of citrus growers and civic leaders, including many UC Berkeley(US) alumni, lobbied aggressively for a UC-administered liberal arts college next to the CES. State Senator Nelson S. Dilworth authored Senate Bill 512 (1949) which former Assemblyman Philip L. Boyd and Assemblyman John Babbage (both of Riverside) were instrumental in shepherding through the State Legislature. Governor Earl Warren signed the bill in 1949, allocating $2 million for initial campus construction.

    Gordon S. Watkins, dean of the College of Letters and Science at University of California at Los Angeles(US), became the first provost of the new college at Riverside. Initially conceived of as a small college devoted to the liberal arts, he ordered the campus built for a maximum of 1,500 students and recruited many young junior faculty to fill teaching positions. He presided at its opening with 65 faculty and 127 students on February 14, 1954, remarking, “Never have so few been taught by so many.”

    UC Riverside’s enrollment exceeded 1,000 students by the time Clark Kerr became president of the University of California(US) system in 1958. Anticipating a “tidal wave” in enrollment growth required by the baby boom generation, Kerr developed the California Master Plan for Higher Education and the Regents designated Riverside a general university campus in 1959. UC Riverside’s first chancellor, Herman Theodore Spieth, oversaw the beginnings of the school’s transition to a full university and its expansion to a capacity of 5,000 students. UC Riverside’s second chancellor, Ivan Hinderaker led the campus through the era of the free speech movement and kept student protests peaceful in Riverside. According to a 1998 interview with Hinderaker, the city of Riverside received negative press coverage for smog after the mayor asked Governor Ronald Reagan to declare the South Coast Air Basin a disaster area in 1971; subsequent student enrollment declined by up to 25% through 1979. Hinderaker’s development of innovative programs in business administration and biomedical sciences created incentive for enough students to enroll at Riverside to keep the campus open.

    In the 1990s, the UC Riverside experienced a new surge of enrollment applications, now known as “Tidal Wave II”. The Regents targeted UC Riverside for an annual growth rate of 6.3%, the fastest in the UC system, and anticipated 19,900 students at UC Riverside by 2010. By 1995, African American, American Indian, and Latino student enrollments accounted for 30% of the UC Riverside student body, the highest proportion of any UC campus at the time. The 1997 implementation of Proposition 209—which banned the use of affirmative action by state agencies—reduced the ethnic diversity at the more selective UC campuses but further increased it at UC Riverside.

    With UC Riverside scheduled for dramatic population growth, efforts have been made to increase its popular and academic recognition. The students voted for a fee increase to move UC Riverside athletics into NCAA Division I standing in 1998. In the 1990s, proposals were made to establish a law school, a medical school, and a school of public policy at UC Riverside, with the UC Riverside School of Medicine and the School of Public Policy becoming reality in 2012. In June 2006, UC Riverside received its largest gift, 15.5 million from two local couples, in trust towards building its medical school. The Regents formally approved UC Riverside’s medical school proposal in 2006. Upon its completion in 2013, it was the first new medical school built in California in 40 years.

    Academics

    As a campus of the University of California(US) system, UC Riverside is governed by a Board of Regents and administered by a president. The current president is Michael V. Drake, and the current chancellor of the university is Kim A. Wilcox. UC Riverside’s academic policies are set by its Academic Senate, a legislative body composed of all UC Riverside faculty members.

    UC Riverside is organized into three academic colleges, two professional schools, and two graduate schools. UC Riverside’s liberal arts college, the College of Humanities, Arts and Social Sciences, was founded in 1954, and began accepting graduate students in 1960. The College of Natural and Agricultural Sciences, founded in 1960, incorporated the CES as part of the first research-oriented institution at UC Riverside; it eventually also incorporated the natural science departments formerly associated with the liberal arts college to form its present structure in 1974. UC Riverside’s newest academic unit, the Bourns College of Engineering, was founded in 1989. Comprising the professional schools are the Graduate School of Education, founded in 1968, and the UCR School of Business, founded in 1970. These units collectively provide 81 majors and 52 minors, 48 master’s degree programs, and 42 Doctor of Philosophy (PhD) programs. UC Riverside is the only UC campus to offer undergraduate degrees in creative writing and public policy and one of three UCs (along with Berkeley and Irvine) to offer an undergraduate degree in business administration. Through its Division of Biomedical Sciences, founded in 1974, UC Riverside offers the Thomas Haider medical degree program in collaboration with UCLA.[29] UC Riverside’s doctoral program in the emerging field of dance theory, founded in 1992, was the first program of its kind in the United States, and UC Riverside’s minor in lesbian, gay and bisexual studies, established in 1996, was the first undergraduate program of its kind in the UC system. A new BA program in bagpipes was inaugurated in 2007.

    Research and economic impact

    UC Riverside operated under a $727 million budget in fiscal year 2014–15. The state government provided $214 million, student fees accounted for $224 million and $100 million came from contracts and grants. Private support and other sources accounted for the remaining $189 million. Overall, monies spent at UC Riverside have an economic impact of nearly $1 billion in California. UC Riverside research expenditure in FY 2018 totaled $167.8 million. Total research expenditures at UC Riverside are significantly concentrated in agricultural science, accounting for 53% of total research expenditures spent by the university in 2002. Top research centers by expenditure, as measured in 2002, include the Agricultural Experiment Station; the Center for Environmental Research and Technology; the Center for Bibliographical Studies; the Air Pollution Research Center; and the Institute of Geophysics and Planetary Physics.

    Throughout UC Riverside’s history, researchers have developed more than 40 new citrus varieties and invented new techniques to help the $960 million-a-year California citrus industry fight pests and diseases. In 1927, entomologists at the CES introduced two wasps from Australia as natural enemies of a major citrus pest, the citrophilus mealybug, saving growers in Orange County $1 million in annual losses. This event was pivotal in establishing biological control as a practical means of reducing pest populations. In 1963, plant physiologist Charles Coggins proved that application of gibberellic acid allows fruit to remain on citrus trees for extended periods. The ultimate result of his work, which continued through the 1980s, was the extension of the citrus-growing season in California from four to nine months. In 1980, UC Riverside released the Oroblanco grapefruit, its first patented citrus variety. Since then, the citrus breeding program has released other varieties such as the Melogold grapefruit, the Gold Nugget mandarin (or tangerine), and others that have yet to be given trademark names.

    To assist entrepreneurs in developing new products, UC Riverside is a primary partner in the Riverside Regional Technology Park, which includes the City of Riverside and the County of Riverside. It also administers six reserves of the University of California Natural Reserve System. UC Riverside recently announced a partnership with China Agricultural University[中国农业大学](CN) to launch a new center in Beijing, which will study ways to respond to the country’s growing environmental issues. UC Riverside can also boast the birthplace of two name reactions in organic chemistry, the Castro-Stephens coupling and the Midland Alpine Borane Reduction.

     
  • richardmitnick 11:56 am on March 5, 2021 Permalink | Reply
    Tags: "Physicists Just Found 4 New Subatomic Particles That May Test The Laws of Nature", , , , CERN LHC, CERN(CH), , Hadrons, , Mesons, , , , Protons and neutrons, , Quarks and antiquarks, , , , Tetraquarks and pentaquarks, The four new particles we've discovered recently are all tetraquarks with a charm quark pair and two other quarks., The standard model is certainly not the last word in the understanding of particles., These models are crucial to achieve the ultimate goal of the LHC: find physics beyond the standard model.   

    From CERN(CH) via Science Alert(AU): “Physicists Just Found 4 New Subatomic Particles That May Test The Laws of Nature” 

    Cern New Bloc

    Cern New Particle Event


    From CERN(CH)

    via

    ScienceAlert

    Science Alert(AU)

    5 MARCH 2021
    PATRICK KOPPENBURG
    Research Fellow in Particle Physics
    Dutch National Institute for Subatomic Physics, Dutch Research Council (NWO – Nederlandse Organisatie voor Wetenschappelijk Onderzoek)(NL)

    Harry Cliff
    Particle physicist
    University of Cambridge(UK).

    1
    The Large Hadron Collider. Credit: CERN.

    This month is a time to celebrate. CERN has just announced the discovery of four brand new particles [3 March 2021: Observation of two ccus tetraquarks and two ccss tetraquarks.] at the Large Hadron Collider (LHC) in Geneva.

    This means that the LHC has now found a total of 59 new particles, in addition to the Nobel prize-winning Higgs boson, since it started colliding protons – particles that make up the atomic nucleus along with neutrons – in 2009.

    Excitingly, while some of these new particles were expected based on our established theories, some were altogether more surprising.

    The LHC’s goal is to explore the structure of matter at the shortest distances and highest energies ever probed in the lab – testing our current best theory of nature: the Standard Model of Particle Physics.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS).

    And the LHC has delivered the goods – it enabled scientists to discover the Higgs boson [below], the last missing piece of the model. That said, the theory is still far from being fully understood.

    One of its most troublesome features is its description of the strong interaction which holds the atomic nucleus together. The nucleus is made up of protons and neutrons, which are in turn each composed of three tiny particles called quarks (there are six different kinds of quarks: up, down, charm, strange, top and bottom).

    If we switched the strong force off for a second, all matter would immediately disintegrate into a soup of loose quarks – a state that existed for a fleeting instant at the beginning of the universe.

    Don’t get us wrong: the theory of the strong interaction, pretentiously called Quantum Chromodynamics, is on very solid footing. It describes how quarks interact through the strong interaction by exchanging particles called gluons. You can think of gluons as analogues of the more familiar photon, the particle of light and carrier of the electromagnetic interaction.

    However, the way gluons interact with quarks makes the strong interaction behave very differently from electromagnetism. While the electromagnetic interaction gets weaker as you pull two charged particles apart, the strong interaction actually gets stronger as you pull two quarks apart.

    As a result, quarks are forever locked up inside particles called hadrons – particles made of two or more quarks – which includes protons and neutrons. Unless, of course, you smash them open at incredible speeds, as we are doing at Cern.

    To complicate matters further, all the particles in the standard model have antiparticles which are nearly identical to themselves but with the opposite charge (or other quantum property). If you pull a quark out of a proton, the force will eventually be strong enough to create a quark-antiquark pair, with the newly created quark going into the proton.

    You end up with a proton and a brand new “meson”, a particle made of a quark and an antiquark. This may sound weird but according to quantum mechanics, which rules the universe on the smallest of scales, particles can pop out of empty space.

    This has been shown repeatedly by experiments – we have never seen a lone quark. An unpleasant feature of the theory of the strong interaction is that calculations of what would be a simple process in electromagnetism can end up being impossibly complicated. We therefore cannot (yet) prove theoretically that quarks can’t exist on their own.

    Worse still, we can’t even calculate which combinations of quarks would be viable in nature and which would not.

    2
    Illustration of a tetraquark. Credit: CERN.

    When quarks were first discovered, scientists realized that several combinations should be possible in theory. This included pairs of quarks and antiquarks (mesons); three quarks (baryons); three antiquarks (antibaryons); two quarks and two antiquarks (tetraquarks); and four quarks and one antiquark (pentaquarks) – as long as the number of quarks minus antiquarks in each combination was a multiple of three.

    For a long time, only baryons and mesons were seen in experiments. But in 2003, the Belle experiment in Japan discovered a particle that didn’t fit in anywhere.

    KEK Belle detector, at the High Energy Accelerator Research Organisation (KEK) in Tsukuba, Ibaraki Prefecture, Japan.

    Belle II KEK High Energy Accelerator Research Organization Tsukuba, Japan.

    It turned out to be the first of a long series of tetraquarks.

    In 2015, the LHCb experiment [below] at the LHC discovered two pentaquarks.

    3
    Is a pentaquark tightly (above) or weakly bound (see image below)? Credit: CERN.

    The four new particles we’ve discovered recently are all tetraquarks with a charm quark pair and two other quarks. All these objects are particles in the same way as the proton and the neutron are particles. But they are not fundamental particles: quarks and electrons are the true building blocks of matter.

    Charming new particles

    The LHC has now discovered 59 new hadrons. These include the tetraquarks most recently discovered, but also new mesons and baryons. All these new particles contain heavy quarks such as “charm” and “bottom”.

    These hadrons are interesting to study. They tell us what nature considers acceptable as a bound combination of quarks, even if only for very short times.

    They also tell us what nature does not like. For example, why do all tetra- and pentaquarks contain a charm-quark pair (with just one exception)? And why are there no corresponding particles with strange-quark pairs? There is currently no explanation.

    4
    Is a pentaquark a molecule? A meson (left) interacting with a proton (right). Credit: CERN.

    Another mystery is how these particles are bound together by the strong interaction. One school of theorists considers them to be compact objects, like the proton or the neutron.

    Others claim they are akin to “molecules” formed by two loosely bound hadrons. Each newly found hadron allows experiments to measure its mass and other properties, which tell us something about how the strong interaction behaves. This helps bridge the gap between experiment and theory. The more hadrons we can find, the better we can tune the models to the experimental facts.

    These models are crucial to achieve the ultimate goal of the LHC: find physics beyond the standard model. Despite its successes, the standard model is certainly not the last word in the understanding of particles. It is for instance inconsistent with cosmological models describing the formation of the universe.

    The LHC is searching for new fundamental particles that could explain these discrepancies. These particles could be visible at the LHC, but hidden in the background of particle interactions. Or they could show up as small quantum mechanical effects in known processes.

    In either case, a better understanding of the strong interaction is needed to find them. With each new hadron, we improve our knowledge of nature’s laws, leading us to a better description of the most fundamental properties of matter.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN(CH) in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier(CH)

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Maximilien Brice and Julien Marius Ordan.


    SixTRack CERN LHC particles

    The European Organization for Nuclear Research (Organisation européenne pour la recherche nucléaire)(EU), known as CERN, is a European research organization that operates the largest particle physics laboratory in the world. Established in 1954, the organization is based in a northwest suburb of Geneva on the Franco–Swiss border and has 23 member states. Israel is the only non-European country granted full membership. CERN is an official United Nations Observer.

    The acronym CERN is also used to refer to the laboratory, which in 2019 had 2,660 scientific, technical, and administrative staff members, and hosted about 12,400 users from institutions in more than 70 countries. In 2016 CERN generated 49 petabytes of data.

    CERN’s main function is to provide the particle accelerators and other infrastructure needed for high-energy physics research – as a result, numerous experiments have been constructed at CERN through international collaborations. The main site at Meyrin hosts a large computing facility, which is primarily used to store and analyse data from experiments, as well as simulate events. Researchers need remote access to these facilities, so the lab has historically been a major wide area network hub. CERN is also the birthplace of the World Wide Web.

    The convention establishing CERN was ratified on 29 September 1954 by 12 countries in Western Europe. The acronym CERN originally represented the French words for Conseil Européen pour la Recherche Nucléaire (European Council for Nuclear Research), which was a provisional council for building the laboratory, established by 12 European governments in 1952. The acronym was retained for the new laboratory after the provisional council was dissolved, even though the name changed to the current Organisation Européenne pour la Recherche Nucléaire (European Organization for Nuclear Research)(EU) in 1954. According to Lew Kowarski, a former director of CERN, when the name was changed, the abbreviation could have become the awkward OERN, and Werner Heisenberg said that this could “still be CERN even if the name is [not]”.

    CERN’s first president was Sir Benjamin Lockspeiser. Edoardo Amaldi was the general secretary of CERN at its early stages when operations were still provisional, while the first Director-General (1954) was Felix Bloch.

    The laboratory was originally devoted to the study of atomic nuclei, but was soon applied to higher-energy physics, concerned mainly with the study of interactions between subatomic particles. Therefore, the laboratory operated by CERN is commonly referred to as the European laboratory for particle physics (Laboratoire européen pour la physique des particules), which better describes the research being performed there.

    Founding members

    At the sixth session of the CERN Council, which took place in Paris from 29 June – 1 July 1953, the convention establishing the organization was signed, subject to ratification, by 12 states. The convention was gradually ratified by the 12 founding Member States: Belgium, Denmark, France, the Federal Republic of Germany, Greece, Italy, the Netherlands, Norway, Sweden, Switzerland, the United Kingdom, and “Yugoslavia”.

    Scientific achievements

    Several important achievements in particle physics have been made through experiments at CERN. They include:

    1973: The discovery of neutral currents in the Gargamelle bubble chamber.
    1983: The discovery of W and Z bosons in the UA1 and UA2 experiments.
    1989: The determination of the number of light neutrino families at the Large Electron–Positron Collider (LEP) operating on the Z boson peak.
    1995: The first creation of antihydrogen atoms in the PS210 experiment.
    1999: The discovery of direct CP violation in the NA48 experiment.
    2010: The isolation of 38 atoms of antihydrogen.
    2011: Maintaining antihydrogen for over 15 minutes.
    2012: A boson with mass around 125 GeV/c2 consistent with the long-sought Higgs boson.

    In September 2011, CERN attracted media attention when the OPERA Collaboration reported the detection of possibly faster-than-light neutrinos. Further tests showed that the results were flawed due to an incorrectly connected GPS synchronization cable.

    The 1984 Nobel Prize for Physics was awarded to Carlo Rubbia and Simon van der Meer for the developments that resulted in the discoveries of the W and Z bosons. The 1992 Nobel Prize for Physics was awarded to CERN staff researcher Georges Charpak “for his invention and development of particle detectors, in particular the multiwire proportional chamber”. The 2013 Nobel Prize for Physics was awarded to François Englert and Peter Higgs for the theoretical description of the Higgs mechanism in the year after the Higgs boson was found by CERN experiments.

    Computer science

    The World Wide Web began as a CERN project named ENQUIRE, initiated by Tim Berners-Lee in 1989 and Robert Cailliau in 1990. Berners-Lee and Cailliau were jointly honoured by the Association for Computing Machinery in 1995 for their contributions to the development of the World Wide Web.

    Current complex

    CERN operates a network of six accelerators and a decelerator. Each machine in the chain increases the energy of particle beams before delivering them to experiments or to the next more powerful accelerator. Currently (as of 2019) active machines are:

    The LINAC 3 linear accelerator generating low energy particles. It provides heavy ions at 4.2 MeV/u for injection into the Low Energy Ion Ring (LEIR).
    The Proton Synchrotron Booster increases the energy of particles generated by the proton linear accelerator before they are transferred to the other accelerators.
    The Low Energy Ion Ring (LEIR) accelerates the ions from the ion linear accelerator LINAC 3, before transferring them to the Proton Synchrotron (PS). This accelerator was commissioned in 2005, after having been reconfigured from the previous Low Energy Antiproton Ring (LEAR).
    The 28 GeV Proton Synchrotron (PS), built during 1954—1959 and still operating as a feeder to the more powerful SPS.
    The Super Proton Synchrotron (SPS), a circular accelerator with a diameter of 2 kilometres built in a tunnel, which started operation in 1976. It was designed to deliver an energy of 300 GeV and was gradually upgraded to 450 GeV. As well as having its own beamlines for fixed-target experiments (currently COMPASS and NA62), it has been operated as a proton–antiproton collider (the SppS collider), and for accelerating high energy electrons and positrons which were injected into the Large Electron–Positron Collider (LEP). Since 2008, it has been used to inject protons and heavy ions into the Large Hadron Collider (LHC).
    The On-Line Isotope Mass Separator (ISOLDE), which is used to study unstable nuclei. The radioactive ions are produced by the impact of protons at an energy of 1.0–1.4 GeV from the Proton Synchrotron Booster. It was first commissioned in 1967 and was rebuilt with major upgrades in 1974 and 1992.
    The Antiproton Decelerator (AD), which reduces the velocity of antiprotons to about 10% of the speed of light for research of antimatter.[50] The AD machine was reconfigured from the previous Antiproton Collector (AC) machine.
    The AWAKE experiment, which is a proof-of-principle plasma wakefield accelerator.
    The CERN Linear Electron Accelerator for Research (CLEAR) accelerator research and development facility.

    Large Hadron Collider

    Many activities at CERN currently involve operating the Large Hadron Collider (LHC) and the experiments for it. The LHC represents a large-scale, worldwide scientific cooperation project.

    The LHC tunnel is located 100 metres underground, in the region between the Geneva International Airport and the nearby Jura mountains. The majority of its length is on the French side of the border. It uses the 27 km circumference circular tunnel previously occupied by the Large Electron–Positron Collider (LEP), which was shut down in November 2000. CERN’s existing PS/SPS accelerator complexes are used to pre-accelerate protons and lead ions which are then injected into the LHC.

    Eight experiments (CMS, ATLAS, LHCb, MoEDAL, TOTEM, LHCf, FASER and ALICE) are located along the collider; each of them studies particle collisions from a different aspect, and with different technologies. Construction for these experiments required an extraordinary engineering effort. For example, a special crane was rented from Belgium to lower pieces of the CMS detector into its cavern, since each piece weighed nearly 2,000 tons. The first of the approximately 5,000 magnets necessary for construction was lowered down a special shaft at 13:00 GMT on 7 March 2005.

    The LHC has begun to generate vast quantities of data, which CERN streams to laboratories around the world for distributed processing (making use of a specialized grid infrastructure, the LHC Computing Grid). During April 2005, a trial successfully streamed 600 MB/s to seven different sites across the world.

    The initial particle beams were injected into the LHC August 2008. The first beam was circulated through the entire LHC on 10 September 2008, but the system failed 10 days later because of a faulty magnet connection, and it was stopped for repairs on 19 September 2008.

    The LHC resumed operation on 20 November 2009 by successfully circulating two beams, each with an energy of 3.5 teraelectronvolts (TeV). The challenge for the engineers was then to try to line up the two beams so that they smashed into each other. This is like “firing two needles across the Atlantic and getting them to hit each other” according to Steve Myers, director for accelerators and technology.

    On 30 March 2010, the LHC successfully collided two proton beams with 3.5 TeV of energy per proton, resulting in a 7 TeV collision energy. However, this was just the start of what was needed for the expected discovery of the Higgs boson. When the 7 TeV experimental period ended, the LHC revved to 8 TeV (4 TeV per proton) starting March 2012, and soon began particle collisions at that energy. In July 2012, CERN scientists announced the discovery of a new sub-atomic particle that was later confirmed to be the Higgs boson.

    CERN CMS Higgs Event May 27, 2012.


    CERN ATLAS Higgs Event
    June 12, 2012.


    Peter Higgs

    In March 2013, CERN announced that the measurements performed on the newly found particle allowed it to conclude that this is a Higgs boson. In early 2013, the LHC was deactivated for a two-year maintenance period, to strengthen the electrical connections between magnets inside the accelerator and for other upgrades.

    On 5 April 2015, after two years of maintenance and consolidation, the LHC restarted for a second run. The first ramp to the record-breaking energy of 6.5 TeV was performed on 10 April 2015. In 2016, the design collision rate was exceeded for the first time. A second two-year period of shutdown begun at the end of 2018.

    Accelerators under construction

    As of October 2019, the construction is on-going to upgrade the LHC’s luminosity in a project called High Luminosity LHC (HL-LHC).

    This project should see the LHC accelerator upgraded by 2026 to an order of magnitude higher luminosity.

    As part of the HL-LHC upgrade project, also other CERN accelerators and their subsystems are receiving upgrades. Among other work, the LINAC 2 linear accelerator injector was decommissioned, to be replaced by a new injector accelerator, the LINAC4 in 2020.

    Possible future accelerators

    CERN, in collaboration with groups worldwide, is investigating two main concepts for future accelerators: A linear electron-positron collider with a new acceleration concept to increase the energy (CLIC) and a larger version of the LHC, a project currently named Future Circular Collider.

    CLIC collider

    CERN FCC Future Circular Collider details of proposed 100km-diameter successor to LHC.

    Not discussed or described, but worthy of consideration is the ILC, International Linear Collider in the planning stages for construction in Japan.

    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan.

    Participation

    Since its foundation by 12 members in 1954, CERN regularly accepted new members. All new members have remained in the organization continuously since their accession, except Spain and Yugoslavia. Spain first joined CERN in 1961, withdrew in 1969, and rejoined in 1983. Yugoslavia was a founding member of CERN but quit in 1961. Of the 23 members, Israel joined CERN as a full member on 6 January 2014, becoming the first (and currently only) non-European full member.

    Enlargement

    Associate Members, Candidates:

    Turkey signed an association agreement on 12 May 2014 and became an associate member on 6 May 2015.
    Pakistan signed an association agreement on 19 December 2014 and became an associate member on 31 July 2015.
    Cyprus signed an association agreement on 5 October 2012 and became an associate Member in the pre-stage to membership on 1 April 2016.
    Ukraine signed an association agreement on 3 October 2013. The agreement was ratified on 5 October 2016.
    India signed an association agreement on 21 November 2016. The agreement was ratified on 16 January 2017.
    Slovenia was approved for admission as an Associate Member state in the pre-stage to membership on 16 December 2016. The agreement was ratified on 4 July 2017.
    Lithuania was approved for admission as an Associate Member state on 16 June 2017. The association agreement was signed on 27 June 2017 and ratified on 8 January 2018.
    Croatia was approved for admission as an Associate Member state on 28 February 2019. The agreement was ratified on 10 October 2019.
    Estonia was approved for admission as an Associate Member in the pre-stage to membership state on 19 June 2020. The agreement was ratified on 1 February 2021.

     
  • richardmitnick 9:51 am on February 22, 2021 Permalink | Reply
    Tags: "Coffea speeds up particle physics data analysis", , , CERN LHC, , , , ,   

    From DOE’s Fermi National Accelerator Laboratory(US): “Coffea speeds up particle physics data analysis” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From DOE’s Fermi National Accelerator Laboratory(US) , an enduring source of strength for the US contribution to scientific research world wide.

    February 19, 2021
    Scott Hershberger

    Analyzing the mountains of data generated by the Large Hadron Collider at the European laboratory CERN takes so much time that even the computers need coffee. Or rather, Coffea — Columnar Object Framework for Effective Analysis.

    A package in the programming language Python, Coffea (pronounced like the stimulating beverage) speeds up the analysis of massive data sets in high-energy physics research. Although Coffea streamlines computation, the software’s primary goal is to optimize scientists’ time.

    “The efficiency of a human being in producing scientific results is of course affected by the tools that you have available,” said Matteo Cremonesi, a postdoc at the U.S. Department of Energy’s Fermi National Accelerator Laboratory. “If it takes more than a day for me to get a single number out of a computation — which often happens in high-energy physics — that’s going to hamper my efficiency as a scientist.”

    Frustrated by the tedious manual work they faced when writing computer code to analyze LHC data, Cremonesi and Fermilab scientist Lindsey Gray assembled a team of Fermilab researchers in 2018 to adapt cutting-edge big data techniques to solve the most challenging questions in high-energy physics. Since then, around a dozen research groups on the CMS experiment — one of the LHC’s two large general-purpose detectors — have adopted Coffea for their work.

    1
    CERN CMS. Around a dozen research groups on the CMS experiment at the Large Hadron Collider have adopted the Coffea data analysis tool for their work. Starting from information about the particles generated in collisions, Coffea enables large statistical analyses that hone researchers’ understanding of the underlying physics, enabling faster run times and more efficient use of computing resources. Credit: CERN.

    CERN map

    SixTrack CERN (CH) LHC particles.

    Starting from information about the particles generated in collisions, Coffea enables large statistical analyses that hone researchers’ understanding of the underlying physics. (Data processing facilities at the LHC carry out the initial conversion of raw data into a format particle physicists can use for analysis.) A typical analysis on the current LHC data set involves processing an astounding roughly 10 billion particle events that can add up to over 50 terabytes of data. That’s the data equivalent of approximately 25,000 hours of streaming video on Netflix.

    At the heart of Fermilab’s analysis tool lies a shift from a method known as event loop analysis to one called columnar analysis.

    “You have a choice whether you want to iterate over each row and do an operation within the columns or if you want to iterate over the operations you’re doing and attack all the rows at once,” explained Fermilab postdoctoral researcher Nick Smith, the main developer of Coffea. “It’s sort of an order-of-operations thing.”

    For example, imagine that for each row, you want to add together the numbers in three columns. In event loop analysis, you would start by adding together the three numbers in the first row. Then you would add together the three numbers in the second row, then move on to the third row, and so on. With a columnar approach, by contrast, you would start by adding the first and second columns for all the rows. Then you would add that result to the third column for all the rows.

    “In both cases, the end result would be the same,” Smith said. “But there are some trade-offs you make under the hood, in the machine, that have a big impact on efficiency.”

    In data sets with many rows, columnar analysis runs around 100 times faster than event loop analysis in Python. Yet prior to Coffea, particle physicists primarily used event loop analysis in their work — even for data sets with millions or billions of collisions.

    The Fermilab researchers decided to pursue a columnar approach, but they faced a glaring challenge: High-energy physics data cannot easily be represented as a table with rows and columns. One particle collision might generate a slew of muons and few electrons, while the next might produce no muons and many electrons. Building on a library of Python code called Awkward Array, the team devised a way to convert the irregular, nested structure of LHC data into tables compatible with columnar analysis. Generally, each row corresponds to one collision, and each column corresponds to a property of a particle created in the collision.

    Coffea’s benefits extend beyond faster run times — minutes compared to hours or days with respect to interpreted Python code — and more efficient use of computing resources. The software takes mundane coding decisions out of the hands of the scientists, allowing them to work on a more abstract level with fewer chances to make errors.

    “Researchers are not here to be programmers,” Smith said. “They’re here to be data scientists.”

    Cremonesi, who searches for dark matter at CMS, was among the first researchers to use Coffea with no backup system. At first, he and the rest of the Fermilab team actively sought to persuade other groups to try the tool. Now, researchers frequently approach them asking how to apply Coffea to their own work.

    Soon, Coffea’s use will expand beyond CMS. Researchers at the Institute for Research and Innovation in Software for High Energy Physics, supported by the U.S. National Science Foundation, plan to incorporate Coffea into future analysis systems for both CMS and ATLAS, the LHC’s other large general-purpose experimental detector.

    CERN (CH) ATLAS Credit: Claudia Marcelloni.

    An upgrade to the LHC known as the High-Luminosity LHC, targeted for completion in the mid-2020s, will record about 100 times as much data, making the efficient data analysis offered by Coffea even more valuable for the LHC experiments’ international collaborators.

    In the future, the Fermilab team also plans to break Coffea into several Python packages, allowing researchers to use just the pieces relevant to them. For instance, some scientists use Coffea mainly for its histogram feature, Gray said.

    For the Fermilab researchers, the success of Coffea reflects a necessary shift in particle physicists’ mindset.

    “Historically, the way we do science focuses a lot on the hardware component of creating an experiment,” Cremonesi said. “But we have reached an era in physics research where handling the software component of our scientific process is just as important.”

    Coffea promises to bring high-energy physics into sync with recent advances in big data in other scientific fields. This cross-pollination may prove to be Coffea’s most far-reaching benefit.

    “I think it’s important for us as a community in high-energy physics to think about what kind of skills we’re imparting to the people that we’re training,” Gray said. “Making sure that we as a field are pertinent to the rest of the world when it comes to data science is a good thing to do.”

    U.S. participation in CMS is supported by the Department of Energy Office of Science.

    See the full here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory(US), located just outside Batavia, Illinois, near Chicago, is a United States Department of Energy national laboratory specializing in high-energy particle physics. Since 2007, Fermilab has been operated by the Fermi Research Alliance, a joint venture of the University of Chicago, and the Universities Research Association (URA). Fermilab is a part of the Illinois Technology and Research Corridor.

    Fermilab’s Tevatron was a landmark particle accelerator; until the startup in 2008 of the Large Hadron Collider (LHC) near Geneva, Switzerland, it was the most powerful particle accelerator in the world, accelerating antiprotons to energies of 500 GeV, and producing proton-proton collisions with energies of up to 1.6 TeV, the first accelerator to reach one “tera-electron-volt” energy. At 3.9 miles (6.3 km), it was the world’s fourth-largest particle accelerator in circumference. One of its most important achievements was the 1995 discovery of the top quark, announced by research teams using the Tevatron’s CDF and DØ detectors. It was shut down in 2011.

    In addition to high-energy collider physics, Fermilab hosts fixed-target and neutrino experiments, such as MicroBooNE (Micro Booster Neutrino Experiment), NOνA (NuMI Off-Axis νe Appearance) and SeaQuest. Completed neutrino experiments include MINOS (Main Injector Neutrino Oscillation Search), MINOS+, MiniBooNE and SciBooNE (SciBar Booster Neutrino Experiment). The MiniBooNE detector was a 40-foot (12 m) diameter sphere containing 800 tons of mineral oil lined with 1,520 phototube detectors. An estimated 1 million neutrino events were recorded each year. SciBooNE sat in the same neutrino beam as MiniBooNE but had fine-grained tracking capabilities. The NOνA experiment uses, and the MINOS experiment used, Fermilab’s NuMI (Neutrinos at the Main Injector) beam, which is an intense beam of neutrinos that travels 455 miles (732 km) through the Earth to the Soudan Mine in Minnesota and the Ash River, Minnesota, site of the NOνA far detector. In 2017, the ICARUS neutrino experiment was moved from CERN to Fermilab, with plans to begin operation in 2020.

    In the public realm, Fermilab is home to a native prairie ecosystem restoration project and hosts many cultural events: public science lectures and symposia, classical and contemporary music concerts, folk dancing and arts galleries. The site is open from dawn to dusk to visitors who present valid photo identification.

    Asteroid 11998 Fermilab is named in honor of the laboratory.

    Weston, Illinois, was a community next to Batavia voted out of existence by its village board in 1966 to provide a site for Fermilab.

    The laboratory was founded in 1969 as the National Accelerator Laboratory; it was renamed in honor of Enrico Fermi in 1974. The laboratory’s first director was Robert Rathbun Wilson, under whom the laboratory opened ahead of time and under budget. Many of the sculptures on the site are of his creation. He is the namesake of the site’s high-rise laboratory building, whose unique shape has become the symbol for Fermilab and which is the center of activity on the campus.

    After Wilson stepped down in 1978 to protest the lack of funding for the lab, Leon M. Lederman took on the job. It was under his guidance that the original accelerator was replaced with the Tevatron, an accelerator capable of colliding protons and antiprotons at a combined energy of 1.96 TeV. Lederman stepped down in 1989. The science education center at the site was named in his honor.

    The later directors include:

    John Peoples, 1989 to 1996
    Michael S. Witherell, July 1999 to June 2005
    Piermaria Oddone, July 2005 to July 2013
    Nigel Lockyer, September 2013 to the present

    Fermilab continues to participate in the work at the Large Hadron Collider (LHC); it serves as a Tier 1 site in the Worldwide LHC Computing Grid.

     
  • richardmitnick 11:34 am on February 10, 2021 Permalink | Reply
    Tags: "Turbocharging data", , , , CERN LHC, , High-performance computing (HPC) and concurrent software improvements are supercharging data-processing., Large-scale experiments such as ATLAS and NSLS-II have their own special data challenges that HPC can help solve., many-core CPUs and graphics processing units (GPUs)., , , So far GPUs don’t function stand alone-they’re attached to CPUs as accelerators., , To get maximum performance researchers must figure out how to move data-and how much to move-from CPU memory to GPUs., Today’s supercomputing systems have massively parallel architectures that simultaneously employ thousands of multicore central processing units (CPUs)   

    From DEIXIS: “Turbocharging data” 


    From DEIXIS

    February 2021
    Sally Johnson

    Pairing large-scale experiments with high-performance computing can reduce data processing time from several hours to minutes.

    High-performance computing (HPC) and concurrent software improvements are supercharging data-processing, just as giant experimental facilities such as CERN’s Large Hadron Collider (LHC) particle accelerator and Brookhaven National Laboratory’s National Synchrotron Light Source II (NSLS-II) seek ways to sift mountains of information.

    CERN LHC Maximilien Brice and Julien Marius Ordan.

    BNL NSLS-II.

    LHC’s ATLAS, which observes the highest-energy proton-proton collisions like those in the Higgs boson discovery, and imaging experiments at the NSLS-II generate “anywhere from several gigabytes to hundreds of petabytes in terms of data volume,” says Meifeng Lin, who leads the High Performance Computing Group for Brookhaven’s Computational Science Initiative (CSI).

    CERN (CH) ATLAS Image Claudia Marcelloni.


    CERN ATLAS Higgs Event
    June 12, 2012.

    3
    The Scientific Computing and Data Center at Brookhaven National Laboratory. Credit: Brookhaven National Laboratory.

    These data typically must go through multiple processing stages to help scientists find meaningful physical results.

    “It’s a perfect marriage between HPC and these big experiments – in the sense that HPC can dramatically improve the speed of data processing,” Lin says.

    Today’s supercomputing systems have massively parallel architectures that simultaneously employ hundreds of thousands of multicore central processing units (CPUs), many-core CPUs and graphics processing units (GPUs), Lin says. “These architectures offer a lot of processing power, which was initially needed for large-scale simulations but is now also used” to quickly analyze large amounts of data.

    But “there are many different HPC architectures, and for both CPUs and GPUs there are several levels of parallelism,” Lin says. “The complicated part is to harness the power of all of these different levels.”

    GPUs offer thousands of processing units. Intricacies such as how data are stored in memory can affect their performance, Lin says.

    “So far, GPUs don’t function stand-alone,” she says. “They’re attached to CPUs as accelerators. So a GPU program will launch from the CPU side and data initially get stored on CPU memory. For GPUs to process the data, you need to first transfer the data from CPU memory to GPU memory, although there are newer technologies that can circumvent this step.”

    To get maximum performance, researchers must figure out how to move data – and how much to move – from CPU memory to GPUs. “Moving these data,” Lin notes, “incurs some computational overhead.” If the data transfer is large, “it requires quite a bit of time compared to the computational speed GPUs can offer.” Multiple issues “affect how you program your software and how much computational speedup you can get.”

    Large-scale experiments such as ATLAS and NSLS-II have their own special data challenges that HPC can help solve.

    Particle physics experiments can run a long time, accumulating loads of data. Algorithms must remove background noise from these troves so scientists can find the signals they seek. “To do this, they have a very large and complex software stack with millions of lines of code,” Lin says. “The signals the scientists want are usually very weak compared to the background noise they need to ignore. So their software undergoes a very rigorous validation and verification process, and changing it is a long and complicated process.”

    In NSLS-II imaging experiments, scientists shine powerful X-rays through materials to obtain structural or physical information. “The software is relatively stand-alone and compact in size, so it’s easier for us to adapt it to make use of the HPC platform,” Lin says.

    Lin cites ptychography, an X-ray microscopy technique that employs computation, as an example of how HPC can accelerate data processing. A computer program reconstructs physical sample information from X-ray scattering data gathered at the NSLS-II’s Hard X-ray Nanoprobe beamline. The code “did the job, but it took a very long time to reconstruct even one sample – up to 10 hours or days, depending on the imaging technique or algorithm used.”

    Lin helped Brookhaven physicist Xiaojing Huang parallelize the computation-heavy ptychography reconstruction algorithm over multiple GPUs. “This,” Huang explains, “increases the calculation speed by more than a thousand times, which shortens the image recovery process from days to minutes. The real-time image processing capability is critical for providing online feedback and optimizing our scientific productivity.”

    To achieve this speedup, Lin and her CIS and NSLS-II colleagues reprogrammed the code to make it run on multiple GPUs. It wasn’t easy. The original software was written for serial processing – running on a single CPU core.

    “We want to be able to break the calculation into several independent pieces part of the time,” Lin says. “And we have to combine them so we’re solving the original big problem, not just individual pieces.”

    One of the most difficult parts was finding the right model to let scientists write the GPU code without going into low-level programming descriptions. “A common programming model for GPUs is CUDA, which has a learning curve,” Lin says. “We wanted to provide the scientists with an interface that’s independent of architecture-specific programming and easy to maintain.”

    The team tapped CuPy, an open-source project that lets users program GPUs with a language they’re already familiar with and use in production. The CuPy interface complies with NumPy, the core scientific computing library in the Python language, on which the code is based.

    One leading CuPy contributor is Leo Fang, a CSI assistant computational scientist who maintains the group’s ptychography software. CuPy and other open-source projects, he says, build “a critical, scalable infrastructure for our scientists and their applications.” The work “benefits the entire open-source community. With this capability, we shift the bottleneck of our end-to-end workflow elsewhere.”

    Lin and her colleagues are making the software portable across various HPC architectures, such as GPUs from different vendors, and to expand the user base. So far, they’ve run their code on NVIDIA and AMD GPUs as well as multicore CPUs. Fang hopes “our users can easily run the code, either on a workstation at home or in a cluster, without being tied to a specific hardware vendor. It’s cool to be able to do this with little to no refactoring in our code.”

    Lin next wants to bring portable HPC solutions to particle physics experiments such as those at the LHC. Its software has special challenges. She and her colleagues are eager to take them on.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Deixis Magazine

    DEIXIS: Computational Science at the National Laboratories is the frequently updated online companion to the eponymous annual publication of the Computational Science Graduate Fellowship. The Krell Institute manages the program for the U.S. Department of Energy.

    DOE and the Computational Science Graduate Fellowship

    The Department of Energy mission is to advance the national, economic, and energy security of the United States; to promote scientific and technological innovation in support of that mission; and to ensure the environmental cleanup of the national nuclear weapons complex. Its Computational Science Graduate Fellowship program provides outstanding benefits and opportunities to students pursuing a Ph.D. in scientific or engineering disciplines with an emphasis in high-performance computing.
    The Krell Institute

    Since it inception in 1997, the Krell Institute has provided superior technical resources, knowledge and experience in managing technology-based education and information programs, including two of the most successful fellowships offered by a U.S. science agency. Krell is named after the advanced civilization that once inhabited the planet Altair IV in the classic 1956 science fiction movie Forbidden Planet.

     
  • richardmitnick 1:36 pm on February 2, 2021 Permalink | Reply
    Tags: "What is luminosity?", , CERN LHC, Collisions are complicated. So physicists talk about luminosity instead., Even if two protons do interact does it count as a collision?, For integrated luminosity physicists switch from squared centimeters to a new unit of area: the barn., , In the HL-LHC 220 billion protons are expected to pass through another 220 billion protons every 25 nanoseconds at the accelerator’s four experimental intersections., , Protons are messy packages of fields and even smaller particles called quarks., Protons aren’t solid orbs that bounce break or shatter when they come into contact with each other., , Talking in barns—and an even smaller unit equal to 10^15 barns called the “femtobarn”., The barn was invented during the 1940s. Its actual size—10^24 centimeters squared-it is equivalent to the size of a uranium nucleus., The HL-LHC will increase the total number of potential collisions scientists have to study by at least a factor of 10., The rate at which particles are brought together to collide is called “instantaneous luminosity.”, Two protons could pass right through each other., Why luminosity and not collisions?   

    From Symmetry: “What is luminosity?” 

    Symmetry Mag
    From Symmetry

    02/02/21
    Sarah Charley

    1
    Illustration by Sandbox Studio, Chicago with Ariel Davis.

    Even on the hottest and driest days, rays from the sun are too weak to ignite a fire. But with a magnifying glass (or, in some unfortunate cases, a glass garden ornament), you can focus sunlight into a beam bright enough to set tinder ablaze.

    At the Large Hadron Collider, scientists apply this same principle when focusing beams of protons (or sometimes heavy ions) before passing them through the accelerator’s four collision points.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS

    CERN/CMS


    LHCb

    CERN/LHCb detector

    High-energy particle collisions allow scientists to study the fundamental laws of physics and search for new particles, fields and forces.

    By tightly focusing the proton beams right before colliding them, scientists can quickly grow the number of collision events they have to study.

    Scientists, engineers and technicians at CERN and around the world—including at Fermi National Accelerator Laboratory, Brookhaven National Laboratory and Lawrence Berkeley National Laboratory, together as part of the US Department of Energy Office of Science’s High-Luminosity LHC Accelerator Upgrade Program—are building new focusing magnets, which will squeeze the colliding protons into even smaller volumes. They’re also designing new kicker magnets, which will bump the trajectories of the incoming particles to help the two beams meet face-to-face at the collision point.

    In the late 2020s, scientists will turn on a turbocharged High-Luminosity LHC. The upgrade will increase the total number of potential collisions scientists have to study by at least a factor of 10.

    Why luminosity and not collisions?

    As you may have noticed, when physicists talk about particle collisions, they talk about a measurement called luminosity. It doesn’t tell scientists exactly how many particle collisions are happening inside a collider; rather, luminosity measures how tightly packed the particles are in the beams that cross. The tighter the squeeze, the more likely it is that some of the particles will collide.

    In the HL-LHC, 220 billion protons are expected to pass through another 220 billion protons every 25 nanoseconds at the accelerator’s four experimental intersections. But the vast majority of the protons will not actually interact with one another. Even with today’s best beam-focusing technology, the odds of a proton colliding with another proton inside the LHC ring is still significantly less than the odds of winning the Mega Millions Jackpot.

    Protons aren’t solid orbs that bounce, break or shatter when they come into contact with each other. Rather, they are messy packages of fields and even smaller particles called quarks.

    Two protons could pass right through each other, and there’s a chance all they would do is replay that scene from the movie Ghost in which actor Patrick Swayze, playing the titular phantom, sticks his ethereal head into a moving train—to no effect. You can bring the protons into a head-on collision, but you can’t make them interact.

    Even if two protons do interact, does it count as a collision? If two protons zip past one another and the shockwave from their intersecting electromagnetic fields ejects a few photons, does that count? What if one of these stray photons plunges through the heart of another proton? What if two protons graze each other and shoot off a bunch of particles, but stay intact?

    Collisions are complicated. So physicists talk about luminosity instead.

    Collision rate

    The rate at which particles are brought together to collide is called “instantaneous luminosity.”

    “The instantaneous luminosity depends on the number of particles in each colliding beam and the area of the beams,” says Paul Lujan, a postdoc at the University of Canterbury who works on luminosity measurements for the CMS experiment. “A smaller beam size means more potential collisions per second.”

    In 2017, LHC physicists achieved a new record when they measured an instantaneous luminosity of 2.06 x 10^34 per square centimeter per second. (Multiply together the number of protons in each beam, then divide by the beam area—in square centimeters—over time.)

    “The units of luminosity are a bit non-intuitive,” Lujan says, “but it gives us exactly the information we need.”

    When scientists load up the LHC with a new batch of particles to collide, they keep them running as long as the beams are in good enough condition with enough particles left to have a good instantaneous luminosity.

    Considering an average LHC fill lasts between 10 and 20 hours, the number of potential collisions can climb very quickly. So scientists don’t just care about instantaneous luminosity; they also care about “integrated luminosity,” how many potential collisions accumulate over those hours of running.

    2
    Illustration by Sandbox Studio, Chicago with Ariel Davis.

    Couldn’t hit the broad side of a barn door

    The difference between instantaneous luminosity and integrated luminosity is the difference between, “Right now I’m driving at 60 miles per hour,” and “Over ten hours, I drove 600 miles.”

    For integrated luminosity, physicists switch from squared centimeters to a new unit of area: the barn, a reference to the idiom, “Couldn’t hit the broad side of a barn.” From a subatomic particle’s point of view, “the barn” is so massive that it would be difficult to miss.

    The barn was invented during the 1940s. Its actual size—10^24 centimeters squared—was classified until the end of World War II. That’s because it is equivalent to the size of a uranium nucleus, a key ingredient in the then-newly developed atomic bomb.

    The barn stuck around after the war and became a standard way to measure area in nuclear and particle physics.

    Talking in barns—and an even smaller unit equal to 10^15 barns called the “femtobarn”—allows physicists to take an enormous number and convert it, turning it from something too long to write out on the side of an actual barn into something that could fit on a postcard.

    Physicists also use femtobarns to measure the probability of a subatomic process, called its “cross section.”

    “Imagine a food fight in a cafeteria,” Lujan says. “We can predict the number of people who will get splattered with a stray meatball [a “meatball interaction,” if you will] based on the number of people present, the area and dimensions of the cafeteria, how long the food fight lasts [which can be used to calculate the “integrated luminosity” of all possible interactions, including meatball interactions] as well as the likelihood of that particular process [the “cross section” of a meatball interaction].”

    To test the laws of physics, physicists compare their predictions about the probability of certain processes to what they actually see in practice.

    With the HL-LHC upgrade, scientists are increasing the number of protons, decreasing the diameter of the collision points, and better aligning the protons’ trajectories. All of these changes help to increase the likelihood that protons will interact with each other when they cruise through the LHC’s intersections. The increased number of collision opportunities will help physicists find and study rare processes and particles that are key to understanding the fundamental laws of physics.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 5:11 pm on January 29, 2021 Permalink | Reply
    Tags: "Discovery machines", , ADONE ( 1969-1993 ) at Frascati (IT), Antiproton Accumulator CERN, Axial Field Spectrometer (AFS) CERN, , Bevatron at Lawrence Berkeley National Laboratory, Brookhaven’s Alternating Gradient Synchrotron, , CERN Intersecting Storage Rings (ISR), CERN LHC, , CERN’s Proton Synchrotron, From CERN (CH) Courier, Gargamelle heavy-liquid bubble chamber CERN, , Initial Cooling Experiment (ICE) 1977–1978 CERN, Intersecting Storage Rings (ISR) CERN, New era CERN’s Intersecting Storage Rings in 1974, , , , Proton Synchrotron at Serpukov near Moscow, Soviet VEP-1; VEPP-2 and VEPP-2M between 1963 and 1974, SPEAR at SLAC, Split Field Magnet facility 1977-2021 CERN., Super collider CERN’s SppS in 1983, Superconducting Super Collider to be built in Texas killed off in 1993 by the US idiot Congress, The Cosmotron at Brookhaven National Laboratory, The Super Proton Synchrotron (SPS) CERN, The Tevatron at Fermilab 1983-2011, UA1 and UA2 at CERN,   

    From CERN (CH) Courier: “Discovery machines” 


    From CERN (CH) Courier

    1

    27 January 2021
    Lyn Evans, (former LHC project director), Imperial College London(UK)
    Peter Jenni, former ATLAS spokesperson), Albert Ludwig University of Freiburg (DE) and CERN.

    1
    New era CERN’s Intersecting Storage Rings in 1974. Credit: CERN-PHOTO-7408061

    The ability to collide high-energy beams of hadrons under controlled conditions transformed the field of particle physics. Until the late 1960s, the high-energy frontier was dominated by the great proton synchrotrons. The Cosmotron at Brookhaven National Laboratory and the Bevatron at Lawrence Berkeley National Laboratory were soon followed by CERN’s Proton Synchrotron and Brookhaven’s Alternating Gradient Synchrotron, and later by the Proton Synchrotron at Serpukov near Moscow [image N/A].

    1
    BNL Cosmotron.

    LBNL Bevatron.

    CERN Proton Synchrotron

    2
    BNL Alternating Gradient Synchrotron (1960-present)

    In these machines protons were directed to internal or external targets in which secondary particles were produced.

    The kinematical inefficiency of this process, whereby the centre-of-mass energy only increases as the square root of the beam energy, was recognised from the outset. In 1943, Norwegian engineer Rolf Widerøe proposed the idea of colliding beams, keeping the centre of mass at rest in order to exploit the full energy for the production of new particles. One of the main problems was to get colliding beam intensities high enough for a useful event rate to be achieved. In the 1950s the prolific group at the University of Wisconsin Midwestern Universities Research Association (MURA), led by Donald Kerst, worked on the problem of “stacking” particles, whereby successive pulses from an injector synchrotron are superposed to increase the beam intensity. They mainly concentrated on protons, where Liouville’s theorem (which states that for a continuous fluid under the action of conservative forces the density of phase space cannot be increased) was thought to apply. Only much later, ways to beat Liouville and to increase the beam density were found. At the 1956 International Accelerator Conference at CERN, Kerst made the first proposal to use stacking to produce colliding beams (not yet storage rings) of sufficient intensity.

    3
    Super collider CERN’s SppS in 1983. Credit: CERN-AC-7604110.

    At that same conference, Gerry O’Neill from Princeton presented a paper proposing that colliding electron beams could be achieved in storage rings by making use of the natural damping of particle amplitudes by synchrotron-radiation emission. A design for the 500 MeV Princeton–Stanford colliding beam experiment was published in 1958 and construction started that same year. At the same time, the Budker Institute for Nuclear Research in Novosibirsk started work on VEP-1, a pair of rings designed to collide electrons at 140 MeV.

    3
    Between 1963 and 1974 Soviet physicists named VEP-1, VEPP-2 and VEPP-2M three different colliders they built it. However, by the 1980s, the Soviets’ work on collider began to slow down. While other countries that keep pace with new technologies are building next-generation colliders, Soviet scientists In the technology of the 1970s they were stuck.

    Then, in March 1960, Bruno Touschek gave a seminar at Laboratori Nazionali di Frascati in Italy where he first proposed a single-ring, 0.6 m-circumference 250 MeV electron–positron collider. “AdA” produced the first stored electron and positron beams less than one year later – a far cry from the time it takes today’s machines to go from conception to operation! From these trailblazers evolved the production machines, beginning with ADONE at Frascati and SPEAR at SLAC. However, it was always clear that the gift of synchrotron-radiation damping would become a hindrance to achieving very high energy collisions in a circular electron–positron collider because the power radiated increases as the fourth power of the beam energy and the inverse fourth power of mass, so is negligible for protons compared with electrons.

    4
    ADONE ( 1969-1993 ) at Frascati (IT)

    5
    SPEAR at SLAC.

    A step into the unknown

    Meanwhile, in the early 1960s, discussion raged at CERN about the next best step for particle physics. Opinion was sharply divided between two camps, one pushing a very high-energy proton synchrotron for fixed-target physics and the other using the technique proposed at MURA to build an innovative colliding beam proton machine with about the same centre-of-mass energy as a conventional proton synchrotron of much larger dimensions. In order to resolve the conflict, in February 1964, 50 physicists from among Europe’s best met at CERN. From that meeting emerged a new committee, the European Committee for Future Accelerators, under the chairmanship of one of CERN’s founding fathers, Edoardo Amaldi. After about two years of deliberation, consensus was formed. The storage ring gained most support, although a high-energy proton synchrotron, the Super Proton Synchrotron (SPS), was built some years later and would go on to play an essential role in the development of hadron storage rings.

    The Super Proton Synchrotron (SPS), CERN’s second-largest accelerator.

    On 15 December 1965, with the strong support of Amaldi, the CERN Council unanimously approved the construction of the Intersecting Storage Rings (ISR), launching the era of hadron colliders.

    6
    Intersecting Storage Rings (ISR). Credit: CERN.

    First collisions

    Construction of the ISR began in 1966 and first collisions were observed on 27 January 1971. The machine, which needed to store beams for many hours without the help of synchrotron-radiation damping to combat inevitable magnetic field errors and instabilities, pushed the boundaries in accelerator science on all fronts. Several respected scientists doubted that it would ever work. In fact, the ISR worked beautifully, exceeding its design luminosity by an order of magnitude and providing an essential step in the development of the next generation of hadron colliders. A key element was the performance of its ultra-high-vacuum system, which was a source of continuous improvement throughout the 13 year-long lifetime of the machine.

    For the experimentalists, the ISR’s collisions (which reached an energy of 63 GeV) opened an exciting adventure at the energy frontier. But they were also learning what kind of detectors to build to fully exploit the potential of the machine – a task made harder by the lack of clear physics benchmarks known at the time in the ISR energy regime. The concept of general-purpose instruments built by large collaborations, as we know them today, was not in the culture of the time. Instead, many small collaborations built experiments with relatively short lifecycles, which constituted a fruitful learning ground for what was to come at the next generation of hadron colliders.

    There was initially a broad belief that physics action would be in the forward directions at a hadron collider. This led to the Split Field Magnet facility as one of the first detectors at the ISR, providing a high magnetic field in the forward directions but a negligible one at large angle with respect to the colliding beams (the nowadays so-important transverse direction).

    7
    Split Field Magnet facility. © 1977-2021 CERN.

    It was with subsequent detectors featuring transverse spectrometer arms over limited solid angles that physicists observed a large excess of high transverse momentum particles above low-energy extrapolations. With these first observations of point-like parton scattering, the ISR made a fundamental contribution to strong-interaction physics. Solid angles were too limited initially, and single-particle triggers too biased, to fully appreciate the hadronic jet structure. That feat required third-generation detectors, notably the Axial Field Spectrometer (AFS) at the end of the ISR era, offering full azimuthal central calorimeter coverage.

    6
    Axial Field Spectrometer (AFS). Credit: CERN.

    The experiment provided evidence for the back-to-back two-jet structure of hard parton scattering.

    7
    TeV frontier The Tevatron at Fermilab in 2011. Credit: Fermilab.

    For the detector builders, the original AFS concept was interesting as it provided an unobstructed phi-symmetric magnetic field in the centre of the detector, however, at the price of massive Helmholtz coil pole tips obscuring the forward directions. Indeed, the ISR enabled the development of many original experimental ideas. A very important one was the measurement of the total cross section using very forward detectors in close proximity to the beam. These “Roman Pots”, named for their inventors, made their appearance in all later hadron colliders, confirming the rising total pp cross section with energy.

    It is easy to say after the fact, still with regrets, that with an earlier availability of more complete and selective (with electron-trigger capability) second- and third-generation experiments at the ISR, CERN would not have been left as a spectator during the famous November revolution of 1974 with the J/ψ discoveries at Brookhaven and SLAC. These, and the ϒ resonances discovered at Fermilab three years later, were clearly observed in the later-generation ISR experiments.

    SPS opens new era

    However, events were unfolding at CERN that would pave the way to the completion of the Standard Model.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS).

    At the ISR in 1972, the phenomenon of Schottky noise (density fluctuations due to the granular nature of the beam in a storage ring) was first observed. It was this very same noise that Simon van der Meer speculated in a paper a few years earlier could be used for what he called “stochastic cooling” of a proton beam, beating Liouville’s theorem by the fact that a beam of particles is not a continuous fluid. Although it is unrealistic to detect the motion of individual particles and damp them to the nominal orbit, van der Meer showed that by correcting the mean transverse motion of a sample of particles continuously, and as long as the statistical nature of the Schottky signal was continuously regenerated, it would be theoretically possible to reduce the beam size and increase its density. With the bandwidth of electronics available at the time, van der Meer concluded that the cooling time would be too long to be of practical importance. But the challenge was taken up by Wolfgang Schnell, who built a state-of-the-art feedback system that demonstrated stochastic cooling of a proton beam for the first time. This would open the door to the idea of stacking and cooling of antiprotons, which later led to the SPS being converted into a proton–antiproton collider.

    8
    Big beast The Large Hadron Collider in 2018. Credit: CERN-PHOTO-201802-030.

    Another important step towards the next generation of hadron colliders occurred in 1973 when the collaboration working on the Gargamelle heavy-liquid bubble chamber published two papers revealing the first evidence for weak neutral currents.

    9
    The Gargamelle heavy-liquid bubble chamber during its installation at the Proton Synchrotron in September 1970. Credit: CERN.

    These were important observations in support of the unified theory of electromagnetic and weak interactions, for which Sheldon Glashow, Abdus Salam and Steven Weinberg were to receive the Nobel Prize in Physics in 1979. The electroweak theory predicted the existence and approximate masses of two vector bosons, the W and the Z, which were too high to be produced in any existing machine. However, Carlo Rubbia and collaborators proposed that, if the SPS could be converted into a collider with protons and antiprotons circulating in opposite directions, there would be enough energy to create them.

    To achieve this the SPS would need to be converted into a storage ring like the ISR, but this time the beam would need to be kept “bunched” with the radio-frequency (RF) system working continuously to achieve a high enough luminosity (unlike the ISR where the beams were allowed to de-bunch all around the ring). The challenges here were two-fold. Noise in the RF system causes particles to diffuse rapidly from the bunch. This was solved by a dedicated feedback system. It was also predicted that the beam–beam interaction would limit the performance of a bunched-beam machine with no synchrotron-radiation damping due to the strongly nonlinear interactions between a particle in one beam with the global electromagnetic field in the other beam.

    A much bigger challenge was to build an accumulator ring in which antiprotons could be stored and cooled by stochastic cooling until a sufficient intensity of antiprotons would be available to transfer into the SPS, accelerate to around 300 GeV and collide with protons. This was done in two stages. First a proof-of-principle was needed to show that the ideas developed at the ISR transferred to a dedicated accumulator ring specially designed for stochastic cooling. This ring was called the Initial Cooling Experiment (ICE), and operated at CERN in 1977–1978.

    9
    Initial Cooling Experiment (ICE) 1977–1978. Credit: CERN.

    In ICE transverse cooling was applied to reduce the beam size and a new technique for reducing the momentum spread in the beam was developed. The experiment proved to be a big success and the theory of stochastic cooling was refined to a point where a real accumulator ring (the Antiproton Accumulator) could be designed to accumulate and store antiprotons produced at 3.5 GeV by the proton beam from the 26 GeV Proton Synchrotron.

    10
    Antiproton Accumulator. Credit: CERN.

    First collisions of protons and antiprotons at 270 GeV were observed on the night of 10 July 1981, signalling the start of a new era in colliding beam physics.

    11
    First steps The R702 experiment at the ISR in 1977. Credit: CERN-PHOTO-7708541X.

    A clear physics goal, namely the discovery of the W and Z intermediate vector bosons, drove the concepts for the two main SppS experiments UA1 and UA2 (in addition to a few smaller, specialised experiments).

    12
    UA1. Credit: CERN.

    13
    US2. Credit: CERN.

    It was no coincidence that the leaders of both collaborations were pioneers of ISR experiments, and many lessons from the ISR were taken on board. UA1 pioneered the concept of a hermetic detector that covered as much as possible the full solid angle around the interaction region with calorimetry and tracking. This allows measurements of the missing transverse energy/momentum, signalling the escaping neutrino in the leptonic W decays. Both electrons and muons were measured, with tracking in a state-of-the-art drift chamber that provided bubble-chamber-like pictures of the interactions. The magnetic field was provided by a dipole-magnet configuration, an approach not favoured in later generation experiments because of its inherent lack of azimuthal symmetry. UA2 featured a (at the time) highly segmented electromagnetic and hadronic calorimeter in the central part (down to 40 degrees with respect to the beam axis), with 240 cells pointing to the interaction region. But it had no muon detection, and in its initial phase only limited electromagnetic coverage in the forward regions. There was no magnetic field except for the forward cones with toroids to probe the W polarisation.

    In 1983 the SppS experiments made history with the direct discoveries of the W and Z. Many other results were obtained, including the first evidence of neutral B-meson particle–antiparticle mixing at UA1 thanks to its tracking and muon detection. The calorimetry of UA2 provided immediate unambiguous evidence for a two-jet structure in events with large transverse energy. Both UA1 and UA2 pushed QCD studies far ahead. The lack of hermeticity in UA2’s forward regions motivated a major upgrade (UA2′) for the second phase of the collider, complementing the central part with new fully hermetic calorimetry (both electromagnetic and hadronic), and also inserting a new tracking cylinder employing novel technologies (fibre tracking and silicon pad detectors). This enabled the experiment to improve searches for top quarks and supersymmetric particles, as well as making almost background-free first precision measurements of the W mass.

    Meanwhile in America

    At the time the SppS was driving new studies at CERN, the first large superconducting synchrotron (the Tevatron, with a design energy close to 1 TeV) was under construction at Fermilab.

    FNAL/Tevatron map


    Tevatron Accelerator


    FNAL/Tevatron

    In view of the success of the stochastic cooling experiments, there was a strong lobby at the time to halt the construction of the Tevatron and to divert effort instead to emulate the SPS as a proton–antiproton collider using the Fermilab Main Ring. Wisely this proposal was rejected and construction of the Tevatron continued. It came into operation as a fixed-target synchrotron in 1984. Two years later it was also converted into a proton–antiproton collider and operated at the high-energy frontier until its closure in September 2011.

    A huge step was made with the detector concepts for the Tevatron experiments, in terms of addressed physics signatures, sophistication and granularity of the detector components. This opened new and continuously evolving avenues in analysis methods at hadron colliders. Already the initial CDF and DØ detectors for Run I (which lasted until 1996) were designed with cylindrical concepts, characteristic of what we now call general-purpose collider experiments, albeit DØ still without a central magnetic field in contrast to CDF’s 1.4 T solenoid.

    FNAL/Tevatron CDF detector

    FNAL/Tevatron DZero detector

    In 1995 the experiments delivered the first Tevatron highlight: the discovery of the top quark. Both detectors underwent major upgrades for Run II (2001–2011) – a theme now seen for the LHC experiments – which had a great impact on the Tevatron’s physics results. CDF was equipped with a new tracker, a silicon vertex detector, new forward calorimeters and muon detectors, while DØ added a 1.9 T central solenoid, vertexing and fibre tracking, and new forward muon detectors. Alongside the instrumentation was a breath-taking evolution in real-time event selection (triggering) and data acquisition to keep up with the increasing luminosity of the collider.

    The physics harvest of the Tevatron experiments during Run II was impressive, including a wealth of QCD measurements and major inroads in top-quark physics, heavy-flavour physics and searches for phenomena beyond the Standard Model. Still standing strong are its precision measurements of the W and top masses and of the electroweak mixing angle sin2θW. The story ended in around 2012 with a glimpse of the Higgs boson in associated production with a vector boson. The CDF and DØ experience influenced the LHC era in many ways: for example they were able to extract the very rare single-top production cross-section with sophisticated multivariate algorithms, and they demonstrated the power of combining mature single-experiment measurements in common analyses to achieve ultimate precision and sensitivity.

    For the machine builders, the pioneering role of the Tevatron as the first large superconducting machine was also essential for further progress. Two other machines – the Relativistic Heavy Ion Collider at Brookhaven and the electron–proton collider HERA at DESY – derived directly from the experience of building the Tevatron.

    BNL RHIC Campus.


    BNL/RHIC.


    BNL/RHIC Star Detector


    BNL/RHIC Phenix.

    Lessons learned from that machine and from the SppS were also integrated into the design of the most powerful hadron collider yet built: the LHC [below].

    The Large Hadron Collider

    The LHC had a difficult birth. Although the idea of a large proton–proton collider at CERN had been around since at least 1977, the approval of the Superconducting Super Collider (SSC) in the US in 1987 put the whole project into doubt. The SSC, with a centre-of-mass energy of 40 TeV, was almost three times more powerful than what could ever be built using the existing infrastructure at CERN. It was only the resilience and conviction of Carlo Rubbia, who shared the 1984 Nobel Prize in Physics with van der Meer for the project leading to the discovery of the W and Z bosons, that kept the project alive. Rubbia, who became Director-General of CERN in 1989, argued that, in spite of its lower energy, the LHC could be competitive with the SSC by having a luminosity an order of magnitude higher, and at a fraction of the cost. He also argued that the LHC would be more versatile: as well as colliding protons, it would be able to accelerate heavy ions to record energies at little extra cost.

    The SSC was eventually cancelled in 1993. This made the case for the LHC even stronger, but the financial climate in Europe at the time was not conducive to the approval of a large project. For example, CERN’s largest contributor, Germany, was struggling with the cost of reunification and many other countries were getting to grips with the introduction of the single European currency. In December 1993 a plan was presented to the CERN Council to build the machine over a 10-year period by reducing the other experimental programmes at CERN to the absolute minimum, with the exception of the full exploitation of the flagship Large Electron Positron (LEP) collider. Although the plan was generally well received, it became clear that Germany and the UK were unlikely to agree to the budget increase required. On the positive side, after the demise of the SSC, a US panel on the future of particle physics recommended that “the government should declare its intentions to join other nations in constructing the LHC”. Positive signals were also being received from India, Japan and Russia.

    In June 1994 the proposal to build the LHC was made once more. However, approval was blocked by Germany and the UK, which demanded substantial additional contributions from the two host states, France and Switzerland. This forced CERN to propose a “missing magnet” machine where only two thirds of the dipole magnets would be installed in a first stage, allowing operation at reduced energy for a number of years. Although costing more in the long run, the plan would save some 300 million Swiss Francs in the first phase. This proposal was put to Council in December 1994 by the new Director-General Christopher Llewellyn Smith and, after a round of intense discussions, the project was finally approved for two-stage construction, to be reviewed in 1997 after non-Member States had made known their contributions. The first country to do so was Japan in 1995, followed by India, Russia and Canada the next year. A final sting in the tail came in June 1996 when Germany unilaterally announced that it intended to reduce its CERN subscription by between 8% and 9%, prompting the UK to demand a similar reduction and forcing CERN to take out loans. At the same time, the two-stage plan was dropped and, after a shaky start, the construction of the full LHC was given the green light.

    The fact that the LHC was to be built at CERN, making full use of the existing infrastructure to reduce cost, imposed a number of strong constraints. The first was the 27 km-circumference of the LEP tunnel in which the machine was to be housed. For the LHC to achieve its design energy of 7 TeV per beam, its bending magnets would need to operate at a field of 8.3 T, about 60% higher than ever achieved in previous machines. This could only be done using affordable superconducting material by reducing the temperature of the liquid-helium coolant from its normal boiling point of 4.2 K to 1.9 K – where helium exists in a macroscopic quantum state with the loss of viscosity and a very large thermal conductivity. A second major constraint was the small (3.8 m) tunnel diameter, which made it impossible to house two independent rings like the ISR. Instead, a novel and elegant magnet design, first proposed by Bob Palmer at Brookhaven, with the two rings separated by only 19 cm in a common yoke and cryostat was developed. This also considerably reduced the cost.

    At precisely 09:30 on 10 September 2008, almost 15 years after the project’s approval, the first beam was injected into the LHC, amid global media attention. In the days that followed good progress was made until disaster struck: during a ramp to full energy, one of the 10,000 superconducting joints between the magnets failed, causing extensive damage from which it took more than a year to recover. Following repairs and consolidation, on 29 November 2009 beam was once more circulating and full commissioning and operation could start. Rapid progress in ramping up the luminosity followed, and the LHC physics programme, at an initial energy of 3.5 TeV per beam, began in earnest in March 2010.

    LHC experiments

    Yet a whole other level of sophistication was realised by the LHC detectors compared to those at previous colliders. The priority benchmark for the designs of the general-purpose detectors ATLAS [below] and CMS [below] was to unambiguously discover (or rule out) the Standard Model Higgs boson for all possible masses up to 1 TeV, which demanded the ability to measure a variety of final states. The challenges for the Higgs search also guaranteed the detectors’ potential for all kinds of searches for physics beyond the Standard Model, which was the other driving physics motivation at the energy frontier. These two very ambitious LHC detector designs integrated all the lessons learned from the experiments at the three predecessor machines, as well as further technology advances in other large experiments, most notably at HERA and LEP.

    H1 detector at DESY HERA ring.

    CERN LEP Collider.

    Just a few simple numbers illustrate the giant leap from the Tevatron to the LHC detectors. CDF and DØ, in their upgraded versions operating at a luminosity of up to 4 × 1032 cm–2s–1, typically had around a million channels and a triggered event rate of 100 Hz, with event sizes of 500 kB. The collaborations were each about 600 strong. By contrast, ATLAS and CMS operated during LHC Run 2 at a luminosity of 2 × 1034 cm–2s–1 with typically 100 million readout channels, and an event rate and size of 500 Hz and 1500 kB. Their publications have close to 3000 authors.

    For many major LHC-detector components, complementary technologies were selected. This is most visible for the superconducting magnet systems, with an elegant and unique large 4 T solenoid in CMS serving both the muon and inner tracking measurements, and an air-core toroid system for the muon spectrometer in ATLAS together with a 2 T solenoid around the inner tracking cylinder. These choices drove the layout of the active detector components, for instance the electromagnetic calorimetry. Here again, different technologies were implemented: a novel-configuration liquid-argon sampling calorimeter for ATLAS and lead-tungstate crystals for CMS.

    From the outset, the LHC was conceived as a highly versatile collider facility, not only for the exploration of high transverse-momentum physics. With its huge production of b and c quarks, it offered the possibility of a very fruitful programme in flavour physics, exploited with great success by the purposely designed LHCb experiment [below]. Furthermore, in special runs the LHC provides heavy-ion collisions for studies of the quark–gluon plasma – the field of action for the ALICE experiment below].

    As the general-purpose experiments learned from the history of experiments in their field, the concepts of both LHCb and ALICE also evolved from a previous generation of experiments in their fields, which would be interesting to trace back. One remark is due: the designs of all four main detectors at the LHC have turned out to be so flexible that there are no strict boundaries between these three physics fields for them. All of them have learned to use features of their instruments to contribute at least in part to the full physics spectrum offered by the LHC, of which the highlight so far was the July 2012 announcement of the discovery of the Higgs boson by the ATLAS and CMS collaborations.

    Peter Higgs


    CERN CMS Higgs Event May 27, 2012.


    CERN ATLAS Higgs Event
    June 12, 2012.

    The following year the collaborations were named in the citation for the 2013 Nobel Prize in Physics awarded to François Englert and Peter Higgs.

    14
    CMS undergoing upgrades in 2019. Credit: CERN.

    Since then, the LHC has exceeded its design luminosity by a factor of two and delivered an integrated luminosity of almost 200 fb–1 in proton–proton collisions, while its beam energy was increased to 6.5 TeV in 2015. The machine has also delivered heavy ion (lead–lead) and even lead–proton collisions. But the LHC still has a long way to go before its estimated end of operations in the mid-to-late 2030s. To this end, the machine was shut down in November 2018 for a major upgrade of the whole of the CERN injector complex as well as the detectors to prepare for operation at high luminosities, ultimately up to a “levelled” luminosity of 7 × 1034 cm–2s–1. The High Luminosity LHC (HL-LHC) upgrade is pushing the boundaries of superconducting magnet technology to the limit, particularly around the experiments where the present focusing elements will be replaced by new magnets built from high-performance Nb3Sn superconductor. The eventual objective is to accumulate 3000 fb–1 of integrated luminosity.

    In parallel, the LHC-experiment collaborations are preparing and implementing major upgrades to their detectors using novel state-of-art technologies and revolutionary approaches to data collection to exploit the tenfold data volume promised by the HL-LHC. Hadron-collider detector concepts have come a long way in sophistication over the past 50 years. However, behind the scenes are other factors paramount to their success. These include an equally spectacular evolution in data-flow architectures, software and the computing approaches, and analysis methods – all of which have been driven into new territories by the extraordinary needs for dealing with rare events within the huge backgrounds of ordinary collisions at hadron colliders. Worthy of particular mention in the success of all LHC physics results is the Worldwide LHC Computing Grid.

    MonALISA LHC Computing GridMap monalisa.caltech.edu/ml/_client.beta

    This journey is now poised to continue, as we look ahead towards how a general-purpose detector at a future 100 TeV hadron collider might look like.

    Beyond the LHC

    Although the LHC has at least 15 years of operations ahead of it, the question now arises, as it did in 1964: what is the next step for the field? The CERN Council has recently approved the recommendations of the 2020 update of the European strategy for particle physics, which includes, among other things, a thorough study of a very high-energy hadron collider to succeed the LHC.

    CERN FCC Future Circular Collider details of proposed 100km-diameter successor to LHC.

    A technical and financial feasibility study for a 100 km circular collider at CERN with a collision energy of at least 100 TeV is now under way. While a decision to proceed with such a facility is to come later this decade, one thing is certain: lessons learned from 50 years of experience with hadron colliders and their detectors will be crucial to the success of our next step into the unknown.

    A possible future elsewhere

    China Circular Electron Positron Collider (CEPC) map. It would be housed in a hundred-kilometer- (62-mile-) round tunnel at one of three potential sites. The documents work under the assumption that the collider will be located near Qinhuangdao City around 200 miles east of Beijing.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN/ATLAS detector

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    SixTRack CERN LHC particles

     
  • richardmitnick 10:17 pm on January 14, 2021 Permalink | Reply
    Tags: "HL-LHC magnets enter production in the US", , , , , CERN LHC, , , , , , , US LHC Accelerator Research Program (LARP)   

    From CERN (CH) Courier: “HL-LHC magnets enter production in the US” 


    From CERN (CH) Courier

    13 January 2021
    Matthew Chalmers editor.


    1
    Next generation BNL technicians Ray Ceruti, Frank Teich, Pete Galioto, Pat Doutney and Dan Sullivan with the second US quadrupole magnet for the HL-LHC to have reached design performance. Credit: BNL.

    The significant increase in luminosity targeted by the high-luminosity LHC (HL-LHC) demands large-aperture quadrupole magnets that are able to focus the proton beams more tightly as they collide. A total of 24 such magnets are to be installed on either side of the ATLAS and CMS experiments [both below] in time for HL-LHC operations in 2027, marking the first time niobium-tin (Nb3Sn) magnet technology is used in an accelerator.

    Nb3Sn is a superconducting material with a critical magnetic field that far exceeds that of the niobium-titanium presently used in the LHC magnets, but once formed it becomes brittle and strain-sensitive, which makes it much more challenging to process and use.

    The milestone signals the end of the prototyping phase for the HL-LHC quadrupoles.

    Following the first successful test of a US-built HL-LHC quadrupole magnet at Brookhaven National Laboratory (BNL) in January last year—attaining a conductor peak field of 11.4 T and exceeding the required integrated gradient of 556 T in a 150 mm-aperture bore—a second quadrupole magnet has now been tested at BNL at nominal performance. Since the US-built quadrupole magnets must be connected in pairs before they can constitute fully operational accelerator magnets, the milestone signals the end of the prototyping phase for the HL-LHC quadrupoles, explains Giorgio Apollinari of Fermilab, who is head of the US Accelerator Upgrade Projects (AUP). “The primary importance is that we have entered the ‘production’ period that will make installation viable in early 2025. It also means we have satisfied the requirements from our funding agency and now the US Department of Energy has authorised the full construction for the US contribution to HL-LHC.”

    Joint venture

    The design and production of the HL-LHC quadrupole magnets are the result of a joint venture between CERN, BNL, Fermilab and Lawrence Berkeley National Laboratory, preceded by the 15 year-long US LHC Accelerator Research Program (LARP).

    The US labs are to provide a total of ten 9 m-long helium-tight vessels (eight for installation and two as spares) for the HL-LHC, each containing two 4.2 m-long magnets. CERN is also producing ten 9 m-long vessels, each containing a 7.5 m-long magnet. The six magnets to be placed on each side of ATLAS and CMS – four from the US and two from CERN – will be powered in series on the same electrical circuit.

    The synergy between CERN and the US laboratories allowed us to considerably reduce the risks.

    “The synergy between CERN and the US laboratories allowed us to considerably reduce the risks, have a faster schedule and a better optimisation of resources,” says Ezio Todesco of CERN’s superconductors and cryostats group. The quadrupole magnet programme at CERN is also making significant progress, he adds, with a short-model quadrupole having recently reached a record 13.4 T peak field in the coil, which is 2 T more than the project requirements. “The full series of magnets, sharing the same design and built on three sites, will also give very relevant information about the viability of future hadron colliders, which are expected to rely on massive, industrial production of Nb3Sn magnets with fields up to 16 T.”

    Since the second US quadrupole magnet was tested in October, the AUP teams have completed the assembly of a third magnet and are close to completing the assembly of a fourth. Next, the first two magnets will be assembled in a single cold mass before being tested in a horizontal configuration and then shipped to CERN in time for the “string test” planned in 2023.

    “In all activities at the forefront of technology, like in the case for these focusing Nb3Sn quadrupoles, the major challenge is probably the transition from an ‘R&D mentality’, where minor improvements can be a daily business, to a ‘production mentality’, where there is a need to build to specific procedures and criteria, with all deviations being formally treated and corrected or addressed,” says Apollinari. “And let’s not forget that the success of this second magnet test came with a pandemic raging across the world.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN/ATLAS detector

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    SixTRack CERN LHC particles

     
  • richardmitnick 11:44 am on January 12, 2021 Permalink | Reply
    Tags: "Antimatter Mystery Continues To Perplex Scientists", , , CERN LHC, , , , , , ,   

    From Forbes Magazine: “Antimatter Mystery Continues To Perplex Scientists” 

    From Forbes Magazine

    Jan 11, 2021

    Don Lincoln-Fermi National Accelerator Laboratory.

    1
    Particles collision like these in the Large Hadron Collider are helping scientist understand why our universe is made of only matter and no antimatter. Credit: Getty.

    One of the biggest mysteries of science is why there is something rather than nothing. According to our best scientific theories, the universe should consist of a featureless bath of energy. Yet that prediction is obviously wrong. While scientists have discovered a few hints on ways to solve the problem, it’s still an outstanding question. And a recent paper released by the LHCb collaboration, using the Large Hadron Collider, the world’s most powerful particle accelerator, has added to our confusion.

    CERN (CH) /LHCb detector.


    CERN (CH) LHC Map

    Probably the most famous scientific equation of all times is Einstein’s E = mc^2. This simple equation says that energy (E) is equal to mass (m), times a constant (c^2). More colloquially, it says that energy can be converted into matter and then back again. However, you need to be bit more cautious. Technically, it says that energy can be converted into equal amounts of matter and antimatter.

    Antimatter is a cousin of matter. Actually, it’s really the opposite of matter – antimatter is yang to matter’s yin. If you combine matter and antimatter, it makes energy – and a lot of it. Combine a gram of matter and antimatter, and the result is an energy release comparable to the atomic explosion of Hiroshima.

    Antimatter was discovered in 1931 and its existence is universally accepted by the scientific community. However, it isn’t something that exists in large quantities in the universe, and that deficit is a huge scientific mystery. That’s because, according to our best understanding of how the universe came into existence, there should be as much matter as antimatter.

    The currently accepted theory of the origins of the universe is called the Big Bang.

    2
    Big Bang, conceptual image. Computer illustration representing the origin of the universe. The term Big Bang describes the initial expansion of all the matter in the universe from an infinitely compact state 13.7 billion years ago. The initial conditions are not known, but less than a second after the beginning, temperatures were trillions of degrees Celsius and the primordial universe was much smaller than an atom. It has been expanding and cooling ever since. Matter formed and coalesced into the galaxies, which are observed to be moving away from each other. Background radiation in the universe is considered a remnant of the Big Bang. Credit: Getty Images.

    Originally, the universe was smaller and hotter, and it has been expanding and cooling for nearly 14 billion years. During the earlier and hotter phase, the universe was full of energy. And, where there is a lot of energy, there should be matter and antimatter made in equal quantities. This means that shortly after the Big Bang, the universe should have contained matter and antimatter. Then, as the universe expanded, that matter and antimatter should have bumped into one another, leaving a vast and featureless bath of energy.

    Yet this obviously isn’t true. Our universe consists of matter and energy, but no large amounts of antimatter are to be found. And, given that matter is all around us, it seems that there must have been something in the early universe that favored matter over antimatter.

    In the 1960s, scientists found that when they made a form of matter called the K meson in particle accelerators that they slightly favored matter K mesons over antimatter ones.

    Recently, scientists at the Large Hadron Collider were investigating another form of mesons, called B mesons. Mesons, like protons and neutrons, are made of smaller particles called quarks. However, while the proton contains three quarks, mesons all contain a quark and an antimatter quark. Different types of mesons contain different mixtures of quarks and antiquarks.

    The electrically neutral B meson contains a d-type quark and an antimatter b-type quark. The antimatter neutral B meson has the opposite quark content – a b-type quark and antimatter d-type quark.

    3
    Antimatter B mesons can change into matter via a complicated interaction. Symbols with a line over them are antimatter. Ones without are matter.

    Because the two types of quarks (b-type and d-type) have the same electrical charge, inside the B meson, the two types of quarks can exchange which is matter and which is antimatter via a complicated interaction. Thus, a neutral B meson can convert into an antimatter neutral B meson and back again. In fact, the particle oscillates its identity back and forth trillions of times per second.

    Scientists have found that the oscillation back and forth isn’t completely even. By looking at a specific decay of neutral B mesons into two other (and lighter) mesons, called the K meson and pi mesons, scientists observed that the B meson isn’t in both states equally. The neutral B meson has a slight preference (55% vs 45%) to be a d-type quark and b-type antiquark.
    Now this asymmetry isn’t enough to explain why our universe is made of matter and not antimatter, but it is a powerful clue. It is also a clear demonstration that there are at least a few subatomic processes that favor matter over antimatter.

    However, another study investigated not neutral B mesons, but ones with electrical charge. When researchers again studied the decay of these particles into K and pi mesons, they found no preference in the decays. This was very mysterious. Scientists are quite perplexed why the universe should exhibit a preference for matter in for neutral B mesons, but not for charged ones.

    It is unclear exactly what message this disagreement is telling us. It could just mean that the existing theory needs a small tweak, but it could well be an important clue in the question of why the universe is made entirely of matter – indeed the question of why we exist at all.

    It’s too soon to tell what the answer will be, but that’s how science is much of the time. A clue here, a peculiarity over there, or the occasional loose thread can be a small observation. Or, sometimes, when you tug that loose thread, the whole thing unravels, and you have to knit an entirely new sweater.

    Operations at the Large Hadron Collider have been on hold for two years now for refurbishments and upgrades. It was expected to resume in March 2021, but Covid caused a brief delay and it is now scheduled to resume in May instead. With new data, maybe researchers will be able to sort it all out.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 4:10 pm on December 23, 2020 Permalink | Reply
    Tags: "Recreating Big Bang matter on Earth", , , , , CERN LHC, , , , ,   

    From CERN (CH): “Recreating Big Bang matter on Earth” 

    Cern New Bloc

    Cern New Particle Event


    From CERN (CH)

    13 NOVEMBER, 2020 [Just now in social media]
    Ana Lopes

    Our fifth story in the LHC Physics at Ten series looks at how the LHC has recreated and greatly advanced our knowledge of the state of matter that is believed to have existed shortly after the Big Bang.

    1
    Recreating Big Bang matter on Earth at CERN’s LHC.

    2
    Illustration of the history of the universe. About one microsecond (μs) from the Big Bang, protons formed from the quark–gluon plasma. Credit: BICEP2 Collaboration/CERN/NASA.

    The Large Hadron Collider (LHC) at CERN usually collides protons together. It is these proton–proton collisions that led to the discovery of the Higgs boson in 2012.

    CERN CMS Higgs Event May 27, 2012.


    CERN ATLAS Higgs Event
    June 12, 2012.

    But the world’s biggest accelerator was also designed to smash together heavy ions, primarily the nuclei of lead atoms, and it does so every year for about one month. And for at least two good reasons. First, heavy-ion collisions at the LHC recreate in laboratory conditions the plasma of quarks and gluons that is thought to have existed shortly after the Big Bang. Second, the collisions can be used to test and study, at the highest manmade temperatures and densities, fundamental predictions of quantum chromodynamics, the theory of the strong force that binds quarks and gluons together into protons and neutrons and ultimately all atomic nuclei.

    The LHC wasn’t the first machine to recreate Big Bang matter: back in 2000, experiments at the Super Proton Synchrotron at CERN found compelling evidence of the quark–gluon plasma.

    The Super Proton Synchrotron (SPS), CERN’s second-largest accelerator. (Image: Julien Ordan/CERN).

    About five years later, experiments at the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory in the US started an era of detailed investigation of the quark–gluon plasma.

    BNL/RHIC.

    However, in the 10 years since it achieved collisions at higher energies than its predecessors, the LHC has taken studies of the quark–gluon plasma to incredible new heights. By producing a hotter, denser and longer-lived quark–gluon plasma as well as a larger number and assortment of particles with which to probe its properties and effects, the LHC has allowed physicists to study the quark–gluon plasma with an unprecedented level of detail. What’s more, the machine has delivered some surprising results along the way, stimulating new theoretical studies of this state of matter.

    Heavy collision course

    When heavy nuclei smash into one another in the LHC, the hundreds of protons and neutrons that make up the nuclei release a large fraction of their energy into a tiny volume, creating a fireball of quarks and gluons.

    3
    First proton-lead collision test at the LHC successful | Symmetry Magazine

    These tiny bits of quark–gluon plasma only exist for fleeting moments, with the individual quarks and gluons, collectively known as partons, quickly forming composite particles and antiparticles that fly out in all directions. By studying the zoo of particles produced in the collisions – before, during and after the plasma is created – researchers can study the plasma from the moment it is produced to the moment it cools down and gives way to a state in which composite particles called hadrons can form. However, the plasma cannot be observed directly. Its presence and properties are deduced from the experimental signatures it leaves on the particles that are produced in the collisions and their comparison with theoretical models.

    Such studies can be divided into two distinct categories. The first kind of study investigates the thousands of particles that emerge from a heavy-ion collision collectively, providing information about the global, macroscopic properties of the quark-gluon plasma. The second kind focuses on various types of particle with large mass or momentum, which are produced more rarely and offer a window into the inner, microscopic workings of the medium.

    At the LHC, these studies are conducted by the collaborations behind all four main LHC experiments: ALICE, ATLAS, CMS and LHCb. Although ALICE was initially specifically designed to investigate the quark–gluon plasma, the other three experiments have also since joined this investigation.

    Global properties

    The LHC has delivered data that has enabled researchers to derive with higher precision than previously achieved several global properties of the medium.

    “If we listen to two different musical instruments with closed eyes, we can distinguish between the instruments even when they are playing the same note. The reason is that a note comes with a set of overtones that give the instrument a unique distinct sound. This is but one example of how simple but powerful overtones are in identifying material properties. Heavy-ion physicists have learnt how to make use of “overtones” in their study of the quark–gluon plasma. The initial stage of a heavy-ion collision produces ripples in the plasma that travel through the medium and excite overtones. Such overtones can be measured by analysing the collective flow of particles that fly out of the plasma and reach the detectors. While previous measurements had revealed only first indications of these overtones, the LHC experiments have mapped them out in detail. Combined with other strides in precision, these data have been used by theorists to characterise the plasma’s properties, such as its temperature, energy density and frictional resistance, which is smaller than that of any other known fluid,” explains Wiedemann.

    These findings have then been supported in multiple ways. For instance, the ALICE collaboration estimated the temperature of the plasma by studying photons that are emitted by the hot fireball. The estimated temperature, about 300 MeV (1 MeV is about 10^10 kelvin), is above the predicted temperature necessary for the plasma to be created (about 160 MeV), and is about 40% higher than the one obtained by the RHIC collider.

    Another example is the estimation of the energy density of the plasma in the initial stage of the collisions. ALICE and CMS obtained a value in the range 12–14 GeV per cubic femtometre (1 femtometre is 10-15 metres), about 2–3 times higher than that determined by RHIC, and again above the predicted energy density needed for the plasma to form (about 1 GeV/fm^3).

    5
    Particle trajectories and energy deposition in the ALICE detector during the last lead–lead collisions of the second LHC run. Credit: CERN)

    Inner workings

    The LHC has supplied not just more particles but also more varied types of particle with which to probe the quark–gluon plasma.

    “Together with state-of-the-art particle detectors that cover more area around the collision points as well as sophisticated methods of identifying and tracking particles, this broad palette has offered unprecedented insight into the inner workings and effects of the quark–gluon plasma.”

    To give a few examples, soon after the LHC started, ATLAS and CMS made the first direct observation of the phenomenon of jet quenching, in which jets of particles formed in the collisions lose energy as they cross the quark–gluon plasma medium. The collaborations found a striking imbalance in the energies of pairs of jets, with one jet almost completely absorbed by the medium.

    Another example concerns heavy quarks. Such particles are excellent probes of the quark–gluon plasma because they are produced in the initial stages of a heavy-ion collision and therefore experience the entire evolution of the plasma. The ALICE collaboration has more recently shown that heavy quarks “feel” the shape and size of the quark–gluon plasma, indicating that even the heaviest quarks move with the medium, which is mostly made of light quarks and gluons.

    The LHC experiments, in particular ALICE and CMS, have also significantly improved our understanding of the hierarchical “melting” in the plasma of bound states of a heavy quark and its antiquark, called quarkonia. The more weakly bound the states are, the more easily they will melt, and as a result the less abundant they will be. CMS was the first to observe this so-called hierarchical suppression for bottomonium states, which consist of a bottom quark and its antiquark. And ALICE revealed that, while the most common form of charmonium states, which are composed of a charm quark and its antiquark, is highly suppressed due to the effect of the plasma, it is also regenerated by the recombination of charm quarks and antiquarks. This recombination phenomenon, observed for the first time at the LHC, provides an important testing ground for theoretical models and phenomenology, which forms a link between the theoretical models and experimental data.

    Surprises in smaller systems

    The LHC data have also revealed unexpected results. For example, the ALICE collaboration showed that the enhanced production of strange hadrons (particles containing at least one strange quark), which is traditionally viewed as a signature of the quark-gluon plasma, arises gradually in proton–proton and proton–lead collisions as the number of particles produced in the collisions, or “multiplicity”, increases.

    Another case in point is the gradual onset of a flow-like feature with the shape of a ridge with increasing multiplicity, which was first observed by CMS in proton–proton and proton–lead collisions. This result was further supported by ALICE and ATLAS observations of the emergence of double-ridge features in proton–lead collisions.

    6
    As the number of particles produced in proton–proton collisions increases (blue lines), the more particles containing at least one strange quark are measured (orange to red squares in the graph). Credit: CERN)

    “The LHC data have killed the long-held view that proton–proton collisions produce free-streaming sets of particles while heavy-ion collisions produce a fully developed quark–gluon plasma. And they tell us that in the small proton–proton collision systems there are more physical mechanisms at work than traditionally thought. The new challenge is to understand, within the theory of the strong force, how quark–gluon plasma-like properties emerge gradually with the size of the collision system.”

    These are just examples of how 10 years of the LHC have greatly advanced physicists’ knowledge of the quark–gluon plasma and thus of the early universe. And with data from the machine’s second run still being analysed and more data to come from the next run and the High-Luminosity LHC, the LHC’s successor, an even more detailed understanding of this unique state of matter is bound to emerge, perhaps with new surprises in the mix.

    “The coming decade at the LHC offers many opportunities for further exploration of the quark–gluon plasma,” says Musa. “The expected tenfold increase in the number of lead–lead collisions should both increase the precision of measurements of known probes of the medium and give us access to new probes. In addition, we plan to explore collisions between lighter nuclei, which could cast further light on the nature of the medium.”

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN (CH) in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier (CH)

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Maximilien Brice and Julien Marius Ordan.


    SixTRack CERN LHC particles

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: