Tagged: Particle Accelerators Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:25 pm on June 22, 2018 Permalink | Reply
    Tags: , , , , , Particle Accelerators, ,   

    From Brookhaven Lab: “Upgrades to ATLAS and LHC Magnets for Run 2 and Beyond” 

    From Brookhaven Lab

    6.22.18

    Peter Genzer,
    genzer@bnl.gov
    (631) 344-3174

    The following news release was issued by CERN, the European Organization for Nuclear Research, home to the Large Hadron Collider (LHC). Scientists from the U.S. Department of Energy’s Brookhaven National Laboratory play multiple roles in the research at the LHC and are making major contributions to the high-luminosity upgrade described in this news release, including the development of new niobium tin superconducting magnets that will enable significantly higher collision rates; new particle tracking and signal readout systems for the ATLAS experiment that will allow scientists to capture and analyze the most significant details from vastly larger data sets; and increases in computing capacity devoted to analyzing and sharing that data with scientists around the world. Brookhaven Lab also hosts the Project Office for the U.S. contribution to the HL-LHC detector upgrades of the ATLAS experiment. For more information about Brookhaven’s roles in the high-luminosity upgrade or to speak with a Brookhaven/LHC scientist, contact Karen McNulty Walsh, (631) 344-8350, kmcnulty@bnl.gov.

    Brookhaven physicists play critical roles in LHC restart and plans for the future of particle physics.

    1
    The ATLAS detector at the Large Hadron Collider, an experiment with large involvement from physicists at Brookhaven National Laboratory. Image credit: CERN

    July 6, 2015

    At the beginning of June, the Large Hadron Collider at CERN, the European research facility, began smashing together protons once again. The high-energy particle collisions taking place deep underground along the border between Switzerland and France are intended to allow physicists to probe the furthest edges of our knowledge of the universe and its tiniest building blocks.

    The Large Hadron Collider returns to operations after a two-year offline period, Long Shutdown 1, which allowed thousands of physicists worldwide to undertake crucial upgrades to the already cutting-edge particle accelerator. The LHC now begins its second multi-year operating period, Run 2, which will take the collider through 2018 with collision energies nearly double those of Run 1. In other words, Run 2 will nearly double the energies that allowed researchers to detect the long-sought Higgs Boson in 2012.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    The U.S. Department of Energy’s Brookhaven National Laboratory is a crucial player in the physics program at the Large Hadron Collider, in particular as the U.S. host laboratory for the pivotal ATLAS experiment, one of the two large experiments that discovered the Higgs. Physicists at Brookhaven were busy throughout Long Shutdown 1, undertaking projects designed to maximize the LHC’s chances of detecting rare new physics as the collider reaches into a previous unexplored subatomic frontier.

    While the technology needed to produce a new particle is a marvel on its own terms, equally remarkable is everything the team at ATLAS and other experiments must do to detect these potentially world-changing discoveries. Because the production of such particles is a rare phenomenon, it isn’t enough to just be able to smash one proton into another. The LHC needs to be able to collide proton bunches, each bunch consisting of hundreds of billions of particles, every 50 nanoseconds—eventually rising to every 25 nanoseconds in Run 2—and be ready to sort through the colossal amounts of data that all those collisions produce.

    It is with those interwoven challenges—maximizing the number of collisions within the LHC, capturing the details of potentially noteworthy collisions, and then managing the gargantuan amount of data those collisions produce—that scientists at Brookhaven National Laboratory are making their mark on the Large Hadron Collider and its search for new physics—and not just for the current Run 2, but looking forward to the long-term future operation of the collider.

    Restarting the Large Hadron Collider

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    2
    Brookhaven physicist Srini Rajagopalan, operation program manager for U.S. ATLAS, works to keep manageable the colossal amounts of data that are generated by the Large Hadron Collider and sent to Brookhaven’s RHIC and ATLAS Computing Facility.

    The Large Hadron Collider is the largest single machine in the world, so it’s tempting to think of its scale just in terms of its immense size. The twin beamlines of the particle accelerator sit about 300 to 600 feet underground in a circular tunnel more than 17 miles around. Over 1,600 magnets, each weighing more than 25 tons, are required to keep the beams of protons focused and on the correct paths, and nearly 100 tons of liquid helium is necessary to keep the magnets operating at temperatures barely above absolute zero. Then there are the detectors, each of which stand several stories high.

    But the scale of the LHC extends not just in space, but in time as well. A machine of this size and complexity doesn’t just switch on or off with the push of a button, and even relatively simple maintenance can require weeks, if not months, to perform. That’s why the LHC recently completed Long Shutdown 1, a two-year offline period in which physicists undertook the necessary repairs and upgrades to get the collider ready for the next three years of near-continuous operation. As the U.S. host laboratory for the ATLAS experiment, Brookhaven National Laboratory was pivotal in upgrading and improving one of the cornerstones of the LHC apparatus.

    “After having run for three years, the detector needs to be serviced much like your car,” said Brookhaven physicist Srini Rajagopalan, operation program manager for U.S. ATLAS. “Gas leaks crop up that need to be fixed. Power supplies, electronic boards and several other components need to be repaired or replaced. Hence a significant amount of detector consolidation work occurs during the shutdown to ensure an optimal working detector when beam returns.”

    Beyond these vital repairs, the major goal of the upgrade work during Long Shutdown 1 was to increase the LHC’s center of mass energies from the previous 8 trillion electron volts (TeV) to 13 TeV, near the operational maximum of 14 TeV.

    “Upgrading the energy means you’re able to probe much higher mass ranges, and you have access to new particles that might be substantially heavier,” said Rajagopalan. “If you have a very heavy particle that cannot be produced, it doesn’t matter how much data you collect, you just cannot reach that. That’s why it was very important to go from 8 to 13 TeV. Doubling the energy allows us to access the new physics much more easily.”

    As the LHC probes higher and higher energies, the phenomena that the researchers hope to observe will happen more and more rarely, meaning the particle beams need to create many more collisions than they did before. Beyond this increase in collision rates, or luminosity, however, the entire infrastructure of data collection and management has to evolve to deal with the vastly increased volume of information the LHC can now produce.

    “Much of the software had to be evolved or rewritten,” said Rajagopalan, “from patches and fixes that are more or less routine software maintenance to implementing new algorithms and installing new complex data management systems capable of handling the higher luminosity and collision rates.”

    Making More Powerful Magnets

    3
    Brookhaven physicist Peter Wanderer, head of the laboratory’s Superconducting Magnet Division, stands in front of the oven in which niobium tin is made into a superconductor.

    The Large Hadron Collider works by accelerating twin beams of protons to speeds close to that of light. The two beams, traveling in opposite directions along the path of the collider, both contain many bunches of protons, with each bunch containing about 100 billion protons. When the bunches of protons meet, not all of the protons inside of them are going to interact and only a tiny fraction of the colliding bunches are likely to yield potentially interesting physics. As such, it’s absolutely vital to control those beams to maximize the chances of useful collisions occurring.

    The best way to achieve that and the desired increase in luminosity—both during the current Run 2, and looking ahead to the long-term future of the LHC—is to tighten the focus of the beam. The more tightly packed protons are, the more likely they’ll smash into each other. This means working with the main tool that controls the beam inside the accelerator: the magnets.

    FNAL magnets such as this one, which is mounted on a test stand at Fermilab, for the High-Luminosity LHC Photo Reidar Hahn

    “Most of the length of the circumference along a circular machine like the LHC is taken up with a regular sequence of magnets,” said Peter Wanderer, head of Brookhaven Lab’s Superconducting Magnet Division, which made some of the magnets for the current LHC configuration and is working on new designs for future upgrades. “The job of these magnets is to bend the proton beams around to the next point or region where you can do something useful with them, like produce collisions, without letting the beam get larger.”

    A beam of protons is a bunch of positively charged particles that all repel one another, so they want to move apart, he explained. So physicists use the magnetic fields to keep the particles from being able to move away from the desired path.

    “You insert different kinds of magnets, different sequences of magnets, in order to make the beams as small as possible, to get the most collisions possible when the beams collide,” Wanderer said.

    The magnets currently in use in the LHC are made of the superconducting material niobium titanium (NbTi). When the electromagnets are cooled in liquid helium to temperatures of about 4 Kelvin (-452.5 degrees Fahrenheit), they lose all electric resistance and are able to achieve a much higher current density compared with a conventional conductor like copper. A magnetic field gets stronger as its current is more densely packed, meaning a superconductor can produce a much stronger field over a smaller radius than copper.

    But there’s an upper limit to how high a field the present niobium titanium superconductors can reach. So Wanderer and his team at Brookhaven have been part of a decade-long project to refine the next generation of superconducting magnets for a future upgrade to the LHC. These new magnets will be made from niobium tin (Nb3Sn).

    “Niobium tin can go to higher fields than niobium titanium, which will give us even stronger focusing,” Wanderer said. “That will allow us to get a smaller beam, and even more collisions.” Niobium tin can also function at a slightly higher temperature, so the new magnets will be easier to cool than those currently in use.

    There are a few catches. For one, niobium tin, unlike niobium titanium, isn’t initially superconducting. The team at Brookhaven has to first heat the material for two days at 650 degrees Celsius (1200 degrees Fahrenheit) before beginning the process of turning the raw materials into the wires and cables that make up an electromagnet.

    “And when niobium tin becomes a superconductor, then it’s very brittle, which makes it really challenging,” said Wanderer. “You need tooling that can withstand the heat for two days. It needs to be very precise, to within thousandths of an inch, and when you take it out of the tooling and want to put it into a magnet, and wrap it with iron, you have to handle it very carefully. All that adds a lot to the cost. So one of the things we’ve worked out over 10 years is how to do it right the first time, almost always.”

    Fortunately, there’s still time to work out any remaining kinks. The new niobium tin magnets aren’t set to be installed at the LHC until around 2022, when the changeover from niobium titanium to niobium tin will be a crucial part of converting the Large Hadron Collider into the High-Luminosity Large Hadron Collider (HL-LHC).

    Managing Data at Higher Luminosity

    As the luminosity of the LHC increases in Run 2 and beyond, perhaps the biggest challenge facing the ATLAS team at Brookhaven lies in recognizing a potentially interesting physics event when it occurs. That selectivity is crucial, because even CERN’s worldwide computing grid—which includes about 170 global sites, and of which Brookhaven’s RHIC and ATLAS Computing Facility is a major center—can only record the tiniest fraction of over 100 million collisions that occur each second. That means it’s just as important to quickly recognize the millions of events that don’t need to be recorded as it is to recognize the handful that do.

    “What you have to do is, on the fly, analyze each event and decide whether you want to save it to disk for later use or not,” said Rajagopalan. “And you have to be careful you don’t throw away good physics events. So you’re looking for signatures. If it’s a good signature, you say, ‘Save it!’ Otherwise, you junk it. That’s how you bring the data rate down to a manageable amount you can write to disk.”

    Physicists screen out unwanted data using what’s known as a trigger system. The principle is simple: as the data from each collision comes in, it’s analyzed for a preset signature pattern, or trigger, that would mark it as potentially interesting.

    “We can change the trigger, or make the trigger more sophisticated to be more selective,” said Brookhaven’s Howard Gordon, a leader in the ATLAS physics program. “If we don’t select the right events, they are gone forever.”

    The current trigger system can handle the luminosities of Run 2, but with future upgrades it will no longer be able to screen out and reject enough collisions to keep the number of recorded events manageable. So the next generation of ATLAS triggers will have to be even more sophisticated in terms of what they can instantly detect—and reject.

    A more difficult problem comes with the few dozen events in each bunch of protons that look like they might be interesting, but aren’t.

    “Not all protons in a bunch interact, but it’s not necessarily going to be only one proton in a bunch that interacts with a proton from the opposite bunch,” said Rajagopalan. “You could have 50 of them interact. So now you have 50 events on top of each other. Imagine the software challenge when just one of those is the real, new physics we’re interested in discovering, but you have all these 49 others—junk!—sitting on top of it.”

    “We call it pileup!” Gordon quipped.

    Finding one good result among 50 is tricky enough, but in 10 years that number will be closer to 1 in 150 or 200, with all those additional extraneous results interacting with each other and adding exponentially to the complexity of the task. Being able to recognize instantly as many characteristics of the desired particles as possible will go a long way to keeping the data manageable.

    Further upgrades are planned over the next decade to cope with the ever-increasing luminosity and collision rates. For example, the Brookhaven team and collaborators will be working to develop an all-new silicon tracking system and a full replacement of the readout electronics with state-of-the-art technology that will allow physicists to collect and analyze ten times more data for LHC Run 4, scheduled for 2026.

    The physicists at CERN, Brookhaven, and elsewhere have strong motivation for meeting these challenges. Doing so will not only offer the best chance of detecting rare physics events and expanding the frontiers of physics, but would allow the physicists to do it within a reasonable timespan.

    As Rajagopalan put it, “We are ready for the challenge. The next few years are going to be an exciting time as we push forward to explore a new unchartered energy frontier.”

    Brookhaven’s role in the LHC is supported by the DOE Office of Science.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

    Advertisements
     
  • richardmitnick 2:41 pm on June 21, 2018 Permalink | Reply
    Tags: , , , Particle Accelerators, ,   

    From Fermilab: “New laser technology shows success in particle accelerators” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermilab an enduring source of strength for the US contribution to scientific research world wide.

    June 21, 2018
    Sarah Lawhun

    1
    David Johnson, left, and Todd Johnson work on the recently installed laser notcher in the Fermilab accelerator complex. The laser notcher, the first application of its kind in an in-production particle accelerator, has helped boost particle beam production at the lab. Photo: Reidar Hahn

    Lasers — used in medicine, manufacturing and made wildly popular by science fiction — are finding a new use in particle physics.

    Fermilab scientists and engineers have developed a tool called a laser notcher, which takes advantage of the laser’s famously precise targeting abilities to do something unexpected: boost the number of particles that accelerators send to experiments. It’s cranked up the lab’s particle output considerably — by an incredible 15 percent — giving scientists more opportunities to study nature’s tiniest constituents.

    While lasers have been used during accelerator tests and diagnostics, this is the first application of its kind used in a fully operational accelerator.

    “For such a new design, the laser notcher has been remarkably reliable,” said Fermilab engineer Bill Pellico, who manages one of the laboratory’s major accelerator upgrade programs, called the Proton Improvement Plan. “It’s already shown it will provide a considerable increase in the number of particles we can produce.”

    The notcher increases particle production, counterintuitively, by removing particles from a particle beam.

    Bunching out

    The process of removing particles isn’t new. Typically, an accelerator generates a particle beam in bunches — compact packets that each contain hundreds of millions of particles. Imagine each bunch in a beam as a pearl on a strand. Bunches can be arranged in patterns according to the acceleration needs. Perhaps the needed pattern is a 80-bunch-long string followed by a three-bunch-long gap. Often, the best way to create the gap is to start with a regular, uninterrupted string of bunches and simply remove the unneeded ones.

    But it isn’t so simple. Traditionally, beam bunches are kicked out by a fast-acting magnet, called a magnetic kicker. It’s a messy business: Particles fly off, strike beamline walls and generally create a subatomic obstacle course for the beam. While it’s not impossible for the beam to pass through such a scene, it also isn’t smooth sailing.

    Accelerator experts refer to the messy phenomenon as beam loss, and it’s a measurable, predictable predicament. They accommodate it by holding back on the amount of beam they accelerate in the first place, setting a ceiling on the number of particles they pack into the beam.

    That ceiling is a limitation for Fermilab’s new and upcoming experiments, which require greater and greater numbers of particles than the accelerator complex could handle previously. So the lab’s accelerator specialists look for ways to raise the particle beam ceiling and meet the experimental needs for beam.

    The most straightforward way to do this is to eliminate the thing that’s keeping the ceiling low and stifling particle delivery — beam loss.

    Lasers against loss

    The new laser notcher works by directing powerful pulses of laser light at particle bunches, taking them out of commission. Both the position and precision of the notcher allow it to create gaps cleanly —delivering a one-two punch in curbing beam loss.

    First, the notcher is positioned early in the series of Fermilab’s accelerators, when the particle beam hasn’t yet achieved the close-to-light speeds it will attain by the time it exits the accelerator chain. (At this early stage, the beam lumbers along at 4 percent the speed of light, a mere 2.7 million miles per hour.) This far upstream, the beam loss resulting from ejecting bunches doesn’t have much of an impact.

    “We moved the process to a place where, when we lose particles, it really doesn’t matter,” said David Johnson, Fermilab engineering physicist who led the laser notcher project.

    Second, the laser notcher is, like a scalpel, surgical in its bunch removal. It ejects bunches precisely, individually, bunch by bunch. That enables scientists to create gaps of exactly the right lengths needed by later acceleration stages.

    For Fermilab’s accelerator chain, the winning formula is for the notcher to create a gap that is 80 nanoseconds (billionths of a second) long every 2,200 nanoseconds. It’s the perfect-length gap needed by one of Fermilab’s later-stage accelerators, called the Booster.

    A graceful exit

    The Fermilab Booster feeds beam to the next accelerator stages or directly to experiments.

    Prior to the laser notcher’s installation, a magnetic kicker would boot specified bunches as they entered the Booster, resulting in messy beam loss.

    With the laser notcher now on the scene, the Booster receives a beam that has prefab, well-defined gaps. These 80-nanosecond-long windows of opportunity mean that, as the beam leaves the Booster and heads toward its next stop, it can make a clean, no-fuss, no-loss exit.

    With Booster beam loss brought down to low levels, Fermilab accelerator operators can raise the ceiling on the numbers of particles they can pack into the beam. The results so far are promising: The notcher has already allowed beam power to increase by a whopping 15 percent.

    Thanks to this innovation and other upgrade improvements, the Booster accelerator is now operating at its highest efficiency ever and at record-setting beam power.

    “Although lasers have been used in proton accelerators in the past for diagnostics and tests, this is the first-of-its-kind application of lasers in an operational proton synchrotron, and it establishes a technological framework for using laser systems in a variety of other bunch-by-bunch applications, which would further advance the field of high-power proton accelerators,” said Sergei Nagaitsev, head of the Fermilab Office of Accelerator Science Programs.

    Plentiful protons and other particles

    The laser notcher, installed in January, is a key part of a larger program, the Proton Improvement Plan (PIP), to upgrade the lab’s chain of particle accelerators to produce powerful proton beams.

    As the name of the program implies, it starts with protons.

    Fermilab sends protons barreling through the lab’s accelerator complex, and they’re routed to various experiments. Along the way, some of them are transformed into other particles needed by experiments, for example into neutrinos—tiny, omnipresent particles that could hold the key to filling in gaps in our understanding the universe’s evolution. Fermilab experiments need boatloads of these particles to carry out its scientific program. Some of the protons are transformed into muons, which can provide scientists with hints about the nature of the vacuum.

    With more protons coming down the pipe, thanks to PIP and the laser notcher, the accelerator can generate more neutrinos, muons and other particles, feeding Fermilab’s muon experiments, Muon g-2 and Mu2e, and its neutrino experiments, including its largest operating neutrino experiment, NOvA, and its flagship, the Deep Underground Neutrino Experiment and Long-Baseline Neutrino Facility.

    “Considering all the upgrades and improvements to Fermilab accelerators as a beautiful cake with frosting, the increase in particle production we managed to achieve with the laser notcher is like the cherry on top of the cake,” Nagaitsev said.

    “It’s a seemingly small change with a significant impact,” Johnson said.

    As the Fermilab team moves forward, they’ll continue to put the notcher through its paces, investigating paths for improvement.

    With this innovation, Fermilab adds another notch in the belt of what lasers can do.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.


    FNAL/MINERvA

    FNAL DAMIC

    FNAL Muon g-2 studio

    FNAL Short-Baseline Near Detector under construction

    FNAL Mu2e solenoid

    Dark Energy Camera [DECam], built at FNAL

    FNAL DUNE Argon tank at SURF

    FNAL/MicrobooNE

    FNAL Don Lincoln

    FNAL/MINOS

    FNAL Cryomodule Testing Facility

    FNAL Minos Far Detector

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    FNAL/NOvA experiment map

    FNAL NOvA Near Detector

    FNAL ICARUS

    FNAL Holometer

     
  • richardmitnick 1:49 pm on June 19, 2018 Permalink | Reply
    Tags: , , , , HL-LHC, Particle Accelerators, ,   

    From CERN: “Major work starts to boost the luminosity of the LHC” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN

    1
    Civil works have begun on the ATLAS and CMS sites to build new underground structures for the High-Luminosity LHC. (Image: Julien Ordan / CERN)

    CERN map

    The Large Hadron Collider (LHC) is officially entering a new stage. Today, a ground-breaking ceremony at CERN celebrates the start of the civil-engineering work for the High-Luminosity LHC (HL-LHC): a new milestone in CERN’s history. By 2026 this major upgrade will have considerably improved the performance of the LHC, by increasing the number of collisions in the large experiments and thus boosting the probability of the discovery of new physics phenomena.

    The LHC started colliding particles in 2010. Inside the 27-km LHC ring, bunches of protons travel at almost the speed of light and collide at four interaction points. These collisions generate new particles, which are measured by detectors surrounding the interaction points. By analysing these collisions, physicists from all over the world are deepening our understanding of the laws of nature.

    While the LHC is able to produce up to 1 billion proton-proton collisions per second, the HL-LHC will increase this number, referred to by physicists as “luminosity”, by a factor of between five and seven, allowing about 10 times more data to be accumulated between 2026 and 2036. This means that physicists will be able to investigate rare phenomena and make more accurate measurements. For example, the LHC allowed physicists to unearth the Higgs boson in 2012, thereby making great progress in understanding how particles acquire their mass. The HL-LHC upgrade will allow the Higgs boson’s properties to be defined more accurately, and to measure with increased precision how it is produced, how it decays and how it interacts with other particles. In addition, scenarios beyond the Standard Model will be investigated, including supersymmetry (SUSY), theories about extra dimensions and quark substructure (compositeness).

    “The High-Luminosity LHC will extend the LHC’s reach beyond its initial mission, bringing new opportunities for discovery, measuring the properties of particles such as the Higgs boson with greater precision, and exploring the fundamental constituents of the universe ever more profoundly,” said CERN Director-General Fabiola Gianotti.

    The HL-LHC project started as an international endeavour involving 29 institutes from 13 countries. It began in November 2011 and two years later was identified as one of the main priorities of the European Strategy for Particle Physics, before the project was formally approved by the CERN Council in June 2016. After successful prototyping, many new hardware elements will be constructed and installed in the years to come. Overall, more than 1.2 km of the current machine will need to be replaced with many new high-technology components such as magnets, collimators and radiofrequency cavities.

    2
    Prototype of a quadrupole magnet for the High-Luminosity LHC. (Image: Robert Hradil, Monika Majer/ProStudio22.ch)

    FNAL magnets such as this one, which is mounted on a test stand at Fermilab, for the High-Luminosity LHC Photo Reidar Hahn

    The secret to increasing the collision rate is to squeeze the particle beam at the interaction points so that the probability of proton-proton collisions increases. To achieve this, the HL-LHC requires about 130 new magnets, in particular 24 new superconducting focusing quadrupoles to focus the beam and four superconducting dipoles. Both the quadrupoles and dipoles reach a field of about 11.5 tesla, as compared to the 8.3 tesla dipoles currently in use in the LHC. Sixteen brand-new “crab cavities” will also be installed to maximise the overlap of the proton bunches at the collision points. Their function is to tilt the bunches so that they appear to move sideways – just like a crab.

    FNAL Crab cavities for the HL-LHC

    CERN crab cavities that will be used in the HL-LHC

    Another key ingredient in increasing the overall luminosity in the LHC is to enhance the machine’s availability and efficiency. For this, the HL-LHC project includes the relocation of some equipment to make it more accessible for maintenance. The power converters of the magnets will thus be moved into separate galleries, connected by new innovative superconducting cables capable of carrying up to 100 kA with almost zero energy dissipation.

    “Audacity underpins the history of CERN and the High-Luminosity LHC writes a new chapter, building a bridge to the future,” said CERN’s Director for Accelerators and Technology, Frédérick Bordry. “It will allow new research and with its new innovative technologies, it is also a window to the accelerators of the future and to new applications for society.”

    To allow all these improvements to be carried out, major civil-engineering work at two main sites is needed, in Switzerland and in France. This includes the construction of new buildings, shafts, caverns and underground galleries. Tunnels and underground halls will house new cryogenic equipment, the electrical power supply systems and various plants for electricity, cooling and ventilation.

    During the civil engineering work, the LHC will continue to operate, with two long technical stop periods that will allow preparations and installations to be made for high luminosity alongside yearly regular maintenance activities. After completion of this major upgrade, the LHC is expected to produce data in high-luminosity mode from 2026 onwards. By pushing the frontiers of accelerator and detector technology, it will also pave the way for future higher-energy accelerators.


    The LHC will receive a major upgrade and transform into the High-Luminosity LHC over the coming years. But what does this mean and how will its goals be achieved? Find out in this video featuring several people involved in the project. (Video: Polar Media/CERN.)

    Fermilab is leading the U.S. contribution to the HL-LHC, in addition to building new components for the upgraded detector for the CMS experiment. The main innovation contributed by the United States for the HL-LHC is a novel new type of accelerator cavity that uses a breakthrough superconducting technology.

    Fermilab is also contributing to the design and construction of superconducting magnets that will focus the particle beam much more tightly than the magnets currently in use in the LHC. Fermilab scientists and engineers have also partnered with other CMS collaborators on new designs for tracking modules in the CMS detector, enabling it to respond more quickly to the increased number of collisions in the HL-LHC.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    OTHER PROJECTS AT CERN

    CERN AEGIS

    CERN ALPHA

    CERN ALPHA

    CERN AMS

    CERN ACACUSA

    CERN ASACUSA

    CERN ATRAP

    CERN ATRAP

    CERN AWAKE

    CERN AWAKE

    CERN CAST

    CERN CAST Axion Solar Telescope

    CERN CLOUD

    CERN CLOUD

    CERN COMPASS

    CERN COMPASS

    CERN DIRAC

    CERN DIRAC

    CERN ISOLDE

    CERN ISOLDE

    CERN LHCf

    CERN LHCf

    CERN NA62

    CERN NA62

    CERN NTOF

    CERN TOTEM

    CERN UA9

     
  • richardmitnick 11:52 am on June 19, 2018 Permalink | Reply
    Tags: , , , Halina Abramowicz, Particle Accelerators, , , ,   

    From Symmetry: Women in STEM-“Q&A: Planning Europe’s physics future” Halina Abramowicz 

    Symmetry Mag
    From Symmetry

    06/13/18
    Lauren Biron

    1
    Artwork by Sandbox Studio, Chicago

    Halina Abramowicz leads the group effort to decide the future of European particle physics.

    Physics projects are getting bigger, more global, more collaborative and more advanced than ever—with long lead times for complex physics machines. That translates into more international planning to set the course for the future.

    In 2014, the United States particle physics community set its priorities for the coming years using recommendations from the Particle Physics Project Prioritization Panel, or P5.

    FNAL Particle Physics Project Prioritization Panel -P5

    In 2020, the European community will refresh its vision with the European Strategy Update for Particle Physics.

    The first European strategy launched in 2006 and was revisited in 2013. In 2019, teams will gather input through planning meetings in preparation for the next refresh.

    Halina Abramowicz, a physicist who works on the ATLAS experiment at CERN’s Large Hadron Collider and the FCAL research and development collaboration through Tel Aviv University, is the chair of the massive undertaking. During a visit to Fermilab to provide US-based scientists with an overview of the process, she sat down with Symmetry writer Lauren Biron to discuss the future of physics in Europe.

    LB:What do you hope to achieve with the next European Strategy Update for Particle Physics?
    HA: Europe is a very good example of the fact that particle physics is very international, because of the size of the infrastructure that we need to progress, and because of the financial constraints.

    The community of physicists working on particle physics is very large; Europe has probably about 10,000 physicists. They have different interests, different expertise, and somehow, we have to make sure to have a very balanced program, such that the community is satisfied, and that at the same time it remains attractive, dynamic, and pushing the science forward. We have to take into account the interests of various national programs, universities, existing smaller laboratories, CERN, and make sure that there is a complementarity, a spread of activities—because that’s the way to keep the field attractive, that is, to be able to answer more questions faster.

    LB: How do you decide when to revisit the European plan for particle physics?
    HA: Once the Higgs was discovered, it became clear that it was time to revisit the strategy, and the first update happened in 2013. The recommendation was to vigorously pursue the preparations for the high-luminosity upgrade of the [Large Hadron Collider].

    The high-luminosity LHC program was formally approved by the CERN Council in September 2016. By the end of 2018, the LHC experiments will have collected almost a factor of 10 more data. It will be a good time to reflect on the latest results, to think about mid-term plans, to discuss what are the different options to consider next and their possible timelines, and to ponder what would make sense as we look into the long-term future.

    CERN HL-LHC map

    Machines, Projects and Experiments operating at CERN LHC and CLIC at three levels of power

    The other aspect which is very important is the fact that the process is called “strategy,” rather than “roadmap,” because it is a discussion not only of the scientific goals and associated projects, but also of how to achieve them. The strategy basically is about everything that the community should be doing in order to achieve the roadmap.

    LB: What’s the difference between a strategy and a roadmap?
    HA: The roadmap is about prioritizing the scientific goals and about the way to address them, while the strategy covers also all the different aspects to consider in order to make the program a success. For example, outreach is part of the strategy. We have to make sure we are doing something that society knows about and is interested in. Education: making sure we share our knowledge in a way which is understandable. Detector developments. Technology transfer. Work with industry. Making sure the byproducts of our activities can also be used for society. It’s a much wider view.

    LB: What is your role in this process?
    HA: The role of the secretary of the strategy is to organize the process and to chair the discussions so that there is an orderly process. At this stage, we have one year to prepare all the elements of the process that are needed—i.e. to collect the input. In the near future we will have to nominate people for the physics preparatory group that will help us organize the open symposium, which is basically the equivalent of a town-hall meeting.

    The hope is that if it’s well organized and we can reach a consensus, especially on the most important aspects, the outcome will come from the community. We have to make sure through interaction with the European community and the worldwide community that we aren’t forgetting anything. The more inputs we have, the better. It is very important that the process be open.

    The first year we debate the physics goals and try to organize the community around a possible plan. Then comes the process that is maybe a little shorter than a year, during which the constraints related to funding and interests of various national communities have to be integrated. I’m of course also hoping that we will get, as an input to the strategy discussions, some national roadmaps. It’s the role of the chair to keep this process flowing.

    LB: Can you tell us a little about your background and how you came to serve as the chair for European Strategy Update?
    HA: That’s a good question. I really don’t know. I did my PhD in 1978; I was one of the youngest PhDs of Warsaw University, thus I’ve spent 40 years in the field. That means that I have participated in at least five large experiments and at least two or three smaller projects. I have a very broad view—not necessarily a deep view—but a broad view of what’s happening.

    LB: There are major particle physics projects going on around the world, like DUNE in the US and Belle II in Japan. How much will the panel look beyond Europe to coordinate activities, and how will it incorporate feedback from scientists on those projects?

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    KEK Belle 2 detector, in Tsukuba, Ibaraki Prefecture, Japan

    HA: This is one of the issues that was very much discussed during my visit. We shouldn’t try to organize the whole world—in fact, a little bit of competition is very healthy. And complementarity is also very important.

    At the physics-level discussions, we’ll make sure that we have representatives from the United States and other countries so we are provided with all the information. As I was discussing with many people here, if there are ideas, experiments or existing collaborations which already include European partners, then of course, there is no issue [because the European partners will provide input to the strategy].

    LB: How do you see Europe working with Asia, in particular China, which has ambitions for a major collider?
    HA: Collaboration is very important, and at the global level we have to find the right balance between competition, which is stimulating, and complementarity. So we’re very much hoping to have one representative from China in the physics preparatory group, because China seems to have ambitions to realize some of the projects which have been discussed. And I’m not talking only about the equivalent of [the Future Circular Collider]; they are also thinking about an [electron-positron] circular collider, and there are also other projects that could potentially be realized in China. I also think that if the Chinese community decides on one of these projects, it may need contributions from around the world. Funding is an important aspect for any future project, but it is also important to reach a critical mass of expertise, especially for large research infrastructures.

    LB: This is a huge effort. What are some of the benefits and challenges of meeting with physicists from across Europe to come up with a single plan?
    HA: The benefits are obvious. The more input we have, the fuller the picture we have, and the more likely we are to converge on something that satisfies maybe not everybody, but at least the majority—which I think is very important for a good feeling in the community.

    The challenges are also obvious. On one hand, we rely very much on individuals and their creative ideas. These are usually the people who also happen to be the big pushers and tend to generate most controversies. So we will have to find a balance to keep the process interesting but constructive. There is no doubt that there will be passionate and exciting discussions that will need to happen; this is part of the process. There would be no point in only discussing issues on which we all agree.

    The various physics communities, in the ideal situation, get organized. We have the neutrino community, [electron-positron collider] community, precision measurements community, the axion community—and here you can see all kinds of divisions. But if these communities can get organized and come up with what one could call their own white paper, or what I would call a 10-page proposal, of how various projects could be lined up, and what would be the advantages or disadvantages of such an approach, then the job will be very easy.

    LB: And that input is what you’re aiming to get by December 2018?
    HA: Yes, yes.

    LB: How far does the strategy look out?
    HA: It doesn’t have an end date. This is why one of the requests for the input is for people to estimate the time scale—how much time would be needed to prepare and to realize the project. This will allow us to build a timeline.

    We have at present a large project that is approved: the high-luminosity LHC. This will keep an important part of our community busy for the next 10 to 20 years. But will the entire community remain fully committed for the whole duration of the program if there are no major discoveries?

    I’m not sure that we can be fed intellectually by one project. I think we need more than one. There’s a diversity program—diversity in the sense of trying to maximize the physics output by asking questions which can be answered with the existing facilities. Maybe this is the time to pause and diversify while waiting for the next big step.

    LB: Do you see any particular topics that you think are likely to come up in the discussion?
    HA: There are many questions on the table. For example, should we go for a proton-proton or an [electron-positron] program? There are, for instance, voices advocating for a dedicated Higgs factory, which would allow us to make measurements of the Higgs properties to a precision that would be extremely hard to achieve at the LHC. So we will have to discuss if the next machine should be an [electron-positron] machine and check whether it is realistic and on what time scale.

    One of the subjects that I’m pretty sure will come up as well is about pushing the accelerating technologies. Are we getting to the limit of what we can do with the existing technologies, and is it time to think about something else?

    To learn more about the European Strategy Update for Particle Physics, watch Abramowicz’s colloquium at Fermilab.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 1:09 pm on June 11, 2018 Permalink | Reply
    Tags: A wealth of ALICE results at Quark Matter 2018, , , , , Particle Accelerators, ,   

    From ALICE at CERN: “A wealth of ALICE results at Quark Matter 2018” 

    CERN

    CERN New Masthead

    From ALICE at CERN

    Contributing 35 talks and almost 100 posters, the ALICE Collaboration has had a strong participation in QM2018, which took place in Venice from 13 to 19 May, where an ample set of new results was presented.

    1

    The 27th edition of the Quark Matter conference took place in Venice in the week 13-19 May. In ALICE, preparations for the conference started months ago, with new analyses in all the Physics Working Groups. More than 70 new preliminary results were reviewed and approved for the conference and 16 new papers were made available on the preprint server right before the start of the conference.

    In the opening session, Alexander Kalweit presented the ALICE highlights talk, which covered many of the new results, and provided pointers to the 35 contributed talks and almost 100 posters that ALICE presented in the following days. The new results from ALICE covered a broad set of topics, including particle production in pp, p-Pb, and Pb-Pb collisions, and for the first time also Xe-Xe collisions, which were produced by the LHC in a short test run in October 2017. In the following, we will highlight a few of the new results that were presented by ALICE at Quark Matter.

    3
    Alexander Kalweit giving his talk on the ALICE highlights.

    For the small system collisions, pp and p-Pb, we reported on recent measurements on production of resonances and (anti-)nuclei as a function of the total charged particle multiplicity. These results show an intriguing dependence of the production of high-momentum particles on the overall multiplicity, which is likely due to the occurrence of multiple semi-hard scatterings in a single proton-proton collision. The goal of these measurements is to study the onset of effects such as strangeness enhancement and radial and elliptic flow, which are typically associated with Quark-Gluon Plasma formation.

    New results on heavy flavour production in p-Pb collisions show that the charmed baryon production rate is much larger than was expected from electron-positron collisions, and the baryon-to-meson ratio is characterized by a maximum at intermediate pT, which is also seen for light flavor baryon-to-meson ratios. This suggests that there is a common production mechanism for light flavor baryons like the protons and Λ baryon and for the charmed Λc baryon. A first result of Λc baryon production in Pb-Pb collisions was presented, which also shows a large baryon/meson ratio. Improving the precision of these measurements is one of the goals of the detector upgrade programme, which was discussed in another session.

    New results on production of the quarkonia J/Ψ and Ψ(2s) in p-Pb collision at √sNN = 8.2 TeV provide more precise information on the density distributions of quarks and gluons in the nucleus. The production of Ψ(2s) is significantly suppressed with respect to expectations from proton-proton collisions, even in the proton-going direction where no suppression is seen for the lower-mass J/Ψ. This suppression is not fully understood yet, but may be coming from final state interactions with light particles at similar momenta.

    In October last year, the LHC collided Xe nuclei for a few hours and ALICE recorded about 2M collisions, which allow studying the dependence of particle production and QGP effects on the size of the colliding nuclei: the isotope of Xe that was used has 129 nucleons, whereas Pb has 208 nucleons. The total multiplicity (number of produced particles) in Xe-Xe collisions is found to be similar to that in Pb-Pb with the same number of participating nucleons, except for very central Xe-Xe collisions, where an increase of particle production per participant pair is found. The elliptic and triangular flow in Xe-Xe and Pb-Pb collisions are very similar when comparing analogous centralities, as expected, because of the similarity of the initial shape of the system. The smaller number of nucleons in Xe leads to larger fluctuations of the initial geometry, which in turn lead to a larger flow signal in central events; the measured values agree with model calculations. The relative abundances of light flavour hadrons in the new Xe-Xe data confirms the previously-established picture that particle chemistry depends mostly on final state particle multiplicity at LHC energies. Finally, for high-momentum particle production, we observe a similar nuclear modification factor in Xe and Pb when comparing collisions with the same multiplicity. This is qualitatively in line with expectations, since parton energy loss depends on the density and the volume of the system, but more detailed model comparisons are being pursued.

    A dedicated study of the nuclear modification factor of peripheral Pb-Pb collisions shows that, while the suppression of high-momentum particle production that is associated with parton energy loss initially decreases when the collisions become less central, it increases again for very peripheral collisions. This non-monotonic behavior suggests that there is a different mechanism that suppresses high-momentum particle production in very peripheral collisions; one possible explanation is that the individual nucleon-nucleon collisions in the nuclear collision have larger impact parameters and that this reduces the number of parton scatterings, and thus the particle production at high transverse momentum. It is also relevant for the interpretation of collisions of small systems, where the observed azimuthal anisotropy suggests that final state interactions are important, but no suppression of final state particle production is found.

    ALICE also presented a first attempt to measure azimuthal anisotropy of direct photons at the LHC, which probes the time evolution of the temperature and pressure in the Quark Gluon Plasma. The measured signal is large, suggesting the importance of late emission of photons. However, the uncertainties are still sizeable and further improvements are needed to firmly establish this conclusion.

    The Quark Gluon Plasma is also studied using high momentum particles that traverse the plasma and interact with it. At the conference, ALICE presented new results on the nuclear modification factor for jets, as well as studies of the substructure of jets, which aim to be directly sensitive to the radiation of gluons by fast partons as they go through the plasma. A suppression of large-angle symmetric splittings is found, which suggests that partons in a parton shower interact independently with the Quark Gluon Plasma if the angle between them is large enough.

    All in all, the Quark Matter conference was a very interesting meeting, with an unprecedented number of new results from ALICE and the other experiments, as well as discussions of new ideas from theorist colleagues. The topics represented above are only a small selection of what was shown at the conference. The release of a large number of new results at the conference has sparked a lot of new discussions, which are being followed up in several places and we are looking forward to the new insights in strongly interaction matter that this will bring.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:


    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS
    ATLAS
    CERN/ATLAS detector

    ALICE
    CERN ALICE New

    CMS
    CERN/CMS Detector

    LHCb

    CERN/LHCb

    LHC

    CERN/LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles


    Quantum Diaries

     
  • richardmitnick 12:56 pm on June 10, 2018 Permalink | Reply
    Tags: , ATLAS Trigger, , , , Particle Accelerators, ,   

    From CERN ATLAS: “In conversation with Nick Ellis, one of the architects of the ATLAS Trigger” 

    CERN ATLAS Higgs Event

    CERN/ATLAS
    From CERN ATLAS

    10th June 2018
    Kate Shaw

    1
    Nick Ellis with the ATLAS trigger system. (Image: K. Anthony/ATLAS Collaboration)

    A long-standing member of the ATLAS Collaboration, CERN physicist Nick Ellis was one of the original architects of the ATLAS Trigger. Working in the 1980s and 1990s, Nick led groups developing innovative ways to move and process huge quantities of data for the next generation of colliders. It was a challenge some thought was impossible to meet. Nick currently leads the CERN ATLAS Trigger and Data Acquisition Group and shared his wealth of experience as a key part of the ATLAS Collaboration.

    I first became involved in what was to become the ATLAS Collaboration in the mid- to late-1980s. I had been working on the UA1 experiment at CERN’s SPS proton–antiproton collider for several years on various physics analyses and also playing a leading role on the UA1 trigger.

    People were starting to think about experiments for higher-energy machines, such as the Large Hadron Collider (LHC) and the never-completed Superconducting Super Collider (SSC). Of course, at this point there was no ATLAS or CMS or even the precursors. There were just groups of people getting together to discuss ideas.

    I remember a first discussion I had about possibilities for the trigger in LHC experiments was over a coffee in CERN’s Restaurant 1 with Peter Jenni. He was on the UA2 experiment at the time and, together with a number of colleagues, was developing ideas for an LHC experiment. Peter later went on to lead the ATLAS Collaboration for over a decade. He told me that nobody was looking at how the trigger system might be designed, and he asked if I would like to develop something. So I did.

    The ATLAS trigger is a multilevel system that selects events that are potentially interesting for physics studies from a much larger number of events. It is very challenging since we start off with an interaction rate of the order of a billion per second. In the first stage of the selection, that has to be done within a few millionths of a second, the event rate must be reduced to about 100 kHz, four orders of magnitude below to the interaction rate, i.e. only one in ten thousand collisions can give rise to a first-level trigger. Note that each event, corresponding to a given bunch crossing, contains many tens of interactions. The rate must then be brought down by a further two orders of magnitude before the data is recorded for offline analysis.

    When I start working on such a complex technical problem, I sit down with a pen and paper and draw diagrams. It’s important to visualise the system. A trigger and data-acquisition system is complicated – you have data being produced, data being processed, data being moved. So, I make a sketch with arrows, writing down order of magnitude numbers, what has to talk to what, what signals have to be sent. These are very rough notes! I doubt anyone other than me would be able to read my sketches that fed into the early designs of ATLAS’ trigger.

    Though I was specifically looking at the first-level calorimeter trigger, which was what I was working on at UA1, I was interested in the trigger more generally. At the time, we did not know that the future held so much possibility in terms of programmable logic. The early ideas for the first-level trigger were based on relatively primitive electronics: modules with discrete logic, memories and some custom integrated circuits.

    There was also concern that the second-level trigger processing would be hard to implement, because those triggers would require too much data to move and too much data to process. Here, the first thing I had to do was to demonstrate that it could be done at all! I carried out an intellectual exercise to try and factorise the problem, to the maximal extent possible. I was driven to do this because it was so interesting, and it was virgin territory. There were no constraints on ideas that could be explored.

    My initial studies were on a maximally-factorised model, the so-called “local–global scheme”. It was never my objective that one would necessarily implement this exact scheme, but I used it as the basis for brainstorming a region-of-interest (ROI) strategy for the trigger. The triggers would look at specified regions of the detector, identified by the first-level trigger, for features of interest, rather than trying to search for features everywhere in the event. This exercise demonstrated that, at any given point in the system, you could get the data movement and computation down to a manageable level.

    2
    The ATLAS Level-1 Calorimeter Trigger, located underground in a cavern adjacent to the experiment. (Image: K. Anthony/ATLAS Collaboration)

    I, along with a few colleagues, developed this exercise into a study that we presented at the 1990 Large Hadron Collider workshop in Aachen, Germany. In the end, thanks to technological progress, it was not necessary to exploit all the ingredients used in the study. In more specific words, instead of separating the processing for each ROI and for each detector, we were able to use a single processor to process fully all of the ROIs in an event. The use of the first-level trigger to guide the second-level data access and processing became a key part of the ATLAS trigger philosophy.

    In the years following the Aachen workshop, the ATLAS and CMS experiments began to take shape. It was a really exciting time, and the number of people involved was tiny in comparison to today. You could do anything and everything; you could come with completely new ideas!

    When first beams and first collisions finally came, things went more smoothly than I had ever dared to hope.

    3
    First collisions in ATLAS. A collision event. (Image: Claudia Marcelloni/ATLAS Experiment)

    We had spent a lot of time planning for what would come when the first single beam came, when the first collisions came, what would we do, in what order, what might go wrong and how could we mitigate it. It has always been in my nature to think ahead about all the potential problems and make plans that let us avoid future issues, ensuring that systems are robust so that a local problem does not become a global problem. Thanks to the work of excellent, dedicated colleagues, everything went really well for first collisions!

    Clearly ATLAS has a long future ahead of it, although we will always face challenges: the upgrades we have planned are by no means trivial! Even with our existing infrastructure and experience, there will no doubt be obstacles that we will have to overcome.

    And, of course, in the even longer term, CERN itself could change, depending on what happens in physics and on the global stage. It wouldn’t be the first laboratory to do so – just look at DESY and SLAC.


    DESY Helmholtz Centres & Networks


    SLAC Campus


    SLAC SSRL


    SLAC LCLS

    Even Fermilab has changed from a collider to a neutrino facility. We never know where the next big discovery will lead us!


    FNAL Short baseline neutrino detector

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN


    CERN Courier

    QuantumDiaries


    Quantum Diaries

     
  • richardmitnick 7:38 pm on June 5, 2018 Permalink | Reply
    Tags: , , Catching hadronic vector boson decays with a finer net, , , Particle Accelerators, ,   

    From CERN ATLAS: “Catching hadronic vector boson decays with a finer net” 

    CERN ATLAS Higgs Event

    CERN/ATLAS
    From CERN ATLAS

    5th June 2018
    ATLAS Collaboration

    1
    Figure 1: ATLAS event display showing two electroweak boson candidates with an invariant mass of 5 TeV, the highest observed in the analysis. Energy deposits in the ATLAS calorimeters are shown in green and yellow rectangles. The angular resolution limits the reconstruction of substructure of highly collimated jets. The view in the top left corner shows the higher angular resolution of the first calorimeter layer and the tracker (orange lines), revealing the striking two-prong substructure in the energy flow. (Image: ATLAS Collaboration/CERN)

    ATLAS has been collecting increasing amounts of data at a centre-of-mass energy of 13 TeV to unravel some of the big mysteries in physics today. For instance, why is the mass of the Higgs boson so much lighter than one would expect? Why is gravity so weak?

    Many theoretical models predict that new physics, which could provide answers to these questions, could manifest itself as yet-undiscovered massive particles. These include massive new particles that would decay to much lighter high-momentum electroweak bosons (W and Z). These in turn decay, and the most common signature would be pairs of highly collimated bundles of particles, known as jets. So far, no evidence of such new particles has been uncovered.

    The ability to distinguish jets initiated by decays of W or Z bosons from those initiated by other processes is critical for the success of these searches. While the energy flow from the bosons exhibits a distinct two-prong structure from the two-body decay of the boson, no such feature exists for jets from a single quark or gluon – the latter being the most frequent scattering products when colliding protons.

    In the past, ATLAS identified this two-prong structure using its fine-grained calorimeter, which measures the energy of the particles inside jets with good resolution. However, in very energetic jets from decays of particles with masses of multiple TeV, the average separation of these prongs is comparable to the segmentation of the ATLAS calorimeter. This creates confusion within the algorithms responsible for identifying the bosons, limiting our sensitivity to new physics at high masses. In contrast to the calorimeter, the ATLAS inner tracking detector reconstructs charged particles with excellent angular resolution, but it lacks sufficient momentum resolution.

    2
    Figure 2 Comparison between the current and previous limits on the cross section times branching ratio for a hypothetical particle V versus its mass. Lower vertical values represent higher sensitivity to new physics. Due to the improvements on the analysis techniques, the current result improves our reach for new physics far beyond what we get from only increasing the size of the data set (middle blue line). (Image: ATLAS Collaboration/CERN)

    A new ATLAS analysis combines the angular information of charged particles reconstructed by the inner detector with the energy information from the calorimeter. This lets ATLAS physicists eliminate the limitations in identifying very energetic jets from bosons. Similar to increasing the magnification of a microscope, this improvement to the ATLAS event reconstruction software allows it to better resolve the energy flow in very energetic jets. This improved magnification allows physicists to also optimize the analyses techniques.

    Making such improvements while collecting more data is necessary to maximize the potential for discovery when exploring new kinematic regimes. This time no new physics was seen, but the technique can be applied to many more searches – and still larger datasets.

    Related journal articles
    _________________________________________________
    See the full article for further references with links.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN


    CERN Courier

    QuantumDiaries


    Quantum Diaries

     
  • richardmitnick 9:23 am on June 4, 2018 Permalink | Reply
    Tags: , , , , , , Particle Accelerators, ,   

    From CERN Courier: “Higgs boson reaches the top” 


    From CERN Courier

    Jun 1, 2018
    No writer credit

    The CMS collaboration has published the first direct observation of the coupling between the Higgs boson and the top quark, offering an important probe of the consistency of the Standard Model (SM). In the SM, the Higgs boson interacts with fermions via a Yukawa coupling, the strength of which is proportional to the fermion mass. Since the top quark is the heaviest particle in the SM, its coupling to the Higgs boson is expected to be the largest and thus the dominant contribution to many loop processes, making it a sensitive probe of hypothetical new physics.

    1
    Combined likelihood analysis

    The associated production of a Higgs boson with a top quark–antiquark pair (ttH) is the best direct probe of the top-Higgs Yukawa coupling with minimal model dependence, and thus a crucial element to verify the SM nature of the Higgs boson. However, its small production rate – constituting only about 1% of the total Higgs production cross-section – makes the ttH measurement a considerable challenge.

    The CMS and ATLAS collaborations reported first evidence for the process last year, based on LHC data collected at a centre-of-mass energy of 13 TeV (CERN Courier May 2017 p49 and December 2017 p12). The first observation, constituting statistical significance above five standard deviations, is based on an analysis of the full 2016 CMS dataset recorded at an energy of 13 TeV and by combining these results with those collected at lower energies.

    The ttH process gives rise to a wide variety of final states, and the new CMS analysis combines results from a number of them. Top quarks decay almost exclusively to a bottom quark (b) and a W boson, the latter subsequently decaying either to a quark and an antiquark or to a charged lepton and its associated neutrino. The Higgs-boson decay channels include the decay to a bb quark pair, a τ+τ– lepton pair, a photon pair, and combinations of quarks and leptons from the decay of intermediate on- or off-shell W and Z bosons. These five Higgs-boson decay channels were analysed by CMS using sophisticated methods, such as multivariate techniques, to separate signal from background events. Each channel poses different experimental challenges: the bb channel has the largest rate but suffers from a large background of events containing a top-quark pair and jets, while the photon and Z-boson pair channels offer the highest signal-to-background ratio at a very small rate.

    CMS observed an excess of events with respect to the background-only hypothesis at a significance of 5.2 standard deviations. The measured values of the signal strength in the considered channels are consistent with each other, and a combined value of 1.26 +0.31/–0.26 times the SM expectation is obtained (see figure). The measured production rate is thus consistent with the SM prediction within one standard deviation. The result establishes the direct Yukawa coupling of the Higgs boson to the top quark, marking an important milestone in our understanding of the properties of the Higgs boson.

    Further reading

    https://arxiv.org/abs/1804.02610
    https://arxiv.org/abs/1803.05485
    https://journals.aps.org/prd/abstract/10.1103/PhysRevD.97.072003

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 8:58 am on June 4, 2018 Permalink | Reply
    Tags: , , , , , Particle Accelerators, ,   

    From CERN CMS and ATLAS: “The Higgs boson reveals its affinity for the top quark” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    New results from the ATLAS and CMS experiments at the LHC reveal how strongly the Higgs boson interacts with the heaviest known elementary particle, the top quark, corroborating our understanding of the Higgs and setting constraints on new physics.

    CERN CMS Event NOV 2010

    From CERN CMS

    By CMS

    The first observation of the simultaneous production of a Higgs boson with a top quark-antiquark pair is being published today in the journal Physical Review Letters. This major milestone, first reported by the CMS Collaboration in early April 2018, unambiguously demonstrates the interaction of the Higgs boson and top quarks, which are the heaviest known subatomic particles. It is an important step forward in our understanding of the origin of mass. The paper features as a PRL Editors’ Suggestion and also has a Physics Viewpoint article published about it.

    ________________________________________________________
    From CMS – first reported by the CMS Collaboration in early April 2018

    The observation of a Higgs boson in 2012 at the Large Hadron Collider marked the starting point of a broad experimental program to determine the properties of the newly discovered particle. In the standard model, the Higgs boson couples to fermions in a Yukawa-type interaction, with a coupling strength proportional to the fermion mass. While decays into γγ, ZZ, WW, and ττ final states have been observed and there is evidence for the direct decay of the particle to the bb (down-type quarks) final state, the decay to the tt (up-type quarks) final state is not kinematically possible. Therefore, it is of paramount importance to probe the coupling of the Higgs boson to the top quark, the heaviest known fermion, by producing the Higgs in the fusion of a top quark-antiquark pair (left diagram) or through radiation from a top quark (right diagram).

    1

    The associated production of a Higgs boson and a top quark-antiquark pair (ttH production) is a direct probe of the top–Higgs coupling. Hence the observation of this production mechanism is one of the primary objectives of the the Higgs physics program at the LHC.

    The CMS experiment has searched for ttH production in the data collected at the center-of-mass energies of 7, 8, and 13 TeV with the Higgs boson decaying to pairs of W bosons, Z bosons, photons, τ leptons, or bottom quark jets. The results have been combined to maximize the sensitivity to this challenging and yet fundamental process.

    3
    An excess of events is observed, with a significance of 5.2 standard deviations, over the expectation from the background-only hypothesis. The corresponding expected significance for the standard model Higgs boson with a mass of 125.09 GeV is 4.2 standard deviations. The measured production rate is consistent with the standard model prediction within one standard deviation.

    In addition to comprising the first observation of a new Higgs boson production mechanism, this measurement establishes the tree-level coupling of the Higgs boson to the top quark, and hence to an up-type quark, and is another milestone towards the measurement of the Higgs boson coupling to fermions.
    ________________________________________________________

    4

    An event candidate for the production of a top quark and top anti-quark pair in conjunction with a Higgs Boson in the CMS detector. The Higgs decays into a tau+ lepton and a tau- lepton; the tau+ in turn decays into hadrons and the tau- decays into an electron. The decay product symbols are in blue. The top quark decays into three jets (sprays of lighter particles) whose names are given in purple. One of these is initiated by a b-quark. The top anti-quark decays into a muon and b-jet, whose names appear in red.

    Further reading:

    [1] CMS ttH observation journal article: Physical Review Letters, June 4, 2018

    See the full CMS article here.

    From CERN ATLAS

    6
    CERN ATLAS Event 2012

    New ATLAS result establishes production of Higgs boson in association with top quarks.

    This rare process is one of the most sensitive tests of the Higgs mechanism.

    By ATLAS Collaboration, 4th June 2018

    According to the Standard Model, quarks, charged leptons, and W and Z bosons obtain their mass through interactions with the Higgs field, a quantum fluctuation of which gives rise to the Higgs boson. To test this theory, ATLAS takes high-precision measurements of the interactions between the Higgs boson and these particles. While the ATLAS and CMS experiments at CERN’s Large Hadron Collider (LHC) had observed and measured the Higgs boson decaying to pairs of W or Z bosons, photons or tau leptons, the Higgs coupling to quarks had not – despite evidence – been observed.

    In results presented today at the LHCP2018 conference, the ATLAS Collaboration has observed the production of the Higgs boson together with a top-quark pair (known as “ttH” production). Only about 1% of all Higgs bosons are produced through this rare process. This result establishes a direct measurement of the interaction between the top quark and the Higgs boson (known as the “top quark Yukawa coupling”). As the top quark is the heaviest particle in the Standard Model, this measurement is one of the most sensitive tests of the Higgs mechanism.

    Previous ATLAS measurements using 2015 and 2016 data provided the first evidence for ttH production from a combination of channels where the Higgs boson decayed to two W or Z bosons (WW* or ZZ*), to a pair of tau leptons, to a pair of b-quarks, or to a pair of photons (“diphoton”). Those results have now been updated with the measurements of the diphoton and ZZ* decay modes that use the larger 2015-2017 dataset, and where improved reconstruction algorithms and new analysis techniques have increased the sensitivity of the measurements. The CMS Collaboration recently reported the observation of ttH production by combining 2015 and 2016 data with data taken at lower collision energies in earlier LHC runs.
    Evidence for ttH production in the diphoton channel in the 2015-2017 dataset

    The probability of a Higgs boson decaying to a diphoton pair is only about 0.2%, making the predicted rate for ttH production in this channel quite small. However, because the energy and direction of photons can be well measured with the ATLAS detector, the reconstructed mass peak obtained with this decay mode is narrow. It is therefore possible to observe a signal even when the number of events is low. Furthermore, regions with lower and higher reconstructed mass (called the “sidebands”) can be used to estimate the background under the signal peak using the data themselves, rendering this channel particularly robust.

    To optimize the measurement ATLAS employs machine learning techniques. Events consistent with the ttH kinematics are selected using “boosted decision tree” (BDT) algorithms that allow physicists to separate the events into multiple categories with different signal-to-background abundance ratios. Depending on the top-quark decay channel considered, the inputs given to the BDT are the momenta of the “jets” (collimated groups of particles that are produced by a quark or gluon), leptons and photons observed in each event. As the decay of a top quark always produces a b-quark, identifying jets that arise from b-quarks is essential for reducing backgrounds. To achieve this, ATLAS developed a b-identification algorithm (also based on machine learning); the b-identification decision for each jet is included in the BDT inputs.

    4
    Figure 1: Time-lapse animation showing the increasing ttH signal in the diphoton mass spectrum as more data are included in the measurement. (Image: ATLAS Collaboration/CERN)

    Each category is analysed separately by studying the distribution of the invariant mass of the diphoton candidates in selected events. This distribution is fit to a combination of signal (Higgs boson decay to diphoton in events containing a top-quark pair) and background (cases where the diphoton candidate does not arise from a Higgs boson or where the event does not contain a true top-quark pair). The numbers of fitted signal events in the different categories are then statistically combined, taking into account correlated experimental and theoretical systematic uncertainties.

    The result of the above procedure, using 80 fb-1 of data recorded in 2015, 2016 and the recent 2017 run of the LHC, is summarised in Figure 1, which shows the diphoton invariant mass distribution, summed over categories weighted by their signal purity. The significance of the observed signal is 4.1 standard deviations; the expected significance for Standard Model production is 3.7 standard deviations.

    Search continues for ttH production in the ZZ* channel in the 2015-2017 dataset.

    The decay of a Higgs boson to ZZ* with the subsequent decay of the ZZ* to four leptons is another channel where the Higgs mass peak is narrow. Due to the very clean detector signature of the four-lepton decay mode, this channel is essentially free of backgrounds apart from small contributions from Higgs bosons produced through other production modes than ttH. However, this decay mode is even rarer than that of diphotons, with less than one event expected from ttH production in the 80 fb-1 of the full 2015-2017 dataset. A dedicated search for this decay was performed, but no candidate events were found in the 2015-2017 ATLAS data.

    5
    Figure 2: Combined ttH production cross section, as well as cross sections measured in the individual analyses, divided by the SM cross section prediction. ML indicates the analysis of the two and three lepton final states (multilepton). The black lines show the total uncertainties, while the bands indicate the statistical and systematic uncertainties. The red line indicates the SM cross section prediction, and the grey band represents the theoretical uncertainties on the prediction. For the γγ and ZZ* channels to full dataset at 13 GeV (collected between 2015 and 2017) have been used, whereas the results of the other channels are based on the 2015 and 2016 data. (Image: ATLAS Collaboration/CERN)

    Combination with earlier ATLAS results

    The measurements described above have been combined with the previously reported searches for ttH that used 2015 and 2016 data. Decays of the Higgs boson to a b-quark pair and to a pair of W bosons or tau leptons had observed (expected) significances of 1.4 (1.6) and 4.2 (2.8) standard deviations, respectively.

    After the combination, the observed (expected) significance of the signal over the background is 5.8 (4.9) standard deviations. The ratio of the combined ttH cross section measurement and the cross section measurements separated by Higgs boson decay modes are presented in Figure 2. The measured ratio of 1.32 ± 0.27 is slightly larger than, but consistent with the Standard Model expectation.

    Further searches for the ttH process were performed using 7 and 8 TeV data collected during Run 1. When combined with the 2015-2017 results, the observed (expected) significance is 6.3 (5.1) standard deviations.

    Summary

    ATLAS has observed the production of the Higgs boson in association with a top-quark pair with a significance of 6.3 standard deviations over the background-only hypothesis. The measured ttH production cross section is consistent with the Standard Model prediction. This measurement provides direct evidence for the coupling of the Higgs boson to the top quark and supports the Standard Model mechanism whereby the top quark obtains its mass through interaction with the Higgs field.

    Evidence for the associated production of the Higgs boson and a top quark pair with the ATLAS detector Physical Review D

    See the full ATLAS article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.
    stem
    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    OTHER PROJECTS AT CERN

    CERN AEGIS

    CERN ALPHA

    CERN ALPHA

    CERN AMS

    CERN ACACUSA

    CERN ASACUSA

    CERN ATRAP

    CERN ATRAP

    CERN AWAKE

    CERN AWAKE

    CERN CAST

    CERN CAST Axion Solar Telescope

    CERN CLOUD

    CERN CLOUD

    CERN COMPASS

    CERN COMPASS

    CERN DIRAC

    CERN DIRAC

    CERN ISOLDE

    CERN ISOLDE

    CERN LHCf

    CERN LHCf

    CERN NA62

    CERN NA62

    CERN NTOF

    CERN TOTEM

    CERN UA9

     
  • richardmitnick 3:16 pm on June 2, 2018 Permalink | Reply
    Tags: , , , , Chinese CEPC project, , Particle Accelerators, ,   

    From CERN Courier: “China’s bid for a circular electron–positron collider” 


    From CERN Courier

    Jun 1, 2018
    Jie Gao
    Institute of High Energy Physics
    University of Chinese Academy of Sciences

    1
    A future collider in China

    Physicists in China have completed a conceptual design report for a 100 km-circumference collider that, in conjunction with a possible linear collider in Japan, would open a new era for high-energy physics in Asia.

    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan

    Chinese accelerator-based research in high-energy physics is a relatively recent affair. It began in earnest in October 1984 with the construction of the 240 m-circumference Beijing Electron Positron Collider (BEPC) at the Institute of High Energy Physics. BEPC’s first collisions took place in 1988 at a centre-of-mass energy of 1.89 GeV. At the time, SLAC in the US and CERN in Europe were operating their more energetic PEP and LEP electron–positron colliders, respectively, while the lower-energy electron–positron machines ADONE (Frascati), DORIS (DESY) and VEPP-4 (BINP Novosibirsk) were also in operation.



    Beijing Electron Positron Collider (BEPC)

    SLAC SSRL


    SLAC SSRL PEP collider map

    CERN LEP Collider


    CERN LEP Collider

    ADONE INFN-LNF synchrotron radiation beamline

    DESY DORIS


    DESY DORIS III

    VEPP-4 (BINP Novosibirsk)

    Beginning in 2006, the BEPCII upgrade project saw the previous machine replaced with a double-ring scheme capable of colliding electrons and positrons at the same beam energy as that of BEPC but with a luminosity 100 times higher (1033 cm−2 s−1). BEPCII, whose collisions are recorded by the Beijing Spectrometer III (BES III) detector, switched on two years later and continues to produce results today, with a particular focus on the study of charm and light-hadron decays.

    BESS III

    China also undertakes non-accelerator-based research in high-energy physics via the Daya Bay neutrino experiment, which was approved in 2006 and announced the first observation of the neutrino mixing angle θ13 in March 2012.

    Daya Bay, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China

    The discovery of the Higgs boson at CERN’s Large Hadron Collider [see below] in July 2012 raises new opportunities for a large-scale accelerator.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    Thanks to the low mass of the Higgs, it is possible to produce it in the relatively clean environment of a circular electron–positron collider – in addition to linear electron–positron colliders such as the International Linear Collider (ILC) [see schematic above] and the Compact Linear Collider (CLIC) – with reasonable luminosity, technology, cost and power consumption.

    CLIC Collider annotated


    CERN/CLIC

    The Higgs boson is the cornerstone of the Standard Model (SM), yet is also responsible for most of its mysteries: the naturalness problem, the mass-hierarchy problem and the vacuum-stability problem, among others. Therefore, precise measurements of the Higgs boson serve as excellent probes of the fundamental physics principles underlying the SM and of exploration beyond the SM.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    4
    Fig. 1.

    In September 2012, Chinese scientists proposed a 50–70 km circumference 240 GeV Circular Electron Positron Collider (CEPC) in China, serving two large detectors for Higgs studies. The tunnel for such a machine could also host a Super Proton Proton Collider (SppC) to reach energies beyond the LHC (figure 1). CERN is also developing, via the Future Circular Collider (FCC) study, a proposal for a large (100 km circumference) tunnel, which could host high-energy electron–positron (FCC-ee), proton–proton (FCC-hh) or electron–proton (FCC-he) colliders (see CERN thinks bigger).

    CERN Future Circular Collider


    FCC Future Circular Collider at CERN

    Progress in both projects is proceeding fast, although many open questions remain – not least how to organise and fund these next great steps in our exploration of fundamental particles.

    China Circular Electron Positron Collider (CEPC) map

    Precision leap

    CEPC is a Higgs factory capable of producing one million clean Higgs bosons over a 10 year period. As a result, the couplings between the Higgs boson and other particles could be determined to an accuracy of 0.1–1% – roughly one order of magnitude better than that expected of the high-luminosity LHC upgrade and challenging the most advanced next-to-next-to-leading-order SM calculations (figure 2). By lowering the centre-of-mass energy to that of the Z pole at around 90 GeV, without the need to change hardware, CEPC could produce at least 10 billion Z bosons per year. As a super Z – and W – factory, CEPC would shed light on rare decays and heavy-flavour physics and mark a factor-10 leap in the precision of electroweak measurements.

    4
    Fig. 2.

    The latest CEPC baseline design is a 100 km double ring (figure 3, left) with a single-beam synchrotron-radiation power of 30 MW at the Higgs pole, and with the same superconducting radio-frequency accelerator system for both electron and positron beams. CEPC could work both at Higgs- and Z-pole energies with a luminosity of 2 × 1034 cm–2 s–1 and 16 × 1034 cm–2 s–1, respectively. The alternative design of CEPC is based on a so-called advanced partial double-ring scheme (figure 3, right) with the aim of reducing the construction cost. Preliminary designs for the two CEPC detectors are shown in figure 4.

    5
    Fig. 3.

    Concerning the SppC baseline, it has been decided to start with 12 T dipole magnets made from iron-based high-temperature superconductors to allow proton–proton collisions at a centre-of-mass energy of 75 TeV and a luminosity of 1035 cm–2 s–1. The SppC SC magnet design is different to the Nb3Sn-based magnets planned by the FCC-hh study, which are targeting a field of 16 T to allow protons to collide at a centre-of-mass energy of 100 TeV. The Chinese design also envisages an upgrade to 20 T magnets, which will take the SppC collision energy to beyond 100 TeV. Discovered just over a decade ago, iron-based superconductors have a much higher superconducting transition temperature than conventional superconductors, and therefore promise to reduce the cost of the magnets to an affordable level. To conduct the relevant R&D, a national network in China has been established and already more than 100 m of iron-based conductor cable has been fabricated.

    6
    Fig. 4.

    The CEPC is designed as a facility where both machines can coexist in the same tunnel (figure 5). It will have a total of four detector experimental halls, each with a floor area of 2000 m2 – two for CEPC and another two for SppC experiments. The tunnel is around 6 m wide and 4.8 m high, hosting the CEPC main ring (comprising two beam pipes), the CEPC booster and SppC. The SppC will be positioned outside of CEPC to accommodate other collision modes, such as an electron–proton, in the far future. The FCC study, which is aiming to complete a Conceptual Design Report (CDR) by the end of the year, adopts a similar staged approach (see CERN thinks bigger [link is above]).

    7
    Fig. 5.

    China on track

    Since the first CEPC proposal, momentum has grown. In June 2013, the 464th Fragrant Hill Meeting (a national meeting series started in 1994 for the long-term strategic development of China’s science and technology) was held in Beijing and devoted to developing China’s high-energy physics following the discovery of the Higgs boson. Two consensuses were reached: the first was to support the ILC and participate in its construction with in-kind contributions, with R&D funds to be requested from the Chinese government; the second was a recognition that a circular electron–positron Higgs factory – the next collider after BEPCII in China – and a Super proton–proton collider built afterwards in the same tunnel is an important historical opportunity for fundamental science.

    n 2014, the International Committee for Future Accelerators (ICFA) released statements supporting studies of energy-frontier circular colliders and encouraged global coordination. ICFA continues to support international studies of circular colliders, in addition to support for linear machines, reflecting the strategic vision of the international high-energy community. In April 2016, during the AsiaHEP and Asian Committee for Future Accelerators (ACFA) meeting in Kyoto, positive statements were made regarding the ILC and a China-led effort on CEPC-SppC. In September that year, at a meeting of the Chinese Physics Society, it was concluded that CEPC is the first option for a future high-energy accelerator project in China, with the strategic aim of making it a large international scientific project. Pre-conceptual design reports (pre-CDRs) for CEPC-SppC were completed at the beginning of 2015 with an international review, based on a single ring-based “pretzel” orbit scheme. A CEPC International Advisory Committee (IAC) was established and, in 2016, the Chinese Ministry of Science and Technology (MOST) allocated 36 million RMB (€4.6 million) for the CEPC study, and in 2018 another 32 million RMB (€4.1 million) has been approved by MOST.

    Ensuring that a large future circular collider maximises its luminosity is a major challenge. The CEPC project has studied the use of a crab-waist collision scheme, which is also being studied for FCC-ee. Each of the double-ring schemes for CEPC have been studied systematically with the aim of comparing the luminosity potentials. On 15 January last year, CEPC-SppC baseline and alternative designs for the CDR were decided, laying the ground for the completion of the CEPC CDR at the end of 2017. Following an international review in June, the CEPC CDR will be published in July 2018.

    While technical R&D continues – both for the CEPC machine and its two large detectors – a crucial issue is how to pay for such a major international project. In addition to the initial funding from MOST, other potential channels include the National Science Foundation of China (NSFC), the Chinese Academy of Sciences (CAS) and local governments. For example, two years ago Beijing Municipal allocated more than 500 million RMB (€65 million) to the Institute of High Energy Physics for superconducting RF development, and in 2018 CAS plans to allocate 200 million RMB (€26 million) to study high-temperature superconductors for magnets, including studies in materials science, industry and projects such as SppC. While not specifically intended for CEPC-SppC, such investments will have strong synergies with high-energy physics and, in November 2017, the CEPC-SppC Industrial Promotion Consortium was established with the aim of supporting mutual efforts between CEPC-SppC and industry.

    A five-year-long Technical Design Report (TDR) effort to optimise the CEPC-SppC design and technologies, and prepare for industrial production, started this year. Construction of CEPC could begin as early as 2022 and be completed by the end of the decade. CEPC would operate for about 10 years, while SppC is planned to start construction in around 2040 and be completed by the mid-2040s. The CEPC-SppC TDR phase after the CDR is critical, both for key-component R&D and industrialisation. R&D has already started towards high-Q, high-field 1.3 GHz and 650 MHz superconducting cavities; 650 MHz high-power high-efficiency klystrons; 12 kW cryogenic systems, 12 T iron-based high temperature superconducting dipoles, and other enabling technologies. Construction of a new 4500 m2 superconducting RF facility in Beijing called the Platform of Advanced Photon Source began in May 2017 to be completed in 2020, and could serve as a supporting facility for different projects.

    International ambition

    CEPC-SppC is a Chinese-proposed project to be built in China, but its nature is an international collaboration for the high-energy physics community worldwide. Following the creation of the CEPC-SppC IAC in 2015, more than 20 MoUs have been signed with many institutes and universities around the world, such as the Budker Institute of Nuclear Physics (BINP; Russia); National Research Nuclear University MEPhI (Moscow, Russia) and the University of Rostock (Germany).

    In August 2017, ICFA endorsed an ILC operating at a centre-of-mass energy of 250 GeV (ILC250), with energy-upgrade possibilities in the future (CERN Courier January/February 2018 p7). Although CEPC and ILC250 start with the same energy to study the Higgs boson, the ultimate goals are totally different from each other: SppC is for a 100 TeV proton–proton collider and ILC is a 1 TeV (maximum) electron–positron collider. The existence of both, however, would offer a highly complementary physics programme operating for a period of decades. The specific feature of CEPC is its small-scale superconducting RF system, (and its relatively large AC power consumption (300 MW for CEPC compared to 110 MW for ILC250). As for the cost, CEPC in its first phase includes part of the cost of SppC for its long tunnel, whereas ILC would upgrade its energy by increasing tunnel length accordingly later.

    8

    Deciding where to site the CEPC-SppC involves numerous considerations. Technical criteria are roughly quantified as follows: earthquake intensity less than seven on the Richter scale; earthquake acceleration less than 0.1 g; ground surface-vibration amplitude less than 20 nm at 1–100 Hz; granite bedrock around 50–100 m deep, and others. The site-selection process started in February 2015, and so far six sites have been considered: Qinhuangdao in Hebei Province; Huangling county in Shanxi Province; Shenshan Special District in Guangdong Province; Baoding (Xiongan) in Heibei Province; Huzhou in Zhejiang Province and Changchun in Jilin Province, where the first three sites have been prospected underground (figure 6). More sites, such as Huzhou in Zhejiang Province, will be considered in the future before a final selection decision. According to Chinese civil construction companies involved in the siting process, a 100 km tunnel will take less than five years to dig using drill-and-blast methods, and around three years if a tunnel boring machine is employed.

    2018 is a milestone year for Higgs factories in Asia. As CEPC completes its CDR, the global high-energy physics community is waiting for a potential positive declaration from the Japanese government, by the end of the year, on their intention to host ILC250 in Japan, upgradable to higher energies. It is also a key moment for high-energy physics in Europe. FCC will complete its CDR by the end of the year, while CLIC released an updated 380 GeV baseline-staging scenario (CERN Courier November 2016 p20), and the European Strategy for Particle Physics update process will get under way (CERN Courier April 2018 p7). Hopefully, both ILC250 and CEPC-SppC will be included in the update together with FCC, while with respect to the US strategy we are looking forward to the next “P5” meeting following the European update.

    During the past five years, CEPC-SppC has kept to schedule both in design and R&D, together with strong team development and international collaboration. On 28 March this year, the Chinese government announced the “Implementation method to support China-initiated large international science projects and plans”, with the goal of identifying between three and five preparatory projects, one or two of which will be put to construction, by 2020. Hopefully, CEPC will be among those selected.

    Related journal articles
    _________________________________________________
    See the full article for further references with some links.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: