Tagged: Accelerator Science Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:25 pm on June 22, 2018 Permalink | Reply
    Tags: Accelerator Science, , , , , , ,   

    From Brookhaven Lab: “Upgrades to ATLAS and LHC Magnets for Run 2 and Beyond” 

    From Brookhaven Lab

    6.22.18

    Peter Genzer,
    genzer@bnl.gov
    (631) 344-3174

    The following news release was issued by CERN, the European Organization for Nuclear Research, home to the Large Hadron Collider (LHC). Scientists from the U.S. Department of Energy’s Brookhaven National Laboratory play multiple roles in the research at the LHC and are making major contributions to the high-luminosity upgrade described in this news release, including the development of new niobium tin superconducting magnets that will enable significantly higher collision rates; new particle tracking and signal readout systems for the ATLAS experiment that will allow scientists to capture and analyze the most significant details from vastly larger data sets; and increases in computing capacity devoted to analyzing and sharing that data with scientists around the world. Brookhaven Lab also hosts the Project Office for the U.S. contribution to the HL-LHC detector upgrades of the ATLAS experiment. For more information about Brookhaven’s roles in the high-luminosity upgrade or to speak with a Brookhaven/LHC scientist, contact Karen McNulty Walsh, (631) 344-8350, kmcnulty@bnl.gov.

    Brookhaven physicists play critical roles in LHC restart and plans for the future of particle physics.

    1
    The ATLAS detector at the Large Hadron Collider, an experiment with large involvement from physicists at Brookhaven National Laboratory. Image credit: CERN

    July 6, 2015

    At the beginning of June, the Large Hadron Collider at CERN, the European research facility, began smashing together protons once again. The high-energy particle collisions taking place deep underground along the border between Switzerland and France are intended to allow physicists to probe the furthest edges of our knowledge of the universe and its tiniest building blocks.

    The Large Hadron Collider returns to operations after a two-year offline period, Long Shutdown 1, which allowed thousands of physicists worldwide to undertake crucial upgrades to the already cutting-edge particle accelerator. The LHC now begins its second multi-year operating period, Run 2, which will take the collider through 2018 with collision energies nearly double those of Run 1. In other words, Run 2 will nearly double the energies that allowed researchers to detect the long-sought Higgs Boson in 2012.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    The U.S. Department of Energy’s Brookhaven National Laboratory is a crucial player in the physics program at the Large Hadron Collider, in particular as the U.S. host laboratory for the pivotal ATLAS experiment, one of the two large experiments that discovered the Higgs. Physicists at Brookhaven were busy throughout Long Shutdown 1, undertaking projects designed to maximize the LHC’s chances of detecting rare new physics as the collider reaches into a previous unexplored subatomic frontier.

    While the technology needed to produce a new particle is a marvel on its own terms, equally remarkable is everything the team at ATLAS and other experiments must do to detect these potentially world-changing discoveries. Because the production of such particles is a rare phenomenon, it isn’t enough to just be able to smash one proton into another. The LHC needs to be able to collide proton bunches, each bunch consisting of hundreds of billions of particles, every 50 nanoseconds—eventually rising to every 25 nanoseconds in Run 2—and be ready to sort through the colossal amounts of data that all those collisions produce.

    It is with those interwoven challenges—maximizing the number of collisions within the LHC, capturing the details of potentially noteworthy collisions, and then managing the gargantuan amount of data those collisions produce—that scientists at Brookhaven National Laboratory are making their mark on the Large Hadron Collider and its search for new physics—and not just for the current Run 2, but looking forward to the long-term future operation of the collider.

    Restarting the Large Hadron Collider

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    2
    Brookhaven physicist Srini Rajagopalan, operation program manager for U.S. ATLAS, works to keep manageable the colossal amounts of data that are generated by the Large Hadron Collider and sent to Brookhaven’s RHIC and ATLAS Computing Facility.

    The Large Hadron Collider is the largest single machine in the world, so it’s tempting to think of its scale just in terms of its immense size. The twin beamlines of the particle accelerator sit about 300 to 600 feet underground in a circular tunnel more than 17 miles around. Over 1,600 magnets, each weighing more than 25 tons, are required to keep the beams of protons focused and on the correct paths, and nearly 100 tons of liquid helium is necessary to keep the magnets operating at temperatures barely above absolute zero. Then there are the detectors, each of which stand several stories high.

    But the scale of the LHC extends not just in space, but in time as well. A machine of this size and complexity doesn’t just switch on or off with the push of a button, and even relatively simple maintenance can require weeks, if not months, to perform. That’s why the LHC recently completed Long Shutdown 1, a two-year offline period in which physicists undertook the necessary repairs and upgrades to get the collider ready for the next three years of near-continuous operation. As the U.S. host laboratory for the ATLAS experiment, Brookhaven National Laboratory was pivotal in upgrading and improving one of the cornerstones of the LHC apparatus.

    “After having run for three years, the detector needs to be serviced much like your car,” said Brookhaven physicist Srini Rajagopalan, operation program manager for U.S. ATLAS. “Gas leaks crop up that need to be fixed. Power supplies, electronic boards and several other components need to be repaired or replaced. Hence a significant amount of detector consolidation work occurs during the shutdown to ensure an optimal working detector when beam returns.”

    Beyond these vital repairs, the major goal of the upgrade work during Long Shutdown 1 was to increase the LHC’s center of mass energies from the previous 8 trillion electron volts (TeV) to 13 TeV, near the operational maximum of 14 TeV.

    “Upgrading the energy means you’re able to probe much higher mass ranges, and you have access to new particles that might be substantially heavier,” said Rajagopalan. “If you have a very heavy particle that cannot be produced, it doesn’t matter how much data you collect, you just cannot reach that. That’s why it was very important to go from 8 to 13 TeV. Doubling the energy allows us to access the new physics much more easily.”

    As the LHC probes higher and higher energies, the phenomena that the researchers hope to observe will happen more and more rarely, meaning the particle beams need to create many more collisions than they did before. Beyond this increase in collision rates, or luminosity, however, the entire infrastructure of data collection and management has to evolve to deal with the vastly increased volume of information the LHC can now produce.

    “Much of the software had to be evolved or rewritten,” said Rajagopalan, “from patches and fixes that are more or less routine software maintenance to implementing new algorithms and installing new complex data management systems capable of handling the higher luminosity and collision rates.”

    Making More Powerful Magnets

    3
    Brookhaven physicist Peter Wanderer, head of the laboratory’s Superconducting Magnet Division, stands in front of the oven in which niobium tin is made into a superconductor.

    The Large Hadron Collider works by accelerating twin beams of protons to speeds close to that of light. The two beams, traveling in opposite directions along the path of the collider, both contain many bunches of protons, with each bunch containing about 100 billion protons. When the bunches of protons meet, not all of the protons inside of them are going to interact and only a tiny fraction of the colliding bunches are likely to yield potentially interesting physics. As such, it’s absolutely vital to control those beams to maximize the chances of useful collisions occurring.

    The best way to achieve that and the desired increase in luminosity—both during the current Run 2, and looking ahead to the long-term future of the LHC—is to tighten the focus of the beam. The more tightly packed protons are, the more likely they’ll smash into each other. This means working with the main tool that controls the beam inside the accelerator: the magnets.

    FNAL magnets such as this one, which is mounted on a test stand at Fermilab, for the High-Luminosity LHC Photo Reidar Hahn

    “Most of the length of the circumference along a circular machine like the LHC is taken up with a regular sequence of magnets,” said Peter Wanderer, head of Brookhaven Lab’s Superconducting Magnet Division, which made some of the magnets for the current LHC configuration and is working on new designs for future upgrades. “The job of these magnets is to bend the proton beams around to the next point or region where you can do something useful with them, like produce collisions, without letting the beam get larger.”

    A beam of protons is a bunch of positively charged particles that all repel one another, so they want to move apart, he explained. So physicists use the magnetic fields to keep the particles from being able to move away from the desired path.

    “You insert different kinds of magnets, different sequences of magnets, in order to make the beams as small as possible, to get the most collisions possible when the beams collide,” Wanderer said.

    The magnets currently in use in the LHC are made of the superconducting material niobium titanium (NbTi). When the electromagnets are cooled in liquid helium to temperatures of about 4 Kelvin (-452.5 degrees Fahrenheit), they lose all electric resistance and are able to achieve a much higher current density compared with a conventional conductor like copper. A magnetic field gets stronger as its current is more densely packed, meaning a superconductor can produce a much stronger field over a smaller radius than copper.

    But there’s an upper limit to how high a field the present niobium titanium superconductors can reach. So Wanderer and his team at Brookhaven have been part of a decade-long project to refine the next generation of superconducting magnets for a future upgrade to the LHC. These new magnets will be made from niobium tin (Nb3Sn).

    “Niobium tin can go to higher fields than niobium titanium, which will give us even stronger focusing,” Wanderer said. “That will allow us to get a smaller beam, and even more collisions.” Niobium tin can also function at a slightly higher temperature, so the new magnets will be easier to cool than those currently in use.

    There are a few catches. For one, niobium tin, unlike niobium titanium, isn’t initially superconducting. The team at Brookhaven has to first heat the material for two days at 650 degrees Celsius (1200 degrees Fahrenheit) before beginning the process of turning the raw materials into the wires and cables that make up an electromagnet.

    “And when niobium tin becomes a superconductor, then it’s very brittle, which makes it really challenging,” said Wanderer. “You need tooling that can withstand the heat for two days. It needs to be very precise, to within thousandths of an inch, and when you take it out of the tooling and want to put it into a magnet, and wrap it with iron, you have to handle it very carefully. All that adds a lot to the cost. So one of the things we’ve worked out over 10 years is how to do it right the first time, almost always.”

    Fortunately, there’s still time to work out any remaining kinks. The new niobium tin magnets aren’t set to be installed at the LHC until around 2022, when the changeover from niobium titanium to niobium tin will be a crucial part of converting the Large Hadron Collider into the High-Luminosity Large Hadron Collider (HL-LHC).

    Managing Data at Higher Luminosity

    As the luminosity of the LHC increases in Run 2 and beyond, perhaps the biggest challenge facing the ATLAS team at Brookhaven lies in recognizing a potentially interesting physics event when it occurs. That selectivity is crucial, because even CERN’s worldwide computing grid—which includes about 170 global sites, and of which Brookhaven’s RHIC and ATLAS Computing Facility is a major center—can only record the tiniest fraction of over 100 million collisions that occur each second. That means it’s just as important to quickly recognize the millions of events that don’t need to be recorded as it is to recognize the handful that do.

    “What you have to do is, on the fly, analyze each event and decide whether you want to save it to disk for later use or not,” said Rajagopalan. “And you have to be careful you don’t throw away good physics events. So you’re looking for signatures. If it’s a good signature, you say, ‘Save it!’ Otherwise, you junk it. That’s how you bring the data rate down to a manageable amount you can write to disk.”

    Physicists screen out unwanted data using what’s known as a trigger system. The principle is simple: as the data from each collision comes in, it’s analyzed for a preset signature pattern, or trigger, that would mark it as potentially interesting.

    “We can change the trigger, or make the trigger more sophisticated to be more selective,” said Brookhaven’s Howard Gordon, a leader in the ATLAS physics program. “If we don’t select the right events, they are gone forever.”

    The current trigger system can handle the luminosities of Run 2, but with future upgrades it will no longer be able to screen out and reject enough collisions to keep the number of recorded events manageable. So the next generation of ATLAS triggers will have to be even more sophisticated in terms of what they can instantly detect—and reject.

    A more difficult problem comes with the few dozen events in each bunch of protons that look like they might be interesting, but aren’t.

    “Not all protons in a bunch interact, but it’s not necessarily going to be only one proton in a bunch that interacts with a proton from the opposite bunch,” said Rajagopalan. “You could have 50 of them interact. So now you have 50 events on top of each other. Imagine the software challenge when just one of those is the real, new physics we’re interested in discovering, but you have all these 49 others—junk!—sitting on top of it.”

    “We call it pileup!” Gordon quipped.

    Finding one good result among 50 is tricky enough, but in 10 years that number will be closer to 1 in 150 or 200, with all those additional extraneous results interacting with each other and adding exponentially to the complexity of the task. Being able to recognize instantly as many characteristics of the desired particles as possible will go a long way to keeping the data manageable.

    Further upgrades are planned over the next decade to cope with the ever-increasing luminosity and collision rates. For example, the Brookhaven team and collaborators will be working to develop an all-new silicon tracking system and a full replacement of the readout electronics with state-of-the-art technology that will allow physicists to collect and analyze ten times more data for LHC Run 4, scheduled for 2026.

    The physicists at CERN, Brookhaven, and elsewhere have strong motivation for meeting these challenges. Doing so will not only offer the best chance of detecting rare physics events and expanding the frontiers of physics, but would allow the physicists to do it within a reasonable timespan.

    As Rajagopalan put it, “We are ready for the challenge. The next few years are going to be an exciting time as we push forward to explore a new unchartered energy frontier.”

    Brookhaven’s role in the LHC is supported by the DOE Office of Science.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

    Advertisements
     
  • richardmitnick 2:41 pm on June 21, 2018 Permalink | Reply
    Tags: Accelerator Science, , , , ,   

    From Fermilab: “New laser technology shows success in particle accelerators” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermilab an enduring source of strength for the US contribution to scientific research world wide.

    June 21, 2018
    Sarah Lawhun

    1
    David Johnson, left, and Todd Johnson work on the recently installed laser notcher in the Fermilab accelerator complex. The laser notcher, the first application of its kind in an in-production particle accelerator, has helped boost particle beam production at the lab. Photo: Reidar Hahn

    Lasers — used in medicine, manufacturing and made wildly popular by science fiction — are finding a new use in particle physics.

    Fermilab scientists and engineers have developed a tool called a laser notcher, which takes advantage of the laser’s famously precise targeting abilities to do something unexpected: boost the number of particles that accelerators send to experiments. It’s cranked up the lab’s particle output considerably — by an incredible 15 percent — giving scientists more opportunities to study nature’s tiniest constituents.

    While lasers have been used during accelerator tests and diagnostics, this is the first application of its kind used in a fully operational accelerator.

    “For such a new design, the laser notcher has been remarkably reliable,” said Fermilab engineer Bill Pellico, who manages one of the laboratory’s major accelerator upgrade programs, called the Proton Improvement Plan. “It’s already shown it will provide a considerable increase in the number of particles we can produce.”

    The notcher increases particle production, counterintuitively, by removing particles from a particle beam.

    Bunching out

    The process of removing particles isn’t new. Typically, an accelerator generates a particle beam in bunches — compact packets that each contain hundreds of millions of particles. Imagine each bunch in a beam as a pearl on a strand. Bunches can be arranged in patterns according to the acceleration needs. Perhaps the needed pattern is a 80-bunch-long string followed by a three-bunch-long gap. Often, the best way to create the gap is to start with a regular, uninterrupted string of bunches and simply remove the unneeded ones.

    But it isn’t so simple. Traditionally, beam bunches are kicked out by a fast-acting magnet, called a magnetic kicker. It’s a messy business: Particles fly off, strike beamline walls and generally create a subatomic obstacle course for the beam. While it’s not impossible for the beam to pass through such a scene, it also isn’t smooth sailing.

    Accelerator experts refer to the messy phenomenon as beam loss, and it’s a measurable, predictable predicament. They accommodate it by holding back on the amount of beam they accelerate in the first place, setting a ceiling on the number of particles they pack into the beam.

    That ceiling is a limitation for Fermilab’s new and upcoming experiments, which require greater and greater numbers of particles than the accelerator complex could handle previously. So the lab’s accelerator specialists look for ways to raise the particle beam ceiling and meet the experimental needs for beam.

    The most straightforward way to do this is to eliminate the thing that’s keeping the ceiling low and stifling particle delivery — beam loss.

    Lasers against loss

    The new laser notcher works by directing powerful pulses of laser light at particle bunches, taking them out of commission. Both the position and precision of the notcher allow it to create gaps cleanly —delivering a one-two punch in curbing beam loss.

    First, the notcher is positioned early in the series of Fermilab’s accelerators, when the particle beam hasn’t yet achieved the close-to-light speeds it will attain by the time it exits the accelerator chain. (At this early stage, the beam lumbers along at 4 percent the speed of light, a mere 2.7 million miles per hour.) This far upstream, the beam loss resulting from ejecting bunches doesn’t have much of an impact.

    “We moved the process to a place where, when we lose particles, it really doesn’t matter,” said David Johnson, Fermilab engineering physicist who led the laser notcher project.

    Second, the laser notcher is, like a scalpel, surgical in its bunch removal. It ejects bunches precisely, individually, bunch by bunch. That enables scientists to create gaps of exactly the right lengths needed by later acceleration stages.

    For Fermilab’s accelerator chain, the winning formula is for the notcher to create a gap that is 80 nanoseconds (billionths of a second) long every 2,200 nanoseconds. It’s the perfect-length gap needed by one of Fermilab’s later-stage accelerators, called the Booster.

    A graceful exit

    The Fermilab Booster feeds beam to the next accelerator stages or directly to experiments.

    Prior to the laser notcher’s installation, a magnetic kicker would boot specified bunches as they entered the Booster, resulting in messy beam loss.

    With the laser notcher now on the scene, the Booster receives a beam that has prefab, well-defined gaps. These 80-nanosecond-long windows of opportunity mean that, as the beam leaves the Booster and heads toward its next stop, it can make a clean, no-fuss, no-loss exit.

    With Booster beam loss brought down to low levels, Fermilab accelerator operators can raise the ceiling on the numbers of particles they can pack into the beam. The results so far are promising: The notcher has already allowed beam power to increase by a whopping 15 percent.

    Thanks to this innovation and other upgrade improvements, the Booster accelerator is now operating at its highest efficiency ever and at record-setting beam power.

    “Although lasers have been used in proton accelerators in the past for diagnostics and tests, this is the first-of-its-kind application of lasers in an operational proton synchrotron, and it establishes a technological framework for using laser systems in a variety of other bunch-by-bunch applications, which would further advance the field of high-power proton accelerators,” said Sergei Nagaitsev, head of the Fermilab Office of Accelerator Science Programs.

    Plentiful protons and other particles

    The laser notcher, installed in January, is a key part of a larger program, the Proton Improvement Plan (PIP), to upgrade the lab’s chain of particle accelerators to produce powerful proton beams.

    As the name of the program implies, it starts with protons.

    Fermilab sends protons barreling through the lab’s accelerator complex, and they’re routed to various experiments. Along the way, some of them are transformed into other particles needed by experiments, for example into neutrinos—tiny, omnipresent particles that could hold the key to filling in gaps in our understanding the universe’s evolution. Fermilab experiments need boatloads of these particles to carry out its scientific program. Some of the protons are transformed into muons, which can provide scientists with hints about the nature of the vacuum.

    With more protons coming down the pipe, thanks to PIP and the laser notcher, the accelerator can generate more neutrinos, muons and other particles, feeding Fermilab’s muon experiments, Muon g-2 and Mu2e, and its neutrino experiments, including its largest operating neutrino experiment, NOvA, and its flagship, the Deep Underground Neutrino Experiment and Long-Baseline Neutrino Facility.

    “Considering all the upgrades and improvements to Fermilab accelerators as a beautiful cake with frosting, the increase in particle production we managed to achieve with the laser notcher is like the cherry on top of the cake,” Nagaitsev said.

    “It’s a seemingly small change with a significant impact,” Johnson said.

    As the Fermilab team moves forward, they’ll continue to put the notcher through its paces, investigating paths for improvement.

    With this innovation, Fermilab adds another notch in the belt of what lasers can do.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.


    FNAL/MINERvA

    FNAL DAMIC

    FNAL Muon g-2 studio

    FNAL Short-Baseline Near Detector under construction

    FNAL Mu2e solenoid

    Dark Energy Camera [DECam], built at FNAL

    FNAL DUNE Argon tank at SURF

    FNAL/MicrobooNE

    FNAL Don Lincoln

    FNAL/MINOS

    FNAL Cryomodule Testing Facility

    FNAL Minos Far Detector

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    FNAL/NOvA experiment map

    FNAL NOvA Near Detector

    FNAL ICARUS

    FNAL Holometer

     
  • richardmitnick 1:49 pm on June 19, 2018 Permalink | Reply
    Tags: Accelerator Science, , , , HL-LHC, , ,   

    From CERN: “Major work starts to boost the luminosity of the LHC” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN

    1
    Civil works have begun on the ATLAS and CMS sites to build new underground structures for the High-Luminosity LHC. (Image: Julien Ordan / CERN)

    CERN map

    The Large Hadron Collider (LHC) is officially entering a new stage. Today, a ground-breaking ceremony at CERN celebrates the start of the civil-engineering work for the High-Luminosity LHC (HL-LHC): a new milestone in CERN’s history. By 2026 this major upgrade will have considerably improved the performance of the LHC, by increasing the number of collisions in the large experiments and thus boosting the probability of the discovery of new physics phenomena.

    The LHC started colliding particles in 2010. Inside the 27-km LHC ring, bunches of protons travel at almost the speed of light and collide at four interaction points. These collisions generate new particles, which are measured by detectors surrounding the interaction points. By analysing these collisions, physicists from all over the world are deepening our understanding of the laws of nature.

    While the LHC is able to produce up to 1 billion proton-proton collisions per second, the HL-LHC will increase this number, referred to by physicists as “luminosity”, by a factor of between five and seven, allowing about 10 times more data to be accumulated between 2026 and 2036. This means that physicists will be able to investigate rare phenomena and make more accurate measurements. For example, the LHC allowed physicists to unearth the Higgs boson in 2012, thereby making great progress in understanding how particles acquire their mass. The HL-LHC upgrade will allow the Higgs boson’s properties to be defined more accurately, and to measure with increased precision how it is produced, how it decays and how it interacts with other particles. In addition, scenarios beyond the Standard Model will be investigated, including supersymmetry (SUSY), theories about extra dimensions and quark substructure (compositeness).

    “The High-Luminosity LHC will extend the LHC’s reach beyond its initial mission, bringing new opportunities for discovery, measuring the properties of particles such as the Higgs boson with greater precision, and exploring the fundamental constituents of the universe ever more profoundly,” said CERN Director-General Fabiola Gianotti.

    The HL-LHC project started as an international endeavour involving 29 institutes from 13 countries. It began in November 2011 and two years later was identified as one of the main priorities of the European Strategy for Particle Physics, before the project was formally approved by the CERN Council in June 2016. After successful prototyping, many new hardware elements will be constructed and installed in the years to come. Overall, more than 1.2 km of the current machine will need to be replaced with many new high-technology components such as magnets, collimators and radiofrequency cavities.

    2
    Prototype of a quadrupole magnet for the High-Luminosity LHC. (Image: Robert Hradil, Monika Majer/ProStudio22.ch)

    FNAL magnets such as this one, which is mounted on a test stand at Fermilab, for the High-Luminosity LHC Photo Reidar Hahn

    The secret to increasing the collision rate is to squeeze the particle beam at the interaction points so that the probability of proton-proton collisions increases. To achieve this, the HL-LHC requires about 130 new magnets, in particular 24 new superconducting focusing quadrupoles to focus the beam and four superconducting dipoles. Both the quadrupoles and dipoles reach a field of about 11.5 tesla, as compared to the 8.3 tesla dipoles currently in use in the LHC. Sixteen brand-new “crab cavities” will also be installed to maximise the overlap of the proton bunches at the collision points. Their function is to tilt the bunches so that they appear to move sideways – just like a crab.

    FNAL Crab cavities for the HL-LHC

    CERN crab cavities that will be used in the HL-LHC

    Another key ingredient in increasing the overall luminosity in the LHC is to enhance the machine’s availability and efficiency. For this, the HL-LHC project includes the relocation of some equipment to make it more accessible for maintenance. The power converters of the magnets will thus be moved into separate galleries, connected by new innovative superconducting cables capable of carrying up to 100 kA with almost zero energy dissipation.

    “Audacity underpins the history of CERN and the High-Luminosity LHC writes a new chapter, building a bridge to the future,” said CERN’s Director for Accelerators and Technology, Frédérick Bordry. “It will allow new research and with its new innovative technologies, it is also a window to the accelerators of the future and to new applications for society.”

    To allow all these improvements to be carried out, major civil-engineering work at two main sites is needed, in Switzerland and in France. This includes the construction of new buildings, shafts, caverns and underground galleries. Tunnels and underground halls will house new cryogenic equipment, the electrical power supply systems and various plants for electricity, cooling and ventilation.

    During the civil engineering work, the LHC will continue to operate, with two long technical stop periods that will allow preparations and installations to be made for high luminosity alongside yearly regular maintenance activities. After completion of this major upgrade, the LHC is expected to produce data in high-luminosity mode from 2026 onwards. By pushing the frontiers of accelerator and detector technology, it will also pave the way for future higher-energy accelerators.


    The LHC will receive a major upgrade and transform into the High-Luminosity LHC over the coming years. But what does this mean and how will its goals be achieved? Find out in this video featuring several people involved in the project. (Video: Polar Media/CERN.)

    Fermilab is leading the U.S. contribution to the HL-LHC, in addition to building new components for the upgraded detector for the CMS experiment. The main innovation contributed by the United States for the HL-LHC is a novel new type of accelerator cavity that uses a breakthrough superconducting technology.

    Fermilab is also contributing to the design and construction of superconducting magnets that will focus the particle beam much more tightly than the magnets currently in use in the LHC. Fermilab scientists and engineers have also partnered with other CMS collaborators on new designs for tracking modules in the CMS detector, enabling it to respond more quickly to the increased number of collisions in the HL-LHC.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    OTHER PROJECTS AT CERN

    CERN AEGIS

    CERN ALPHA

    CERN ALPHA

    CERN AMS

    CERN ACACUSA

    CERN ASACUSA

    CERN ATRAP

    CERN ATRAP

    CERN AWAKE

    CERN AWAKE

    CERN CAST

    CERN CAST Axion Solar Telescope

    CERN CLOUD

    CERN CLOUD

    CERN COMPASS

    CERN COMPASS

    CERN DIRAC

    CERN DIRAC

    CERN ISOLDE

    CERN ISOLDE

    CERN LHCf

    CERN LHCf

    CERN NA62

    CERN NA62

    CERN NTOF

    CERN TOTEM

    CERN UA9

     
  • richardmitnick 11:52 am on June 19, 2018 Permalink | Reply
    Tags: Accelerator Science, , , Halina Abramowicz, , , , ,   

    From Symmetry: Women in STEM-“Q&A: Planning Europe’s physics future” Halina Abramowicz 

    Symmetry Mag
    From Symmetry

    06/13/18
    Lauren Biron

    1
    Artwork by Sandbox Studio, Chicago

    Halina Abramowicz leads the group effort to decide the future of European particle physics.

    Physics projects are getting bigger, more global, more collaborative and more advanced than ever—with long lead times for complex physics machines. That translates into more international planning to set the course for the future.

    In 2014, the United States particle physics community set its priorities for the coming years using recommendations from the Particle Physics Project Prioritization Panel, or P5.

    FNAL Particle Physics Project Prioritization Panel -P5

    In 2020, the European community will refresh its vision with the European Strategy Update for Particle Physics.

    The first European strategy launched in 2006 and was revisited in 2013. In 2019, teams will gather input through planning meetings in preparation for the next refresh.

    Halina Abramowicz, a physicist who works on the ATLAS experiment at CERN’s Large Hadron Collider and the FCAL research and development collaboration through Tel Aviv University, is the chair of the massive undertaking. During a visit to Fermilab to provide US-based scientists with an overview of the process, she sat down with Symmetry writer Lauren Biron to discuss the future of physics in Europe.

    LB:What do you hope to achieve with the next European Strategy Update for Particle Physics?
    HA: Europe is a very good example of the fact that particle physics is very international, because of the size of the infrastructure that we need to progress, and because of the financial constraints.

    The community of physicists working on particle physics is very large; Europe has probably about 10,000 physicists. They have different interests, different expertise, and somehow, we have to make sure to have a very balanced program, such that the community is satisfied, and that at the same time it remains attractive, dynamic, and pushing the science forward. We have to take into account the interests of various national programs, universities, existing smaller laboratories, CERN, and make sure that there is a complementarity, a spread of activities—because that’s the way to keep the field attractive, that is, to be able to answer more questions faster.

    LB: How do you decide when to revisit the European plan for particle physics?
    HA: Once the Higgs was discovered, it became clear that it was time to revisit the strategy, and the first update happened in 2013. The recommendation was to vigorously pursue the preparations for the high-luminosity upgrade of the [Large Hadron Collider].

    The high-luminosity LHC program was formally approved by the CERN Council in September 2016. By the end of 2018, the LHC experiments will have collected almost a factor of 10 more data. It will be a good time to reflect on the latest results, to think about mid-term plans, to discuss what are the different options to consider next and their possible timelines, and to ponder what would make sense as we look into the long-term future.

    CERN HL-LHC map

    Machines, Projects and Experiments operating at CERN LHC and CLIC at three levels of power

    The other aspect which is very important is the fact that the process is called “strategy,” rather than “roadmap,” because it is a discussion not only of the scientific goals and associated projects, but also of how to achieve them. The strategy basically is about everything that the community should be doing in order to achieve the roadmap.

    LB: What’s the difference between a strategy and a roadmap?
    HA: The roadmap is about prioritizing the scientific goals and about the way to address them, while the strategy covers also all the different aspects to consider in order to make the program a success. For example, outreach is part of the strategy. We have to make sure we are doing something that society knows about and is interested in. Education: making sure we share our knowledge in a way which is understandable. Detector developments. Technology transfer. Work with industry. Making sure the byproducts of our activities can also be used for society. It’s a much wider view.

    LB: What is your role in this process?
    HA: The role of the secretary of the strategy is to organize the process and to chair the discussions so that there is an orderly process. At this stage, we have one year to prepare all the elements of the process that are needed—i.e. to collect the input. In the near future we will have to nominate people for the physics preparatory group that will help us organize the open symposium, which is basically the equivalent of a town-hall meeting.

    The hope is that if it’s well organized and we can reach a consensus, especially on the most important aspects, the outcome will come from the community. We have to make sure through interaction with the European community and the worldwide community that we aren’t forgetting anything. The more inputs we have, the better. It is very important that the process be open.

    The first year we debate the physics goals and try to organize the community around a possible plan. Then comes the process that is maybe a little shorter than a year, during which the constraints related to funding and interests of various national communities have to be integrated. I’m of course also hoping that we will get, as an input to the strategy discussions, some national roadmaps. It’s the role of the chair to keep this process flowing.

    LB: Can you tell us a little about your background and how you came to serve as the chair for European Strategy Update?
    HA: That’s a good question. I really don’t know. I did my PhD in 1978; I was one of the youngest PhDs of Warsaw University, thus I’ve spent 40 years in the field. That means that I have participated in at least five large experiments and at least two or three smaller projects. I have a very broad view—not necessarily a deep view—but a broad view of what’s happening.

    LB: There are major particle physics projects going on around the world, like DUNE in the US and Belle II in Japan. How much will the panel look beyond Europe to coordinate activities, and how will it incorporate feedback from scientists on those projects?

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    KEK Belle 2 detector, in Tsukuba, Ibaraki Prefecture, Japan

    HA: This is one of the issues that was very much discussed during my visit. We shouldn’t try to organize the whole world—in fact, a little bit of competition is very healthy. And complementarity is also very important.

    At the physics-level discussions, we’ll make sure that we have representatives from the United States and other countries so we are provided with all the information. As I was discussing with many people here, if there are ideas, experiments or existing collaborations which already include European partners, then of course, there is no issue [because the European partners will provide input to the strategy].

    LB: How do you see Europe working with Asia, in particular China, which has ambitions for a major collider?
    HA: Collaboration is very important, and at the global level we have to find the right balance between competition, which is stimulating, and complementarity. So we’re very much hoping to have one representative from China in the physics preparatory group, because China seems to have ambitions to realize some of the projects which have been discussed. And I’m not talking only about the equivalent of [the Future Circular Collider]; they are also thinking about an [electron-positron] circular collider, and there are also other projects that could potentially be realized in China. I also think that if the Chinese community decides on one of these projects, it may need contributions from around the world. Funding is an important aspect for any future project, but it is also important to reach a critical mass of expertise, especially for large research infrastructures.

    LB: This is a huge effort. What are some of the benefits and challenges of meeting with physicists from across Europe to come up with a single plan?
    HA: The benefits are obvious. The more input we have, the fuller the picture we have, and the more likely we are to converge on something that satisfies maybe not everybody, but at least the majority—which I think is very important for a good feeling in the community.

    The challenges are also obvious. On one hand, we rely very much on individuals and their creative ideas. These are usually the people who also happen to be the big pushers and tend to generate most controversies. So we will have to find a balance to keep the process interesting but constructive. There is no doubt that there will be passionate and exciting discussions that will need to happen; this is part of the process. There would be no point in only discussing issues on which we all agree.

    The various physics communities, in the ideal situation, get organized. We have the neutrino community, [electron-positron collider] community, precision measurements community, the axion community—and here you can see all kinds of divisions. But if these communities can get organized and come up with what one could call their own white paper, or what I would call a 10-page proposal, of how various projects could be lined up, and what would be the advantages or disadvantages of such an approach, then the job will be very easy.

    LB: And that input is what you’re aiming to get by December 2018?
    HA: Yes, yes.

    LB: How far does the strategy look out?
    HA: It doesn’t have an end date. This is why one of the requests for the input is for people to estimate the time scale—how much time would be needed to prepare and to realize the project. This will allow us to build a timeline.

    We have at present a large project that is approved: the high-luminosity LHC. This will keep an important part of our community busy for the next 10 to 20 years. But will the entire community remain fully committed for the whole duration of the program if there are no major discoveries?

    I’m not sure that we can be fed intellectually by one project. I think we need more than one. There’s a diversity program—diversity in the sense of trying to maximize the physics output by asking questions which can be answered with the existing facilities. Maybe this is the time to pause and diversify while waiting for the next big step.

    LB: Do you see any particular topics that you think are likely to come up in the discussion?
    HA: There are many questions on the table. For example, should we go for a proton-proton or an [electron-positron] program? There are, for instance, voices advocating for a dedicated Higgs factory, which would allow us to make measurements of the Higgs properties to a precision that would be extremely hard to achieve at the LHC. So we will have to discuss if the next machine should be an [electron-positron] machine and check whether it is realistic and on what time scale.

    One of the subjects that I’m pretty sure will come up as well is about pushing the accelerating technologies. Are we getting to the limit of what we can do with the existing technologies, and is it time to think about something else?

    To learn more about the European Strategy Update for Particle Physics, watch Abramowicz’s colloquium at Fermilab.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 3:36 pm on June 18, 2018 Permalink | Reply
    Tags: Accelerator Science, , DESY HERA H1 and ZEUS, Heavy quark production,   

    From DESY: “Weighty insights from heavy quarks” 

    DESY
    From DESY

    HERA sheds light on heavy quarks and their properties

    DESY HERA


    The HERA accelerator at DESY in Hamburg was unique in that it smashed two totally different kinds of particles into each other – protons and electrons or positrons. HERA thus consists of two different accelerator rings: a superconducting proton ring (top) and a normal-conducting electron ring (bottom). HERA ran from 1990 to 2007.

    The two HERA collaborations H1 and ZEUS have combined their measurements on heavy quark production. The paper which is now published by The European Physics Journal summarises an analysis of charm and beauty quark production in these collisions. A team led by DESY physicists has produced final results, a culmination of over 20 years of work. Together with previously published physics results from HERA, it will appear in text books of the future as well as having application to the physics studied at the Large Hadron Collider in CERN, Geneva to which DESY groups also make key contributions.

    From 1992 to 2007, the unique 6.3-kilometre underground storage ring HERA (Hadron Electron Ring Accelerator) at DESY collided electrons and protons accelerated close to the speed of light in order to resolve and study the proton’s structure. Protons are in the core of each single atomic nucleus in the universe. Their composition of three quarks – two up and one down quark which are held together by so-called gluons, carrier particles of the strong force, is well known since decades. However, HERA showed a much more complicated picture of the proton: a sizzling soup where gluons can produce more gluons and can split into pairs of quarks and antiquarks – the so-called sea quarks – all of them interacting again very quickly.Therefore electrons were used as probes which penetrated deeply into the proton and were scattered off one of the proton’s constituents via exchange of the weak or the electromagnetic force, two of the four basic forces in the universe. The particles emerging from the scattering processes – some of them so-called heavy quarks called charm and beauty – were measured by the two multi-purpose detectors H1 and ZEUS.

    2
    The HERA experiments showed that the proton is a very dynamic subject consisting of quarks, antiquarks and gluons (picture: DESY).

    Compared to the up and down quarks, charm and beauty quarks are about 1000 times heavier. Consequently, they were produced less frequently in electron-proton interactions. In order to have the highest precision possible, the two collaborations have recently combined all their measurements on the production of these heavy quarks. The detailed analysis allows precision tests of the theory of the strong force, quantum chromodynamics (QCD), and to constrain the structure of matter. The measurements have also been used to provide extractions of the masses of the charm and beauty quarks. DESY physicist Oleksandr Zenaiev, who led the analysis, comments: “The masses are two of the fundamental parameters of the Standard Model of particle physics and have been measured here to high precision.” The results compare well with other measurements in different processes, demonstrating the universality of the measurements.

    Joachim Mnich, DESY´s Director in charge of Particle and Astroparticle Physics, notes: “The HERA experiments have collected a very valuable set of lepton-proton collisions and continue to produce high-level publications even 10 years after the end of data taking. The publication is also a substantial proof that our efforts to preserve the HERA data for later analysis pays off.”

    Overall, the theoretical calculations, based on QCD, give a reasonable description of the data. “However, some tension is observed when simultaneously trying to describe these measurements of heavy quark production and similar inclusive measurements where no requirement on the nature of the quark is imposed. This demonstrates that HERA data are still challenging our best theoretical descriptions of the fundamental structure of matter. The collaborations look forward to further theoretical developments inspired by these unique results”, says Matthew Wing, spokesman of the ZEUS collaboration.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    desi

    DESY is one of the world’s leading accelerator centres. Researchers use the large-scale facilities at DESY to explore the microcosm in all its variety – from the interactions of tiny elementary particles and the behaviour of new types of nanomaterials to biomolecular processes that are essential to life. The accelerators and detectors that DESY develops and builds are unique research tools. The facilities generate the world’s most intense X-ray light, accelerate particles to record energies and open completely new windows onto the universe. 
That makes DESY not only a magnet for more than 3000 guest researchers from over 40 countries every year, but also a coveted partner for national and international cooperations. Committed young researchers find an exciting interdisciplinary setting at DESY. The research centre offers specialized training for a large number of professions. DESY cooperates with industry and business to promote new technologies that will benefit society and encourage innovations. This also benefits the metropolitan regions of the two DESY locations, Hamburg and Zeuthen near Berlin.

     
  • richardmitnick 5:07 pm on June 15, 2018 Permalink | Reply
    Tags: Accelerator Science, , , ,   

    From Fermilab: “Fermilab develops forefront accelerator components for the High-Luminosity LHC” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermilab , an enduring source of strength for the US contribution to scientific research world wide.

    June 14, 2018
    Jordan Rice

    A groundbreaking ceremony will be held tomorrow to celebrate the start of civil engineering work for a major upgrade to the Large Hadron Collider at CERN in Geneva, Switzerland. When complete, the High-Luminosity LHC (HL-LHC) will produce five to seven times more proton-proton collisions than the currently operating LHC, powering new discoveries about our universe.

    CERN CMS Tracker for HL-LHC

    For the last decade, scientists, engineers and technicians from the U.S. Department of Energy’s Fermi National Accelerator Laboratory have been working with partners around the world to conduct R&D on new accelerator components that would make operations at the HL-LHC possible. The U.S. research was conducted via the LHC Accelerator Research Program, or LARP. Now the research turns into reality, as construction of the new components begins.

    The primary components contributed by the United States for the HL-LHC construction are powerful superconducting magnets and superconducting deflecting cavities, called crab cavities of a novel compact design never before used in an accelerator.

    1
    Fermilab is developing magnets such as this one, which is mounted on a test stand at Fermilab, for the High-Luminosity LHC. Photo: Reidar Hahn

    “This is a truly major milestone for the whole U.S. accelerator community,” said Fermilab scientist Giorgio Apollinari, who leads the DOE Office of Science-funded U.S. HL-LHC Accelerator Upgrade Project (AUP). “More than 10 years of research work funded by DOE under LARP have gone into developing these cutting-edge magnets and crab cavities and in demonstrating their technical feasibility for the intended application at HL-LHC. We now look forward with much anticipation to shipping the first components to CERN and seeing them operate as part of the world’s foremost particle collider.”

    In the LHC, superconducting quadrupole magnets focus the beams into collision at four points around the 27-kilometer ring. In the HL-LHC, these focusing magnets must be more powerful to focus the stream of particles much tighter than in the LHC. Fermilab, in collaboration with DOE’s Brookhaven and Lawrence Berkeley national laboratories, developed the basic technology for these new magnets through LARP. The final design was completed in collaboration with CERN for application in the HL-LHC upgrade.

    These new magnets are made of a niobium-tin alloy that allows the magnets to reach the desired high magnetic field of 12 tesla. This powerful field is created by running a very high electric current through coils of superconducting wire, which conduct electricity without resistance when cooled to almost absolute zero. Fermilab is the lead U.S. laboratory for this project and is fabricating half of the coils and conducting the final assembly and testing of 11 full cryoassembly magnet structures before shipping them to CERN. The U.S. in total is delivering half of the quadrupole magnets for the upgrade, while CERN is completing the other half.

    “These are the next generation of superconducting magnets for accelerators,” said Fermilab’s Ruben Carcagno, the deputy project manager for the HL-LHC AUP. “This is the first time that this new technology will be deployed in a working machine. So it’s a big step.”

    2
    Fermilab is developing and constructing cavities like this one for the future HL-LHC. The cavity proper is the structure situated between the four rods. Photo: Leonardo Ristori

    In addition to the magnets, the United States will deliver half of the crab cavities to CERN for the HL-LHC, while CERN completes the remaining cavities. The cavities to be produced in the United States are of a radio-frequency dipole (RFD) design and are the product of more than 10 years of research through LARP by Old Dominion University and SLAC National Accelerator Laboratory, with contributions from Thomas Jefferson National Accelerator Facility and U.S. industry. Fermilab will be responsible for fabricating and testing the RFD cavities before delivering them to CERN. These novel cavities will kick or tilt the beams just before they pass through each other to maximize the beam overlap and therefore the possibility of proton collisions.

    Once it’s up and running, the HL-LHC will produce up to 15 million Higgs bosons per year, compared to the 4 million produced during the LHC’s 2015-2017 run. The higher luminosity will mean big changes for the LHC experiments as well, and the ATLAS and CMS detectors are undergoing major upgrades of their own. Learn more about Fermilab’s contributions to the HL-LHC upgrades to the CMS detector.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.


    FNAL/MINERvA

    FNAL DAMIC

    FNAL Muon g-2 studio

    FNAL Short-Baseline Near Detector under construction

    FNAL Mu2e solenoid

    Dark Energy Camera [DECam], built at FNAL

    FNAL DUNE Argon tank at SURF

    FNAL/MicrobooNE

    FNAL Don Lincoln

    FNAL/MINOS

    FNAL Cryomodule Testing Facility

    FNAL Minos Far Detector

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    FNAL/NOvA experiment map

    FNAL NOvA Near Detector

    FNAL ICARUS

    FNAL Holometer

     
  • richardmitnick 1:09 pm on June 11, 2018 Permalink | Reply
    Tags: A wealth of ALICE results at Quark Matter 2018, Accelerator Science, , , , , ,   

    From ALICE at CERN: “A wealth of ALICE results at Quark Matter 2018” 

    CERN

    CERN New Masthead

    From ALICE at CERN

    Contributing 35 talks and almost 100 posters, the ALICE Collaboration has had a strong participation in QM2018, which took place in Venice from 13 to 19 May, where an ample set of new results was presented.

    1

    The 27th edition of the Quark Matter conference took place in Venice in the week 13-19 May. In ALICE, preparations for the conference started months ago, with new analyses in all the Physics Working Groups. More than 70 new preliminary results were reviewed and approved for the conference and 16 new papers were made available on the preprint server right before the start of the conference.

    In the opening session, Alexander Kalweit presented the ALICE highlights talk, which covered many of the new results, and provided pointers to the 35 contributed talks and almost 100 posters that ALICE presented in the following days. The new results from ALICE covered a broad set of topics, including particle production in pp, p-Pb, and Pb-Pb collisions, and for the first time also Xe-Xe collisions, which were produced by the LHC in a short test run in October 2017. In the following, we will highlight a few of the new results that were presented by ALICE at Quark Matter.

    3
    Alexander Kalweit giving his talk on the ALICE highlights.

    For the small system collisions, pp and p-Pb, we reported on recent measurements on production of resonances and (anti-)nuclei as a function of the total charged particle multiplicity. These results show an intriguing dependence of the production of high-momentum particles on the overall multiplicity, which is likely due to the occurrence of multiple semi-hard scatterings in a single proton-proton collision. The goal of these measurements is to study the onset of effects such as strangeness enhancement and radial and elliptic flow, which are typically associated with Quark-Gluon Plasma formation.

    New results on heavy flavour production in p-Pb collisions show that the charmed baryon production rate is much larger than was expected from electron-positron collisions, and the baryon-to-meson ratio is characterized by a maximum at intermediate pT, which is also seen for light flavor baryon-to-meson ratios. This suggests that there is a common production mechanism for light flavor baryons like the protons and Λ baryon and for the charmed Λc baryon. A first result of Λc baryon production in Pb-Pb collisions was presented, which also shows a large baryon/meson ratio. Improving the precision of these measurements is one of the goals of the detector upgrade programme, which was discussed in another session.

    New results on production of the quarkonia J/Ψ and Ψ(2s) in p-Pb collision at √sNN = 8.2 TeV provide more precise information on the density distributions of quarks and gluons in the nucleus. The production of Ψ(2s) is significantly suppressed with respect to expectations from proton-proton collisions, even in the proton-going direction where no suppression is seen for the lower-mass J/Ψ. This suppression is not fully understood yet, but may be coming from final state interactions with light particles at similar momenta.

    In October last year, the LHC collided Xe nuclei for a few hours and ALICE recorded about 2M collisions, which allow studying the dependence of particle production and QGP effects on the size of the colliding nuclei: the isotope of Xe that was used has 129 nucleons, whereas Pb has 208 nucleons. The total multiplicity (number of produced particles) in Xe-Xe collisions is found to be similar to that in Pb-Pb with the same number of participating nucleons, except for very central Xe-Xe collisions, where an increase of particle production per participant pair is found. The elliptic and triangular flow in Xe-Xe and Pb-Pb collisions are very similar when comparing analogous centralities, as expected, because of the similarity of the initial shape of the system. The smaller number of nucleons in Xe leads to larger fluctuations of the initial geometry, which in turn lead to a larger flow signal in central events; the measured values agree with model calculations. The relative abundances of light flavour hadrons in the new Xe-Xe data confirms the previously-established picture that particle chemistry depends mostly on final state particle multiplicity at LHC energies. Finally, for high-momentum particle production, we observe a similar nuclear modification factor in Xe and Pb when comparing collisions with the same multiplicity. This is qualitatively in line with expectations, since parton energy loss depends on the density and the volume of the system, but more detailed model comparisons are being pursued.

    A dedicated study of the nuclear modification factor of peripheral Pb-Pb collisions shows that, while the suppression of high-momentum particle production that is associated with parton energy loss initially decreases when the collisions become less central, it increases again for very peripheral collisions. This non-monotonic behavior suggests that there is a different mechanism that suppresses high-momentum particle production in very peripheral collisions; one possible explanation is that the individual nucleon-nucleon collisions in the nuclear collision have larger impact parameters and that this reduces the number of parton scatterings, and thus the particle production at high transverse momentum. It is also relevant for the interpretation of collisions of small systems, where the observed azimuthal anisotropy suggests that final state interactions are important, but no suppression of final state particle production is found.

    ALICE also presented a first attempt to measure azimuthal anisotropy of direct photons at the LHC, which probes the time evolution of the temperature and pressure in the Quark Gluon Plasma. The measured signal is large, suggesting the importance of late emission of photons. However, the uncertainties are still sizeable and further improvements are needed to firmly establish this conclusion.

    The Quark Gluon Plasma is also studied using high momentum particles that traverse the plasma and interact with it. At the conference, ALICE presented new results on the nuclear modification factor for jets, as well as studies of the substructure of jets, which aim to be directly sensitive to the radiation of gluons by fast partons as they go through the plasma. A suppression of large-angle symmetric splittings is found, which suggests that partons in a parton shower interact independently with the Quark Gluon Plasma if the angle between them is large enough.

    All in all, the Quark Matter conference was a very interesting meeting, with an unprecedented number of new results from ALICE and the other experiments, as well as discussions of new ideas from theorist colleagues. The topics represented above are only a small selection of what was shown at the conference. The release of a large number of new results at the conference has sparked a lot of new discussions, which are being followed up in several places and we are looking forward to the new insights in strongly interaction matter that this will bring.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:


    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS
    ATLAS
    CERN/ATLAS detector

    ALICE
    CERN ALICE New

    CMS
    CERN/CMS Detector

    LHCb

    CERN/LHCb

    LHC

    CERN/LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles


    Quantum Diaries

     
  • richardmitnick 11:30 am on June 11, 2018 Permalink | Reply
    Tags: Accelerator Science, , , , ,   

    From SLAC Lab: “Work Begins on New SLAC Facility for Revolutionary Accelerator Science” 


    From SLAC Lab

    June 11, 2018
    Manuel Gnida

    The goal: develop plasma technologies that could shrink future accelerators up to 1,000 times, potentially paving the way for next-generation particle colliders and powerful light sources.

    The Department of Energy’s SLAC National Accelerator Laboratory has started to assemble a new facility for revolutionary accelerator technologies that could make future accelerators 100 to 1,000 times smaller and boost their capabilities.

    The project is an upgrade to the Facility for Advanced Accelerator Experimental Tests (FACET), a DOE Office of Science user facility that operated from 2011 to 2016.


    SLAC FACET

    FACET-II will produce beams of highly energetic electrons like its predecessor, but with even better quality.

    These beams will primarily be used to develop plasma acceleration techniques, which could lead to next-generation particle colliders that enhance our understanding of nature’s fundamental particles and forces and novel X-ray lasers that provide us with unparalleled views of ultrafast processes in the atomic world around us.

    FACET-II will be a unique facility that will help keep the U.S. at the forefront of accelerator science, said SLAC’s Vitaly Yakimenko, project director. “Its high-quality beams will enable us to develop novel acceleration methods,” he said. “In particular, those studies will bring us close to turning plasma acceleration into actual scientific applications.”

    2
    SLAC is upgrading its Facility for Advanced Accelerator Experimental Tests (FACET) – a test bed for new technologies that could revolutionize the way we build particle accelerators. FACET-II will use the middle third of the lab’s 2-mile-long linear accelerator (SLAC ground plan at top). It will send a beam of electrons (bottom, blue line) from the electron source (bottom left) to the experimental area (bottom right), where it will arrive with an energy of 10 billion electronvolts. The design allows for adding the capability to produce and accelerate positrons (bottom, red line) later. (Greg Stewart/SLAC National Accelerator Laboratory)

    The DOE has now approved the $26 million project (Critical Decisions 2 and 3). The new facility, which is expected to be completed by the end of 2019, will also operate as an Office of Science user facility – a federally sponsored research facility for advanced accelerator research available on a competitive, peer-reviewed basis to scientists from around the world.

    “As a strategically important national user facility, FACET-II will allow us to explore the feasibility and applications of plasma-driven accelerator technology,” said James Siegrist, associate director of the High Energy Physics (HEP) program of DOE’s Office of Science, which stewards advanced accelerator R&D in the U.S. for the development of applications in science and society. “We’re looking forward to seeing the groundbreaking science in this area that FACET-II promises, with the potential for significant reduction of the size and cost of future accelerators, including free-electron lasers and medical accelerators.”

    Bruce Dunham, head of SLAC’s Accelerator Directorate, said, “Our lab was built on accelerator technology and continues to push innovations in the field. We’re excited to see FACET-II move forward.”

    Surfing the Plasma Wake

    The new facility will build on the successes of FACET, where scientists already demonstrated that the plasma technique can very efficiently boost the energy of electrons and their antimatter particles, positrons. In this method, researchers send a bunch of very energetic particles through a hot ionized gas, or plasma, creating a plasma wake for a trailing bunch to “surf” on and gain energy.

    3
    Researchers will use FACET-II to develop the plasma wakefield acceleration method, in which researchers send a bunch of very energetic particles through a hot ionized gas, or plasma, creating a plasma wake for a trailing bunch to “surf” on and gain energy. (Greg Stewart/SLAC National Accelerator Laboratory)

    In conventional accelerators, particles draw energy from a radiofrequency field inside metal structures. However, these structures can only support a limited energy gain per distance before breaking down. Therefore, accelerators that generate very high energies become very long, and very expensive. The plasma wakefield approach promises to break new ground. Future plasma accelerators could, for example, unfold the same acceleration power as SLAC’s historic 2-mile-long copper accelerator (linac) in just a few meters.

    3
    Aerial view of SLAC’s 2-mile-long linac. The longest linear accelerator ever built, it produced its first particle beams in 1966 and has been the lab’s backbone for accelerator-driven science ever since. (SLAC National Accelerator Laboratory)

    Researchers will use FACET-II for crucial developments before plasma accelerators can become a reality. “We need to show that we’re able to preserve the quality of the beam as it passes through plasma,” said SLAC’s Mark Hogan, FACET-II project scientist. “High-quality beams are an absolute requirement for future applications in particle and X-ray laser physics.”

    The FACET-II facility is currently funded to operate with electrons, but its design allows adding the capability to produce and accelerate positrons later – a step that would enable the development of plasma-based electron-positron particle colliders for particle physics experiments.

    4
    Future particle colliders will require highly efficient acceleration methods for both electrons and positrons. Plasma wakefield acceleration of both particle types, as shown in this simulation, could lead to smaller and more powerful colliders than today’s machines. (F. Tsung/W. An/UCLA; Greg Stewart/SLAC National Accelerator Laboratory)

    Another important objective is the development of novel electron sources that could lead to next-generation light sources, such as brighter-than-ever X-ray lasers. These powerful discovery machines provide scientists with unprecedented views of the ever-changing atomic world and open up new avenues for research in chemistry, biology and materials science.

    Other science goals for FACET-II include compact wakefield accelerators that use certain electrical insulators (dielectrics) instead of plasma, as well as diagnostics and computational tools that will accurately measure and simulate the physics of the new facility’s powerful electron beams. Science goals are being developed with regular input from the FACET user community.

    “The approval for FACET-II is an exciting milestone for the science community,” said Chandrashekhar Joshi, a researcher from the University of California, Los Angeles, and longtime collaborator of SLAC’s plasma acceleration team. “The facility will push the boundaries of accelerator science, discover new and unexpected physics and substantially contribute to the nation’s coordinated effort in advanced accelerator R&D.”

    Fast Track to First Experiments

    To complete the facility, crews will install an electron source and magnets to compress electron bunches, as well as new shielding, said SLAC’s Carsten Hast, FACET-II technical director. “We’ll also upgrade the facility’s control systems and install tools to analyze the beam properties.”

    FACET-II will use one kilometer (one-third) of the SLAC linac – sending electrons from the source at one end to the experimental area at the other end – to generate an electron beam with an energy of 10 billion electronvolts that will drive the facility’s versatile research program.

    FACET-II has issued its first call for proposals for experiments that will run when the facility goes online in 2020.

    “The project team has done an outstanding job in securing DOE approval for the facility,” said DOE’s Hannibal Joma, federal project director for FACET-II. “We’ll now deliver the project on time for the user program at SLAC.”

    SLAC’s Selina Green, project manager, said, “After two years of very hard work, it’s very exciting to see the project finally come together. Thanks to the DOE’s continued support we’ll soon be able to open FACET-II for groundbreaking new science.”

    5
    Members of SLAC’s FACET-II project team. From left: Nate Lipkowitz, Kevin Turner, Carsten Hast, Lorenza Ladao, Gary Bouchard, Vitaly Yakimenko, Martin Johansson, Selina Green, Glen White, Eric Bong, Jerry Yocky. Not pictured: Lauren Alsberg, Jeff Chan, Karl Flick, Mark Hogan, John Seabury. (Dawn Harmer/SLAC National Accelerator Laboratory)

    For more information, please visit the website:

    FACET-II Website

    Press Office Contact:
    Andy Freeberg
    afreeberg@slac.stanford.edu
    (650) 926-4359

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

     
  • richardmitnick 12:56 pm on June 10, 2018 Permalink | Reply
    Tags: Accelerator Science, ATLAS Trigger, , , , , ,   

    From CERN ATLAS: “In conversation with Nick Ellis, one of the architects of the ATLAS Trigger” 

    CERN ATLAS Higgs Event

    CERN/ATLAS
    From CERN ATLAS

    10th June 2018
    Kate Shaw

    1
    Nick Ellis with the ATLAS trigger system. (Image: K. Anthony/ATLAS Collaboration)

    A long-standing member of the ATLAS Collaboration, CERN physicist Nick Ellis was one of the original architects of the ATLAS Trigger. Working in the 1980s and 1990s, Nick led groups developing innovative ways to move and process huge quantities of data for the next generation of colliders. It was a challenge some thought was impossible to meet. Nick currently leads the CERN ATLAS Trigger and Data Acquisition Group and shared his wealth of experience as a key part of the ATLAS Collaboration.

    I first became involved in what was to become the ATLAS Collaboration in the mid- to late-1980s. I had been working on the UA1 experiment at CERN’s SPS proton–antiproton collider for several years on various physics analyses and also playing a leading role on the UA1 trigger.

    People were starting to think about experiments for higher-energy machines, such as the Large Hadron Collider (LHC) and the never-completed Superconducting Super Collider (SSC). Of course, at this point there was no ATLAS or CMS or even the precursors. There were just groups of people getting together to discuss ideas.

    I remember a first discussion I had about possibilities for the trigger in LHC experiments was over a coffee in CERN’s Restaurant 1 with Peter Jenni. He was on the UA2 experiment at the time and, together with a number of colleagues, was developing ideas for an LHC experiment. Peter later went on to lead the ATLAS Collaboration for over a decade. He told me that nobody was looking at how the trigger system might be designed, and he asked if I would like to develop something. So I did.

    The ATLAS trigger is a multilevel system that selects events that are potentially interesting for physics studies from a much larger number of events. It is very challenging since we start off with an interaction rate of the order of a billion per second. In the first stage of the selection, that has to be done within a few millionths of a second, the event rate must be reduced to about 100 kHz, four orders of magnitude below to the interaction rate, i.e. only one in ten thousand collisions can give rise to a first-level trigger. Note that each event, corresponding to a given bunch crossing, contains many tens of interactions. The rate must then be brought down by a further two orders of magnitude before the data is recorded for offline analysis.

    When I start working on such a complex technical problem, I sit down with a pen and paper and draw diagrams. It’s important to visualise the system. A trigger and data-acquisition system is complicated – you have data being produced, data being processed, data being moved. So, I make a sketch with arrows, writing down order of magnitude numbers, what has to talk to what, what signals have to be sent. These are very rough notes! I doubt anyone other than me would be able to read my sketches that fed into the early designs of ATLAS’ trigger.

    Though I was specifically looking at the first-level calorimeter trigger, which was what I was working on at UA1, I was interested in the trigger more generally. At the time, we did not know that the future held so much possibility in terms of programmable logic. The early ideas for the first-level trigger were based on relatively primitive electronics: modules with discrete logic, memories and some custom integrated circuits.

    There was also concern that the second-level trigger processing would be hard to implement, because those triggers would require too much data to move and too much data to process. Here, the first thing I had to do was to demonstrate that it could be done at all! I carried out an intellectual exercise to try and factorise the problem, to the maximal extent possible. I was driven to do this because it was so interesting, and it was virgin territory. There were no constraints on ideas that could be explored.

    My initial studies were on a maximally-factorised model, the so-called “local–global scheme”. It was never my objective that one would necessarily implement this exact scheme, but I used it as the basis for brainstorming a region-of-interest (ROI) strategy for the trigger. The triggers would look at specified regions of the detector, identified by the first-level trigger, for features of interest, rather than trying to search for features everywhere in the event. This exercise demonstrated that, at any given point in the system, you could get the data movement and computation down to a manageable level.

    2
    The ATLAS Level-1 Calorimeter Trigger, located underground in a cavern adjacent to the experiment. (Image: K. Anthony/ATLAS Collaboration)

    I, along with a few colleagues, developed this exercise into a study that we presented at the 1990 Large Hadron Collider workshop in Aachen, Germany. In the end, thanks to technological progress, it was not necessary to exploit all the ingredients used in the study. In more specific words, instead of separating the processing for each ROI and for each detector, we were able to use a single processor to process fully all of the ROIs in an event. The use of the first-level trigger to guide the second-level data access and processing became a key part of the ATLAS trigger philosophy.

    In the years following the Aachen workshop, the ATLAS and CMS experiments began to take shape. It was a really exciting time, and the number of people involved was tiny in comparison to today. You could do anything and everything; you could come with completely new ideas!

    When first beams and first collisions finally came, things went more smoothly than I had ever dared to hope.

    3
    First collisions in ATLAS. A collision event. (Image: Claudia Marcelloni/ATLAS Experiment)

    We had spent a lot of time planning for what would come when the first single beam came, when the first collisions came, what would we do, in what order, what might go wrong and how could we mitigate it. It has always been in my nature to think ahead about all the potential problems and make plans that let us avoid future issues, ensuring that systems are robust so that a local problem does not become a global problem. Thanks to the work of excellent, dedicated colleagues, everything went really well for first collisions!

    Clearly ATLAS has a long future ahead of it, although we will always face challenges: the upgrades we have planned are by no means trivial! Even with our existing infrastructure and experience, there will no doubt be obstacles that we will have to overcome.

    And, of course, in the even longer term, CERN itself could change, depending on what happens in physics and on the global stage. It wouldn’t be the first laboratory to do so – just look at DESY and SLAC.


    DESY Helmholtz Centres & Networks


    SLAC Campus


    SLAC SSRL


    SLAC LCLS

    Even Fermilab has changed from a collider to a neutrino facility. We never know where the next big discovery will lead us!


    FNAL Short baseline neutrino detector

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN


    CERN Courier

    QuantumDiaries


    Quantum Diaries

     
  • richardmitnick 7:38 pm on June 5, 2018 Permalink | Reply
    Tags: Accelerator Science, , Catching hadronic vector boson decays with a finer net, , , , ,   

    From CERN ATLAS: “Catching hadronic vector boson decays with a finer net” 

    CERN ATLAS Higgs Event

    CERN/ATLAS
    From CERN ATLAS

    5th June 2018
    ATLAS Collaboration

    1
    Figure 1: ATLAS event display showing two electroweak boson candidates with an invariant mass of 5 TeV, the highest observed in the analysis. Energy deposits in the ATLAS calorimeters are shown in green and yellow rectangles. The angular resolution limits the reconstruction of substructure of highly collimated jets. The view in the top left corner shows the higher angular resolution of the first calorimeter layer and the tracker (orange lines), revealing the striking two-prong substructure in the energy flow. (Image: ATLAS Collaboration/CERN)

    ATLAS has been collecting increasing amounts of data at a centre-of-mass energy of 13 TeV to unravel some of the big mysteries in physics today. For instance, why is the mass of the Higgs boson so much lighter than one would expect? Why is gravity so weak?

    Many theoretical models predict that new physics, which could provide answers to these questions, could manifest itself as yet-undiscovered massive particles. These include massive new particles that would decay to much lighter high-momentum electroweak bosons (W and Z). These in turn decay, and the most common signature would be pairs of highly collimated bundles of particles, known as jets. So far, no evidence of such new particles has been uncovered.

    The ability to distinguish jets initiated by decays of W or Z bosons from those initiated by other processes is critical for the success of these searches. While the energy flow from the bosons exhibits a distinct two-prong structure from the two-body decay of the boson, no such feature exists for jets from a single quark or gluon – the latter being the most frequent scattering products when colliding protons.

    In the past, ATLAS identified this two-prong structure using its fine-grained calorimeter, which measures the energy of the particles inside jets with good resolution. However, in very energetic jets from decays of particles with masses of multiple TeV, the average separation of these prongs is comparable to the segmentation of the ATLAS calorimeter. This creates confusion within the algorithms responsible for identifying the bosons, limiting our sensitivity to new physics at high masses. In contrast to the calorimeter, the ATLAS inner tracking detector reconstructs charged particles with excellent angular resolution, but it lacks sufficient momentum resolution.

    2
    Figure 2 Comparison between the current and previous limits on the cross section times branching ratio for a hypothetical particle V versus its mass. Lower vertical values represent higher sensitivity to new physics. Due to the improvements on the analysis techniques, the current result improves our reach for new physics far beyond what we get from only increasing the size of the data set (middle blue line). (Image: ATLAS Collaboration/CERN)

    A new ATLAS analysis combines the angular information of charged particles reconstructed by the inner detector with the energy information from the calorimeter. This lets ATLAS physicists eliminate the limitations in identifying very energetic jets from bosons. Similar to increasing the magnification of a microscope, this improvement to the ATLAS event reconstruction software allows it to better resolve the energy flow in very energetic jets. This improved magnification allows physicists to also optimize the analyses techniques.

    Making such improvements while collecting more data is necessary to maximize the potential for discovery when exploring new kinematic regimes. This time no new physics was seen, but the technique can be applied to many more searches – and still larger datasets.

    Related journal articles
    _________________________________________________
    See the full article for further references with links.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN


    CERN Courier

    QuantumDiaries


    Quantum Diaries

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: