Tagged: Accelerator Science Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:37 am on August 15, 2019 Permalink | Reply
    Tags: Accelerator Science, Azure ML, , , Every proton collision at the Large Hadron Collider is different but only a few are special. The special collisions generate particles in unusual patterns — possible manifestations of new rule-break, Fermilab is the lead U.S. laboratory for the CMS experiment., , , , , , , The challenge: more data more computing power   

    From Fermi National Accelerator Lab- “A glimpse into the future: accelerated computing for accelerated particles” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    August 15, 2019
    Leah Hesla

    Every proton collision at the Large Hadron Collider is different, but only a few are special. The special collisions generate particles in unusual patterns — possible manifestations of new, rule-breaking physics — or help fill in our incomplete picture of the universe.

    Finding these collisions is harder than the proverbial search for the needle in the haystack. But game-changing help is on the way. Fermilab scientists and other collaborators successfully tested a prototype machine-learning technology that speeds up processing by 30 to 175 times compared to traditional methods.

    Confronting 40 million collisions every second, scientists at the LHC use powerful, nimble computers to pluck the gems — whether it’s a Higgs particle or hints of dark matter — from the vast static of ordinary collisions.

    Rifling through simulated LHC collision data, the machine learning technology successfully learned to identify a particular postcollision pattern — a particular spray of particles flying through a detector — as it flipped through an astonishing 600 images per second. Traditional methods process less than one image per second.

    The technology could even be offered as a service on external computers. Using this offloading model would allow researchers to analyze more data more quickly and leave more LHC computing space available to do other work.

    It is a promising glimpse into how machine learning services are supporting a field in which already enormous amounts of data are only going to get bigger.

    1
    Particles emerging from proton collisions at CERN’s Large Hadron Collider travel through through this stories-high, many-layered instrument, the CMS detector. In 2026, the LHC will produce 20 times the data it does currently, and CMS is currently undergoing upgrades to read and process the data deluge. Photo: Maximilien Brice, CERN

    The challenge: more data, more computing power

    Researchers are currently upgrading the LHC to smash protons at five times its current rate.

    By 2026, the 17-mile circular underground machine at the European laboratory CERN will produce 20 times more data than it does now.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    CMS is one of the particle detectors at the Large Hadron Collider, and CMS collaborators are in the midst of some upgrades of their own, enabling the intricate, stories-high instrument to take more sophisticated pictures of the LHC’s particle collisions. Fermilab is the lead U.S. laboratory for the CMS experiment.

    If LHC scientists wanted to save all the raw collision data they’d collect in a year from the High-Luminosity LHC, they’d have to find a way to store about 1 exabyte (about 1 trillion personal external hard drives), of which only a sliver may unveil new phenomena. LHC computers are programmed to select this tiny fraction, making split-second decisions about which data is valuable enough to be sent downstream for further study.

    Currently, the LHC’s computing system keeps roughly one in every 100,000 particle events. But current storage protocols won’t be able to keep up with the future data flood, which will accumulate over decades of data taking. And the higher-resolution pictures captured by the upgraded CMS detector won’t make the job any easier. It all translates into a need for more than 10 times the computing resources than the LHC has now.

    The recent prototype test shows that, with advances in machine learning and computing hardware, researchers expect to be able to winnow the data emerging from the upcoming High-Luminosity LHC when it comes online.

    “The hope here is that you can do very sophisticated things with machine learning and also do them faster,” said Nhan Tran, a Fermilab scientist on the CMS experiment and one of the leads on the recent test. “This is important, since our data will get more and more complex with upgraded detectors and busier collision environments.”

    2
    Particle physicists are exploring the use of computers with machine learning capabilities for processing images of particle collisions at CMS, teaching them to rapidly identify various collision patterns. Image: Eamonn Maguire/Antarctic Design

    Machine learning to the rescue: the inference difference

    Machine learning in particle physics isn’t new. Physicists use machine learning for every stage of data processing in a collider experiment.

    But with machine learning technology that can chew through LHC data up to 175 times faster than traditional methods, particle physicists are ascending a game-changing step on the collision-computation course.

    The rapid rates are thanks to cleverly engineered hardware in the platform, Microsoft’s Azure ML, which speeds up a process called inference.

    To understand inference, consider an algorithm that’s been trained to recognize the image of a motorcycle: The object has two wheels and two handles that are attached to a larger metal body. The algorithm is smart enough to know that a wheelbarrow, which has similar attributes, is not a motorcycle. As the system scans new images of other two-wheeled, two-handled objects, it predicts — or infers — which are motorcycles. And as the algorithm’s prediction errors are corrected, it becomes pretty deft at identifying them. A billion scans later, it’s on its inference game.

    Most machine learning platforms are built to understand how to classify images, but not physics-specific images. Physicists have to teach them the physics part, such as recognizing tracks created by the Higgs boson or searching for hints of dark matter.

    Researchers at Fermilab, CERN, MIT, the University of Washington and other collaborators trained Azure ML to identify pictures of top quarks — a short-lived elementary particle that is about 180 times heavier than a proton — from simulated CMS data. Specifically, Azure was to look for images of top quark jets, clouds of particles pulled out of the vacuum by a single top quark zinging away from the collision.

    “We sent it the images, training it on physics data,” said Fermilab scientist Burt Holzman, a lead on the project. “And it exhibited state-of-the-art performance. It was very fast. That means we can pipeline a large number of these things. In general, these techniques are pretty good.”

    One of the techniques behind inference acceleration is to combine traditional with specialized processors, a marriage known as heterogeneous computing architecture.

    Different platforms use different architectures. The traditional processors are CPUs (central processing units). The best known specialized processors are GPUs (graphics processing units) and FPGAs (field programmable gate arrays). Azure ML combines CPUs and FPGAs.

    “The reason that these processes need to be accelerated is that these are big computations. You’re talking about 25 billion operations,” Tran said. “Fitting that onto an FPGA, mapping that on, and doing it in a reasonable amount of time is a real achievement.”

    And it’s starting to be offered as a service, too. The test was the first time anyone has demonstrated how this kind of heterogeneous, as-a-service architecture can be used for fundamental physics.

    5
    Data from particle physics experiments are stored on computing farms like this one, the Grid Computing Center at Fermilab. Outside organizations offer their computing farms as a service to particle physics experiments, making more space available on the experiments’ servers. Photo: Reidar Hahn

    At your service

    In the computing world, using something “as a service” has a specific meaning. An outside organization provides resources — machine learning or hardware — as a service, and users — scientists — draw on those resources when needed. It’s similar to how your video streaming company provides hours of binge-watching TV as a service. You don’t need to own your own DVDs and DVD player. You use their library and interface instead.

    Data from the Large Hadron Collider is typically stored and processed on computer servers at CERN and partner institutions such as Fermilab. With machine learning offered up as easily as any other web service might be, intensive computations can be carried out anywhere the service is offered — including off site. This bolsters the labs’ capabilities with additional computing power and resources while sparing them from having to furnish their own servers.

    “The idea of doing accelerated computing has been around decades, but the traditional model was to buy a computer cluster with GPUs and install it locally at the lab,” Holzman said. “The idea of offloading the work to a farm off site with specialized hardware, providing machine learning as a service — that worked as advertised.”

    The Azure ML farm is in Virginia. It takes only 100 milliseconds for computers at Fermilab near Chicago, Illinois, to send an image of a particle event to the Azure cloud, process it, and return it. That’s a 2,500-kilometer, data-dense trip in the blink of an eye.

    “The plumbing that goes with all of that is another achievement,” Tran said. “The concept of abstracting that data as a thing you just send somewhere else, and it just comes back, was the most pleasantly surprising thing about this project. We don’t have to replace everything in our own computing center with a whole bunch of new stuff. We keep all of it, send the hard computations off and get it to come back later.”

    Scientists look forward to scaling the technology to tackle other big-data challenges at the LHC. They also plan to test other platforms, such as Amazon AWS, Google Cloud and IBM Cloud, as they explore what else can be accomplished through machine learning, which has seen rapid evolution over the past few years.

    “The models that were state-of-the-art for 2015 are standard today,” Tran said.

    As a tool, machine learning continues to give particle physics new ways of glimpsing the universe. It’s also impressive in its own right.

    “That we can take something that’s trained to discriminate between pictures of animals and people, do some modest amount computation, and have it tell me the difference between a top quark jet and background?” Holzman said. “That’s something that blows my mind.”

    This work is supported by the DOE .

    See the full here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 12:48 pm on August 13, 2019 Permalink | Reply
    Tags: Accelerator Science, , MIT’s Plasma Science and Fusion Center (PSFC)-the megawatt gyrotron, , , , PLasma, University of Washington-Advanced Propulsion Laboratory and Space Plasma Simulation Laboratory   

    From MIT News: “Julian Picard: Chopping microwaves, sharpening instincts” 

    MIT News

    From MIT News

    August 12, 2019
    Paul Rivenberg | Plasma Science and Fusion

    1
    “One of the reasons I came back to grad school was to be steeped in something for a long time,” says Julian Picard, who works in MIT’s Plasma Science and Fusion Center. “After spending so long working hard on something, you start to develop a gut instinct.” Photo: Paul Rivenberg

    MIT graduate student slices microwave pulses to test advanced accelerators.

    “Looking through microscopes has never been my thing,” says Julian Picard.

    As a graduate student in the Department of Physics, Picard works with the invisible world of particles and electromagnetic waves every day, yet he is motivated by the goal of creating something very visible, “something you can hold in your hand.” His study of the microwaves that speed from the megawatt gyrotron at MIT’s Plasma Science and Fusion Center (PSFC) could lead the way to smaller and more powerful particle accelerators, the kind of finished product Picard finds rewarding.

    Picard became interested in plasma as an undergraduate at the University of Washington in Seattle. His student research at their Advanced Propulsion Laboratory and Space Plasma Simulation Laboratory prepared him for an internship, and later a research engineer position, at Eagle Harbor Technologies. Working there on plasma generation and pulsed power supplies, he admired the way the most experienced scientists seemed to solve problems “intuitively.”

    “That was inspiring to me,” he says. “One of the reasons I came back to grad school was to be steeped in something for a long time. After spending so long working hard on something, you start to develop a gut instinct.”

    Picard notes it was difficult to find a graduate program that would provide him with a deep physics background, along with the opportunity to apply his understanding to a practical plasma project.

    “That is what drives me,” Picard says, “I want to understand how something works well enough to apply it in a new way. To me, it feels vacuous to try to design something without understanding how it works. That’s why I wanted to find a program in physics: I wanted to continue developing my background in basic science, and then be able to apply it to a variety of things.”

    He discovered what he wanted at the PSFC in the Plasma Science and Technology Group, headed by Richard Temkin, who introduced him to the center’s megawatt gyrotron, the source of microwaves for a new project to test particle accelerator cavities.

    Particle accelerators, besides being essential tools for studying the universe, have practical applications including medical instrument sterilization, computer chip manufacture, material identification and radioisotope production for cancer treatment. While an accelerator typically runs at low frequency (1 gigahertz) with success, researchers have suspected that running it at higher frequencies would allow it to be made smaller and more efficient, improving the convenience and possibly reducing the expense.

    Although the PSFC megawatt gyrotron is capable of producing microwaves at the higher frequency of 110 GHz, the length of the pulse would melt any accelerator cavity it passed through. Researchers needed to find a way to shorten that pulse.

    In an article for Applied Physics Letters, Picard describes the experimental setup that allowed researchers to “chop” the pulse. The piece received the Outstanding Student Paper Award from the IEEE Nuclear and Plasma Sciences Society at the 2019 Pulsed Power and Plasma Science Conference in June.

    To shorten the pulse, PSFC researchers strategically arranged a wafer of silicon in the path of the microwaves. Typically, microwaves would pass straight through this. However, a laser directed onto the wafer creates a type of plasma inside the silicon that will reflect the microwaves for as long as the laser is on. Those reflected high-frequency microwaves can be directed into the accelerator, and the pulse chopped to a manageable length (10 nanoseconds) simply by turning off the laser.

    The laser-targeted wafer does not reflect all the microwaves; about 30 percent are absorbed by or pass through the silicon. Picard’s study showed, however, that as the gyrotron power increased toward a megawatt the wafer reflected more. Instead of reflecting 70 percent of the microwaves, it reflected closer to 80 or 85 percent.

    “This effect had never been seen before because nobody could test at the higher power levels,” says Picard. “Reflection becomes more efficient at higher powers compared to lower powers. That means there is more power available, so we can test more interesting accelerator structures.”

    The PSFC is working with a group from Stanford University that designs accelerator cavities, which can now be tested with the “Megawatt Microwave Pulse Chopper.”

    Picard is pleased with the experiment.

    “What I’ve really liked about this project is that, at the end of the day, we have a device that makes a short pulse,” he says. “That’s a deliverable. It’s satisfying and motivating.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 11:03 am on August 12, 2019 Permalink | Reply
    Tags: "Atomic ‘Trojan horse’ could inspire new generation of X-ray lasers and particle colliders", 'Sneaking’ electrons into plasma., A potentially much brighter electron source based on plasma that could be used in more compact more powerful particle accelerators., Accelerator Science, “Supermicroscopes” that can resolve atomic and subatomic details., , Plasma wake, Plasma Wakefield Acceleration, Referred to as the Trojan horse technique because it’s reminiscent of the way the ancient Greeks are said to have invaded the city of Troy., , , , Trailing electrons can “surf” the wake and gain tremendous amounts of energy.   

    From SLAC National Accelerator Lab: “Atomic ‘Trojan horse’ could inspire new generation of X-ray lasers and particle colliders” 

    From SLAC National Accelerator Lab

    August 12, 2019
    Manuel Gnida

    1
    Illustration, based on simulations, of the Trojan horse technique for the production of high-energy electron beams. A laser beam (red, at left) strips electrons (blue dots) off of helium atoms. Some of the freed electrons (red dots) get accelerated inside a plasma bubble (white elliptical shape) created by an electron beam (green). (Thomas Heinemann/University of Strathclyde)

    5

    At SLAC’s FACET facility, researchers have produced an intense electron beam by ‘sneaking’ electrons into plasma, demonstrating a method that could be used in future compact discovery machines that explore the subatomic world.


    SLAC FACET

    How do researchers explore nature on its most fundamental level? They build “supermicroscopes” that can resolve atomic and subatomic details. This won’t work with visible light, but they can probe the tiniest dimensions of matter with beams of electrons, either by using them directly in particle colliders or by converting their energy into bright X-rays in X-ray lasers. At the heart of such scientific discovery machines are particle accelerators that first generate electrons at a source and then boost their energy in a series of accelerator cavities.

    Now, an international team of researchers, including scientists from the Department of Energy’s SLAC National Accelerator Laboratory, has demonstrated a potentially much brighter electron source based on plasma that could be used in more compact, more powerful particle accelerators.

    The method, in which the electrons for the beam are released from neutral atoms inside the plasma, is referred to as the Trojan horse technique because it’s reminiscent of the way the ancient Greeks are said to have invaded the city of Troy by hiding their forceful soldiers (electrons) inside a wooden horse (plasma), which was then pulled into the city (accelerator).

    “Our experiment shows for the first time that the Trojan horse method actually works,” says Bernhard Hidding from the University of Strathclyde in Glasgow, Scotland, the principal investigator of a study published today in Nature Physics. “It’s one of the most promising methods for future electron sources and could push the boundaries of today’s technology.”

    Replacing metal with plasma

    In current state-of-the-art accelerators, electrons are generated by shining laser light onto a metallic photocathode, which kicks electrons out of the metal. These electrons are then accelerated inside metal cavities, where they draw more and more energy from a radiofrequency field, resulting in a high-energy electron beam. In X-ray lasers, such as SLAC’s Linac Coherent Light Source (LCLS), the beam drives the production of extremely bright X-ray light.

    But metal cavities can only support a limited energy gain over a given distance, or acceleration gradient, before breaking down, and therefore accelerators for high-energy beams become very large and expensive. In recent years, scientists at SLAC and elsewhere have looked into ways to make accelerators more compact. They demonstrated, for example, that they can replace metal cavities with plasma that allows much higher acceleration gradients, potentially shrinking the length of future accelerators 100 to 1,000 times.

    The new paper expands the plasma concept to the electron source of an accelerator.

    “We’ve previously shown that plasma acceleration can be extremely powerful and efficient, but we haven’t been able yet to produce beams with high enough quality for future applications,” says co-author Mark Hogan from SLAC. “Improving beam quality is a top priority for the next years, and developing new types of electron sources is an important part of that.”

    According to previous calculations [Nature Communications] by Hidding and colleagues, the Trojan horse technique could make electron beams 100 to 10,000 times brighter than today’s most powerful beams. Brighter electron beams would also make future X-ray lasers brighter and further enhance their scientific capabilities.

    “If we’re able to marry the two major thrusts – high acceleration gradients in plasma and beam creation in plasma – we could be able to build X-ray lasers that unfold the same power over a distance of a few meters rather than kilometers,” says co-author James Rosenzweig, the principal investigator for the Trojan horse project at the University of California, Los Angeles.

    3
    Animation illustrating the concept of the Trojan horse method. An electron bunch from SLAC’s FACET facility (bright spot at right) passes through hydrogen plasma (purple), which creates a plasma bubble (blue). As the bubble moves through the plasma at nearly the speed of light, a laser pulse strips electrons (white dots) off of neutral helium atoms inside the plasma. The released electrons are trapped in the tail of the bubble where they gain energy (bright spot at left). (Greg Stewart/SLAC National Accelerator Laboratory)

    Producing superior electron beams

    The researchers carried out their experiment at SLAC’s Facility for Advanced Accelerator Experimental Tests (FACET). The facility, which is currently undergoing a major upgrade, generates pulses of highly energetic electrons for research on next-generation accelerator technologies, including plasma acceleration.

    First, the team flashed laser light into a mixture of hydrogen and helium gas. The light had just enough energy to strip electrons off hydrogen, turning neutral hydrogen into plasma. It wasn’t energetic enough to do the same with helium, though, whose electrons are more tightly bound than those for hydrogen, so it stayed neutral inside the plasma.

    Then, the scientists sent one of FACET’s electron bunches through the plasma, where it produced a plasma wake, much like a motorboat creates a wake when it glides through the water. Trailing electrons can “surf” the wake and gain tremendous amounts of energy.

    In this study, the trailing electrons came from within the plasma (see animation above and movie below). Just when the electron bunch and its wake passed by, the researchers zapped the helium in the plasma with a second, tightly focused laser flash. This time the light pulse had enough energy to kick electrons out of the helium atoms, and the electrons were then accelerated in the wake.

    The synchronization between the electron bunch, rushing through the plasma with nearly the speed of light, and the laser flash, lasting merely a few millionths of a billionth of a second, was particularly important and challenging, says UCLA’s Aihua Deng, one of the study’s lead authors: “If the flash comes too early, the electrons it produces will disturb the formation of the plasma wake. If it comes too late, the plasma wake has moved on and the electrons won’t get accelerated.”

    The researchers estimate that the brightness of the electron beam obtained with the Trojan horse method can already compete with the brightness of existing state-of-the-art electron sources.

    “What makes our technique transformative is the way the electrons are produced,” says Oliver Karger, the other lead author, who was at the University of Hamburg, Germany, at the time of the study. When the electrons are stripped off the helium, they get rapidly accelerated in the forward direction, which keeps the beam narrowly bundled and is a prerequisite for brighter beams.


    Computer simulation of the Trojan horse method. An electron bunch from SLAC’s FACET facility passed through hydrogen plasma and created a plasma bubble. A laser flash (dark red circular area) strips electrons off of neutral helium atoms (not shown) drifting inside the plasma. The released electrons are sucked into the tail of the bubble (trajectories shown in green), where they gain energy (color change from black to orange dots). The plasma bubble, shown stationary here, travels through the plasma with nearly the speed of light. (Daniel Ullmann and Andrew Beaton/University of Strathclyde)

    More R&D work ahead

    But before applications like compact X-ray lasers could become a reality, much more research needs to be done.

    Next, the researchers want to improve the quality and stability of their beam and work on better diagnostics that will allow them to measure the actual beam brightness, instead of estimating it.

    These developments will be done once the FACET upgrade, FACET-II, is completed. “The experiment relies on the ability to use a strong electron beam to produce the plasma wake,” says Vitaly Yakimenko, director of SLAC’s FACET Division. “FACET-II will be the only place in the world that will produce such beams with high enough intensity and energy.”


    Animation, based on simulations, of the Trojan horse technique for the production of high-energy electron beams in perpendicular geometry (90 degrees between laser and electron beams) as realized at SLAC. A laser beam (red, from right to left) strips electrons (blue dots) off of helium atoms. Some of the freed electrons (purplish to yellowish dots) get accelerated inside a plasma bubble (white elliptical shape) created by an electron beam (green). (Thomas Heinemann/University of Strathclyde)


    Animation, based on simulations, of the Trojan horse technique for the production of high-energy electron beams in collinear geometry (laser and electron beams aligned). A focused laser beam (orange-red) strips electrons (initially blue dots) off of helium atoms. All these freed electrons get accelerated (visualized by increasingly green color) inside a plasma bubble (white elliptical shape) created by an electron beam (green). (Thomas Heinemann, Andrew Beaton/University of Strathclyde)

    Other partners involved in the project were Sci-Tech Daresbury, UK; the German research center DESY; the University of Colorado Boulder; the University of Oslo, Norway; the University of Texas at Austin; RadiaBeam Technologies; RadiaSoft LLC; and Tech-X Corporation. Large parts of this work were funded by the DOE Office of Science. LCLS and FACET are DOE Office of Science user facilities.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    SLAC/LCLS


    SLAC/LCLS II projected view


    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

     
  • richardmitnick 10:18 am on August 12, 2019 Permalink | Reply
    Tags: Accelerator Science, , , Cryomodules and Cavities, Fermilab modified a cryomodule design from DESY in Germany, , , , LCLS-II will provide a staggering million pulses per second., Lined up end to end 37 cryomodules will power the LCLS-II XFEL., , , , , SLAC’s linear particle accelerator, ,   

    From Fermi National Accelerator Lab: “A million pulses per second: How particle accelerators are powering X-ray lasers” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    August 12, 2019
    Caitlyn Buongiorno

    About 10 years ago, the world’s most powerful X-ray laser — the Linac Coherent Light Source — made its debut at SLAC National Accelerator Laboratory. Now the next revolutionary X-ray laser in a class of its own, LCLS-II, is under construction at SLAC, with support from four other DOE national laboratories.

    SLAC LCLS-II

    Researchers in biology, chemistry and physics will use LCLS-II to probe fundamental pieces of matter, creating 3-D movies of complex molecules in action, making LCLS-II a powerful, versatile instrument at the forefront of discovery.

    The project is coming together thanks largely to a crucial advance in the fields of particle and nuclear physics: superconducting accelerator technology. DOE’s Fermilab and Thomas Jefferson National Accelerator Facility are building the superconducting modules necessary for the accelerator upgrade for LCLS-II.

    1
    SLAC National Accelerator Laboratory is upgrading its Linac Coherent Light Source, an X-ray laser, to be a more powerful tool for science. Both Fermilab and Thomas Jefferson National Accelerator Facility are contributing to the machine’s superconducting accelerator, seen here in the left part of the diagram. Image: SLAC

    A powerful tool for discovery

    Inside SLAC’s linear particle accelerator today, bursts of electrons are accelerated to energies that allow LCLS to fire off 120 X-ray pulses per second. These pulses last for quadrillionths of a second – a time scale known as a femtosecond – providing scientists with a flipbook-like look at molecular processes.

    “Over time, you can build up a molecular movie of how different systems evolve,” said SLAC scientist Mike Dunne, director of LCLS. “That’s proven to be quite remarkable, but it also has a number of limitations. That’s where LCLS-II comes in.”

    Using state-of-the-art particle accelerator technology, LCLS-II will provide a staggering million pulses per second. The advance will provide a more detailed look into how chemical, material and biological systems evolve on a time scale in which chemical bonds are made and broken.

    To really understand the difference, imagine you’re an alien visiting Earth. If you take one image a day of a city, you would notice roads and the cars that drive on them, but you couldn’t tell the speed of the cars or where the cars go. But taking a snapshot every few seconds would give you a highly detailed picture of how cars flow through the roads and would reveal phenomena like traffic jams. LCLS-II will provide this type of step-change information applied to chemical, biological and material processes.

    To reach this level of detail, SLAC needs to implement technology developed for particle physics – superconducting acceleration cavities – to power the LCLS-II free-electron laser, or XFEL.

    3
    This is an illustration of the electron accelerator of SLAC’s LCLS-II X-ray laser. The first third of the copper accelerator will be replaced with a superconducting one. The red tubes represent cryomodules, which are provided by Fermilab and Jefferson Lab. Image: SLAC

    Accelerating science

    Cavities are structures that impart energy to particle beams, accelerating the particles within them. LCLS-II, like modern particle accelerators, will take advantage of superconducting radio-frequency cavity technology, also called SRF technology. When cooled to 2 Kelvin, superconducting cavities allow electricity to flow freely, without any resistance. Like reducing the friction between a heavy object and the ground, less electrical resistance saves energy, allowing accelerators to reach higher power for less cost.

    “The SRF technology is the enabling step for LCLS-II’s million pulses per second,” Dunne said. “Jefferson Lab and Fermilab have been developing this technology for years. The core expertise to make LCLS-II possible lives at these labs.”

    Fermilab modified a cryomodule design from DESY, in Germany, and specially prepared the cavities to draw the record-setting performance from the cavities and cryomodules that will be used for LCLS-II.

    The cylinder-shaped cryomodules, about a meter in diameter, act as specialized containers for housing the cavities. Inside, ultracold liquid helium continuously flows around the cavities to ensure they maintain the unwavering 2 Kelvin essential for superconductivity. Lined up end to end, 37 cryomodules will power the LCLS-II XFEL.

    See the full here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 1:18 pm on August 5, 2019 Permalink | Reply
    Tags: "Fermilab’s HEPCloud goes live", Accelerator Science, , , , , ,   

    From Fermi National Accelerator Lab: “Fermilab’s HEPCloud goes live” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    August 5, 2019
    Marcia Teckenbrock

    To meet the evolving needs of high-energy physics experiments, the underlying computing infrastructure must also evolve. Say hi to HEPCloud, the new, flexible way of meeting the peak computing demands of high-energy physics experiments using supercomputers, commercial services and other resources.

    Five years ago, Fermilab scientific computing experts began addressing the computing resource requirements for research occurring today and in the next decade. Back then, in 2014, some of Fermilab’s neutrino programs were just starting up. Looking further into future, plans were under way for two big projects. One was Fermilab’s participation in the future High-Luminosity Large Hadron Collider at the European laboratory CERN.

    The other was the expansion of the Fermilab-hosted neutrino program, including the international Deep Underground Neutrino Experiment. All of these programs would be accompanied by unprecedented data demands.

    To meet these demands, the experts had to change the way they did business.

    HEPCloud, the flagship project pioneered by Fermilab, changes the computing landscape because it employs an elastic computing model. Tested successfully over the last couple of years, it officially went into production as a service for Fermilab researchers this spring.

    2
    Scientists on Fermilab’s NOvA experiment were able to execute around 2 million hardware threads at a supercomputer [NERSC Cray Cori II supercomputer at NERSC at LBNL, named after Gerty Cori, the first American woman to win a Nobel Prize in science the Office of Science’s National Energy Research Scientific Computing Center.] And scientists on CMS experiment have been running workflows using HEPCloud at NERSC as a pilot project. Photo: Roy Kaltschmidt, Lawrence Berkeley National Laboratory]

    Experiments currently have some fixed computing capacity that meets, but doesn’t overshoot, its everyday needs. For times of peak demand, HEPCloud enables elasticity, allowing experiments to rent computing resources from other sources, such as supercomputers and commercial clouds, and manages them to satisfy peak demand. The prior method was to purchase local resources that on a day-to-day basis, overshoot the needs. In this new way, HEPCloud reduces the costs of providing computing capacity.

    “Traditionally, we would buy enough computers for peak capacity and put them in our local data center to cover our needs,” said Fermilab scientist Panagiotis Spentzouris, former HEPCloud project sponsor and a driving force behind HEPCloud. “However, the needs of experiments are not steady. They have peaks and valleys, so you want an elastic facility.”

    In addition, HEPCloud optimizes resource usage across all types, whether these resources are on site at Fermilab, on a grid such as Open Science Grid, in a cloud such as Amazon or Google, or at supercomputing centers like those run by the DOE Office of Science Advanced Scientific Computing Research program (ASCR). And it provides a uniform interface for scientists to easily access these resources without needing expert knowledge about where and how best to run their jobs.

    The idea to create a virtual facility to extend Fermilab’s computing resources began in 2014, when Spentzouris and Fermilab scientist Lothar Bauerdick began exploring ways to best provide resources for experiments at CERN’s Large Hadron Collider. The idea was to provide those resources based on the overall experiment needs rather than a certain amount of horsepower. After many planning sessions with computing experts from the CMS experiment at the LHC and beyond, and after a long period of hammering out the idea, a scientific facility called “One Facility” was born. DOE Associate Director of Science for High Energy Physics Jim Siegrist coined the name “HEPCloud” — a computing cloud for high-energy physics — during a general discussion about a solution for LHC computing demands. But interest beyond high-energy physics was also significant. DOE Associate Director of Science for Advanced Scientific Computing Research Barbara Helland was interested in HEPCloud for its relevancy to other Office of Science computing needs.

    3
    The CMS detector at CERN collects data from particle collisions at the Large Hadron Collider. Now that HEPCloud is in production, CMS scientists will be able to run all of their physics workflows on the expanded resources made available through HEPCloud. Photo: CERN

    The project was a collaborative one. In addition to many individuals at Fermilab, Miron Livny at the University of Wisconsin-Madison contributed to the design, enabling HEPCloud to use the workload management system known as Condor (now HTCondor), which is used for all of the lab’s current grid activities.

    Since its inception, HEPCloud has achieved several milestones as it moved through the several development phases leading up to production. The project team first demonstrated the use of cloud computing on a significant scale in February 2016, when the CMS experiment used HEPCloud to achieve about 60,000 cores on the Amazon cloud, AWS. In November 2016, CMS again used HEPCloud to run 160,000 cores using Google Cloud Services , doubling the total size of the LHC’s computing worldwide. Most recently in May 2018, NOvA scientists were able to execute around 2 million hardware threads at a supercomputer the Office of Science’s National Energy Research Scientific Computing Center (NERSC), increasing both the scale and the amount of resources provided. During these activities, the experiments were executing and benefiting from real physics workflows. NOvA was even able to report significant scientific results at the Neutrino 2018 conference in Germany, one of the most attended conferences in neutrino physics.

    CMS has been running workflows using HEPCloud at NERSC as a pilot project. Now that HEPCloud is in production, CMS scientists will be able to run all of their physics workflows on the expanded resources made available through HEPCloud.

    Next, HEPCloud project members will work to expand the reach of HEPCloud even further, enabling experiments to use the leadership-class supercomputing facilities run by ASCR at Argonne National Laboratory and Oak Ridge National Laboratory.

    Fermilab experts are working to see that, eventually, all Fermilab experiments be configured to use these extended computing resources.

    This work is supported by the DOE Office of Science.

    See the full here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 12:41 pm on August 5, 2019 Permalink | Reply
    Tags: Accelerator Science, , , , ,   

    From CERN Courier: “Sixty years of the CERN Courier” 


    From CERN Courier

    5 August, 2019
    Matthew Chalmers

    The magazine has published over 600 issues and now reaches tens of thousands of readers.

    1
    From its first issue in 1959 to today, the CERN Courier has gone through several transformations, including a redesign for its 60th anniversary (Image: Cristina Agrigoroae/CERN)

    In August 1959, when CERN was just five years old, and the Proton Synchrotron was preparing for beams, Director-General Cornelis Bakker founded a new periodical to inform staff what was going on.

    CERN Proton Synchrotron

    It was just eight-pages long with a print run of 1000, but already a section called Other people’s atoms reported news from other labs.

    The CERN Courier has since transformed into an international magazine of around 40 pages with a circulation of 22,000 print copies, covering the global high-energy physics scene. Its website, which receives about 30,000 monthly views, was relaunched this month and provides up-to-date news from the field.

    To celebrate its diamond jubilee, a feature in the latest issue reveals several gems from past editions and shows the ever-present challenges of predicting the next discovery in fundamental research.

    You can peruse the full archive of all CERN Courier issues via the CERN Document Server.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN/ATLAS detector

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 12:22 pm on August 5, 2019 Permalink | Reply
    Tags: "ATLAS releases new search for strong supersymmetry", Accelerator Science, , , , ,   

    From CERN ATLAS: “ATLAS releases new search for strong supersymmetry” 

    CERN/ATLAS detector

    CERN ATLAS Higgs Event

    CERN ATLAS another view Image Claudia Marcelloni ATLAS CERN


    CERN ATLAS New II Credit CERN SCIENCE PHOTO LIBRARY


    From CERN ATLAS

    5th August 2019

    1
    Figure 1: Distributions of observed data events, compared to the Standard Model prediction, for (left) a subset of the bins used in the multi-bin search, or (right) one of the BDT search discriminants. (Image: ATLAS Collaboration/CERN)

    New particles sensitive to the strong interaction might be produced in abundance in the proton-proton collisions generated by the LHC – provided that they aren’t too heavy. These particles could be the partners of gluons and quarks predicted by supersymmetry (SUSY), a proposed extension of the Standard Model of particle physics that would expand its predictive power to include much higher energies. In the simplest scenarios, these “gluinos” and “squarks” would be produced in pairs, and decay directly into quarks and a new stable neutral particle (the “neutralino”), which would not interact with the ATLAS detector. The neutralino could be the main constituent of dark matter.

    The ATLAS Collaboration has been searching for such processes since the early days of LHC operation. Physicists have been studying collision events featuring “jets” of hadrons, where there is a large imbalance in the momenta of these jets in the plane perpendicular to the colliding protons (“missing transverse momentum”, ETmiss). This missing momentum would be carried away by the undetectable neutralinos. So far, ATLAS searches have led to increasingly tighter constraints on the minimum possible masses of squarks and gluinos.

    Is it possible to do better, with more data? The probability of producing these heavy particles decreases exponentially with their masses, and thus repeating the previous analyses with a larger dataset only goes so far. New, sophisticated methods that help to better distinguish a SUSY signal from the background Standard Model events are needed to take these analyses further. Crucial improvements may come from increasing the efficiency for selecting signal events, improving the rejection of background processes, or looking into less-explored regions.

    Today, at the Lepton Photon Symposium in Toronto, Canada, the ATLAS Collaboration presented new results illustrating the benefits brought by more advanced analysis techniques, which were pioneered in other search channels. The sensitivity of the new analysis is significantly improved thanks to the use of two complementary approaches.

    In the first approach, referred to as the “multi-bin search”, the events are classified into bins defined by two observables: the effective mass and the ETmiss significance. These characterise the amount of energy involved in the interaction (large, if heavy particles were produced), and how unlikely the observed ETmiss is to be caused by the escaping neutralinos rather than the mismeasurement of jet energies. With up to 24 orthogonal bins defined at a time, the search is sensitive to a large variety of masses of gluinos, squarks and neutralinos (Figure 1 (left)).

    The second approach, known as the “Boosted Decision Tree (BDT) search”, uses machine learning classification algorithms to better discriminate a potential signal. The BDTs are trained with some of the kinematic properties of the jets + ETmiss final states, predicted by the Monte Carlo simulation for signal and background events. Eight such discriminants are defined, each optimised for a different region of the parameter and model space (Figure 1 (right)).

    2
    Figure 2: 95% confidence level exclusion limits on the masses of gluinos, squarks and neutralinos, in simplified signal scenarios assuming (left) only the pair production of gluinos, or (right) the combined pair production of gluinos and squarks for a neutralino mass of 0 GeV. (Image: ATLAS Collaboration/CERN)

    The new results made use of the full LHC Run 2 dataset, corresponding to an integrated luminosity of 139 fb-1, and did not show any significant difference between the number of observed events and the Standard Model predictions in the signal-enriched regions. Exclusion limits were therefore set on the masses of gluinos, squarks and neutralinos, assuming different scenarios. Some examples are shown in Figure 2. For the multi-bin search, the strength of all the bins can be simultaneously brought to bear, increasing the exclusion power of the analysis.

    Links

    Search for squarks and gluinos in final states with jets and missing transverse momentum using 139 fb−1 of 13 TeV proton-proton collision data with the ATLAS detector (ATLAS-CONF-2019-040, link coming soon)
    Lepton Photon 2019 plenary presentation: Overview of the ATLAS Experiment by Pierre Savard
    Search for squarks and gluinos in final states with jets and missing transverse momentum using 36 fb−1 of 13 TeV proton-proton collision data with the ATLAS detector (Phys. Rev. D 97 (2018) 112001, see figures)
    Search for squarks and gluinos using final states with jets and missing transverse momentum with the ATLAS detector in 7 TeV proton-proton collisions (ATLAS-CONF-2011-086)
    Search for top-squark pair production in final states with one lepton, jets, and missing transverse momentum using 36 fb−1 of 13 TeV proton-proton collision data with the ATLAS detector (JHEP 06 (2018) 108, see figures)
    Search for supersymmetry using final states with one lepton, jets, and missing transverse momentum with the ATLAS detector in 7 TeV proton-proton collisions (Phys. Rev. Lett. 106 (2011) 131802, see figures)
    See also the full lists of ATLAS Conference Notes and ATLAS Physics Papers.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN Courier

    Quantum Diaries
    QuantumDiaries

    CERN map


    CERN LHC Grand Tunnel
    CERN LHC particles

     
  • richardmitnick 1:55 pm on July 30, 2019 Permalink | Reply
    Tags: "CDF, Accelerator Science, , DZero experiments presented with prestigious European Physics Society prize", , , , ,   

    From Fermi National Accelerator Lab: “CDF, DZero experiments presented with prestigious European Physics Society prize” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    July 30, 2019
    Edited by the esteemed Leah Hesla

    On July 15 in Ghent, Belgium, the European Physical Society formally presented the CDF and DZero collaborations with the 2019 High Energy and Particle Physics Prize “for the discovery of the top quark and the detailed measurement of its properties.”

    FNAL/Tevatron CDF detector

    FNAL/Tevatron DZero detector

    FNAL/Tevatron tunnel

    FNAL/Tevatron map

    Three of the four experiment co-spokespersons accepted the award on behalf of the collaborations at the biannual EPS conference. A number of CDF and DZero physicists were in attendance.

    EPS awards the prize every two years to one or more persons or to collaborations for an outstanding contribution to high-energy and particle physics in an experimental, theoretical or technological area.

    1
    CDF and DZero collaborators attended the award ceremony. Photo courtesy of EPS Conference

    2
    From left: EPS Chair of High Energy and Particle Physics Barbara Erazmus, CDF co-spokesperson Giorgio Chiarelli, DZero co-spokesperson Paul Grannis, DZero co-spokesperson Dmitri Denisov. Not pictured: CDF co-spokesperson David Toback. Photo courtesy of EPS Conference

    See the full here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 4:40 pm on July 29, 2019 Permalink | Reply
    Tags: Accelerator Science, , Bianca Giaccone, By using plasma they can treat the cavities’ inner walls even as they sit inside a particle accelerator., Cavities are the components in particle accelerators that transfer energy to particle beams as they pass through-superconducting accelerating cavities., , Giaccone is working on a different technique called plasma processing originally proposed and implemented at Oak Ridge National Laboratory., , , , Scientists can reduce field emission in cavities by high-pressure water rinsing in cleanrooms., The main goal is to limit an unwanted effect called “field emission” during which the cavity’s inner surface emits electrons.,   

    From Fermi National Accelerator Lab: Women in STEM- “Bianca Giaccone, IIT student working at Fermilab, recognized for new technique to improve particle accelerator performance” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    July 29, 2019
    edited by the esteemed journalist Leah Hesla

    1
    Bianca Giaccone’s award-winning research focuses on a technique called plasma processing. Here Giaccone is operating the vacuum and gas system used to flow gas through the accelerating cavities. Photo: Reidar Hahn

    Bianca Giaccone, a Ph.D. student from the Illinois Institute of Technology working at Fermilab, has received the Young Investigator Prize for Best Talk at the International Conference on RF Superconductivity.

    Her talk covered a technique for processing superconducting accelerating cavities.

    Cavities are the components in particle accelerators that transfer energy to particle beams as they pass through. Superconducting radio-frequency cavities, or SRF cavities, in particular are the technology of choice for many future accelerators.

    Along with partners at Oak Ridge National Laboratory and SLAC National Accelerator Laboratory, Giaccone and a group of SRF experts at Fermilab are working on a method for cleaning the inside of cavities made of niobium.

    The main goal is to limit an unwanted effect called “field emission,” during which the cavity’s inner surface emits electrons. The more the field emission is reduced, the better, since electrons flying off the cavity surface can cause the cavity’s efficiency to plummet.

    Scientists can reduce field emission in cavities by high-pressure water rinsing in cleanrooms. However, some contaminants may still fall on the cavity surface as the cavities are assembled into the larger building blocks that make up the final accelerator, called cryomodules.

    Giaccone is working on a different technique, called plasma processing, originally proposed and implemented at Oak Ridge National Laboratory. Giaccone and the multi-institution plasma processing team are now developing an extension of this technique for cleaning cavities for the upcoming Linac Coherent Light Source upgrade, called LCLS-II, an X-ray laser currently under construction at SLAC.

    LCLS-II

    By using plasma, they can treat the cavities’ inner walls even as they sit inside a particle accelerator. There would be no need to move or disassemble the cryomodules, which would be extremely costly. The technique has the potential of having a very high impact, as it can be applied to recover the performance of degraded cavities in accelerators worldwide.

    On behalf of the collaboration, Giaccone presented the promising first results from tests of this method, which they applied to 1.3-GHz cavities for LCLS-II. One of the main innovations brought by the Fermilab team is an easier and more effective way to ignite the plasma in the cavities, as detailed in a recent paper published by the group [Journal of Applied Physics].

    3
    This shows the inside of an accelerating cavity. A low-pressure, inert gas is necessary to ignite the glow discharge inside the cavity. Photo courtesy of Bianca Giaccone

    The group plans to implement the technique on an LCLS-II cryomodule at Fermilab and eventually at the LCLS-II site at SLAC.

    Giaccone is working on her thesis under the supervision of IIT professor John Zasadzinski and Fermilab scientist and Peoples fellow Martina Martinello.

    The Young Investigator Prize for Best Talk is given to an individual based on the relevance and impact of the scientific work; the novelty and quality of the scientific work; the quality of the poster and oral presentation; and the individual’s interaction and professionalism toward the program committee.

    This work is supported by a plasma processing grant from the Office of Basic Energy Science, and by the GARD facilities program of the Office of High Energy Physics in the DOE Office of Science.

    See the full here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 12:26 pm on July 27, 2019 Permalink | Reply
    Tags: A new way to measure the “distance” between high-energy particle collision events, Accelerator Science, , , ,   

    From “Physics”: “Viewpoint: Putting Distance Between Collider Events” 

    Physics LogoAbout Physics

    Physics Logo 2

    From “Physics”

    July 26, 2019
    Michael Schmitt
    Department of Physics and Astronomy
    Northwestern University, Evanston, IL 60208, USA

    A new way to measure the “distance” between high-energy particle collision events can help researchers interpret events involving, for example, the production of Higgs bosons or of top quarks.

    Using statistical tests to make sense of large datasets is an integral part of modern science and especially of collider physics. Model selection—selecting which candidate model provides a good explanation of a set of data—is an important case. For example, one might want to test whether the kinematics of particles produced in high-energy collisions imply the presence of a hypothetical particle. A related example is the classification of events. When two high-energy protons collide, they produce a huge number of subatomic particles with various energies and momenta. How can researchers, given measurements of these quantities, decide what type of collider event they are witnessing? Are they observing just a set of typical quantum chromodynamic (QCD) hadronic jets, or might the collision products contain top quarks or Higgs bosons? Patrick Komiske, of the Massachusetts Institute of Technology and Harvard University, and co-workers have proposed a “metric” that provides a new way to quantify how “distant” two collider events are [1]. As the authors show, this metric can be used to develop a relatively simple and easy-to-use classification tool that is nearly as effective as state-of-the-art machine-learning techniques requiring significantly more computational effort.

    1
    Figure 1: EMD quantifies the “distance” between different events by calculating the work needed to move the particles associated with one event (red) so that they match those associated with another even (blue). In this case, the two events are top quark jets plotted as a function of different sets of parameters (azimuthal angle and rapidity).

    Driven by the peculiarities of the problems they face, researchers have developed a wide palette of clever test statistics. Chi-squared and likelihood-ratio tests, which measure how well competing models fit a given a set of data, are well known examples. Astronomers sometimes use the mean integrated squared error (MISE), similar to chi-squared. MISE measures the overlap between two probability density functions (PDFs). It is well suited for comparing similar PDFs, but when the two functions are well separated, MISE produces tiny numbers: it would be difficult to determine whether, for instance, two Gaussian PDFs were separated by 10 standard deviations or 20 because the two functions have essentially zero overlap. MISE is not well suited as the basis of a classification tool for high-energy collider events, since even similar events could have weakly overlapping PDFs.

    The “earth mover’s distance” (EMD) [2], based on the so-called Wasserstein metric [3, 4], is an interesting alternative to MISE. It can be described as the minimum amount of energy needed to move a given “pile of earth” (that is, the first function) so that it turns into another pile (the second function). The EMD depends on the amount of earth that flows from the first pile to the second and on how far it flows. The EMD depends linearly on separation rather than exponentially, as MISE does, so the difference between 10 and 20 standard deviations is just a factor of 2. In other words, EMD emphasizes separation, rather than overlap (Fig. 1).

    Can the EMD be used to quantify the difference between two collider events, for example between one involving the production of two top quarks and another associated with just a bunch of hadronic jets? Classifying events requires comparing their features. One approach is to define an abstract mathematical space whose dimensions correspond to different features of an event. Using a Monte Carlo event generator, one can simulate events, tagging each one as containing top quarks or just jets. Then, a real collision event is located in this space, based on its features, and the tags of the nearby simulated events are examined. If they are mainly top quark events, then the real collision event is probably a top quark event too. The success of this nearest-neighbor classification scheme depends critically on how the metric measuring the distance between events is defined. The choice of metric is not obvious—how does one compare quantitatively a difference in momentum with a difference in polar angles, given that they have different units?

    Komiske and his colleagues suggest using the EMD as this distance metric [1]. In their example, they aim to distinguish hadronic W boson decays, such as those found in top quark events, from ordinary QCD jets. The authors compute the EMD by taking the particles in the W boson jet and transporting them to match the particles in the QCD jet. Calculating this distance requires a smart algorithm, but once the distance has been calculated, the nearest-neighbor algorithm is easy to apply, without any “training” or extensive optimization process. One simply evaluates the fraction of nearest neighbors that are tagged as W jets and classifies the real event accordingly. This simplicity stands in stark contrast to highly sophisticated deep learning techniques that take the particle momenta as direct inputs or that represent an event as an “image,” where one “pixel” corresponds to one calorimeter detector element. Komiske and his co-workers show that a simple EMD-based nearest-neighbor classifier performs nearly as well as advanced deep learning techniques.

    This paper introduces additional, exciting ideas. A collision event has structure at many levels and scales. First, there is the configuration of jets in the event, and second, there is the arrangement of particles inside a given jet. While jets and the particles within them are randomly distributed, they are not simply isotropic, and their nonuniform kinematic distributions contain interesting physics. Since the authors’ EMD is based on the individual particles in an event, one can expect that the EMD encodes information about such distributions. If so, can one use the EMD to distinguish, on a statistical basis, three subjets produced in the decay of a top quark decay from a single large jet, for example?

    Komiske and his co-workers address this question by introducing a mathematical quantity called the correlation dimension [5, 6]. This quantity relates physical effects that determine the event’s detailed structure to the scales of the event (e.g., energy or momentum). It turns out that the EMD captures structural details in a remarkable way. For instance, the authors show that top quark events have a richer structure than QCD multijet events at certain energy scales (on the order of the mass of the W boson), even when the gross features of the events are the same. As a second application of the correlation dimension, the authors study hadronic jets with the same energy but with a wide range of jet masses and show that the jets with high mass have a more elaborate internal structure than jets with low mass. This new approach may enable new studies of QCD and insights into jet formation—currently topics of great interest at the Large Hadron Collider.

    It will be interesting to see where the ideas and techniques presented in this short and thought-provoking paper will bring us. The new EMD-based metric may well lead to better event classification techniques that enable experimenters to discover new physics beyond the standard model. In addition, the application of the correlation dimension to their new metric might bring new insights into standard model physics, such as the formation and structure of hadronic jets.

    This research is published in Physical Review Letters.

    References

    P. T. Komiske, E. M. Metodiev, and J. Thaler, “Metric space of collider events,” Phys. Rev. Lett. 123, 041801 (2019).
    O. Pele and B. Taskar, “The tangent earth mover’s distance,” in Geometric Science of Information – First International Conference, Paris, France, August 2013, Proceedings, edited by F. Nielson and F. Barbaresco (Springer, Berlin, 2013), p. 397[Amazon][WorldCat].
    L. N. Wasserstein, “Markov processes over denumerable products of spaces describing large systems of automata,” Problems Inform. Transmission 5, 47 (1969).
    R. L. Dobrushin, “Prescribing a system of random variables by conditional distributions,” Theor. Probab. Appl. 15, 458 (1970).
    P. Grassberger and I. Procaccia, “Characterization of strange attractors,” Phys. Rev. Lett. 50, 346 (1983).
    B. Kégl, “Intrinsic dimension estimation using packing numbers,” in Advances in Neural Information Processing Systems 15, Proceedings of the 2002 Neural Information Processing Systems Conference, Vancouver, Canada, edited by S. Becker, S. Thrun, and K. Obermayer (MIT Press, Cambridge, 2003), p. 681[Amazon][WorldCat].

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Physicists are drowning in a flood of research papers in their own fields and coping with an even larger deluge in other areas of physics. How can an active researcher stay informed about the most important developments in physics? Physics highlights a selection of papers from the Physical Review journals. In consultation with expert scientists, the editors choose these papers for their importance and/or intrinsic interest. To highlight these papers, Physics features three kinds of articles: Viewpoints are commentaries written by active researchers, who are asked to explain the results to physicists in other subfields. Focus stories are written by professional science writers in a journalistic style and are intended to be accessible to students and non-experts. Synopses are brief editor-written summaries. Physics provides a much-needed guide to the best in physics, and we welcome your comments (physics@aps.org).

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: