Tagged: CERN LHC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:43 am on May 2, 2016 Permalink | Reply
    Tags: , , CERN LHC, , ,   

    From phys.org: “Physicists abuzz about possible new particle as CERN revs up” 


    May 2, 2016
    Jamey Keaten And Frank Jordans

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Scientists around the globe are revved up with excitement as the world’s biggest atom smasher—best known for revealing the Higgs boson four years ago—starts whirring again to churn out data that may confirm cautious hints of an entirely new particle.

    Higgs Boson Event
    Higgs Boson Event

    Such a discovery would all but upend the most basic understanding of physics, experts say.

    The European Center for Nuclear Research, or CERN by its French-language acronym, has in recent months given more oomph to the machinery in a 27-kilometer (17-mile) underground circuit along the French-Swiss border known as the Large Hadron Collider.

    In a surprise development in December, two separate LHC detectors each turned up faint signs that could indicate a new particle, and since then theorizing has been rife.

    “It’s a hint at a possible discovery,” said theoretical physicist Csaba Csaki, who isn’t involved in the experiments. “If this is really true, then it would possibly be the most exciting thing that I have seen in particle physics in my career—more exciting than the discovery of the Higgs itself.”

    After a wintertime break, the Large Hadron Collider, or LHC, reopened on March 25 to prepare for a restart in early May. CERN scientists are doing safety tests and scrubbing clean the pipes before slamming together large bundles of particles in hopes of producing enough data to clear up that mystery. Firm answers aren’t expected for weeks, if not until an August conference of physicists in Chicago known as ICHEP.

    On Friday, the LHC was temporarily immobilized by a weasel, which invaded a transformer that helps power the machine and set off an electrical outage. CERN says it was one of a few small glitches that will delay by a few days plans to start the data collection at the $4.4 billion collider.

    The 2012 confirmation of the Higgs boson, dubbed the “God particle” by some laypeople, culminated a theory first floated decades earlier. The “Higgs” rounded out the Standard Model of physics, which aims to explain how the universe is structured at the infinitesimal level.

    The LHC’s Atlas and Compact Muon Solenoid particle detectors in December turned up preliminary readings that suggested a particle not accounted for by the Standard Model might exist at 750 Giga electron Volts. This mystery particle would be nearly four times more massive than the top quark, the most massive particle in the model, and six times more massive than the Higgs, CERN officials say.


    CERN/CMS Detector
    CERN/CMS Detector

    The Standard Model has worked well, but has gaps notably about dark matter, which is believed to make up one-quarter of the mass of the universe.

    The Standard Model of elementary particles , with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles , with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Theorists say the December results, if confirmed, could help elucidate that enigma; or it could signal a graviton—a theorized first particle with gravity—or another boson, even hint of a new dimension.

    More data is needed to iron those possibilities out, and even then, the December results could just be a blip. But with so much still unexplained, physicists say discoveries of new particles—whether this year or later—may be inevitable as colliders get more and more powerful.

    Dave Charlton, who heads the Atlas team, said the December results could just be a “fluctuation” and “in that case, really for science, there’s not really any consequence … At this point, you won’t find any experimentalist who will put any weight on this: We are all very largely expecting it to go away again.”

    “But if it stays around, it’s almost a new ball game,” said Charlton, an experimental physicist at the University of Birmingham in Britain.

    The unprecedented power of the LHC has turned physics on its head in recent years. Whereas theorists once predicted behaviors that experimentalists would test in the lab, the vast energy being pumped into CERN’s collider means scientists are now seeing results for which there isn’t yet a theoretical explanation.

    “This particle—if it’s real—it would be something totally unexpected that tells us we’re missing something interesting,” he said.

    Whatever happens, experimentalists and theorists agree that 2016 promises to be exciting because of the sheer amount of data pumped out from the high-intensity collisions at record-high energy of 13 Tera electron Volts, a level first reached on a smaller scale last year, and up from 8 TeVs previously. (CERN likens 1 TeV to the energy generated by a flying mosquito: That may not sound like much, but it’s being generated at a scale a trillion times smaller.)

    In energy, the LHC will be nearly at full throttle—its maximum is 14 TeV—and over 2,700 bunches of particles will be in beams that collide at the speed of light, which is “nearly the maximum,” CERN spokesman Arnaud Marsollier said. He said the aim is to produce six times more collisions this year than in 2015.

    “When you open up the energies, you open up possibilities to find new particles,” he said. “The window that we’re opening at 13 TeV is very significant. If something exists between 8 and 13 TeV, we’re going to find it.”

    Still, both branches of physics are trying to stay skeptical despite the buzz that’s been growing since December.

    Csaki, a theorist at Cornell University in Ithaca, New York, stressed that the preliminary results don’t qualify as a discovery yet and there’s a good chance they may turn out not to be true. The Higgs boson had been predicted by physicists for a long time before it was finally confirmed, he noted.

    “Right now it’s a statistical game, but the good thing is that there will be a lot of new data coming in this year and hopefully by this summer we will know if this is real or not,” Csaki said, alluding to the Chicago conference. “No vacation in August.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

  • richardmitnick 7:57 am on April 14, 2016 Permalink | Reply
    Tags: , , CERN LHC, ,   

    From FNAL’s Don Lincoln on livescience: “Collider Unleashed! The LHC Will Soon Hit Its Stride” 


    April 12, 2016

    FNAL Don Lincoln
    Don Lincoln, Senior Scientist, Fermi National Accelerator Laboratory; Adjunct Professor of Physics, University of Notre Dame

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    If you’re a science groupie and would love nothing better than for a cornerstone scientific theory to be overthrown and replaced with something newer and better, then 2016 might well be your year. The world’s largest particle accelerator, the Large Hadron Collider (LHC), is resuming operations after a pause during the winter months, when the cost for electricity in France is highest.

    So why is it such a big deal that LHC coming back on line? It’s because this is the year the accelerator will operate at something approaching its design specifications. Scientists will smash the gas pedal to the floor, crank the fire hose wide open, spin the amplifier button to eleven or enact whatever metaphor you like. This year is the first real year of full-scale LHC operations.

    A particle smasher reborn

    Now if you actually are a science groupie, you know what the LHC is and have probably heard about some of its accomplishments. You know it smashes together two beams of protons traveling at nearly the speed of light. You know scientists using the LHC found the Higgs boson.

    CERN ATLAS Higgs Event
    CERN ATLAS Higgs Event

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    You know that this marvel is the largest scientific device ever built.

    So what’s different now? Well, let’s go back in time to 2008, when the LHC circulated its first beams. At the time, the world’s premier particle accelerator was the U.S. Department of Energy’s Fermilab Tevatron, which collided beams at a whopping 2 trillion electron volts (TeV) of energy and with a beam brightness of about 2 × 1032 cm-2 s-1.

    FNAL/Tevatron map
    FNAL/Tevatron map

    FNAL/Tevatron CDF
    FNAL/Tevatron CDF detectorFNAL/DZero detector
    FNAL/DZero detector

    The technical term for beam brightness is “instantaneous luminosity,” and basically it’s a density. More precisely, when a beam passes through a target, the instantaneous luminosity (L) is the number of particles per second in a beam that pass a location (ΔNB/Δt) divided by the area of the beam (A), multiplied by the number of targets (NT), L = ΔNB/Δt × (1/A) × NT. (And the target can be another beam.)

    The simplest analogy that will help you understand this quantity is a light source and a magnifying glass. You can increase the “luminosity” of the light by turning up the brightness of the light source or by focusing the light more tightly. It is the same way with a beam. You can increase the instantaneous luminosity by increasing the number of beam or target particles, or by concentrating the beam into a smaller area.

    The LHC was built to replace the Tevatron and trounce that machine’s already-impressive performance numbers.

    [If our USA Congress was not filled with idiots, we would have built in Texas the Superconducting Super Collider and not lost this HEP race.]

    The new accelerator was designed to collide beams at a collision energy of 14 TeV and to have a beam brightness — instantaneous luminosity — of at least 100 × 1032 cm-2 s-1. So the beam energy was to be seven times higher, and the beam brightness would increase 50- to 100-fold.

    Sadly, in 2008, a design flaw was uncovered in the LHC when an electrical short caused severe damage, requiring two years to repair . Further, when the LHC actually did run, in 2010, it operated at half the design energy (7 TeV) and at a beam brightness basically the same as that of the Fermilab Tevatron. The lower energy was to give a large safety margin, as the design flaw had been only patched, not completely reengineered.

    The situation improved in 2011 when the beam brightness got as high as 30 × 1032 cm-2 s-1, although with the same beam energy. In 2012, the beam energy was raised to 8 TeV, and the beam brightness was higher still, peaking at about 65 × 1032 cm-2 s-1.

    The LHC was shut down during 2013 and 2014 to retrofit the accelerator to make it safe to run at closer to design specifications. The retrofits consisted mostly of additional industrial safety measures that allowed for better monitoring of the electrical currents in the LHC. This helps ensure there are no electrical shorts and that there is sufficient venting. The venting guarantees no catastrophic ruptures of the LHC magnets (which steer the beams) in the event that cryogenic liquids — helium and nitrogen — in the magnets warm up and turn into a gas. In 2015, the LHC resumed operations, this time at 13 TeV and with a beam brightness of 40 × 1032 cm-2 s-1.

    So what’s expected in 2016?

    The LHC will run at 13 TeV and with a beam brightness that is expected to approach 100 × 1032 cm-2 s-1 and possibly even slightly exceed that mark. Essentially, the LHC will be running at design specifications.

    In addition, there is a technical change in 2016. The protons in the LHC beams will be spread more uniformly around the ring, thus reducing the number of protons colliding simultaneously, resulting in better data that is easier to interpret.

    At a technical level, this is kind of interesting. A particle beam isn’t continuous like a laser beam or water coming out of a hose. Instead, the beam comes in a couple of thousand distinct “bunches.” A bunch looks a little bit like a stick of uncooked spaghetti, except it is about a foot long and much thinner — about 0.3 millimeters, most of the time. These bunches travel in the huge 16-mile-long (27 kilometers) circle that is the LHC, with each bunch separated from the other bunches by a distance that (until now) has been about 50 feet (15 meters).

    The technical change in 2016 is to take the same number of beam protons (roughly 3 × 1014 protons) and split them up into 2,808 bunches, each separated not by 50 feet, but by 25 feet (7.6 m). This doubles the number of bunches, but cuts the number of protons in each bunch in half. (Each bunch contains about 1011 protons.)

    Because the LHC has the same number of protons but separated into more bunches, that means when two bunches cross and collide in the center of the detector, there are fewer collisions per crossing. Since most collisions are boring and low-energy affairs, having a lot of them at the same time that an interesting collision occurs just clutters up the data.

    Ideally, you’d like to have only an interesting collision and no simultaneous boring ones. This change of bunch separation distance from 50 feet to 25 feet brings the data collection closer to ideal.

    Luminous beams

    Another crucial design element is the integrated beam. Beam brightness (instantaneous luminosity) is related to the number of proton collisions per second, while integrated beam (integrated luminosity) is related to the total number of collisions that occur as the two counter-rotating beams continually pass through the detector. Integrated luminosity is something that adds up over the days, months and years.

    The unit of integrated luminosity is a pb-1. This unit is a bit confusing, but not so bad. The “b” in “pb” stands for a barn (more on that in a moment). A barn is 10-24 cm2. A picobarn (pb) is 10-36 cm2. The term “barn” is a unit of area and comes from another particle physics term called a cross section, which is related to how likely it is that two particles will interact and generate a specific outcome. Two objects that have large effective area will interact easily, while objects with a small effective area will interact rarely.

    An object with an area of a barn is a square with a length of 10-12 cm. That’s about the size of the nucleus of a uranium atom.

    During World War II, physicists at Purdue University in Indiana were working with uranium and needed to mask their work for security reasons. So they invented the term “barn,” defining it as an area about the size of a uranium nucleus. Given how big this area is in the eyes of nuclear and particle physicists, the Purdue scientists were co-opting the phrase “as big as a barn.” In the luminosity world, with its units of (1/barn), small numbers mean more luminosity.

    This trend is evident in the integrated luminosity seen in the LHC each year as scientists improved their ability to operate the accelerator. The integrated luminosity in 2010 was 45 pb-1. In 2011 and 2012, it was 6,100 pb-1 and 23,300 pb-1, respectively. As time went on, the accelerator ran more reliably, resulting in far higher numbers of recorded collisions.

    Because the accelerator had been re-configured during the 2013 to 2014 shutdown, the luminosity was lower in 2015, coming in at 4,200 pb-1, although, of course, at the much higher beam energy. The 2016 projection could be as high as 35,000 pb-1. The predicted increase merely reflects the accelerator operators’ increased confidence in their ability to operate the facility.

    This means in 2016, we could actually record eight times as much data as we did in 2015. And it is expected that 2017 will bring even higher performance.

    Illuminating new science

    Let’s think about what these improvements mean. When LHC first collided beams, in 2010, the Higgs boson was still to be observed.

    Higgs Boson Event
    Higgs Boson Event

    On the other hand, the particle was already predicted, and there was good circumstantial evidence to expect that the Higgs would be discovered. And, without a doubt, it must be admitted that the discovery of the Higgs boson was an enormous scientific triumph.

    But confirming previously predicted particles, no matter how impressive, is not why the LHC was built.

    Scientists’ current theory of the particle world is called the Standard Model, and it was developed in the late 1960s, half a century ago.

    The Standard Model of elementary particles , with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles , with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth

    While it is an incredibly successful theory, it is known to have holes. Although it explains why particles have mass, it doesn’t explain why some particles have more mass than others. It doesn’t explain why there are so many fundamental particles, given that only a handful of them are needed to constitute the ordinary matter of atoms and puppies and pizzas. It doesn’t explain why the universe is composed solely of matter, when the theory predicts that matter and antimatter should exist in equal quantities. It doesn’t identify dark matter, which is five times more prevalent than ordinary matter and is necessary to explain why galaxies rotate in a stately manner and don’t rip themselves apart.

    When you get right down to it, there is a lot the Standard Model doesn’t explain. And while there are tons of ideas about new and improved theories that could replace it, ideas are cheap. The trick is to find out which idea is right.

    That’s where the LHC comes in. The LHC can explore what happens if we expose matter to more and more severe conditions. Using Einstein’s equation E = mc2, we can see how the high-collision energies only achievable in the LHC are converted into forms of matter never before seen. We can sift through the LHC data to find clues that point us in the right direction to hopefully figure out the next bigger and more effective theory. We can take another step toward our ultimate goal of finding a theory of everything.

    With the LHC now operating at essentially design spec, we can finally use the machine to do what we built it for: to explore new realms, to investigate phenomena never before seen and, stealing a line from my favorite television show, “to boldly go where no one has gone before.” We scientists are excited. We’re giddy. We’re pumped. In fact, there can be but one way to express how we view this upcoming year:

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 3:23 pm on April 7, 2016 Permalink | Reply
    Tags: , , , CERN LHC, , , , , ,   

    From Symmetry: “Physicists build ultra-powerful accelerator magnet” 

    Symmetry Mag


    Sarah Charley

    Magnet built for LHC

    The next generation of cutting-edge accelerator magnets is no longer just an idea. Recent tests revealed that the United States and CERN have successfully co-created a prototype superconducting accelerator magnet that is much more powerful than those currently inside the Large Hadron Collider.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles

    Engineers will incorporate more than 20 magnets similar to this model into the next iteration of the LHC, which will take the stage in 2026 and increase the LHC’s luminosity by a factor of ten. That translates into a ten-fold increase in the data rate.

    “Building this magnet prototype was truly an international effort,” says Lucio Rossi, the head of the High-Luminosity (HighLumi) LHC project at CERN. “Half the magnetic coils inside the prototype were produced at CERN, and half at laboratories in the United States.”

    During the original construction of the Large Hadron Collider, US Department of Energy national laboratories foresaw the future need for stronger LHC magnets and created the LHC Accelerator Research Program (LARP): an R&D program committed to developing new accelerator technology for future LHC upgrades.

    MQXF1 quadrupole 1.5-meter prototype magnet sits at Fermilab before testing.
    MQXF1 quadrupole 1.5-meter prototype magnet sits at Fermilab before testing. G. Ambrosio (US-LARP and Fermilab), P. Ferracin and E. Todesco (CERN TE-MSC)

    This 1.5-meter-long model, which is a fully functioning accelerator magnet, was developed by scientists and engineers at Fermilab [FNAL], Brookhaven National Laboratory [BNL], Lawrence Berkeley National Laboratory [LBL], and CERN.

    FNAL II photo

    BNL Logo (2)

    LBL Big


    The magnet recently underwent an intense testing program at Fermilab, which it passed in March with flying colors. It will now undergo a rigorous series of endurance and stress tests to simulate the arduous conditions inside a particle accelerator.

    This new type of magnet will replace about 5 percent of the LHC’s focusing and steering magnets when the accelerator is converted into the High-Luminosity LHC, a planned upgrade which will increase the number and density of protons packed inside the accelerator. The HL-LHC upgrade will enable scientists to collect data at a much faster rate.

    The LHC’s magnets are made by repeatedly winding a superconducting cable into long coils. These coils are then installed on all sides of the beam pipe and encased inside a superfluid helium cryogenic system. When cooled to 1.9 Kelvin, the coils can carry a huge amount of electrical current with zero electrical resistance. By modulating the amount of current running through the coils, engineers can manipulate the strength and quality of the resulting magnetic field and control the particles inside the accelerator.

    The magnets currently inside the LHC are made from niobium titanium, a superconductor that can operate inside a magnetic field of up to 10 teslas before losing its superconducting properties. This new magnet is made from niobium-three tin (Nb3Sn), a superconductor capable of carrying current through a magnetic field of up to 20 teslas.

    “We’re dealing with a new technology that can achieve far beyond what was possible when the LHC was first constructed,” says Giorgio Apollinari, Fermilab scientist and Director of US LARP. “This new magnet technology will make the HL-LHC project possible and empower physicists to think about future applications of this technology in the field of accelerators.”

    High-Luminosity LHC coil
    High-Luminosity LHC coil similar to those incorporated into the successful magnet prototype shows the collaboration between CERN and the LHC Accelerator Research Program, LARP.
    Photo by Reidar Hahn, Fermilab

    This technology is powerful and versatile—like upgrading from a moped to a motorcycle. But this new super material doesn’t come without its drawbacks.

    “Niobium-three tin is much more complicated to work with than niobium titanium,” says Peter Wanderer, head of the Superconducting Magnet Division at Brookhaven National Lab. “It doesn’t become a superconductor until it is baked at 650 degrees Celsius. This heat-treatment changes the material’s atomic structure and it becomes almost as brittle as ceramic.”

    Building a moose-sized magnet from a material more fragile than a teacup is not an easy endeavor. Scientists and engineers at the US national laboratories spent 10 years designing and perfecting a new and internationally reproducible process to wind, form, bake and stabilize the coils.

    “The LARP-CERN collaboration works closely on all aspects of the design, fabrication and testing of the magnets,” says Soren Prestemon of the Berkeley Center for Magnet Technology at Berkeley Lab. “The success is a testament to the seamless nature of the collaboration, the level of expertise of the teams involved, and the ownership shown by the participating laboratories.”

    This model is a huge success for the engineers and scientists involved. But it is only the first step toward building the next big supercollider.

    “This test showed that it is possible,” Apollinari says. “The next step is it to apply everything we’ve learned moving from this prototype into bigger and bigger magnets.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 10:58 am on March 24, 2016 Permalink | Reply
    Tags: , CERN LHC, , New Software,   

    From Symmetry: “The next big LHC upgrade? Software.” 

    Symmetry Mag

    Sarah Charley

    Compatible and sustainable software could revolutionize high-energy physics research.

    Eamonn Maguire / Antarctic Design

    The World Wide Web may have been invented at CERN, but it was raised and cultivated abroad. Now a group of Large Hadron Collider physicists are looking outside academia to solve one of the biggest challenges in physics—creating a software framework that is sophisticated, sustainable and more compatible with rest of the world.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    “The software we used to build the LHC and perform our analyses is 20 years old,” says Peter Elmer, a physicist at Princeton University. “Technology evolves, so we have to ask, does our software still make sense today? Will it still do what we need 20 or 30 years from now?”

    Elmer is part of a new initiative funded by the National Science Foundation called the DIANA/HEP project, or Data Intensive ANAlysis for High Energy Physics. The DIANA project has one main goal: improve high-energy physics software by incorporating best practices and algorithms from other disciplines.

    “We want to discourage physics from re-inventing the wheel,” says Kyle Cranmer, a physicist at New York University and co-founder of the DIANA project. “There has been an explosion of high-quality scientific software in recent years. We want to start incorporating the best products into our research so that we can perform better science more efficiently.”

    DIANA is the first project explicitly funded to work on sustainable software, but not alone in the endeavor to improve the way high energy physicists perform their analyses. In 2010 physicist Noel Dawe started the rootpy project, a community-driven initiative to improve the interface between ROOT and Python.

    “ROOT is the central tool that every physicist in my field uses,” says Dawe, who was a graduate student at Simon Fraser University when he started rootpy and is currently a fellow at the University of Melbourne. “It does quite a bit, but sometimes the best tool for the job is something else. I started rootpy as a side project when I was a graduate student because I wanted to find ways to interface ROOT code with other tools.”

    Physicists began developing ROOT in the 1990s in the computing language C++. This software has evolved a lot since then, but has slowly become outdated, cumbersome and difficult to interface with new scientific tools written in languages such as Python or Julia. C++ has also evolved over the course of the last twenty years, but physicists must maintain a level of backward compatibility in order to preserve some of their older code.

    “It’s in a bubble,” says Gilles Louppe, a machine learning expert working on the DIANA project. “It’s hard to get in and it’s hard to get out. It’s isolated from the rest of the world.”

    Before coming to CERN, Louppe was a core developer of the machine learning platform scikit-learn, an open source library of versatile data mining and data analysis tools. He is now a postdoctoral researcher at New York University and working closely with physicists to improve the interoperability between common LHC software products and the scientific python ecosystem. Improved interoperability will make it easier for physicists to benefit from global advancements in machine learning and data analysis.

    “Software and technology are changing so fast,” Cranmer says. “We can reap the rewards of industry and everything the world is coming up with.”

    One trend that is spreading rapidly in the data science community is the computational notebook: a hybrid of analysis code, plots and narrative text. Project Jupyter is developing the technology that enables these notebooks. Two developers from the Jupyter team recently visited CERN to work with the ROOT team and further develop the ROOT version, ROOTbook.

    “ROOTbooks represent a confluence of two communities and two technologies,” says Cranmer.

    Physics patterns

    To perform tasks such as identifying and tagging particles, physicists use machine learning. They essentially train their LHC software to identify certain patterns in the data by feeding it thousands of simulations. According to Elmer, this task is like one big “needle in a haystack” problem.

    “Imagine the book Where’s Waldo. But instead of just looking for one Waldo in one picture, there are many different kinds of Waldos and 100,000 pictures every second that need to be analyzed.”

    But what if these programs could learn to recognize patterns on their own with only minimal guidance? One small step outside the LHC is a thriving multi-billion dollar industry doing just that.

    “When I take a picture with my iPhone, it instantly interprets the thousands of pixels to identify people’s faces,” Elmer says. Companies like Facebook and Google are also incorporating more and more machine learning techniques to identify and catalogue information so that it is instantly accessible anywhere in the world.

    Organizations such as Google, Facebook and Russia’s Yandex are releasing more and more tools as open source. Scientists in other disciplines, such as astronomy, are incorporating these tools into the way they do science. Cranmer hopes that high-energy physics will move to a model that makes it easier to take advantage of these new offerings as well.

    “New software can expand the reach of what we can do at the LHC,” Cranmer says. “The potential is hard to guess.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 12:12 pm on March 22, 2016 Permalink | Reply
    Tags: , , CERN LHC,   

    From Symmetry: “Why are particle accelerators so large?” 

    Symmetry Mag


    Sarah Charley

    The Large Hadron Collider at CERN is a whopping 27 kilometers in circumference. Edda Gschwendtner, physicist and project leader for CERN’s plasma wakefield acceleration experiment (AWAKE), explains why scientists use such huge machines.

    Access mp4 video here .

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles

    We can only see so much with the naked eye. To see things that are smaller, we use a microscope, and to see things that are further away, we use a telescope. The more powerful the tool, the more we can see.

    Particle accelerators are tools that allow us probe both the fundamental components of nature and the evolution and origin of all matter in the visible (and maybe even the invisible?) universe. The more powerful the accelerator, the further we can see into the infinitely small and the infinitely large.

    You can think about particle accelerators like a racetrack for particles. Racecars don’t start out going 200 miles per hour—they must gradually accelerate over time on either a large circular racetrack or a long, straight road.

    In physics, these two types of “tracks” are circular accelerators and linear accelerators.

    Particles in circular accelerators gradually gain energy as they race through an accelerating structure at a certain position in the ring. For instance, the protons in the LHC make 11,000 laps every second for 20 minutes before they reach their collision energy. During their journey, magnets guide the particles around the bends in the accelerator and keep them on course.

    But just like a car on a curvy mountain road, the particles’ energy is limited by the curves in the accelerators. If the turns are too tight or the magnets are too weak, the particles will eventually fly off course.

    Linear accelerators don’t have this problem, but they face an equally challenging aspect: particles in linear accelerators only have the length of the track where they pass through accelerating structures to reach their desired energy. Once they reach the end, that’s it.

    So if we want to look deeper into matter and further back toward the start of the universe, we have to go higher in energy, which means we need more powerful tools.

    One option is to build larger accelerators—linear accelerators hundreds of miles long or giant circular accelerators with long, mellow turns.

    We can also invest in our technology. We can develop accelerating structure techniques to rapidly and effectively accelerate particles in linear accelerators over a short distance. We can also design and build incredibly strong magnets—stronger than anything that exists today—that can bend ultra-high energy particles around the turns in circular accelerators.

    Realistically, the future tools we use to look into the infinitely small and infinitely large will involve a combination of technological advancement and large-scale engineering to bring us closer to understanding the unknown.

    Have a burning question about particle physics? Let us know via email or Twitter (using the hashtag #AskSymmetry). We might answer you in a future video!

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 11:45 am on March 22, 2016 Permalink | Reply
    Tags: , , CERN LHC, The LHC wakes up from its winter break   

    From CERN: “The LHC wakes up from its winter break” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead


    21 Mar 2016
    Harriet Kim Jarlett

    These are distribution feed boxes, which bring power to the magnets.There are 52 of them around the LHC, of 4 different sizes and characteristics. Big, copper cables transfer the current in tiny superconducting cables. (Image: Maximillien Brice/ CERN)

    It’s March already, and time for the LHC to wake up from its short winter break.The LHC was the last machine to be handed back to operators after the completion of maintenance work carried out during the Year-End Technical Stop (YETS) that had started on 14 December, 2015.

    During the past eleven weeks several maintenance activities took place in all the accelerators and beam lines. They included the replacement of the LHC beam absorbers for injection (TDIs) that are used to absorb the SPS beam if a problem occurs, providing vital protection for the LHC, maintenance at several points of the cryogenic system, the replacement of 18 magnets in the Super Proton Synchrotron and an extensive campaign to identify and remove thousands of obsolete cables.

    The YETS also gave the experiments the opportunity to carry out repairs and maintenance work on their detectors. In particular, at CMS, the cold box, which had caused problems for the experiment’s magnet during 2015, was cleaned and a few water leaks on the site were fixed.

    Powering tests began on 4 March and finished on 18 March 2016, marking the initial step to the first beams of the year. It was a tight schedule, with the tests scheduled for just 14 days before moving on to machine checkout and then commissioning with beam around Easter. During that time, over 8500 tests are being performed on the 1600 circuits. Even though the tests were executed automatically, the experts in charge of running and analysing them needed to pay careful attention to the thousands of multi-coloured signals on their screens.

    LHC machine operators at work during powering tests of the LHC superconducting circuits. Powering is the first milestone in the LHC machine restart process. In this stage, the LHC operators perform tests with current of the superconducting circuits with the aim of checking the protection functionalities, the powering chain and the capability of the circuits to reach the values needed for operation. In total, more than 8500 test steps were performed on the 1600 circuits in less than 2 weeks. (Image: Maximillien Brice/CERN)

    Last year marked a great start to Run 2. The objective was to establish proton-proton collisions at 13 TeV with 25 nanosecond bunch spacing.

    2015 was a learning year for CERN, and by the time the machines were switched off for the end-of-year break a great deal was known about how to operate the LHC at the new higher energy, with shorter bunch spacing, allowing physicists to get many more bunches of particles into the beam and thereby deliver more data to the experiments.

    “This was a great result but, to put it into context, the goal for the whole of Run 2 is to deliver 100 fb-1 by the end of 2018, so we still have a long way to go”, says Frédérick Bordry, Director for accelerators and technology.

    “It would be easy to think that LHC running is becoming routine, and in many ways it is. Nevertheless, the year-end technical stop is a vital part of the running cycle and much has been accomplished over this short winter break,” he concludes.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier




    CERN CMS New

    CERN LHCb New II


    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

  • richardmitnick 7:32 am on February 23, 2016 Permalink | Reply
    Tags: , , CERN LHC,   

    From LBL: “Updated Workflows for New LHC” 

    Berkeley Logo

    Berkeley Lab

    February 22, 2016
    Linda Vu 510-495-2402

    After a massive upgrade, the Large Hadron Collider (LHC), the world’s most powerful particle collider is now smashing particles at an unprecedented 13 tera-electron-volts (TeV)—nearly double the energy of its previous run from 2010-2012. In just one second, the LHC can now produce up to 1 billion collisions and generate up to 10 gigabytes of data in its quest to push the boundaries of known physics. And over the next decade, the LHC will be further upgraded to generate about 10 times more collisions and data.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    To deal with the new data deluge, researchers working on one of the LHC’s largest experiments—ATLAS—are relying on updated workflow management tools developed primarily by a group of researchers at the Lawrence Berkeley National Laboratory (Berkeley Lab). Papers highlighting these tools were recently published in the Journal of Physics: Conference Series.

    CERN ATLAS Higgs Event

    “The issue with High Luminosity LHC is that we are producing ever-increasing amounts of data, faster than Moore’s Law and cannot actually see how we can do all of the computing that we need to do with the current software that we have,” says Paolo Calafiura, a scientist in Berkeley Lab’s Computational Research Division (CRD). “If we don’t either find new hardware to run our software or new technologies to make our software run faster in ways we can’t anticipate, the only choice that we have left is to be more selective in the collision events that we record. But, this decision will of course impact the science and nobody wants to do that.”

    To tackle this problem, Calafiura and his colleagues of the Berkeley Lab ATLAS Software group are developing new software tools called Yoda and AthenaMP to speed up the analysis of the data by leveraging the capabilities of next-generation Department of Energy (DOE) supercomputers like the National Energy Research Scientific Computing Center’s (NERSC’s) Cori system, as well as DOE’s current Leadership Computing Facilities, to analyze ATLAS data.

    NERSC CRAY Cori supercomputer
    NERSC Cray Cori supercomputer

    Yoda: Treating Single Supercomputers like the LHC Computing Grid

    Around the world, researchers rely on the LHC Computing Grid to process the petabytes of data collected by LHC detectors every year. The grid comprises 170 networked computing centers in 36 countries. CERN’s computing center, where the LHC is located, is ‘Tier 0’ of the grid. It processes the raw LHC data, and then divides it into chunks for the other Tiers. Twelve ‘Tier 1’ computing centers then accept the data directly from CERN’s computers, further process the information and then break it down into even more chunks for the hundreds of computing centers further down the grid. Once a computer finishes its analysis, it sends the findings to a centralized computer and accepts a new chunk of data.

    Like air traffic controllers, special software manages workflow on the computing grid for each of the LHC experiments. The software is responsible for breaking down the data, directing the data to its destination, telling systems on the grid when to execute an analysis and when to store information. To deal with the added deluge of data from the LHC’s upgraded ATLAS experiment, Vakhtang Tsulaia from the Berkeley Lab’s ATLAS Software group added another layer of software to the grid called Yoda Event Service system.

    The researchers note that the idea with Yoda is to replicate the LHC Computing Grid workflow on a supercomputer. So as soon as a job arrives at the supercomputer, Yoda will breakdown the data chunk into even smaller units, representing individual events or event ranges, and then assign those jobs to different compute nodes. Because only the portion of the job that will be processed is sent to the compute node, computing resources no longer need to stage the entire file before executing a job, so processing happens relatively quickly.

    To efficiently take advantage of available HPC resources, Yoda is also flexible enough to adapt to a variety of scheduling options—from back filling to large time allocations. After processing the individual events or event ranges, Yoda saves the output to the supercomputer’s shared file system so that these jobs can be terminated at anytime with minimal data losses. This means that Yoda jobs can now be submitted to the HPC batch queue in back filling mode. So if the supercomputer is not utilizing all of its cores for a certain amount of time, Yoda can automatically detect that and submit a properly sized job to the batch queue to utilize those resources.

    “Yoda acts like a daemon that is constantly submitting jobs to take advantage of available resources, this is what we call opportunistic computing,” says Calafiura.

    In early 2015 the team tested Yoda’s performance by running ATLAS jobs from the previous LHC run on NERSC’s Edison supercomputer and successfully scaled up to 50,000 computer processor cores.

    LBL NERSC Edison supercomputer
    NERSC Cray Edison supercomputer

    AthenaMP: Adapting ATLAS Workloads for Massively Parallel Systems

    In addition to Yoda, the Berkeley Lab ATLAS software group also developed the AthenaMP software that allows the ATLAS reconstruction, simulation and data analysis framework to run efficiently on massively parallel systems.

    “Memory has always been a scare resource for ATLAS reconstruction jobs. In order to optimally exploit all available CPU-cores on a given compute node, we needed to have a mechanism that would allow the sharing of memory pages between processes or threads,” says Calafiura.

    AthenaMP addresses the memory problem by leveraging the Linux fork and copy-on-write mechanisms. So when a node receives a task to process, the job is initialized on one core and sub-processes are forked to other cores, which then process all of the events assigned to the initial task. This strategy allows for the sharing of memory pages between event processors running on the same compute node.

    By running ATLAS reconstruction in one AthenaMP job with several worker processes, the team notes that they achieved a significantly reduced overall memory footprint when compared to running the same number of independent serial jobs. And, for certain configurations of the ATLAS production jobs they’ve managed to reduce the memory usage by a factor of two.

    “Our goal is to get onto more hardware and these tools help us do that. The massive scale of many high performance systems means that even a small fraction of computing power can yield large returns in processing throughput for high energy physics,” says Calafiura.

    This work was supported by DOE’s Office of Science.

    Read the papers:

    Fine grained event processing on HPCs with the ATLAS Yoda system

    Running ATLAS workloads within massively parallel distributed applications using Athena Multi-Process framework (AthenaMP): http://iopscience.iop.org/article/10.1088/1742-6596/664/7/072050

    About Computing Sciences at Berkeley Lab

    The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy’s research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.

    ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 6,000 scientists at national laboratories and universities, including those at Berkeley Lab’s Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

  • richardmitnick 5:11 pm on December 22, 2015 Permalink | Reply
    Tags: , , CERN LHC, Fabiola Gianotti, , ,   

    From Nature: “CERN’s next director-general on the LHC and her hopes for international particle physics” 

    Nature Mag

    22 December 2015
    Elizabeth Gibney

    Fabiola Gianotti talks to Nature ahead of taking the helm at Europe’s particle-physics laboratory on 1 January.

    Fabiola Gianotti is the incoming director-general of CERN. Maximilien Brice/CERN

    Fabiola Gianotti, the Italian physicist who announced the discovery of the Higgs boson in 2012, will from 1 January take charge at CERN, the laboratory near Geneva, Switzerland, where the particle was found.

    Gianotti spoke to Nature ahead of taking up the post, to discuss hints of new physics at the upgraded Large Hadron Collider (LHC), China’s planned accelerators and CERN’s worldwide ambitions — as well as how to deal with egos.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    How excited should we be about the latest LHC results, which already hint at signals that could turn out to be due to new physics phenomena?

    At the moment, experiments are seeing some fluctuations and hints, which, if they are due to signals from new physics, will next year consolidate with the huge amount of data the LHC will deliver. On the other hand, if they are just fluctuations, they will disappear. We have to be patient. In addition to looking for new physics, we are going to study the Higgs boson with very high precision.

    Will any of the hints that we’ve already seen be directing the physicists’ searches?

    I don’t think that the direction of exploration is being guided by the hints people see here and there. The correct approach is to be totally open and not be driven by our prejudices, because we don’t know where new physics is, or how it will look.

    Following the LHC’s energy upgrade, data collection in the 2015 run has been slower than hoped. How would you characterize it so far?

    Run 2 has been extremely successful. We have recorded about 4 inverse femtobarns of data [roughly equivalent to 400 trillion proton–proton collisions]. The initial goal was between 8 and 10 femtobarns, so it’s less. However, a huge number of challenges have been addressed and solved. So for me, this is more important than accumulating collisions. We could have accumulated more, but only by not addressing the challenges that will allow us to make a big jump in terms of intensity of the beams next year.

    In 2015, one LHC paper had more than 5,000 authors. There must be some people on such experiments who want more credit for their efforts.
    How do you deal with the clash of egos?

    I think the collaborations accept very well this idea that everybody signs the paper, and I am also a strong supporter of that. The reason is simple: you can be the guy who has a good idea to do a very cute analysis, so get very nice results. But you would not have been able to do the analysis if many other people had not built the detectors that gave you the data. None of these experiments is a one-man show, they are the work of thousands of people who have all contributed in their domain and all equally deserve to sign the paper.

    I hope that universities, advancement committees and boards that hire people understand that just because there are many authors, that does not mean the individual did not make an important contribution.

    CERN is currently at the heart of international particle physics, but China is designing a future collider that could succeed the LHC after 2035. Do you think that China could become the world’s centre for particle physics in the 2040s?

    At the moment there are many conceptual design studies for future big accelerators around the world. Of course conceptual studies are important, but there is a big step between studies and future reality. I think it is very good that all regions in the world show an interest and commitment to thinking about the future of particle physics. It’s a very good sign of a healthy discipline.

    Is there a chance that China might become a CERN member?

    Before becoming a full member, you become an associate member, and associate membership is something that can be conceived [for China]. So we will see in the coming years if this can become a reality. It’s an interesting option to explore.

    Do you plan to encourage more countries to become CERN members?

    Of course. A lot has been done since 2010 to enlarge CERN membership, in terms of associate members in particular, but also [full] members: we got Israel, for instance, and soon we will get Romania. I will continue along this direction.

    Some people think that future governments will be unwilling to fund larger and more expensive facilities. Do you think a collider bigger than the LHC will ever be built? And will it depend on the LHC finding something new?

    The outstanding questions in physics are important and complex and difficult, and they require the deployment of all the approaches the discipline has developed, from high-energy colliders to precision experiments and cosmic surveys. High-energy accelerators have been our most powerful tools of exploration in particle physics, so we cannot abandon them. What we have to do is push the research and development in accelerator technology, so that we will be able to reach higher energy with compact accelerators.

    Nature doi:10.1038/nature.2015.19040

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

  • richardmitnick 2:19 pm on December 18, 2015 Permalink | Reply
    Tags: , , CERN LHC, , , , ,   

    From Symmetry: “CERN and US increase cooperation” 


    Sarah Charley

    Eric Bridiers, US Mission

    The United States and the European physics laboratory have formally agreed to partner on continued LHC research, upcoming neutrino research and a future collider.

    Today in a ceremony at CERN, US Ambassador to the United Nations Pamela Hamamoto and CERN Director-General Rolf Heuer signed five formal agreements that will serve as the framework for future US-CERN collaboration.

    These protocols augment the US-CERN cooperation agreement signed in May 2015 in a White House ceremony and confirm the United States’ continued commitment to research at the Large Hadron Collider.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    They also officially expand the US-CERN partnership to include work on a US-based neutrino research program and on the study of a future circular collider at CERN.

    “This is truly a good day for the relationship between CERN and the United States,” says Hamamoto, US permanent representative to the United Nations in Geneva. “By working together across borders and cultures, we challenge our knowledge and push back the frontiers of the unknown.”

    The partnership between the United States and CERN dates back to the 1950s, when American scientist Isidor Rabi served as one of CERN’s founding members.

    “Today’s agreements herald a new era in CERN-US collaboration in particle physics,” Heuer says. “They confirm the US commitment to the LHC project, and for the first time, they set down in black and white European participation through CERN in pioneering neutrino research in the US. They are a significant step towards a fully connected trans-Atlantic research program.”

    Today, the United States is the most represented nation in both the ATLAS and CMS collaborations at the LHC.


    CMS Use this one

    Its contributions are sponsored through the US Department of Energy’s Office of Science and the National Science Foundation.

    According to the new protocols, the United States will continue to support the LHC program through participation in the ATLAS, CMS and ALICE experiments.


    The LHC Accelerator Research Program, an R&D partnership between five US national laboratories, plans to develop powerful new magnets and accelerating cavities for an upgrade to the accelerator called the High-Luminosity LHC, scheduled to begin at the end of this decade.

    In addition, a joint neutrino-research protocol will enable a new type of reciprocal relationship to blossom between CERN and the US.

    “The CERN neutrino platform is an important development for CERN,” says Marzio Nessi, its coordinator. “It embodies CERN’s undertaking to foster and contribute to fundamental research in neutrino physics at particle accelerators worldwide, notably in the US.”

    The agreement will enable scientists and engineers working at CERN to participate in the design and development of technology for the Deep Underground Neutrino Experiment, a Fermilab-hosted experiment that will explore the mystery of neutrino oscillations and neutrino mass.

    FNAL Dune & LBNF

    For the first time, CERN will serve as a platform for scientists participating in a major research program hosted on another continent. CERN will serve as a European base for scientists working the DUNE experiment and on short-baseline neutrino research projects also hosted by the United States.

    Finally, the protocols pave the way beyond the LHC research program. The United States and CERN will collaborate on physics and technology studies aimed at the development of a proposed new circular accelerator, with the aim of reaching seven times higher energies than the LHC.

    The protocols take effect immediately and will be renewed automatically on a five-year basis.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 5:17 pm on December 9, 2015 Permalink | Reply
    Tags: , , CERN LHC,   

    From Symmetry: “Save the particles” 


    Sarah Charley

    To learn more about the particles they collide, physicists turn their attention to a less destructive type of collision in the LHC.

    CMS. Maximilien Brice, CERN

    Every second, the Large Hadron Collider generates millions of particle collisions. Scientists watching these interactions usually look out for only the most spectacular ones.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    But recently they’ve also taken an interest in some gentler moments, during which the accelerated particles interact with photons, quanta of light.

    When charged particles—like the protons the LHC usually collides or the lead ions it is colliding right now—are forced around bends in an accelerator, they lose energy in the form of light radiation.

    Originally, physicists perceived this photon leak as a nuisance. But today, laboratories around the world specifically build accelerators to produce it. They can use this high-energy light to take high-speed images of materials and processes in the tiniest detail.

    Scientists are now using the LHC as a kind of light source to figure out what’s going on inside the protons and ions they collide.

    The LHC’s accelerated particles are chock-full of energy. When protons collide—or, more specifically, when the quarks and gluons that make up protons interact—their energy is converted into mass with manifests as other particles, such as Higgs bosons.

    Those particles decay back into energy as they sail through particle detectors set up around the collision points, leaving their signatures behind. Physicists usually study these particles, the ones created in collisions.

    In proton-photon collisions, however, they can study the protons themselves. That’s because photons can traverse a particle’s core without rupturing its structure. They pass harmlessly through the proton, creating new particles along the way.

    “When a high-energy light wave hits a proton, it produces particles—all kinds of particles—without breaking the proton,” says Daniel Tapia Takaki, an assistant professor at the University of Kansas who is a part of the CMS collaboration. “These particles are recorded by our detector and allow us to reconstruct an unprecedentedly high-quality picture of what’s inside.”

    Tapia Takaki is interested in using these photon-induced interactions to study the density of gluons inside high-energy protons and nuclei.

    As a proton is accelerated to close to the speed of light, its gluons swell and eventually split—like cells dividing in an embryo. Scientists want to know: Just how packed are gluons inside these protons? And what can that tell us about what happens when they collide?

    The Standard Model—a well-vetted model that predicts the properties of subatomic particles—predicts that the density of gluons inside a proton is directly related to the likelihood a proton will spit out a pair of charm quarks in the form of a J/psi particle during a proton-photon interaction.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    “So by measuring the J/psi’s production rate very precisely, we can automatically have access to the density of gluons,” Tapia Takaki says.

    Prior to joining the CMS experiment, Tapia Takaki worked with colleagues on the ALICE experiment to conduct a similar study of photon-lead interactions.


    Tapia Takaki plans to study the lead ions currently being collided in the LHC in more detail with his current team.

    The trickiest part of these studies isn’t applying the equation, but identifying the collisions, Tapia Takaki says.

    To identify subtle proton-photon and photon-lead collisions, Tapia Takaki and his colleagues must carefully program their experiments to cherry-pick and record events in which there’s no evidence of protons colliding—yet there is still evidence of the production of low-energy particles.

    “It’s challenging because the interactions of light with protons or lead ions take place all the time,” Tapia Takaki says. “We had to find a way to record these events without overloading the detector’s bandwidth.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc

Get every new post delivered to your Inbox.

Join 552 other followers

%d bloggers like this: