Tagged: CERN LHC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:20 am on October 20, 2016 Permalink | Reply
    Tags: , CERN LHC, Missing transverse energy (MET), , What Happens When Energy Goes Missing?   

    From particlebites: “What Happens When Energy Goes Missing?” 

    particlebites bloc

    particlebites

    October 11, 2016
    Julia Gonski

    Article:Performance of algorithms that reconstruct missing transverse momentum in √s = 8 TeV proton-proton collisions in the ATLAS detector
    Authors: The ATLAS Collaboration
    Reference: arXiv:1609.09324

    CERN/ATLAS detector
    CERN/ATLAS detector

    The ATLAS experiment recently released a note detailing the nature and performance of algorithms designed to calculate what is perhaps the most difficult quantity in any LHC event: missing transverse energy.

    2
    Figure 1: LHC momentum conservation.

    1
    Figure 2: ATLAS event display showing MET balancing two jets.

    Missing transverse energy (MET) is so difficult because by its very nature, it is missing, thus making it unobservable in the detector. So where does this missing energy come from, and why do we even need to reconstruct it?

    The LHC accelerates protons towards one another on the same axis, so that they collide head on.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Therefore, the incoming partons have net momentum along the direction of the beamline, but no net momentum in the transverse direction (see Figure 1). MET is then defined as the negative vectorial sum (in the transverse plane) of all recorded particles. Any nonzero MET indicates a particle that escaped the detector. This escaping particle could be a regular Standard Model neutrino, or something much more exotic, such as the lightest supersymmetric particle or a dark matter candidate.

    Figure 2 shows an event display where the calculated MET balances the visible objects in the detector. In this case, these visible objects are jets, but they could also be muons, photons, electrons, or taus. This constitutes the “hard term” in the MET calculation. Often there are also contributions of energy in the detector that are not associated to a particular physics object, but may still be necessary to get an accurate measurement of MET. This momenta is known as the “soft term”.

    In the course of looking at all the energy in the detector for a given event, inevitably some pileup will sneak in. The pileup could be contributions from additional proton-proton collisions in the same bunch crossing, or from scattering of protons upstream of the interaction point. Either way, the MET reconstruction algorithms have to take this into account. Adding up energy from pileup could lead to more MET than was actually in the collision, which could mean the difference between an observation of dark matter and just another Standard Model event.

    One of the ways to suppress pile up is to use a quantity called jet vertex fraction (JVF), which uses the additional information of tracks associated to jets. If the tracks do not point back to the initial hard scatter, they can be tagged as pileup and not included in the calculation. This is the idea behind the Track Soft Term (TST) algorithm. Another way to remove pileup is to estimate the average energy density in the detector due to pileup using event-by-event measurements, then subtracting this baseline energy. This is used in the Extrapolated Jet Area with Filter, or EJAF algorithm.

    Once these algorithms are designed, they are tested in two different types of events. One of these is in W to lepton + neutrino decay signatures. These events should all have some amount of real missing energy from the neutrino, so they can easily reveal how well the reconstruction is working. The second group is Z boson to two lepton events. These events should not have any real missing energy (no neutrinos), so with these events, it is possible to see if and how the algorithm reconstructs fake missing energy. Fake MET often comes from miscalibration or mismeasurement of physics objects in the detector. Figures 3 and 4 show the calorimeter soft MET distributions in these two samples; here it is easy to see the shape difference between real and fake missing energy.

    3
    Figure 3: Distribution of the sum of missing energy in the calorimeter soft term (“real MET”) shown in Z to μμ data and Monte Carlo events.

    4
    Figure 4: Distribution of the sum of missing energy in the calorimeter soft term (“fake MET”) shown in W to eν data and Monte Carlo events.

    This note evaluates the performance of these algorithms in 8 TeV proton proton collision data collected in 2012. Perhaps the most important metric in MET reconstruction performance is the resolution, since this tells you how well you know your MET value. Intuitively, the resolution depends on detector resolution of the objects that went into the calculation, and because of pile up, it gets worse as the number of vertices gets larger. The resolution is technically defined as the RMS of the combined distribution of MET in the x and y directions, covering the full transverse plane of the detector. Figure 5 shows the resolution as a function of the number of vertices in Z to μμ data for several reconstruction algorithms. Here you can see that the TST algorithm has a very small dependence on the number of vertices, implying a good stability of the resolution with pileup.

    5
    Figure 5: Resolution obtained from the combined distribution of MET(x) and MET(y) for five algorithms as a function of NPV in 0-jet Z to μμ data.

    Another important quantity to measure is the angular resolution, which is important in the reconstruction of kinematic variables such as the transverse mass of the W. It can be measured in W to μν simulation by comparing the direction of the MET, as reconstructed by the algorithm, to the direction of the true MET. The resolution is then defined as the RMS of the distribution of the phi difference between these two vectors. Figure 6 shows the angular resolution of the same five algorithms as a function of the true missing transverse energy. Note the feature between 40 and 60 GeV, where there is a transition region into events with high pT calibrated jets. Again, the TST algorithm has the best angular resolution for this topology across the entire range of true missing energy.

    6
    Figure 6: Resolution of ΔΦ(reco MET, true MET) for 0 jet W to μν Monte Carlo.

    As the High Luminosity LHC looms larger and larger, the issue of MET reconstruction will become a hot topic in the ATLAS collaboration. In particular, the HLLHC will be a very high pile up environment, and many new pile up subtraction studies are underway. Additionally, there is no lack of exciting theories predicting new particles in Run 3 that are invisible to the detector. As long as these hypothetical invisible particles are being discussed, the MET teams will be working hard to catch them, so we can safely expect some innovation of these methods in the next few years.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    What is ParticleBites?

    ParticleBites is an online particle physics journal club written by graduate students and postdocs. Each post presents an interesting paper in a brief format that is accessible to undergraduate students in the physical sciences who are interested in active research.

    The papers are accessible on the arXiv preprint server. Most of our posts are based on papers from hep-ph (high energy phenomenology) and hep-ex (high energy experiment).

    Why read ParticleBites?

    Reading a technical paper from an unfamiliar subfield is intimidating. It may not be obvious how the techniques used by the researchers really work or what role the new research plays in answering the bigger questions motivating that field, not to mention the obscure jargon! For most people, it takes years for scientific papers to become meaningful.

    Our goal is to solve this problem, one paper at a time. With each brief ParticleBite, you should not only learn about one interesting piece of current work, but also get a peek at the broader picture of research in particle physics.

    Who writes ParticleBites?

    ParticleBites is written and edited by graduate students and postdocs working in high energy physics. Feel free to contact us if you’re interested in applying to write for ParticleBites.

    ParticleBites was founded in 2013 by Flip Tanedo following the Communicating Science (ComSciCon) 2013 workshop.

    2
    Flip Tanedo UCI Chancellor’s ADVANCE postdoctoral scholar in theoretical physics. As of July 2016, I will be an assistant professor of physics at the University of California, Riverside

    It is now organized and directed by Flip and Julia Gonski, with ongoing guidance from Nathan Sanders.

     
  • richardmitnick 12:12 pm on October 18, 2016 Permalink | Reply
    Tags: , CERN LHC, , , , , Physicists Recover From a Summer’s Particle ‘Hangover’   

    From NYT: “Physicists Recover From a Summer’s Particle ‘Hangover’” 

    New York Times

    The New York Times

    OCT. 17, 2016
    George Johnson

    1
    James Yang

    As I sat last month in the cafeteria at CERN, the nuclear research center near Geneva that is home to the Large Hadron Collider, I looked out at the expanse of tables and wondered what all those young physicists were talking about.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Judging from their enthusiasm, they had recovered from the summer’s “diphoton hangover,” the nickname given to the disappointment that followed the coming apart, weeks earlier, of a striking observation — an excess number of photons hinting that some exotic new particle might be lurking behind the scenes, an encore to the Higgs boson.

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    CERN ATLAS Higgs Event
    CERN ATLAS Higgs Event

    Could it be a cousin of the Higgs, or a long sought particle of dark matter? The excitement led to a speculative bubble of papers seeking to explain what turned out to be a nonevent. What had jumped out as a pattern in the data was apparently a mirage, like seeing a pyramid on Mars.

    A victim of its own success, particle physics has come to a turning point. For decades, the theorists have been calling the shots, predicting particles like the Higgs for the experimenters to find, plugging the holes in the cosmic puzzle. Now, with the pieces in place, in the form of the Standard Model, the theorists are hoping to push further, looking again to the experimenters to confront them with new things to theorize about — clues, perhaps, to an even deeper order.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    As plates and utensils clattered around me, I thought of an earlier changing of the guard. In 1962 in this same cafeteria, a 32-year-old theorist named Murray Gell-Mann wrote a prediction on a napkin that helped set the course for the next half-century of research.

    During the early decades of the 1900s, swashbuckling experimenters had run the show. Launching their instruments in balloons and carrying them to mountaintops, they brought back snapshots of cosmic ray particles that made no sense.

    “Who ordered that?” Isidor I. Rabi, a renowned theorist, famously said after he learned of the muon — a fat, short-lived cousin of the electron. Then came interlopers called pions, kaons, lambdas, sigmas and xis.

    Just three particles — electrons, protons and neutrons — seemed like enough to make the world. What were all of these extras? They didn’t fit into any existing scheme.

    With pencil and paper, Dr. Gell-Mann devised one, doing for physics what Mendeleev, with his periodic table of the elements, had done for chemistry. Sorting the particles into clusters of eight and 10, Dr. Gell-Mann came up with a framework he called the Eightfold Way. It had nothing to do with Buddhism. He just liked the name.

    Among the rows and columns of Mendeleev’s table there had been empty spaces — place holders for elements like germanium (a kin to carbon, silicon, tin and lead) that would not be discovered for years. And so it was with the gaps in the Eightfold Way.

    If Dr. Gell-Mann’s math was right — a pretty good bet — there had to be a particle he called the omega minus. He described what to look for on a cafeteria napkin and handed it to a colleague, Nicholas Samios.

    Two years later, Dr. Samios, one of the great experimenters of his generation, discovered the particle at Brookhaven National Laboratory on Long Island. (It had also been predicted by the Israeli physicist Yuval Ne’eman, who was sitting at lunch that day with Dr. Gell-Mann at CERN.)

    From then on, the theorists were ascendant. With its bona fides established, the Eightfold Way led to quarks, gluons and ultimately the Standard Model, a chart with its own holes to fill. One by one, the experimenters obliged until the keystone, the Higgs, was put in place, discovered with the Large Hadron Collider in 2012.

    And that, for the theorists, led to a postpartum depression. Though now complete, the Standard Model (available on a T-shirt at the CERN gift shop) lacks the elegance one might like in a well-made universe.

    There are matter particles and force particles with masses ranging from zero (photons and gluons) and near zero (neutrinos) to the top quark, which is as hefty as an entire atom of tungsten — an element whose name, in Swedish, means “heavy stone.”

    The Higgs explains how particles acquired mass, but not why they were spit forth with such a hodgepodge of different values. Least satisfying of all, the Standard Model leaves out the most salient of forces, gravity, which is described by an entirely different theory.

    Is this how the universe just happens to be? Or is there a grander theory that would demand that things be precisely this way? And so the search goes on.

    There is no reason other than sheer stubbornness for human brains to assume that they are neurologically equipped to understand the finest details of creation. But those vibrant physicists in the CERN cafeteria didn’t seem burdened with existential angst.

    As they lined up to bus their lunch trays, placing the dishes on a conveyor belt, I wondered for a moment about the fate of all those discarded napkins. All it might take is one, marked in pencil or ink with a unique scribble, to set particle physics off on its next adventure.

    Beneath our feet, particles collided silently in the tunnels, striking more sparks to puzzle over.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 8:46 am on October 13, 2016 Permalink | Reply
    Tags: , , , CERN LHC, ,   

    From Harvard: “They ponder the universe” 

    Harvard University

    Harvard University

    October 12, 2016
    Alvin Powell

    Harvard students join faculty at CERN in Europe to tackle physics’ mysteries


    Access mp4 video here .

    Once you know enough math, Harvard Ph.D. student Tony Tong said, you get to know physics. And physics, he said, is simply amazing.

    “[Physics] is always helpful to answer the question of ‘Why?’ Why the skies are blue, why the universe is so big, basic stuff,” Tong said. “I’m always curious about those questions and the solution is always so beautiful.”

    Tong, it seems, had come to the right place. He was speaking on a warm July day in a small courtyard at the European Organization for Nuclear Research, known as CERN, the scientific campus on the outskirts of Geneva that is the world’s beating heart for high-energy particle physics.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Home of the world’s most powerful particle accelerator, the Large Hadron Collider (LHC), CERN made world headlines in 2012 when scientists announced the discovery of the Higgs boson, the final undiscovered particle in the theoretical framework of the universe called the Standard Model.

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    CERN ATLAS Higgs Event
    CERN ATLAS Higgs Event

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    The eyes of the scientific world remain focused on CERN today because the LHC is back in operation after a major upgrade that boosted its energy to 13 tera electron volts, allowing it to crash beams of protons into each other more powerfully than ever before. Now that the Standard Model is complete, scientists are looking for what’s still mysterious, sometimes called the “new physics” or “physics beyond the Standard Model.” Its form, presumably, would involve a particle born of these high-energy collisions, one that points the way to an even broader understanding of the universe, shedding light on such puzzling areas as dark matter, supersymmetry, dark energy, and even gravity, which has stubbornly refused to fit neatly into our understanding of the universe’s basic forces.

    Standard model of Supersymmetry DESY
    Standard model of Supersymmetry DESY

    CERN fired up its first accelerator in 1957. Among its milestone discoveries are the elementary particles called W and Z bosons, antihydrogen — the antimatter version of the common element — and the creation of the World Wide Web to share massive amounts of information among scientists, scattered at institutions around the world.

    The CERN campus, which straddles the Switzerland-France border amid breathtaking views of the distant Alps, produces more than just science, however. In ways technological, theoretical, educational, and inspirational, it also produces scientists.


    Access mp4 video here .
    Inside the Antimatter Factory at CERN, the ATRAP antimatter experiment seeks to slow and trap antimatter for comparison with ordinary matter.

    CERN ATRAP New
    CERN ATRAP

    “Those four years at CERN doing research were a very important part of my training,” said Harvard Physics Department chair Masahiro Morii, who was a research scientist at CERN early in his career. “It taught me things that are a bit difficult to quantify, but changed my perspective very drastically on what it means to be a scientist, what it means to be a high-energy physicist.”

    Year-round, the graduate students and postdoctoral fellows taking their initial career steps work among established scientists, learning and gaining experience difficult to get outside of CERN or a handful of other facilities around the world. Harvard’s Donner Professor of Science John Huth said what becomes apparent is science’s messiness.

    “They see the process as it unfold with all its warts. Science is pretty messy when you get into the nitty-gritty,” Huth said. “It’s just an invaluable experience. Even if you become a scientist in a different discipline or you leave science entirely, understanding that intrinsic messiness is really important.”

    In an environment focused on the practice of physics rather than the teaching of it, CERN puts the onus for learning onto the student, Morii said. Students build and test equipment, make sure what’s installed is running properly, and pluck the most meaningful pieces from the resulting data tsunami. They analyze it at all hours of the day and sometimes deep into the night, since there’s always someone awake and logged onto Skype to answer a question or share an insight.

    “People are really passionate, so it doesn’t really feel like you’re up until 11 doing your job. Maybe you’re thinking about something on the train home and you wanted to look into it. It’s not regular hours, but I don’t think that deters anyone,” said Harvard physics Ph.D. student Julia Gonski. “People like the work and it’s fun. Twenty-four hours a day, you can get on Skype and someone you know is on Skype and working.”

    While fellows and graduate students are at CERN year-round, each summer the campus’ population swells as undergraduates eager to take part in the world’s most famous science experiment step off the plane in Geneva.

    At CERN, they become part of a unique city of physicists from around the world, with different educational and cultural backgrounds but the same passions and similar goals.

    “It was this enormous scientific laboratory, with thousands of people working all hours of the night trying to understand the fundamentals of the universe, as corny as that is to say,” said Harvard postdoctoral fellow Alexander Tuna, who first came to CERN as a summer undergrad from Duke University in 2009. “It was really immersive and fun. There’s always someone around with an interesting insight or an answer to a question.”

    The secrets of the universe

    As a visitor approaches CERN, the giant brown orb of the multistory Globe of Science and Innovation comes into view.

    The globe, looking like an enormous particle half-buried in the earth, serves as a CERN welcome center and is far more visually appealing than the main campus across the street. Protected by fences with access limited through guard stations, the campus’ narrow, twisting roadways wind between boxy, industrial-looking buildings numbered instead of named, as if creativity there is reserved for science instead of infrastructure. Even the cafeteria that serves as a central gathering spot is named simply “Restaurant 1.”

    “It was different than I expected,” said Harvard junior Matthew Bledsoe. “I figured a place on the forefront of physics would look fresher and newer, new buildings and stuff. But [they are] 1950s and ’60s-era buildings, so the buildings are pretty old. It looks like a factory.”

    Visitors quickly learn to look past the boxy exteriors to what’s inside. There they find thousands of people working on 18 experiments, seven associated with the LHC and the others with smaller accelerators and a decelerator, which is used for antimatter experiments like those run by Harvard Physics Professor Gerald Gabrielse’s ATRAP collaboration.

    ATRAP, short for “antihydrogen trap,” relies on the LHC’s high energy to make protons collide with a target to create antiprotons. The experiment then cools and slows the antiprotons, and combines them with positrons, the antimatter equivalent of electrons, to create antihydrogen for study and comparison with ordinary hydrogen. Gabrielse, who pioneered antimatter experiments at CERN, said that for students who want to go into high-energy physics, getting a taste of the enormous collaborations that are behind such experiments is key.

    “If you’re interested in making a career in doing those kinds of things [experimental particle physics], it’s extremely important to have this experience,” Gabrielse said.

    The LHC, with its potential to pierce the veil between the known world of the Standard Model and the mysteries that the model does not address, takes center stage. Yet to visitors wandering the halls and sidewalks of CERN, the LHC is nowhere to be seen.

    That’s because the LHC is buried 300 feet underground in a massive tunnel that runs 17 miles from Switzerland into France and back again. Its twin proton beams circle in opposite directions, crossing four times on their journey. At those crossings are four major particle detectors, one of which is ATLAS, a massive machine backed by a worldwide collaboration in which Harvard scientists play lead roles, and which was one of two experiments to detect the Higgs boson.

    CERN/ATLAS detector
    CERN/ATLAS detector

    2
    Outside the ATLAS control room at the LHC. Joe Sherman/Harvard Staff Photographer

    “You can think of it (ATLAS) as a really large camera surrounding the collision point where protons collide,” Tuna said.

    ATLAS, which stands for A Toroidal LHC Apparatus, is 180 feet long, 82 feet in diameter, and weighs 7,000 tons. When the proton beams collide, they scatter particles in all directions. ATLAS dutifully records these collisions, producing far more data than current computing technology can store, so filters are employed that screen out more mundane results and keep only the most promising for analysis.

    The complex undertaking requires a collaboration that is as massive as the task the researchers have set for themselves. It includes about 3,000 physicists from 175 institutions in 38 countries.

    “This is the center of particle physics right now,” said Harvard Ph.D. student Karri DiPetrillo. “As a scientist, you like asking nature questions and seeing what the answer is. Because we have thousands of people working on a single experiment, you know we’re asking some of the hardest questions in the universe. If it takes thousands of people to find the answer, you know that it’s a good question.”

    For decades, physicists exploring the most basic particles that make up the universe were guided by the Standard Model, which held that everything is made of a limited number of quarks, leptons, and bosons. Over the years, one by one, experimental physicists, including Harvard faculty members, found the particles predicted by the theory: bottom quark, W boson, Z boson, top quark. In 2012, they found the Higgs boson, the last theorized particle.

    When the huge hubbub over the Higgs discovery faded, particle physicists began to assess the field’s new reality. After decades in which theoretical physicists were leading, telling experimental physicists what new particle to look for, the roles are now reversed.

    As reliable as the Standard Model has been, it doesn’t explain everything. And, while theoretical physicists have several ideas of where those mysteries might fit into current knowledge, no evidence exists to tip the scales toward one idea or another.

    Even the Higgs boson still holds secrets, as detecting it didn’t completely explain it. Scientists who continue to probe the Higgs boson hope that the particle may yet reveal clues — inconsistencies from what is expected from the Standard Model — that will outline the broader path forward.

    “There are really two paths. One path is to really push on what we understand about the Higgs boson because that has the strangest properties associated with it and if you push the theory at all the Higgs creates the most problems for it,” Huth said. “The other is the discovery region for something new, like dark matter.”

    The undergraduate summer

    A scientist’s path to CERN usually starts with a passion for physics. Graduate student Nathan Jones credits a family road trip to Colorado during which he read a library book about the universe. Undergrad Bledsoe was wowed by a trip to Fermilab outside Chicago as a high school freshman, while grad student Gonski traces it to the annoyance she felt when she learned her high school chemistry teacher had gotten the science wrong.

    “I remember being in chemistry class in high school when they told us protons and neutrons are indivisible,” said Gonski, who learned otherwise from Stephen Hawking’s “A Brief History of Time.” “I was so offended … I remember being frustrated and asking my parents, ‘Did you guys know?’ At that point I wanted to see how far down we can go [in particle size].”

    After that initial spark, students take classes and often work in a campus laboratory before heading overseas. Some undergraduates go to CERN through the Undergraduate Summer Research Experience program run by the University of Michigan for students across the country. Several Harvard students benefitted instead from the Weissman International Internship Program Grant, established in 1994 to provide faraway opportunities for them.


    Access mp4 video here .
    A field of sunflowers stands at the roadside on the approach to CERN.

    Once the funding is set, there’s nothing left but the plane ride and moving into their new digs. Undergraduates live in settings ranging from downtown Geneva to the French countryside. Last summer, three Harvard students — Ben Garber ’17, Gary Putnam ’17, and Bledsoe — rented an apartment over the border in France and commuted to work each day by bike, while Katie Fraser ’18 stayed closer, at CERN’s on-campus hostel.

    Days consisted of morning lectures on topics relevant to their work. After those lectures — and the occasional pickup basketball game at lunchtime — they’d spend afternoons working on a project. Garber worked with Tuna and DiPetrillo on an analysis of Higgs boson decay (the particle itself exists for a tiny period of time) into two W bosons. Bledsoe worked on hardware, building and testing a circuit board to be used in the planned 2018 ATLAS upgrade, in the cavernous Building 188 under the tutelage of Theo Alexopoulos from the Technical University of Athens. Wherever they were, whether doing project tasks or having cafeteria conversations, the students were steeped in physics.

    “It was a lot of fun, different than I expected. You learn stuff just by being there, pick up vocabulary in lunchtime conversations,” Fraser said. “It definitely solidified my desire to go into high-energy physics.”

    Melissa Franklin, Mallinckrodt Professor of Physics, said lessons can be found behind almost every door at CERN.

    “I was just amazed, it was unbelievable,” said Franklin, who first visited between her undergraduate and graduate years. “I went to every place I could on site and just knocked on doors and bugged people … You learn so much by osmosis. You have to learn to hang around and ask good questions.”

    Jennifer Roloff, a Harvard physics Ph.D. student, first came to CERN in 2011 as an undergraduate and has been back every summer. Now she helps manage the University of Michigan summer undergraduate program, which gives her a broad view of the student experience.

    “There are definitely some students who do miss home,” Roloff said. “For a lot of them it’s the first time out of the country [or] the first time long-term out of the country. For a lot of them, they realize this is not what they want to do. CERN is not for everyone. There are challenges and difficulties that are not in other physics.”

    That understanding, Gabrielse said, is as important a lesson as finding your intellectual home.

    “Some decide, based on it, to go into the field. Some decide not to,” Gabrielse said. “That guidance too is valuable.”

    Yet being at CERN is not just about science. Students have their weekends free and can explore their new surroundings. Some hike the Alps or the closer Jura Mountains. Others walk the ancient streets of Geneva, visiting its lakefront, restaurants, museums, and other attractions. Putnam loved a park near the University of Geneva where people played on large chessboards with giant pieces. He also soaked up the area’s natural splendor.

    “It’s so beautiful here,” Putnam said. “Sometimes I forget and do the normal thing of looking down and not paying attention, but being able to look up and see the mountains is really special.”

    On call for a particle emergency

    Life at CERN as graduate students is not quite so fancy-free. Visits are limited to summers early in graduate careers as they complete coursework, but once that’s done, they can come and stay to conduct dissertation research.

    To keep the ATLAS collaboration running, graduate students are required to spend a year of research time doing work to benefit the experiment itself, to ensure that high-quality data is collected, for example, or that potentially significant collision events aren’t lost in the data.

    “We have to make sure the data we’re receiving is like you expect it, ready for analysis,” DiPetrillo said. “[It’s taken] probably half of my time in the last year; the other half has been working on Standard Model measurement of the Z boson.”

    5
    A little physicist humor written on a CERN blackboard. Joe Sherman/Harvard Staff Photographer

    Part of DiPetrillo’s duty is assisting in ATLAS’ day-to-day operation, working in the ATLAS control room — with its Mission Control feel, and dominated by a wall-sized screen — and monitoring one of several subsystems that make the whole operation work. Monitoring those subsystems makes ATLAS a 24/7 proposition.

    In addition to working overnight in the control room, DiPetrillo is often on call to back up someone on site. While on call, she has to stay near her phone and within an hour’s drive of the facility in case something goes wrong. If that happens, she troubleshoots the problem with the person in the control room or pushes the problem up to someone more senior.

    “You can think of ATLAS as always taking data so we always need people watching it, making sure ATLAS is working in a way that we want [it] to, that the detector is working … and that data looks the way we expect,” DiPetrillo said.

    When not on call or manning the control room overnight, a graduate student’s life at CERN is full of meetings to share and hear the latest findings, and of hours poring over the latest data looking for the kind of statistical bump that might indicate a new particle — or a new something else.

    The LHC’s recent upgrade has made scientists hopeful that a new particle will be discovered soon. But if not, another upgrade planned for about 2018 may do the trick. While the recent upgrade made the energy of the proton beam higher, the next one will increase luminosity, or the number of protons in the beam, multiplying the number of collisions at any given moment and improving the odds of detecting extremely rare events.

    “We’re all here to … discover stuff, but it’s so difficult. It’s impossible to do as one person,” Gonski said. “I would love to be one person on the 500-person team to discover [supersymmetry’s] stop quark. It’d be great for physics if we all discovered this and for me to say I want to do this — be a tiny fraction of a large group effort.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Harvard University campus

    Harvard is the oldest institution of higher education in the United States, established in 1636 by vote of the Great and General Court of the Massachusetts Bay Colony. It was named after the College’s first benefactor, the young minister John Harvard of Charlestown, who upon his death in 1638 left his library and half his estate to the institution. A statue of John Harvard stands today in front of University Hall in Harvard Yard, and is perhaps the University’s best known landmark.

    Harvard University has 12 degree-granting Schools in addition to the Radcliffe Institute for Advanced Study. The University has grown from nine students with a single master to an enrollment of more than 20,000 degree candidates including undergraduate, graduate, and professional students. There are more than 360,000 living alumni in the U.S. and over 190 other countries.

     
  • richardmitnick 7:56 pm on October 6, 2016 Permalink | Reply
    Tags: , CERN LHC, , Superconducting super collider   

    From Physics Today: “A bridge too far: The demise of the Superconducting Super Collider” A very important article. 

    Physics Today bloc

    Physics Today

    10.6.16
    Michael Riordan

    The largest basic scientific project ever attempted, the supercollider proved to be beyond the management capacity of the US high-energy physics community. A smaller proton collider would have been substantially more achievable.

    2
    Map of the proposed Superconducting Super Collider.

    When the US Congress terminated the Superconducting Super Collider (SSC) in October 1993 after about $2 billion had been spent on the project, it ended more than four decades of American leadership in high-energy physics. To be sure, US hegemony in the discipline had been deteriorating for more than a decade, but the SSC cancellation was the ultimate blow that put Europe unquestionably in the driver’s seat and opened the door to the discovery of the Higgs boson at CERN (see Physics Today, September 2012, page 12). The causes and consequences of the SSC’s collapse, a watershed event in the history of science, have been discussed and debated ever since it happened.

    At least a dozen good reasons have been suggested for the demise of the SSC. Primary among them are the project’s continuing cost overruns, its lack of significant foreign contributions, and the end of the Cold War. But recent research and documents that have come to light have led me to an important new conclusion: The project was just too large and too expensive to have been pursued primarily by a single nation, however wealthy and powerful. Wolfgang “Pief” Panofsky, founding director of SLAC, voiced that possibility during a private conversation in the months after the project’s demise; he suggested that perhaps the SSC project was “a bridge too far” for US high-energy physics. That phrase became lodged firmly in my mind throughout the many years I was researching its history.

    1
    View along the Superconducting Super Collider main-ring tunnel, in early 1993. (Courtesy of Fermilab Archives.)

    Some physicists will counter that the SSC was in fact being pursued as an international project, with the US taking the lead in anticipation that other nations would follow; it had done so on large physics projects in the past and was doing so with the much costlier International Space Station. But that argument ignores the inconvenient truth that the gargantuan project was launched by the Reagan administration as a deliberate attempt to reestablish US leadership in a scientific discipline the nation had long dominated. If other nations were to become involved, they would have had to do so as junior partners in a multibillion-dollar enterprise led by US physicists.

    That fateful decision, made by the leader of the world’s most powerful government, established the founding rhetoric for the SSC project, which proved difficult to abandon when it came time to enlist foreign partners.

    The SSC and the LHC

    In contrast, CERN followed a genuinely international approach in the design and construction of its successful Large Hadron Collider (LHC), albeit at a much more leisurely pace than had been the case for the SSC.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Serious design efforts begun during the late 1980s and early 1990s ramped up after the SSC’s termination. Although the LHC project also experienced trying growth problems and cost overruns—its cost increased from an estimated 2.8 billion Swiss francs ($2.3 billion at the time) in 1996 to more than 4.3 billion Swiss francs in 2009—it managed to survive and become the machine that allowed the Higgs-boson discovery using only about half of its originally designed 14 TeV energy.

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    CERN ATLAS Higgs Event
    CERN ATLAS Higgs Event

    (The SSC, by comparison, was designed for 40 TeV collision energy.) When labor costs and in-kind contributions from participating nations are included, the total LHC price tag approached $10 billion, a figure often given in the press. Having faced problems similar to, though not as severe as, what the SSC project experienced, the LHC’s completion raises an obvious question: Why did CERN and its partner nations succeed where the US had failed?

    From the SSC’s early days, many scientists thought it should have been sited at or near Fermilab to take advantage of the existing infrastructure, both physical and human. University of Chicago physicist and Nobel laureate James Cronin explicitly stated that opinion in a letter he circulated to his fellow high-energy physicists in August 1988. CERN has followed that approach for decades, building one machine after another as extensions of its existing facilities and reusing parts of the older machines in new projects, thereby reducing costs. Perhaps as important, CERN had also gathered and developed some of the world’s most experienced accelerator physicists and engineers, who work together well. During the late 1980s, Fermilab had equally adept machine builders—plus substantial physical infrastructure—who could have turned to other productive endeavors when the inevitable funding shortfalls occurred during the annual congressional appropriations process.

    Troublesome clashes occurred at the SSC between the high-energy physicists and engineers who had been recruited largely from the shrinking US military–industrial complex as the Cold War wound down during the late 1980s and early 1990s. For example, the methods by which SSC general manager Edward Siskin and magnet division director Thomas Bush managed large projects and developed sophisticated components differed greatly from those customarily employed by high-energy physicists. A particular bone of contention was the project’s initial lack of a cost-and-schedule control system, which by then had become mandatory practice for managing large military-construction and development projects overseen by the Department of Defense. Such clashes would probably not have erupted in the already well-integrated Fermilab high-energy physicist culture, nor would the disagreements have been as severe.

    Those pro-Fermilab arguments, however, ignore the grim realities of the American political process. A lucrative new project that was to cost more than $5 billion and promised more than 2000 high-tech jobs could not be sole-sourced to an existing US laboratory, no matter how powerful its state congressional delegation. As politically astute leaders of the Department of Energy recognized, the SSC project had to be offered up to all states able to provide a suitable site, with the decision based (at least publicly) on objective, rational criteria. Given the political climate of the mid 1980s, a smaller project costing less than $1 billion and billed as an upgrade of existing facilities might have been sole-sourced to Fermilab, but not one as prominent and costly as the SSC. It had to be placed on the US auction block, and Texas made the best bid according to the official DOE criteria.

    Unlike the SSC, the LHC project benefited from the project management skills of a single physicist, Lyndon Evans, who came to the task with decades of experience on proton colliders. Despite the facility’s major problems and cost overruns, Evans enjoyed the strong support of the CERN management and a deeply experienced cadre of physicists and engineers. On the LHC project, engineers reported ultimately to physicists, who as the eventual users of the machine were best able to make the required tradeoffs when events did not transpire as originally planned. The project encountered daunting difficulties and major delays, including the September 2008 quench of dozens of superconducting magnets. But the core management team led by Evans worked through those problems, shared a common technological culture, and understood and supported the project’s principal scientific goals.

    Similar observations cannot be made regarding the military–industrial engineers who came to dominate the SSC lab’s collider construction. Until 1992 a succession of acting or ineffectual project managers could not come to grips with the demands of such a complex, enormous project that involved making countless decisions weekly. Secretary of energy James D. Watkins deliberately had Siskin inserted into the SSC management structure in late 1990 in an effort to wrest control of the project from the high-energy physicists. After SLAC physicist John Rees stepped in as the SSC project manager in 1992, he and Siskin began working together effectively and finally got a computerized cost-and-schedule control system up and running—and thus the project under better control. But it proved to be too late, as the SSC had already gained a hard-to-shake reputation in Congress as being mismanaged and out of control.

    CERN also enjoys an enviable internal structure, overseen by its governing council, that largely insulates its leaders and scientists from the inevitable political infighting and machinations of member nations. Unlike in the US, the director general or project manager could not be subpoenaed to appear before a parliamentary investigations subcommittee or be required to testify under oath about its management lapses or cost overruns—as SSC director Roy Schwitters had to do before Congress. Nor did the LHC project face annual congressional appropriations battles and threats of termination, as did major US projects like the SSC and the space station. Serious problems that arose with the LHC—for example, a large cost overrun in 2001—were addressed in the council, which represents the relevant ministries of its member nations and generally operates by consensus, especially on major laboratory initiatives. That supple governing structure helps keep control of a project within the hands of the scientists involved and hinders government officials from intervening directly.

    Because the council must also address the wider interests of individual European ministries, CERN leaders have to be sensitive to the pressures that the annual budget, new projects, and cost overruns can exert on other worthy science. In that manner, European scientists in other disciplines have a valuable voice in CERN governing circles. The LHC project consequently had to be tailored to address such concerns before the council would grant it final approval. In the US, the only mechanism available was for disgruntled scientists to complain openly, which Philip Anderson of Princeton University, Theodore Geballe of Stanford University, Rustum Roy of the Pennsylvania State University, and others did in prominent guest editorials or in congressional hearings when SSC costs got out of hand between 1989 and 1991. The resulting polarization of the US physics community helped undermine what had been fairly broad support for the SSC project in the House of Representatives, which in 1989 had voted 331–92 to proceed with construction.

    Because of financial pressures, CERN had to effectively internationalize the LHC project—obtaining monetary and material commitments from such nonmember nations as Canada, China, India, Japan, Russia and the US—before the council would give approval to go ahead with it. When that approval finally came in 1996, the LHC was a truly international scientific project with firm financial backing from more than 20 nations. Those contributions enabled Evans and his colleagues to proceed with the design of a collider able to reach the full 14 TeV collision energy as originally planned.

    Scale matters

    In hindsight, the LHC was (somewhat fortuitously) more appropriately sized to its primary scientific goal: the discovery of the Higgs boson. The possibility that this elusive quarry could turn up at a mass as low as 125 GeV was not widely appreciated until the late 1980s, when theories involving supersymmetry began to suggest the possibility of such a light Higgs boson emerging from collisions. But by then the SSC die had been cast in favor of a gargantuan 40 TeV collider, 87 km in circumference, that would be able to uncover the roots of spontaneous symmetry breaking even if the long-anticipated phenomenon required the protons’ constituent quarks and gluons to collide with energies9 as high as 2 TeV. When it became apparent in late 1989 that roughly $2 billion more would be needed to reduce design risks that could make it difficult for the SSC to attain its intended collision rate, Panofsky argued that the project should be down-scoped to 35 TeV to save hundreds of millions of dollars. But nearly everyone else countered that the full 40 TeV was required to make sure users could discover the Higgs boson—or whatever else was responsible for spontaneous symmetry breaking and elementary-particle masses.

    3
    Schematic of the Superconducting Super Collider, depicting its main 87 km ring—designed to circulate and collide twin proton beams, each at energies up to 20 TeV—the injector accelerators, and experimental halls, where the protons were to collide. That ring circumference is more than three times the 27 km circumference of CERN’s Large Hadron Collider (orange). The footprints of yet smaller particle colliders at Fermilab (purple) and SLAC (green) are also shown for comparison.

    A US High-Energy Physics Advisory Panel (HEPAP) subpanel, chaired by SLAC deputy director Sidney Drell, unanimously endorsed that fateful decision in 1990. The US high-energy physics community had thus committed itself to an enormous project that became increasingly difficult to sustain politically amid the worsening fiscal climate of the early 1990s. With the end of the Cold War and subsequent absence of a hoped-for peace dividend during a stubborn recession, the US entered a period of fiscal austerity not unlike what is now occurring in many developed Western nations. In that constrained environment, a poorly understood basic-science project experiencing large, continuing cost overruns and lacking major foreign contributions presented an easy political target for congressional budget cutters.

    A 20 TeV proton collider—or perhaps just a billion-dollar extension of existing facilities such as the 4–5 TeV Dedicated Collider proposed by Fermilab in 1983—would likely have survived the budget axe and discovered the light Higgs boson long ago. Indeed, another option on the table during the 1983 meetings of a HEPAP subpanel chaired by Stanford physicist Stanley Wojcicki was for Brookhaven National Laboratory to continue construction of its Isabelle collider while Fermilab began the design work on that intermediate-energy proton– antiproton collider, whose costs were then projected at about $600 million.

    That more conservative, gradual approach would have maintained the high-energy physics research productivity of the DOE laboratories for at least another decade. And such smaller projects would certainly have been more defensible during the economic contractions of the early 1990s, for they aligned better with the high-energy physics community’s diminishing political influence in Washington. Their construction would also have been far easier for physicists to manage and control by themselves without having to involve military–industrial engineers.

    The Wojcicki subpanel had originally recommended that the US design a 20–40 TeV collider, but that was before European physicists led by CERN decided in 1984 to focus their long-range plans on a 14 TeV proton collider that they could eventually achieve by adding superconducting magnets to the Large Electron–Positron Collider (LEP) then under construction. (Actually, they considered 18 TeV achievable when they made this decision.) Lowering the SSC energy as Panofsky suggested thus risked Congress raising the awkward question that had already been voiced by SSC opponents, “Why don’t US physicists just join the LHC project and save US taxpayers billions of dollars?” Although justified on purely physics grounds, the 1990 decision to keep the original SSC energy clearly had a significant political dimension, too.

    The US high-energy physics community therefore elected to “bet the company” on an extremely ambitious 40 TeV collider, so large that it ultimately had to be sited at a new laboratory in the American Southwest, as was originally envisioned in 1982. Such a choice, however, meant abandoning the three-laboratory DOE system that had worked well for nearly two decades and had fostered US leadership in high-energy physics. (That was Cronin’s primary concern when he urged his fellow physicists and DOE to site the SSC at Fermilab.) But perceived European threats to US hegemony and Reagan administration encouragement tipped the balance toward making the SSC a national project and away from it becoming the truly international “world laboratory” that others had long been advocating.

    Infrastructure problems

    In retrospect, the SSC leadership faced two daunting tasks in establishing a new high-energy physics laboratory in Waxahachie, Texas:

    ► Building the physical infrastructure for a laboratory that would cost billions of US taxpayer dollars and was certain to be a highly visible, contentious project.

    ► Organizing the human infrastructure needed to ensure that the SSC became a world-class laboratory where scientists could do breakthrough high-energy physics research.

    Addressing those tasks meant having to draw resources away from other worthy programs and projects that competed with the SSC during a period of tight annual budgets. Reagan administration officials had insisted that the project would be funded by new money, but that was only a convenient fiction. Congress, not the president, holds the federal purse strings, so the SSC always had to compete against other powerful interests—especially energy and water projects—for its annual funding. And it usually came up short, which further delayed the project and increased its costs.

    Schwitters and other managers attempted to attract top-notch physicists to staff the laboratory, but after 1988 many of its original, primary advocates in the SSC Central Design Group (CDG) returned to their tenured positions in universities and national labs. For example, CDG director Maury Tigner, who returned to Cornell University, might have been the best choice for the original project manager. (Second-tier CDG managers did go to Texas, however, as did many younger, untenured physicists.) Despite the promise and likely prestige of building a world-class scientific laboratory, the Dallas–Fort Worth area was viewed as an intellectual backwater by many older, accomplished high-energy physicists. They might have come to work there on a temporary or consulting basis, as did Rees originally, but making a permanent, full-time commitment and bringing their spouses and families with them proved a difficult choice for many.

    Achieving the first daunting task in a cost-effective way thus required bringing in an alien, military–industrial culture that made realizing the second task much more difficult. Teaming with EG&G and Sverdrup Corporations helped the SSC laboratory to tap the growing surplus of military–industrial engineers. It was crucial to get capable engineers working on the project quickly so that all the detailed design and construction work could occur on schedule and costs could be controlled. But the presence of military–industrial engineers at high levels in the SSC organization served as an added deterrent to established physicists who might otherwise have moved to Texas to help organize and build the laboratory.

    Estimates of the infrastructure costs that could have been saved by siting the SSC adjacent to Fermilab range from $495 million to $3.28 billion. The official DOE figures came in at the lower end, from $495 million to $1.03 billion, but they ignored the value of the human infrastructure then available at Fermilab. In hindsight, the costs of establishing such infrastructure anew at a green-field site were not small. In Tunnel Visions, my coauthors and I estimate that the total added infrastructure costs—physical plus human—of building the SSC in Texas would have been about $2 billion.

    Unlike historians gazing into the past, however, physicists do not enjoy the benefit of hindsight when planning a new machine. Guided in part by the dominant theoretical paradigm, they work with a cloudy crystal ball through which they can only guess at phenomena likely to occur in a new energy range, and they must plan accordingly. And few can foresee what may transpire in the economic or political realms that could jeopardize an enormous project that requires about a decade to complete and will cost billions of dollars, euros, or Swiss francs—or, relevant today, a trillion yen. That climate of uncertainty thus argues for erring on the side of fiscal conservatism and for trying to reduce expenses by building a new machine at or near an existing laboratory. Such a gradual, incremental approach has been followed successfully at CERN for six decades now, and to a lesser extent at other high-energy physics labs.

    But US physicists, perhaps enticed by Reagan administration promises, elected to stray from that well-worn path in the case of the SSC. It took a giant leap of faith to imagine that they could construct an enormous new collider at a green-field site where everything had to be assembled from scratch—including the SSC management team—and defend the project before Congress in times of increasing fiscal austerity. A more modest project sited at Fermilab would likely have weathered less opposition and still be operating today.

    In the multibillion-dollar realm it had entered, the US high-energy physics community had to form uneasy alliances with powerful players in Washington and across the nation. And those alliances involved uncomfortable compromises that led, directly or indirectly, to the SSC project’s demise. That community of a few thousand physicists had a small and diminishing supply of what Beltway insiders recognize as “political capital.” It could not by itself lay claim to more than 5 billion taxpayer dollars when many other pressing demands were being made on the federal purse. Thus for the SSC to move forward as a principally national project meant that those physicists had to give up substantial control to powerful partners with their own competing agendas. The Texans’ yearning for high-tech jobs, for example, helped congressional opponents paint the SSC as a pork-barrel project in the public mind. In the process, the high-energy physics community effectively lost control of its most important project.

    A personal perspective

    Part of the problem driving up the SSC costs was the project’s founding rhetoric: the intention to leapfrog European advances and reassert US leadership in high-energy physics. The Reagan administration in particular was promoting US competitiveness over international cooperation; treating other nations as equal partners would not have gained the administration’s support. And a smaller, say 20 TeV, proton collider would not have sufficed either, for that was much too close in energy to what CERN could eventually achieve in the 27 km LEP tunnel then under construction. The SSC therefore had to shoot for 40 TeV, which was presented as a scientific necessity but was in fact mainly a political choice. That energy was more than 20 times the energy of the Fermilab Tevatron, and the SSC proved to be nearly 20 times as expensive. And along with its onerous price tag came other, unanticipated complications—managerial as well as political—that US physicists were ill-equipped to confront. As Panofsky suggested, the SSC was indeed “a bridge too far”—a phrase he probably borrowed from the title of Cornelius Ryan’s 1974 book about a disastrous Allied campaign to capture the Arnhem Bridge over the Rhine River during World War II.

    I became convinced of that interpretation only in April 2014, when previously suppressed documents surfaced at the William J. Clinton Presidential Library. The documents were memos to Clinton’s chief of staff regarding a draft letter being circulated among top administration officials in early 1993 by new secretary of energy Hazel O’Leary. In the letter, Clinton was to request a billion-dollar SSC contribution from Japanese prime minister Kiichi Miyazawa. Such a contribution would have helped tremendously to reassure House members that major foreign support was indeed forthcoming and perhaps would have kept the project alive. But the memos, one from science adviser John Gibbons and the other from assistant to the president John Podesta and staff secretary Todd Stern, recommended against the president sending such a letter. The latter memo was particularly adamant:

    NSC [the National Security Council] agrees that we should convey to the Japanese our firm backing for the SSC, but still objects strongly [emphasis in the original] to sending a letter to Miyazawa. Such a letter could be seen as suggesting that we attach greater importance to Japanese participation in the SSC than we do to Japanese efforts on other fronts, such as aid to Russia.

    The document underscored for me what insurmountable competition the SSC faced in securing the required billions of dollars in federal and foreign funding. Despite their political influence reaching back to the years after World War II, high-energy physicists were not accustomed to playing in the major leagues of US politics. No such letter was ever sent.

    In the final analysis, the Cold War model of doing Big Science projects, with the US taking the lead unilaterally and expecting other Western nations to follow in its footsteps, was no longer appropriate. By the 1980s the global scientific community had begun an epochal transition into a multipolar world in which other nations expect to be treated as equal partners in such major scientific endeavors—especially considering the large financial contributions involved. As US high-energy physicists have hopefully learned from the 1993 termination of the SSC, it should have been promoted from day one as a genuinely international world-laboratory project.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Our mission

    The mission of Physics Today is to be a unifying influence for the diverse areas of physics and the physics-related sciences.

    It does that in three ways:

    • by providing authoritative, engaging coverage of physical science research and its applications without regard to disciplinary boundaries;
    • by providing authoritative, engaging coverage of the often complex interactions of the physical sciences with each other and with other spheres of human endeavor; and
    • by providing a forum for the exchange of ideas within the scientific community.”

     
  • richardmitnick 2:55 pm on September 29, 2016 Permalink | Reply
    Tags: , CERN LHC, , LHC smashes old collision records,   

    From Symmetry: “LHC smashes old collision records” 

    Symmetry Mag
    Symmetry

    09/29/16
    Sarah Charley

    1
    Tai Sakuma, CMS experiment

    The Large Hadron Collider is now producing about a billion proton-proton collisions per second.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    The LHC is colliding protons at a faster rate than ever before, approximately 1 billion times per second. Those collisions are adding up: This year alone the LHC has produced roughly the same number of collisions as it did during all of the previous years of operation together.

    This faster collision rate enables scientists to learn more about rare processes and particles such as Higgs bosons, which the LHC produces about once every billion collisions.

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    CERN ATLAS Higgs Event
    CERN ATLAS Higgs Event

    “Every time the protons collide, it’s like the spin of a roulette wheel with several billion possible outcomes,” says Jim Olsen, a professor of physics at Princeton University working on the CMS experiment.

    CERN/CMS Detector
    CERN/CMS Detector

    “From all these possible outcomes, only a few will teach us something new about the subatomic world. A high rate of collisions per second gives us a much better chance of seeing something rare or unexpected.”

    Since April, the LHC has produced roughly 2.4 quadrillion particle collisions in both the ATLAS and CMS experiments.

    CERN/ATLAS detector
    CERN/ATLAS detector

    The unprecedented performance this year is the result of both the incremental increases in collision rate and the sheer amount of time the LHC is up and running.

    “This year the LHC is stable and reliable,” says Jorg Wenninger, the head of LHC operations. “It is working like clockwork. We don’t have much downtime.”

    Scientists predicted that the LHC would produce collisions around 30 percent of the time during its operation period. They expected to use the rest of the time for maintenance, rebooting, refilling and ramping the proton beams up to their collision energy. However, these numbers have flipped; the LHC is actually colliding protons 70 percent of the time.

    “The LHC is like a juggernaut,” says Paul Laycock, a physicist from the University of Liverpool working on the ATLAS experiment. “We took around a factor of 10 more data compared to last year, and in total we already have more data in Run 2 than we took in the whole of Run 1. Of course the biggest difference between Run 1 and Run 2 is that the data is at twice the energy now, and that’s really important for our physics program.”

    This unexpected performance comes after a slow start-up in 2015, when scientists and engineers still needed to learn how to operate the machine at that higher energy.

    “With more energy, the machine is much more sensitive,” says Wenninger. “We decided not to push it too much in 2015 so that we could learn about the machine and how to operate at 13 [trillion electronvolts]. Last year we had good performance and no real show-stoppers, so now we are focusing on pushing up the luminosity.”

    The increase in collision rate doesn’t come without its difficulties for the experiments.

    “The number of hard drives that we buy and store the data on is determined years before we take the data, and it’s based on the projected LHC uptime and luminosity,” Olsen says. “Because the LHC is outperforming all estimates and even the best rosy scenarios, we started to run out of disk space. We had to quickly consolidate the old simulations and data to make room for the new collisions.”

    The increased collision rate also increased the importance of vigilant detector monitoring and adjustments of experimental parameters in real time. All the LHC experiments are planning to update and upgrade their experimental infrastructure in winter 2017.

    “Even though we were kept very busy by the deluge of data, we still managed to improve on the quality of that data,” says Laycock. “I think the challenges that arose thanks to the fantastic performance of the LHC really brought the best out of ATLAS, and we’re already looking forward to next year.”

    Astonishingly, 2.4 quadrillion collisions represent just 1 percent of the total amount planned during the lifetime of the LHC research program. The LHC is scheduled to run through 2037 and will undergo several rounds of upgrades to further increase the collision rate.

    “Do we know what we will find? Absolutely not,” Olsen says. “What we do know is that we have a scientific instrument that is unprecedented in human history, and if new particles are produced at the LHC, we will find them.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 10:57 am on September 23, 2016 Permalink | Reply
    Tags: , , CERN LHC, , , , ,   

    From Yale via Vox: “Why physicists really, really want to find a new subatomic particle” 

    Yale University bloc

    Yale University

    1

    Vox

    Sep 21, 2016
    Brian Resnick

    The latest search for a new particle has fizzled. Scientists are excited, and a bit scared.

    2
    Particle physicists are begging nature to reveal the secrets of the universe. The universe isn’t talking back. FABRICE COFFRINI/AFP/Getty Images

    Particle physicists are rather philosophical when describing their work.

    “Whatever we find out, that is what nature chose,” Kyle Cranmer, a physics professor at New York University, tells me. It’s a good attitude to have when your field yields great disappointments.

    For months, evidence was mounting that the Large Hadron Collider, the biggest and most powerful particle accelerator in the world, had found something extraordinary: a new subatomic particle, which would be a discovery surpassing even the LHC’s discovery of the Higgs boson in 2012, and perhaps the most significant advance since Einstein’s theory of relativity.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    And yet, nature had other plans.

    In August, the European Organization for Nuclear Research (CERN) reported that the evidence for the new particle had run thin. What looked like a promising “bump” in the data, indicating the presence of a particle with a unique mass, was just noise.

    But to Cranmer — who has analyzed LHC data in his work — the news did not equate failure. “You have to keep that in mind,” he says. “Because it can feel that way. It wasn’t there to be discovered. It’s like being mad that someone didn’t find an island when someone is sailing in the middle of the ocean.”

    What’s more, the LHC’s journey is far from over. The machine is expected to run for another 20 or so years. There will be more islands to look for.

    “We’re either going to discover a bunch of new particles or we will not,” Cranmer says. “If we find new particles, we can study them, and then we have a foothold to make progress. And if we don’t, then [we’ll be] staring at a flat wall in front figuring out how to climb it.”

    This is a dramatic moment, one that could provoke “a crisis at the edge of physics,” according to a New York Times op-ed. Because if the superlative LHC can’t find answers, it will cast doubt that answers can be found experimentally.

    From here, there are two broad scenarios that could play out, both of which will vastly increase our understanding of nature. One scenario will open up physics to a new world of understanding about the universe; the other could end particle physics as we know it.

    The physicists themselves can’t control the outcome. They’re waiting for nature to tell them the answers.

    Why do we care about new subatomic particles anyway?

    3
    A graphic showing traces of collision of particles at the Compact Muon Solenoid (CMS) experience is pictured with a slow speed experience at Universe of Particles exhibition of the the European Organization for Nuclear Research (CERN) on December 13, 2011, in Geneva. FABRICE COFFRINI/AFP/GettyImages

    The LHC works by smashing together atoms at incredibly high velocities. These particles fuse and can form any number of particles that were around in the universe from the Big Bang onward.

    When the Higgs boson was confirmed in 2012, it was a cause for celebration and unease.

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    CERN ATLAS Higgs Event
    CERN ATLAS Higgs Event

    The Higgs was the last piece of a puzzle called the standard model, which is a theory that connects all the known components of nature (except gravity) together in a balanced, mathematical equation.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth

    The Higgs was the final piece that had been theorized to exist but never seen.

    After the Higgs discovery, the scientists at the LHC turned their hopes in a new direction. They hoped the accelerator could begin to find particles that had never been theorized nor ever seen. It was like going from a treasure hunt with a map to chartering a new ocean.

    They want to find these new subatomic particles because even though the standard model is now complete, it still can’t answer a lot of lingering questions about the universe. Let’s go through the two scenarios step by step.

    Scenario 1: There are more subatomic particles! Exciting!

    If the LHC finds new subatomic particles, it lend evidence to a theory known as supersymmetry. Supersymmetry posits that all the particles in the standard model must have a shadow “super partner” that spins in a slightly different direction.

    Standard model of Supersymmetry DESY
    Standard model of Supersymmetry DESY

    Scientists have never seen one of these supersymmetrical particles, but they’re keen to. Supersymmetry could neatly solve some of the biggest problems vexing physicists right now.

    Such as:

    1) No one knows what dark matter is

    One of these particles could be what scientists call “dark matter,” which is theorized to make up 27 percent of the universe. But we’ve never seen dark matter, and that leaves a huge gaping hole in our understanding of the how the universe formed and exists today.

    “It could be that one particle is responsible for dark matter,” Cranmer explains. Simple enough.

    2) The Higgs boson is much too light

    The Higgs discovery was an incredible triumph, but it also contained a mystery to solve. The boson — at 126 GeV (giga electron volts) — was much lighter than the standard model and the math of quantum mechanics suggests it should be.

    Why is that a problem? Because it’s a wrinkle to be ironed out in our understanding of the universe. It suggests the standard model can’t explain everything. And physicists want to know everything.

    “Either nature is sort of ugly, which is entirely conceivable, and we just have to live with the fact that the Higgs boson mass is light and we don’t know why,” Ray Brock, a Michigan State University physicist who has worked on the LHC, says, “or nature is trying to tell us something.”

    It could be that a yet-to-be-discovered subatomic particle interacts with the Higgs, making it lighter than it ought to be.

    3) The standard model doesn’t unify the forces of the universe

    There are four major forces that make the universe tick: the strong nuclear force (which holds atoms together), the weak nuclear force (what makes Geiger counters tick), electromagnetism (you’re using it right now, reading this article on an electronic screen), and gravity (don’t look down.)

    Scientists aren’t content with the four forces. They, for decades, have been trying to prove that the universe works more elegantly, that, deep down, all these forces are just manifestations of one great force that permeates the universe.

    Physicists call this unification, and the standard model doesn’t provide it.

    “If we find supersymmetry at the LHC, it is a huge boost to the dream that three of the fundamental forces we have [all of them except gravity] are all going to unify,” Cranmer says.

    4) Supersymmetry would lead to more particle hunting

    If scientists find one new particle, supersymmetry means they’ll find many more. That’s exciting. “It’s not going to be just one new particle that we discovered, and yay!” Cranmer says. “We’re going to be finding new forces, or learn something really deep about the nature of space and time. Whatever it is, it’s going to be huge.”

    Scenario 2: There are no new subatomic particles. Less exciting! But still interesting. And troubling.

    The LHC is going to run for around another 20 years, at least. There’s a lot of time left to find new particles, even if there is no supersymmetry. “This is what always blows my mind,” Brock says. “We’ve only taken about 5 percent of the total planned data that the LHC is going to deliver until the middle 2020s.”

    But the accelerator also might not find anything. If the new particles aren’t there to find, the LHC won’t find them. (Hence, the notion that physicists are looking for “what nature chose.”)

    But again, this doesn’t represent a failure. It will actually yield new insights about the universe.

    “It would be a profound discovery to find that we’re not going to see anything else,” Cranmer says.

    1) For one, it would suggest that supersymmetry isn’t the answer

    If supersymmetry is dead, then theoretical physicists will have to go back to the drawing board to figure out how to solve the mysteries left open by the standard model.

    “If we’re all coming up empty, we would have to question our fundamental assumptions,” Sarah Demers, a Yale physicist, tells me. “Which is something we’re trying to do all the time, but that would really force us.”

    2) The answers exist, but they exist in a different universe

    If the LHC can’t find answers to questions like “why is the Higgs so light?” scientists might grow to accept a more out-of-the-box idea: the multiverse.

    That’s the idea where there are tons of universes all existing parallel to one another. It could be that “in most of [the universes], the Higgs boson is really heavy, and in only in very unusual universes [like our own] is the Higgs boson so light that life can form,” Cranmer says.

    Basically: On the scale of our single universe, it might not make sense for the Higgs to be light. But if you put it together with all the other possible universes, the math might check out.

    There’s a problem with this theory, however: If heavier Higgs bosons exist in different universes, there’s no possible way to observe them. They’re in different universes!

    “Which is why a lot of people hate it, because they consider it to be anti-science,” Cranmer says. “It might be impossible to test.”

    3) The new subatomic particles do exist, but the LHC isn’t powerful enough to find them

    In 20 years, if the LHC doesn’t find any new particles, there might be a simple reason: These particles are too heavy for the LHC to detect.

    This is basic E=mc2 Einstein: The more energy in the particle accelerator, the heavier the particles it can create. The LHC is the most powerful particle accelerator in the history of man, but even it has its limits.

    So what will physicists do? Build an even bigger, even smashier particle collider? That’s an option. There are currently preliminary plans in China for a collider double the size of the LHC.

    Building a bigger collider might be a harder sell for international funding agencies. The LHC was funded in part because of the quest to confirm the Higgs. Will governments really spend billions on a machine that may not yield epic insights?

    “Maybe we were blessed as a field that we always had a target or two to shoot for. We don’t have that anymore,” says Markus Klute, an MIT physicist stationed at CERN in Europe. “It’s easier to explain to the funding agencies specifically that there’s a specific endpoint.”

    The LHC will keep running for the foreseeable future. But it could prove a harder task to make the case to build a new collider.

    Either way, these are exciting times for physics

    4
    Dean Mouhtaropoulos/Getty Images

    “I think we have had a tendency to be prematurely depressed,” Demers says. “It’s never a step backward to learn something new,” even if the news is negative. “Ruling out ideas teaches us an incredible amount.”

    And she says that even if the LHC can never find another particle, it can still produce meaningful insights. Already, her colleagues are using it to help determine why there’s so much more matter than antimatter in the universe. And she reminds me the LHC can still teach us more about the mysterious Higgs. We will be able to measure it to a more precise degree.

    Brock, the MSU physicist, notes that since the 1960s, physicists have been chasing the standard model. Now they don’t quite know what they’re chasing. But they know it will change the world.

    “I can’t honestly say in all those 40 years, I’ve been exploring,” Brock says. “I’ve been testing the standard model. The Higgs boson was the last missing piece. Now, we have to explore.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Yale University Campus

    Yale University comprises three major academic components: Yale College (the undergraduate program), the Graduate School of Arts and Sciences, and the professional schools. In addition, Yale encompasses a wide array of centers and programs, libraries, museums, and administrative support offices. Approximately 11,250 students attend Yale.

     
  • richardmitnick 9:01 am on September 23, 2016 Permalink | Reply
    Tags: , , , CERN LHC   

    From AARNet: “New record set for elephant data flow over AARNet” 

    aarnet-bloc

    AARNet

    1
    A new record set for the for the ARC Centre of Excellence in Particle Physics at the University of Melbourne: 53 terabytes transferred in 24hrs sustained at nearly 5 gigabits per second.

    “I think we’ve made a new record for us. 53 terabytes transferred in 24hrs, at 100% efficiency,” reported Sean Crosby to AARNet’s eResearch team. Crosby is a research computing scientist working at the ARC Centre of Excellence for Particle Physics at the Terascale (CoEPP) at the University of Melbourne. He is referring to a recent elephant data flow over the AARNet network between the University of Melbourne and a research network-connected site located in Germany.

    Data processing for the ATLAS Experiment

    This huge data flow forms part of the CoEPP’s activities as an ATLAS experiment Tier2 site for the Worldwide Large Hardron Collider Computing Grid (WLCG). The CoEPP is one of the 170+ grid-connected computing centres in 42 countries worldwide that provide the linked-up computing and storage facilities required for analysing the ~30 Petabytes (30 million gigabytes) of data CERN’s Large Hadron Collider (LHC) produces annually.

    Helping scientists further our understanding of the Universe

    Physicists are using the LHC to recreate the conditions of the Universe just after the ‘Big Bang’. They are searching for new discoveries in the head-on collisions of protons of extraordinarily high energy to further our understanding of energy and matter. Following the discovery of the Higgs boson in 2012, data from the ATLAS experiment allows in-depth investigation of the boson’s properties and the origin of mass.

    The reported 100% efficiency of this particular big data transfer between Australian and Germany, clocked at nearly 5 gigabits per second sustained over 24 hours, is a great example of the reliability and scalability of the AARNet network to meet the needs of data-intensive research on demand.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    AARNet provides critical infrastructure for driving innovation in today’s knowledge-based economy

    Australia’s Academic and Research Network (AARNet) is a national resource – a National Research and Education Network (NREN). AARNet provides unique information communications technology capabilities to enable Australian education and research institutions to collaborate with each other and their international peer communities.

     
  • richardmitnick 3:29 pm on September 20, 2016 Permalink | Reply
    Tags: , CERN LHC, LHC hits 2016 luminosity target   

    From CERN Courier: “LHC hits 2016 luminosity target” 

    CERN Courier

    Sep 16, 2016
    No writer credit found

    1
    Integrated luminosity

    At the end of August, two months ahead of schedule, the integrated luminosity delivered by the LHC reached the 2016 target value of 25 fb–1 in both the ATLAS and CMS experiments. The milestone is the result of a large group of scientists and technical experts who work behind the scenes to keep the 27 km-circumference machine operating at the highest possible performance.

    Following a push to produce as many proton–proton collisions as possible before the summer conferences, several new ideas, such as a novel beam-production technique in the injectors, have been incorporated to boost the LHC performance. Thanks to these improvements, over the summer the LHC was routinely operating with peak luminosities 10%–15% above the design value of 1034 cm–2 s–1.

    This is a notable success, especially considering that a temporary limitation in the Super Proton Synchrotron only allows the injection of 2220 bunches per beam instead of the foreseen 2750, and that the LHC energy is currently limited to 6.5 TeV instead of the nominal 7 TeV. The excellent availability of all the key systems of the LHC is one of the main reasons behind these achievements.

    The accelerator team is now gearing up for the season finale. Following a technical stop, a forward proton–proton physics run took place in mid-September. Proton–proton physics is scheduled to continue until the last week in October, after which proton–lead physics will take over for a period of one month. The LHC and its experiments can look forward to the completion of what is already a very successful year.

    See the full article here .
    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 2:03 pm on September 16, 2016 Permalink | Reply
    Tags: , CERN LHC, , ,   

    From Symmetry: “The secret lives of long-lived particles” 

    Symmetry Mag

    Symmetry

    09/16/16
    Sarah Charley

    A theoretical species of particle might answer nearly every question about our cosmos—if scientists can find it.

    1
    ATLAS collaboration

    The universe is unbalanced.

    Gravity is tremendously weak. But the weak force, which allows particles to interact and transform, is enormously strong. The mass of the Higgs boson is suspiciously petite. And the catalog of the makeup of the cosmos? Ninety-six percent incomplete.

    Almost every observation of the subatomic universe can be explained by the Standard Model of particle physics—a robust theoretical framework bursting with verifiable predictions.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    But because of these unsolved puzzles, the math is awkward, incomplete and filled with restrictions.

    A few more particles would solve almost all of these frustrations. Supersymmetry (nicknamed SUSY for short) is a colossal model that introduces new particles into the Standard Model’s equations.

    Standard model of Supersymmetry DESY
    Standard model of Supersymmetry DESY

    It rounds out the math and ties up loose ends. The only problem is that after decades of searching, physicists have found none of these new friends.

    But maybe the reason physicists haven’t found SUSY (or other physics beyond the Standard Model) is because they’ve been looking through the wrong lens.

    “Beautiful sets of models keep getting ruled out,” says Jessie Shelton, a theorist at the University of Illinois, “so we’ve had to take a step back and consider a whole new dimension in our searches, which is the lifetime of these particles.”

    In the past, physicists assumed that new particles produced in particle collisions would decay immediately, almost precisely at their points of origin. Scientists can catch particles that behave this way—for example, Higgs bosons—in particle detectors built around particle collision points. But what if new particles had long lifetimes and traveled centimeters—even kilometers—before transforming into something physicists could detect?

    This is not unprecedented. Bottom quarks, for instance, can travel a few tenths of a millimeter before decaying into more stable particles. And muons can travel several kilometers (with the help of special relativity) before transforming into electrons and neutrinos. Many theorists are now predicting that there may be clandestine species of particles that behave in a similar fashion. The only catch is that these long-lived particles must rarely interact with ordinary matter, thus explaining why they’ve escaped detection for so long. One possible explanation for this aloof behavior is that long live particles dwell in a hidden sector of physics.

    “Hidden-sector particles are separated from ordinary matter by a quantum mechanical energy barrier—like two villages separated by a mountain range,” says Henry Lubatti from the University of Washington. “They can be right next to each other, but without a huge boost in energy to get over the peak, they’ll never be able to interact with each other.”

    High-energy collisions generated by the Large Hadron Collider could kick these hidden-sector particles over this energy barrier into our own regime. And if the LHC can produce them, scientists should be able to see the fingerprints of long-lived particles imprinted in their data.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Long-lived particles jolted into our world by the LHC would most likely fly at close to the speed of light for between a few micrometers and a few hundred thousand kilometers before transforming into ordinary and measurable matter. This incredibly generous range makes it difficult for scientists to pin down where and how to look for them.

    But the lifetime of a subatomic particle is much like that of any living creature. Each type of particle has an average lifespan, but the exact lifetime of an individual particle varies. If these long-lived particles can travel thousands of kilometers before decaying, scientists are hoping that they’ll still be able to catch a few of the unlucky early-transformers before they leave the detector. Lubatti and his collaborators have also proposed a new LHC surface detector, which would extend their search range by many orders of magnitude.

    Because these long-lived particles themselves don’t interact with the detector, their signal would look like a stream of ordinary matter spontaneously appearing out of nowhere.

    “For instance, if a long lived particle decayed into quarks while inside the muon detector, it would mimic the appearance of several muons closely clustered together,” Lubatti says. “We are triggering on events like this in the ATLAS experiment.” After recording the events, scientists use custom algorithms to reconstruct the origins of these clustered particles to see if they could be the offspring of an invisible long-lived parent.

    If discovered, this new breed of matter could help answer several lingering questions in physics.

    “Long-lived particles are not a prediction of a single new theory, but rather a phenomenon that could fit into almost all of our frameworks for beyond-the-Standard-Model physics,” Shelton says.

    In addition to rounding out the Standard Model’s mathematics, inert long-lived particles could be cousins of dark matter—an invisible form of matter that only interacts with the visible cosmos through gravity. They could also help explain the origin of matter after the Big Bang.

    “So many of us have spent a lifetime studying such a tiny fraction of the universe,” Lubatti says. “We’ve understood a lot, but there’s still a lot we don’t understand—an enormous amount we don’t understand. This gives me and my colleagues pause.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 11:38 am on September 7, 2016 Permalink | Reply
    Tags: , CERN LHC, , ,   

    From particlebites: “Searching for Magnetic Monopoles with MoEDAL” 

    particlebites bloc

    particlebites

    September 7, 2016
    Julia Gonski
    Article: Search for magnetic monopoles with the MoEDAL prototype trapping detector in 8 TeV proton-proton collisions at the LHC
    Authors: The ATLAS Collaboration
    Reference: arXiv:1604.06645v4 [hep-ex]

    Somewhere in a tiny corner of the massive LHC cavern, nestled next to the veteran LHCb detector, a new experiment is coming to life.

    The Monopole & Exotics Detector at the LHC, nicknamed the MoEDAL experiment, recently published its first ever results on the search for magnetic monopoles and other highly ionizing new particles. The data collected for this result is from the 2012 run of the LHC, when the MoEDAL detector was still a prototype. But it’s still enough to achieve the best limit to date on the magnetic monopole mass.

    1
    CERN

    Magnetic monopoles are a very appealing idea. From basic electromagnetism, we expect to swap electric and magnetic fields under duality without changing Maxwell’s equations. Furthermore, Dirac showed that a magnetic monopole is not inconsistent with quantum electrodynamics (although they do not appear natually.) The only problem is that in the history of scientific experimentation, we’ve never actually seen one. We know that if we break a magnet in half, we will get two new magnetics, each with its own North and South pole (see Figure 1).

    This is proving to be a thorn in the side of many physicists. Finding a magnetic monopole would be great from a theoretical standpoint. Many Grand Unified Theories predict monopoles as a natural byproduct of symmetry breaking in the early universe. In fact, the theory of cosmological inflation so confidently predicts a monopole that its absence is known as the “monopole problem”. There have been occasional blips of evidence for monopoles in the past (such as a single event in a detector), but nothing has been reproducible to date.

    Enter MoEDAL. It is the seventh addition to the LHC family, having been approved in 2010. If the monopole is a fundamental particle, it will be produced in proton-proton collisions. It is also expected to be very massive and long-lived. MoEDAL is designed to search for such a particle with a three-subdetector system.


    CERN: The LHC MoEDAL Experiment

    The Nuclear Track Detector is composed of plastics that are damaged when a charged particle passes through them. The size and shape of the damage can then be observed with an optical microscope. Next is the TimePix Radiation Monitor system, a pixel detector which absorbs charge deposits induced by ionizing radiation. The newest addition is the Trapping Detector system, which is simply a large aluminum volume that will trap a monopole with its large nuclear magnetic moment.

    The collaboration collected data using these distinct technologies in 2012, and studied the resulting materials and signals. The ultimate limit in the paper excludes spin-0 and spin-1/2 monopoles with masses between 100 GeV and 3500 GeV, and a magnetic charge > 0.5gD (the Dirac magnetic charge). See Figures 3 and 4 for the exclusion curves. It’s worth noting that this upper limit is larger than any fundamental particle we know of to date. So this is a pretty stringent result.

    3
    Cross-section upper limits at 95% confidence level for DY spin-1/2 monopole production as
    a function of mass, with different charge models.

    4
    Cross-section upper limits at 95% confidence level for DY spin-1/2 monopole production as
    a function of charge, with different mass models.

    As for moving forward, we’ve only talked about monopoles here, but the physics programme for MoEDAL is vast. Since the detector technology is fairly broad-based, it is possible to find anything from SUSY to Universal Extra Dimensions to doubly charged particles. Furthermore, this paper is only published on LHC data from September to December of 2012, which is not a whole lot. In fact, we’ve collected over 25x that much data in this year’s run alone (although this detector was not in use this year.) More data means better statistics and more extensive limits, so this is definitely a measurement that will be greatly improved in future runs. A new version of the detector was installed in 2015, and we can expect to see new results within the next few years.

    Further Reading:

    CERN press release
    The MoEDAL collaboration website
    “The Phyiscs Programme of the MoEDAL experiment at the LHC”. arXiv.1405.7662v4 [hep-ph]
    “Introduction to Magnetic Monopoles”. arxiv.1204.30771 [hep-th]
    Condensed matter physics has recently made strides in the study of a different sort of monopole; see “Observation of Magnetic Monopoles in Spin Ice”, arxiv.0908.3568 [cond-mat.dis-nn]

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    What is ParticleBites?

    ParticleBites is an online particle physics journal club written by graduate students and postdocs. Each post presents an interesting paper in a brief format that is accessible to undergraduate students in the physical sciences who are interested in active research.

    The papers are accessible on the arXiv preprint server. Most of our posts are based on papers from hep-ph (high energy phenomenology) and hep-ex (high energy experiment).

    Why read ParticleBites?

    Reading a technical paper from an unfamiliar subfield is intimidating. It may not be obvious how the techniques used by the researchers really work or what role the new research plays in answering the bigger questions motivating that field, not to mention the obscure jargon! For most people, it takes years for scientific papers to become meaningful.

    Our goal is to solve this problem, one paper at a time. With each brief ParticleBite, you should not only learn about one interesting piece of current work, but also get a peek at the broader picture of research in particle physics.

    Who writes ParticleBites?

    ParticleBites is written and edited by graduate students and postdocs working in high energy physics. Feel free to contact us if you’re interested in applying to write for ParticleBites.

    ParticleBites was founded in 2013 by Flip Tanedo following the Communicating Science (ComSciCon) 2013 workshop.

    2
    Flip Tanedo UCI Chancellor’s ADVANCE postdoctoral scholar in theoretical physics. As of July 2016, I will be an assistant professor of physics at the University of California, Riverside

    It is now organized and directed by Flip and Julia Gonski, with ongoing guidance from Nathan Sanders.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: