Tagged: Accelerator Science Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:48 pm on October 11, 2018 Permalink | Reply
    Tags: Accelerator Science, , European Strategy for Particle Physics, , , ,   

    From CERN: “European Strategy for Particle Physics” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN

    The European Strategy for Particle Physics is the cornerstone of Europe’s decision-making process for the long-term future of the field. Mandated by the CERN Council, it is formed through a broad consultation of the grass-roots particle physics community, it actively solicits the opinions of physicists from around the world, and it is developed in close coordination with similar processes in the US and Japan in order to ensure coordination between regions and optimal use of resources globally.

    The European Strategy process was initiated by the CERN Council in 2005, resulting in a document being adopted by the Council in 2006. Unsurprisingly, this document placed the LHC at the top of Europe particle physics’ scientific priorities, with a significant luminosity upgrade already being mooted. A ramp-up of R&D into future accelerators also featured high on the priority list, followed by coordination with a potential International Linear Collider, and participation in a global neutrino programme.

    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan

    The original Strategy also foresaw increased collaboration with neighbouring fields such as astroparticle and nuclear physics, and it recognised the importance of complementary issues such as communications and technology transfer.

    The original European Strategy prescribed regular updates to take into account the evolution of the field. The first of these was prepared in 2012 and adopted in 2013. By this time, the LHC had proved its capacity with the discovery of the long-sought Higgs boson, evidence for the Brout-Englert-Higgs mechanism through which fundamental particles acquire their mass.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    Again, it came as no surprise that the LHC topped the list of scientific priorities for European particle physics, with the high-luminosity upgrade increasing in importance, and preparations for the post-LHC future taking shape. “Europe”, said the Strategy document, “needs to be in a position to propose an ambitious post-LHC accelerator project at CERN by the time of the next Strategy update.”

    Post LHC map

    The remainder of the updated recommendations represented logical and evidence-based evolutions of those contained in the initial European Strategy. All have been, or are in the process of being, implemented.

    As the second update of the European Strategy gets underway, the stakes are high. Europe, in collaboration with partners from around the world, is engaged in R&D projects for a range of ambitious post-LHC facilities under the CLIC and FCC umbrellas.


    CERN/CLIC

    It is time to check progress on these, matching their expected performance to physics needs. The discussions will be based on scientific evidence gleaned from the impressive results coming in from the LHC, as well as from technological and resourcing considerations.

    In other areas of particle physics, much has changed since the last strategy update. Europe, through CERN, is now contributing fully to a globally-coordinated neutrino programme with experiments to be carried out in the USA and Japan. The International Linear Collider, which would be complementary to the LHC, remains on the table with a site having been identified in Japan and a decision on whether to go forward expected soon. There are ambitious plans to build a large collider in China. And at CERN, a study to investigate the potential for physics beyond colliders, maximising the potential for CERN’s unique accelerator complex, was launched in 2016. All of these factors will feed into the deliberations soon to get underway to update the European Strategy for Particle Physics.

    The current update of the European Strategy was initiated by the CERN Council in December 2016 to be carried out between 2018 and 2020, a date deemed optimal for the major decisions that need to be taken for the future of particle physics in Europe. A call for input was made in March 2018, with dates being fixed for the key information gathering and drafting stages in August 2018.

    Strategic planning in European particle physics is an open, inclusive and evidence-driven process. Follow the current strategy update as it evolves, and join us on the unfolding adventure of research at the frontier of knowledge.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

    OTHER PROJECTS AT CERN

    CERN AEGIS

    CERN ALPHA

    CERN ALPHA

    CERN AMS

    CERN ACACUSA

    CERN ASACUSA

    CERN ATRAP

    CERN ATRAP

    CERN AWAKE

    CERN AWAKE

    CERN CAST

    CERN CAST Axion Solar Telescope

    CERN CLOUD

    CERN CLOUD

    CERN COMPASS

    CERN COMPASS

    CERN DIRAC

    CERN DIRAC

    CERN ISOLDE

    CERN ISOLDE

    CERN LHCf

    CERN LHCf

    CERN NA62

    CERN NA62

    CERN NTOF

    CERN TOTEM

    CERN UA9

    CERN Proto Dune

    CERN Proto Dune

    Advertisements
     
  • richardmitnick 4:03 pm on October 9, 2018 Permalink | Reply
    Tags: Accelerator Science, , , , , , ,   

    From Symmetry: “Progress in plasma wakefield acceleration for positrons” 

    Symmetry Mag
    From Symmetry

    1
    SLAC FACET-II upgrading its Facility for Advanced Accelerator Experimental Tests (FACET) – a test bed for new technologies that could revolutionize the way we build particle accelerators

    2
    Researchers will use FACET-II to develop the plasma wakefield acceleration method, in which researchers send a bunch of very energetic particles through a hot ionized gas, or plasma, creating a plasma wake for a trailing bunch to “surf” on and gain energy. Credit: Greg Stewart/SLAC National Accelerator Laboratory. phys.org

    3
    FACET-II will produce beams of highly energetic electrons like its predecessor FACET, but with much better quality.SLAC

    4
    Researchers will use FACET-II for crucial developments before plasma accelerators can become a reality. SLAC

    5
    Future particle colliders will require highly efficient acceleration methods for both electrons and positrons. Plasma wakefield acceleration of both particle types, as shown in this simulation, could lead to smaller and more powerful colliders than today’s machines. Credit: F. Tsung/W. An/UCLA; Greg Stewart/SLAC National Accelerator Laboratory. phys.org

    10/09/18
    Angela Anderson

    Three new studies show the promise and challenge of using plasma wakefield acceleration to build a future electron-positron collider.

    Matter is known to exist in four different states: solid, liquid, gas or—under circumstances such as very high temperatures—plasma. A plasma is an ionized gas, a gas with enough energy that some of its atoms have lost their electrons, and those negatively charged electrons are floating along with the now positively charged nuclei they left behind.

    If you send two bunches of particles speeding through plasma about a hair’s width apart, the first creates a wake that feeds the second with energy. That’s the basic idea behind a powerful technology under development called plasma wakefield acceleration, which promises to make future particle colliders more compact and affordable.

    Three recent studies have advanced accelerator physicists’ efforts to design a powerful future matter-antimatter collider using plasma wakefield technology.

    The current most powerful particle accelerator in the world is the Large Hadron Collider, which measures about 17 miles in circumference and cost more than $4 billion to construct. To get higher-energy particle collisions that could further our understanding of nature’s fundamental building blocks, accelerators conventionally must increase in size and cost.

    But plasma wakefield acceleration, also known as PWFA, could buck that trend. The technology has already been shown to significantly increase the energy gained by accelerated particles over shorter distances.

    “With plasma wakefield acceleration, we are trying to do something analogous to making better computer chips—the phones in our pockets can now do the same thing that football fields of computers did before,” explains PWFA researcher Carl Lindstrøm from the University of Oslo.

    A plasma wakefield accelerator could accomplish in just a few meters what it takes the copper linear accelerator at the US Department of Energy’s SLAC National Accelerator Laboratory 2 miles to do.

    “Of all known particle accelerator mechanisms, plasmas provide the most energy gained over a set distance—what’s known as accelerating gradient,” says Spencer Gessner, an accelerator physicist at CERN and formerly SLAC. “We’ve already demonstrated gradients that are almost 10,000 times larger than the conventional radio frequency cavities used in SLAC’s current linear accelerator.”

    If successful, PWFA could dramatically increase the energy of a future linear collider in the same footprint, or make it possible to build a smaller collider. “It’s unlikely that you would build tons of these machines, because they consume a lot of power,” Gessner explains. “But if even one existed, it would be a big improvement over where we are today.”

    The problem with positrons

    The cleanest collisions for particle physics research are produced by smashing together electrons and positrons. That’s because both electrons and positrons are fundamental particles; they cannot be broken down into smaller parts. And it’s because electrons and positrons are a matter-antimatter pair; when they collide, they annihilate one another and convert neatly into new particles and energy, leaving no leftover particle mess behind.

    Electron-positron colliders of the past produced numerous insights in particle physics, including Nobel Prize-winning discoveries of quarks, the tau lepton and the J/psi meson (co-discovered with scientists using a proton accelerator). These collisions are also preferred in the design of next-generation discovery machines, including plasma wakefield accelerators.

    The problem is with positrons.

    Whereas electrons can be accelerated as a tightly focused particle bunch in the plasma wake, positron bunches tend to lose their compact shape and focus in the plasma environment. PWFA scientists refer to this difference as asymmetry, and the latest research explores strategies for overcoming it.

    “For electrons, plasma wakefield acceleration achieves the two things we need from it to build the machines we would like to build: They accelerate quickly and maintain their quality,” Lindstrøm says. “It’s just unlucky, really, that the same is not true for positrons, and that is the huge challenge we are facing.”

    Wave vs. tsunami

    A conventional accelerator accelerates particles using radio-frequency cavities. RF cavities often look like series of beads on a straight line of string. Electromagnetic waves build up inside RF cavities so that they continuously flip from positive to negative and back again. Scientists send charged particles through the RF cavities, where they receive a series of pushes and pulls from the electromagnetic wave, gaining speed and energy along the way.

    The accelerating wave in a conventional accelerator varies in a regular and predictable way, making it simple to place electrons or positrons in the right location to get a boost.

    Plasma, on the other hand, creates what scientists refer to as a “non-linear” environment: one that is difficult to predict mathematically because there is no uniform variation.

    “When you send a very strong beam into plasma, it’s going to cause something like a tsunami, making all your equations invalid,” Gessner explains. “It’s no longer simply perturbing the ocean, it’s completely remaking it.”

    This non-linear plasma environment offers high acceleration gradients and focusing for electrons, but the effect on positrons is more perilous: While experiments have demonstrated acceleration of positrons in plasma, the quality of the beam cannot hold.

    According to Gessner, there are two ways to approach the asymmetry challenge: “We can either embrace the asymmetry and see where it takes us—although this turns out to be very complicated. Or we can try to create symmetry, for example, by creating a hollow channel inside the plasma where focusing is no longer an issue.”

    Learning from the roadblocks

    During the past several years, scientists working at SLAC’s Facility for Advanced Accelerator Experimental Tests, or FACET, have done a series of studies on positron acceleration in plasma. In 2015, a team comprised of SLAC and UCLA researchers accelerated antimatter in a plasma wake using only one bunch of positrons [Nature Scientific Reports]. The tail of that bunch was fed by the wake created by the head.

    Single-bunch positron acceleration could potentially be put to use in a plasma-based “afterburner” for existing or future RF accelerators. One plasma accelerator structure could be added onto the end of a linear accelerator to boost energy without having to make it much longer.

    However, a complete PWFA accelerator would need to be built with many consecutive accelerator structures that require a separate trailing positron bunch.

    SLAC’s Mark Hogan, who has been studying PWFA for more than two decades, explains: “With a single bunch you are losing in one half and gaining in the other. By the time you get through multiple plasma cells, there won’t be any particles left because you are always dividing the bunch in half. You’d have to start with an enormous number of particles.”

    In October 2017, the researchers started investigating techniques that might work for multiple plasma cells and were able to accelerate a distinct bunch of positrons using PWFA.

    “We used a strong, dense positron beam to accelerate a separate bunch of trailing positrons for the first time,” says first author of the paper in Nature Scientific Reports, Antoine Doche of Paris-Saclay University. “This was one important and necessary step for future colliders.”

    In the same study the scientists showed they could accelerate positrons in a “quasilinear” wave, demonstrating that the driving bunch does not necessarily need to be positrons: Electrons or a laser driver could create a similar wake for the trailing positrons.

    The study opens promising paths to explore the first approach, embracing the problem with positrons, though technical challenges persist.

    “Colliders require particle beams with very specific properties,” Doche explains, “High charge, meaning a lot of particles in each bunch, and a small bunch size. When a positron beam drives a plasma wave, the wave evolves toward a nonlinear regime, all the more quickly as the charge of the bunch increases. One solution might be to more fully understand these positron-driven nonlinear waves.”

    Rolling off a hill

    In 2016, the research team eliminated the asymmetry issue by creating a narrow tube of plasma with neutral gas inside where the positrons stayed tightly focused as they flew through. That same research showed that the positron beam created an energetic wake that could accelerate a trailing bunch of positrons, and in the latest experiments the team achieved this two-bunch acceleration in what they call the “hollow channel.”

    While the hollow channel approach avoids the problem of asymmetry, it brings its own obstacles.

    “If the beam is not perfectly aligned in the tube, it will start to drift to the side that it is offset,” Lindstrøm says. “It’s like putting a ball on a hilltop—if it’s slightly to one side, it will roll off to that side. It’s an effect that we call the transverse wakefield, and it is something that has been seen in past accelerators as a weak effect. But here, because we have a very, very narrow plasma tube, the effect grows really fast. Our latest research [Physical Review Letters] measured and verified that the effect is very strong.”

    When the positrons are deflected away from the axis through this effect, the beam is lost.

    “The most recent studies verify where we currently stand, with this large challenge in front of us,” Lindstrøm says. “But in the process of getting there, we learned a lot about how this technology works.”

    Gessner concurs, “We study the problem, we see how well we can make it work, and we identify the most challenging roadblocks. And then we go back to the drawing board.”

    Encouraging signals

    Despite the challenges, international momentum to achieve high-energy accelerators based on plasma is growing.

    In research roadmaps, both the DOE and the International Committee for Future Accelerators have included positron acceleration in plasma as a goal for the next decade. Gessner and Sebastien Corde, a Paris-Saclay University PWFA researcher, are heading up a working group on positron acceleration in plasma that is tasked with making recommendations for the European Strategy for Particle Physics.

    Since the earliest experiments, SLAC has been the only laboratory in the world with the infrastructure needed to provide positron beams for PWFA research. FACET operated from 2011 to 2016 as a DOE Office of Science user facility. And the DOE recently gave the green light to its upgrade, FACET-II, which is set to come online for experiments in 2020.

    While FACET-II will initially operate with electrons only, its design allows for adding capability to produce and accelerate positrons in the future.

    “We’re at a point where people are taking this knowledge that we’ve amassed in this field and figuring out what to do next. Can we take one of these approaches, like the hollow channel, and make it more forgiving?” Hogan says. “There are a lot of things for people to look at and study going forward.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 12:05 pm on October 3, 2018 Permalink | Reply
    Tags: Accelerator Science, , , , Leon Lederman Nobel laureate former laboratory director and passionate advocate of science education dies at age 96, Nobel laureate, , ,   

    From Fermi National Accelerator Lab: “Leon Lederman, Nobel laureate, former laboratory director and passionate advocate of science education, dies at age 96” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    October 3, 2018

    Leon Lederman from Kane County Connects

    Leon Lederman, a trail-blazing researcher with a passion for science education who served as Fermilab’s director from 1978 to 1989 and won the Nobel Prize for discovery of the muon neutrino, died peacefully on Oct. 3 at a nursing home in Rexburg, Idaho. He was 96.

    He is survived by his wife of 37 years, Ellen, and three children, Rena, Jesse and Rachel, from his first wife, Florence Gordon.

    With a career that spanned more than 60 years, Lederman became one of the most important figures in the history of particle physics. He was responsible for several breakthrough discoveries, uncovering new particles that elevated our understanding of the fundamental universe. But perhaps his most critical achievements were his influence on the field and his efforts to improve science education.

    “Leon Lederman provided the scientific vision that allowed Fermilab to remain on the cutting edge of technology for more than 40 years,” said Nigel Lockyer, the laboratory’s current director. “Leon’s leadership helped to shape the field of particle physics, designing, building and operating the Tevatron and positioning the laboratory to become a world leader in accelerator and neutrino science. Today, we continue to develop and build the next generation of particle accelerators and detectors and help to advance physics globally. Leon had an immeasurable impact on the evolution of our laboratory and our commitment to future generations of scientists, and his legacy will live on in our daily work and our outreach efforts.”

    Through Lederman’s early award-winning work, he rose to prominence as a researcher and began to influence science policy. In the early 1960s, he proposed the idea for the National Accelerator Laboratory, which eventually became Fermi National Accelerator Laboratory (Fermilab). He worked with laboratory founder Robert R. Wilson to establish a community of users, credentialed individuals from around the world who could use the facilities and join experimental collaborations.

    According to Fermilab scientist Alvin Tollestrup, who worked with Lederman for more than 40 years, Lederman’s success was in part due to his ability to bring people together and get them to work cohesively.

    “One of his greatest skills was getting good people to work with him,” Tollestrup said. “He wasn’t selfish about his ideas. What he accomplished came about from his ability to put together a great team.”

    Lederman began his tenure as Fermilab director in 1978, at a time when both the laboratory staff and the greater particle physics community were deeply divided. As a charismatic leader and a respected researcher, Lederman unified the Fermilab staff and rallied the U.S. particle physics community around the idea of building a proton-antiproton collider. Originally called the energy doubler, the particle accelerator eventually became the Tevatron, the world’s highest-energy particle collider from 1983 until 2010.

    “Leon gave U.S. and world physicists a step up, a unique facility, a very high-energy collider, and his successors keep working for these things,” said Director Emeritus John Peoples, who worked with Lederman for more than 40 years and served as Lederman’s deputy director from 1988 to 1989. “Leon made that happen. He set things in motion.”

    In order to begin plans for a high-energy proton-antiproton collider, Lederman convinced the greater physics community, the Department of Energy, president Reagan’s science advisor and Congress.

    “Leon had the ability to lead. He was unifying and convincing,” Peoples said. “He had the ability to listen to people carefully and could synthesize things well. He was very persuasive. In some sense, I was manipulated at every level.”

    Lederman’s ability to convince others stemmed in part from his charm and his sense of humor, Peoples said.

    “He seemed to have an enormous storehouse of jokes,” Peoples said. “He had a lighthearted personality, he could have been a stand-up comic at times.”

    4
    Leon Lederman celebrates his birthday with children from the Fermilab daycare center.

    Lederman was born on July 15, 1922, to Russian-Jewish immigrant parents in New York City. His father, who operated a hand laundry, revered learning. Lederman graduated from the City College of New York with a degree in chemistry in 1943, although by that point, he had become friends with a group of physicists and became interested in the topic. He served three years with the United States Army in World War II and then returned to Columbia University in New York to pursue his Ph.D. in particle physics, which he received in 1951. During graduate school, Lederman joined the Columbia physics department in constructing a 385-MeV synchrotron at Nevis Lab at Irvington-on-the Hudson, New York. He remained as part of that collaboration for 28 years and eventually serving as director of Nevis labs from 1961 to 1978.

    In 1956, while working as part of a Columbia team at Brookhaven National Laboratory, Lederman discovered the long-lived neutral K meson. In 1962, Lederman, along with colleagues Jack Steinberger and Melvin Schwartz, produced a beam of neutrinos using a high-energy accelerator. They discovered that sometimes, instead of producing an electron, a muon is produced, showing the existence of a new type of neutrino, the muon neutrino. That discovery eventually earned them the 1988 Nobel Prize in physics.

    5
    Leon M. Lederman Nobel laureate, Director of FNAL after R.R. Wilson stands outside Wilson Hall at Fermilab on the day he learned he was awarded the 1988 Nobel Prize.

    The advancement of particle accelerators continued to spur discoveries. At Brookhaven in 1965, Lederman and his team found the first antinucleus in the form of antideuteron — an antiproton and an antineutron. In 1977, at Fermilab, Lederman led the team that discovered the bottom quark, at the time the first of a suspected new family of heavy particles.

    “All of those experiments were important because they set the stage for learning that we have at least two generations of leptons and something else,” Tollestrup said.”

    Lederman served as director of Fermilab from 1978 to 1989. During his tenure as laboratory director, Lederman had a significant impact on laboratory culture. He was responsible for establishing new amenities that set Fermilab apart from other labs, such as the first daycare facility at a Department of Energy national laboratory and an art gallery that continues to host rotating exhibits.

    He also had significant impact on the next generation of scientists. It was during his years at Columbia, an institution that required students to teach, that Lederman developed a passion for science education and outreach, which became a theme throughout his career. Between 1951 and 1978 he mentored 50 Ph.D. students. He liked to joke about their success, saying that not a single one was in jail.

    As director of Fermilab, Lederman established the ongoing Saturday Morning Physics program, which has attracted students from around the Chicago areas for decades to learn more about particle physics from experts, originally from Lederman, and then a long list of leading scientists. The program has inspired generations of high school students.

    6
    Leon Lederman in 1982

    Recognizing the need for more focused education in science and math, Lederman focused on creating learning spaces and opportunities for students. In the early 1980s, Lederman worked with members of the Illinois state government to start the Illinois Math and Science Academy, which was founded in 1985, and worked with officials to try to adjust the science curriculum in Chicago’s public schools so that students learned physics first, forming the foundation for their future scientific education. He founded and was chairman of the Teachers Academy for Mathematics and Science and was active in the professional development of primary school teachers in Chicago. He also helped to found the nonprofit Fermilab Friends for Science Education, a national leading organization in precollege science education.

    In later years, Lederman continued his outreach efforts, often in memorable ways. In 2008, he set up shop on the corner of 34th Street and 8th Avenue in New York City and answered science questions from passersby.

    During his career, Lederman received some of the highest national and international awards and honors given to scientists. These include the 1965 National Medal of Science, the 1972 Elliot Creeson Medal from the Franklin Institute, the Wolf Prize in 1982 and the Nobel Prize in 1988. He received the Enrico Fermi Award in 1992 for his career contributions to science, technology and medicine related to nuclear energy and the science and technology of energy, and was given the Vannevar Bush Award in 2012 for exceptional lifelong leaders in science and technology.

    In addition to his appointments at Columbia, Nevis and Fermilab, Lederman also served as the Pritzker professor of science at Illinois Institute of Technology and chairman of the State of Illinois Governor’s Science Advisory Committee. He also served on the Board of the Chicago Museum of Science and Industry, the Secretary of Energy Advisory Board and others.

    When Lederman stepped down as Fermilab’s director in 1989 and Peoples took the role, Lederman shared some sage advice. A desk nameplate, which sits on Peoples’s desk more than 25 years later, reads “I’m listening.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.


    FNAL/MINERvA

    FNAL DAMIC

    FNAL Muon g-2 studio

    FNAL Short-Baseline Near Detector under construction

    FNAL Mu2e solenoid

    Dark Energy Camera [DECam], built at FNAL

    FNAL DUNE Argon tank at SURF

    FNAL/MicrobooNE

    FNAL Don Lincoln

    FNAL/MINOS

    FNAL Cryomodule Testing Facility

    FNAL Minos Far Detector

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    FNAL/NOvA experiment map

    FNAL NOvA Near Detector

    FNAL ICARUS

    FNAL Holometer

     
  • richardmitnick 1:48 pm on September 27, 2018 Permalink | Reply
    Tags: Accelerator Science, , , , LHCb experiment discovers two perhaps three new particles, , ,   

    From CERN: “LHCb experiment discovers two, perhaps three, new particles” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN

    27 Sep 2018
    Ana Lopes

    1
    The LHCb experiment at CERN. (Image: CERN)

    It could be three for the price of one. The LHCb collaboration has found two never-before-seen particles, as well as hints of another new particle, in high-energy proton collisions at the Large Hadron Collider (LHC). Future studies of the properties of these new particles will shed light on the strong force that binds subatomic particles called quarks together.

    The new particles are predicted by the well-established quark model, and belong to the same family of particles as the protons that the LHC accelerates and collides: baryons, which are made up of three quarks. But the type of quarks they contain are different: whereas protons contain two up quarks and one down quark, the new particles, dubbed Σb(6097)+ and Σb(6097)-, are bottom baryons composed of one bottom quark and two up quarks (buu) or one bottom quark and two down quarks (bdd) respectively. Four relatives of these particles, known as Σb+, Σb-, Σb*+ and Σb*-, were first observed at a Fermilab experiment, but this is the first time that their two higher-mass counterparts, Σb(6097)+ and Σb(6097)-, have been detected.

    The LHCb collaboration found these particles using the classic particle-hunting technique of looking for an excess of events, or bump, over a smooth background of events in data from particle collisions. In this case, the researchers looked for such bumps in the mass distribution of a two-particle system consisting of a neutral baryon called Λb0 and a charged quark-antiquark particle called the π meson. They found two bumps corresponding to the Σb(6097)+ and Σb(6097)- particles, with the whopping significances of 12.7 and 12.6 standard deviations respectively; five standard deviations is the usual threshold to claim the discovery of a new particle. The 6097 in the names refers to the approximate masses of the new particles in MeV, about six times more massive than the proton.

    The third particle, named Zc-(4100) by the LHCb collaboration, is a possible candidate for a different type of quark beast, one made not of the usual two or three quarks but of four quarks (strictly speaking, two quarks and two antiquarks), two of which are heavy charm quarks. Such exotic mesons, sometimes described as “tetraquarks”, as well as five-quark particles called “pentaquarks”, have long been predicted to exist but have only relatively recently been discovered. Searching for structures in the decays of heavier B mesons, the LHCb researchers detected evidence for Zc-(4100) with a significance of more than three standard deviations, short of the threshold for discovery. Future studies with more data, at LHCb or at other experiments, may be able to boost or disprove this evidence.

    The new findings, described in two papers posted online and submitted for publication to physics journals, represent another step in physicists’ understanding of the strong force, one of the four fundamental forces of nature.

    For more information, see the LHCb website.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

    OTHER PROJECTS AT CERN

    CERN AEGIS

    CERN ALPHA

    CERN ALPHA

    CERN AMS

    CERN ACACUSA

    CERN ASACUSA

    CERN ATRAP

    CERN ATRAP

    CERN AWAKE

    CERN AWAKE

    CERN CAST

    CERN CAST Axion Solar Telescope

    CERN CLOUD

    CERN CLOUD

    CERN COMPASS

    CERN COMPASS

    CERN DIRAC

    CERN DIRAC

    CERN ISOLDE

    CERN ISOLDE

    CERN LHCf

    CERN LHCf

    CERN NA62

    CERN NA62

    CERN NTOF

    CERN TOTEM

    CERN UA9

    CERN Proto Dune

    CERN Proto Dune

     
  • richardmitnick 12:57 pm on September 27, 2018 Permalink | Reply
    Tags: Accelerator Science, , , , , , ,   

    From Symmetry: “Life as an accelerator operator” 

    Symmetry Mag
    From Symmetry

    09/27/18
    Shannon Brescher Shea

    1
    Photo by Fermilab – Note this photo is sorely lacking proper identification as to who took the photograph and what is the device.

    Behind some of the world’s biggest scientific instruments are teams with a set of skills you can’t find anywhere else.

    One Friday night in March, accelerator operator Alyssa Miller was chatting with fellow staff members in the rec center at the US Department of Energy’s Fermi National Accelerator Laboratory right after she finished her shift. As they talked, several of her colleagues’ cell phones started buzzing. Scrambling to check their messages, they realized the local utility company had abruptly stopped providing the accelerator complex with electricity. The lights were still on, thanks to back-up power, but the complex requires too much energy to keep running by generator alone.

    The Fermilab Accelerator Complex, a DOE Office of Science user facility, comprises seven particle accelerators and storage rings and hosts more than 2000 scientists a year. Fermilab’s Booster provides a low-energy beam of neutrinos for the MicroBooNE experiment. The Main Injector provides the world’s highest intensity neutrino beams for the MINOS, MINERvA and NOvA experiments. In the future it will send neutrinos to the Deep Underground Neutrino Experiment detectors in South Dakota. It provides beams of muons to the Muon g-2 experiment and will send them to the Mu2e experiment as well. It also sends particles to the SeaQuest fixed-target experiment, as well as a facility for testing detector technologies.

    “It’s an enormously complicated chain,” says Mike Lindgren, head of Fermilab’s accelerator division.

    On that Friday, the chain was broken. The facility’s accelerator operators and technical specialists ran to their stations.

    “Accelerators like to be turned off nicely,” says Miller, a former operator who is now an engineering physicist at Fermilab. “When they’re shut off all at once, that leaves things in kind of a bad state.”

    As power returned, the operators and technical specialists combined forces to retune the giant machine. It was an all-hands-on-deck effort. Shift leaders called workers in early from their days off. The entire team pushed hard for nearly 24 hours to get the entire complex running again.

    While this was a memorable incident, it was all part of the job. For accelerator operators at the DOE’s national laboratories, there’s no such thing as an average day.

    A day in the life

    More than 3000 people work on or with accelerators at DOE’s 17 national labs. About 300 of them are operators.

    Because the accelerators generally run 24 hours a day, seven days a week, operators work a shift schedule, changing between the day, night, and overnight (or “owl”) shifts.

    An operator’s shift starts with a briefing from the crew chief about what happened since the team was last at work. After the meeting, operators settle down in the main control room. The walls there are covered with screens showing multi-color lines, graphs, and maps.

    The amount of incoming information can be overwhelming. During Miller’s first few months at Fermilab, she says, “I kept thinking to myself, ‘Oh my goodness, how am I going to learn all of this?’ Now I sit down and I feel comfortable at my console.”

    Operators use the data to judge how well the accelerator is delivering beams to the scientists’ experiments. They constantly check the many console displays to ensure everything is running smoothly. If the beam is getting “loose” and shedding too many particles, they tweak the magnets to tighten it. Depending on the lab and researchers’ needs, they may also adjust the magnets to change the beam’s parameters such as its energy, size and shape.

    The first line of defense

    Operators are also accelerators’ first responders, identifying and fixing problems as they arise. From electrical ground faults that bring down power supplies to imperfections in the vacuum around the particle beam, operators must recognize and deal with a whole host of issues.

    “Operators are the first line of defense for any problems at the accelerator,” says Fermilab operator KelliAnn Rubrecht.

    Operators have to be familiar with a multitude of topics, from charged particle dynamics to cryogenic systems maintenance. “We have to know a little bit of everything,” Rubrecht says. “That’s one of the things I really like about this job.”

    Each facility has unique needs. They often require one-of-a-kind equipment and devices that can’t be purchased. “We take a lot of pride in building things ourselves,” Miller says. “It requires ingenuity for sure.”

    Sometimes, operators can solve an issue at their computers with a twist of a knob. Other times, they face physical problems that require hands-on inspection. If they don’t have the expertise to repair the problem themselves, they call in people who specialize in particular sections of the accelerator. “Once you know what the problem is, you find the right people to fix it,” says Mary Convery, Deputy Division Head of Accelerator Systems at Fermilab.

    Descending into the accelerator tunnel (always in groups of at least two for safety), operators move into a strange, underground world. The walls are thick concrete; the ceilings are low; water that runs through the cooling pipes humidifies the air.

    For operators, it can be a welcome change of pace from gazing at computer screens.

    “There’s so much going on behind the scenes that you can’t get a true appreciation for until you go down there,” Miller says.

    The learning curve

    Because most operators have no prior experience running accelerators, there’s extensive on-the-job training. It takes up to two years for a rookie operator to cover it all.

    To help get new and aspiring operators up to speed, eight DOE labs partner with two universities to run the US Particle Accelerator School. Twice a year, the school offers intensive sessions for lab staff, university students, and those in industry who want to learn more about accelerators.

    “We have instructors that are leaders in their field, and they bring a lot of state-of-the-art material,” says Steve Lund, a professor at Michigan State University and the head of the Particle Accelerator School. “Students are learning from people who are established pillars of the community.”

    The school offers classes on topics available nowhere else. The universities hosting the sessions provide students with academic credits.

    Running a particle accelerator requires a niche set of skills. But the challenge of acquiring them is worth the chance to use them, Miller says.

    “If you want to get your hands dirty literally and figuratively and you want to learn a lot and you want to be a little confused and frustrated the whole time, operations is for you,” she says. “It’s an experience like no other.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 4:07 pm on September 25, 2018 Permalink | Reply
    Tags: Accelerator Science, , , , New ATLAS result of ultra-rare B-meson decay to muon pair, , ,   

    From CERN ATLAS: “New ATLAS result of ultra-rare B-meson decay to muon pair” 

    CERN ATLAS Higgs Event

    CERN/ATLAS
    From CERN ATLAS

    25th September 2018
    ATLAS Collaboration

    1
    Figure 1: Measured dimuon mass distributions in the selection channel with highest expected signal purity. Superimposed is the result of a fit to the data. The total fit result is shown (black continuous line), with the observed signal component (dashed red line), b→μμ X background (dashed blue), and continuum background (dashed green). Signal components are grouped in one single curve, including the B0s → μ+μ- and B0 → μ+μ- components. The peaking B0(s) → hh′ background (brown dashed line) is, for all BDT bins, very close to the x-axis. (Image: ATLAS Collaboration/CERN)

    The study of hadrons – particles that combine together quarks to form mesons or baryons – is a vital part of the ATLAS physics programme. Their analysis has not only perfected our understanding of the Standard Model, it has also provided excellent opportunities for discovery.

    Among the variety of hadrons available in nature and produced in LHC proton-proton collisions, B-mesons play a fundamental role. They are bound-states of two quarks – one a bottom quark, the other one of the lighter quarks (up, down, strange or charm) – that decay through the weak interaction to lighter hadrons and/or leptons. Over the past decades, physicists have examined rare and precisely predicted phenomena involving neutral B-mesons (i.e. B0 or Bs mesons), searching for slight discrepancies from theory predictions that could be a signal of new physics.

    On 20 September 2018, at the International Workshop on the CKM Unitarity Triangle (CKM 2018), ATLAS revealed the most stringent experimental constraint of the very rare decay of the B0 meson into two muons (μ). The result is a new milestone that complements analyses [Physical Review Letters] previously published by LHC experiments dedicated to the study of B-mesons in a quest spanning almost three decades.

    ATLAS revealed the most stringent experimental constraint of the very rare decay of the B0 meson into two muons.

    2
    Figure 2: Likelihood contours for the combination of the Run 1 and 2015-2016 Run 2 results (in black). The solid, dashed and dashed-dotted contours delimit the one, two and three standard deviation regions, respectively. The contours for the Run 2 2015-2016 result alone are overlaid in blue. The Standard Model prediction and its uncertainties are shown by the solid cross. (Image: ATLAS Collaboration/CERN)

    The rareness of this decay is due to the coincidence of two factors: first, the decay requires quantum loops with several weak interaction vertices, some of which have a low probability to occur; second, angular momentum conservation constrains the decay products of the scalar B0 or Bs meson into a highly unlikely configuration.

    According to the Standard Model, the probability of generating this decay is about 1.1 in 10 billion. The new ATLAS result gets very close, with the tightest available upper limit of 2.1 occurrences in ten billion at the 95% confidence level. The result was obtained using data collected in 2015 and 2016 combined with an analogous analysis of Run 1 data. The result also provided a 4.6 standard deviations evidence for the Bs→ μμ decay, whose branching fraction is measured to be 2.8 +0.8–0.7 x10–9. It confirms previous measurements from the LHCb and CMS collaborations.

    This new ATLAS result is the first milestone towards a more precise measurement that will be obtained with the full Run 2 dataset, which is expected to improve the current precision by about 30%. Further projections towards the high-luminosity LHC (HL-LHC) era predict that ATLAS will be able to further improve the precision of this result by about a factor 3.

    Related journal articles
    _________________________________________________
    See the full article for further references with links.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN map


    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN


    CERN Courier

    QuantumDiaries


    Quantum Diaries

     
  • richardmitnick 3:39 pm on September 25, 2018 Permalink | Reply
    Tags: Accelerator Science, , , Argonne's Theta supercomputer, Aurora exascale supercomputer, , , , , , ,   

    From Argonne National Laboratory ALCF: “Argonne team brings leadership computing to CERN’s Large Hadron Collider” 

    Argonne Lab
    News from Argonne National Laboratory

    From Argonne National Laboratory ALCF

    ANL ALCF Cetus IBM supercomputer

    ANL ALCF Theta Cray supercomputer

    ANL ALCF Cray Aurora supercomputer

    ANL ALCF MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility

    September 25, 2018
    Madeleine O’Keefe

    CERN’s Large Hadron Collider (LHC), the world’s largest particle accelerator, expects to produce around 50 petabytes of data this year. This is equivalent to nearly 15 million high-definition movies—an amount so enormous that analyzing it all poses a serious challenge to researchers.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    A team of collaborators from the U.S. Department of Energy’s (DOE) Argonne National Laboratory is working to address this issue with computing resources at the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science User Facility. Since 2015, this team has worked with the ALCF on multiple projects to explore ways supercomputers can help meet the growing needs of the LHC’s ATLAS experiment.

    The efforts are especially important given what is coming up for the accelerator. In 2026, the LHC will undergo an ambitious upgrade to become the High-Luminosity LHC (HL-LHC). The aim of this upgrade is to increase the LHC’s luminosity—the number of events detected per second—by a factor of 10. “This means that the HL-LHC will be producing about 20 times more data per year than what ATLAS will have on disk at the end of 2018,” says Taylor Childers, a member of the ATLAS collaboration and computer scientist at the ALCF who is leading the effort at the facility. “CERN’s computing resources are not going to grow by that factor.”

    Luckily for CERN, the ALCF already operates some of the world’s most powerful supercomputers for science, and the facility is in the midst of planning for an upgrade of its own. In 2021, Aurora—the ALCF’s next-generation system, and the first exascale machine in the country—is scheduled to come online.

    It will provide the ATLAS experiment with an unprecedented resource for analyzing the data coming out of the LHC—and soon, the HL-LHC.

    CERN/ATLAS detector

    Why ALCF?

    CERN may be best known for smashing particles, which physicists do to study the fundamental laws of nature and gather clues about how the particles interact. This involves a lot of computationally intense calculations that benefit from the use of the DOE’s powerful computing systems.

    The ATLAS detector is an 82-foot-tall, 144-foot-long cylinder with magnets, detectors, and other instruments layered around the central beampipe like an enormous 7,000-ton Swiss roll. When protons collide in the detector, they send a spray of subatomic particles flying in all directions, and this particle debris generates signals in the detector’s instruments. Scientists can use these signals to discover important information about the collision and the particles that caused it in a computational process called reconstruction. Childers compares this process to arriving at the scene of a car crash that has nearly completely obliterated the vehicles and trying to figure out the makes and models of the cars and how fast they were going. Reconstruction is also performed on simulated data in the ATLAS analysis framework, called Athena.

    An ATLAS physics analysis consists of three steps. First, in event generation, researchers use the physics that they know to model the kinds of particle collisions that take place in the LHC. In the next step, simulation, they generate the subsequent measurements the ATLAS detector would make. Finally, reconstruction algorithms are run on both simulated and real data, the output of which can be compared to see differences between theoretical prediction and measurement.

    “If we understand what’s going on, we should be able to simulate events that look very much like the real ones,” says Tom LeCompte, a physicist in Argonne’s High Energy Physics division and former physics coordinator for ATLAS.

    “And if we see the data deviate from what we know, then we know we’re either wrong, we have a bug, or we’ve found new physics,” says Childers.

    Some of these simulations, however, are too complicated for the Worldwide LHC Computing Grid, which LHC scientists have used to handle data processing and analysis since 2002.

    MonALISA LHC Computing GridMap http:// monalisa.caltech.edu/ml/_client.beta

    The Grid is an international distributed computing infrastructure that links 170 computing centers across 42 countries, allowing data to be accessed and analyzed in near real-time by an international community of more than 10,000 physicists working on various LHC experiments.

    The Grid has served the LHC well so far, but as demand for new science increases, so does the required computing power.

    That’s where the ALCF comes in.

    In 2011, when LeCompte returned to Argonne after serving as ATLAS physics coordinator, he started looking for the next big problem he could help solve. “Our computing needs were growing faster than it looked like we would be able to fulfill them, and we were beginning to notice that there were problems we were trying to solve with existing computing that just weren’t able to be solved,” he says. “It wasn’t just an issue of having enough computing; it was an issue of having enough computing in the same place. And that’s where the ALCF really shines.”

    LeCompte worked with Childers and ALCF computer scientist Tom Uram to use Mira, the ALCF’s 10-petaflops IBM Blue Gene/Q supercomputer, to carry out calculations to improve the performance of the ATLAS software.

    MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility

    Together they scaled Alpgen, a Monte Carlo-based event generator, to run efficiently on Mira, enabling the generation of millions of particle collision events in parallel. “From start to finish, we ended up processing events more than 20 times as fast, and used all of Mira’s 49,152 processors to run the largest-ever event generation job,” reports Uram.

    But they weren’t going to stop there. Simulation, which takes up around five times more Grid computing than event generation, was the next challenge to tackle.
    Moving forward with Theta

    In 2017, Childers and his colleagues were awarded a two-year allocation from the ALCF Data Science Program (ADSP), a pioneering initiative designed to explore and improve computational and data science methods that will help researchers gain insights into very large datasets produced by experimental, simulation, or observational methods. The goal is to deploy Athena on Theta, the ALCF’s 11.69-petaflops Intel-Cray supercomputer, and develop an end-to-end workflow to couple all the steps together to improve upon the current execution model for ATLAS jobs which involves a many­step workflow executed on the Grid.

    ANL ALCF Theta Cray XC40 supercomputer

    “Each of those steps—event generation, simulation, and reconstruction—has input data and output data, so if you do them in three different locations on the Grid, you have to move the data with it,” explains Childers. “Ideally, you do all three steps back-to-back on the same machine, which reduces the amount of time you have to spend moving data around.”

    Enabling portions of this workload on Theta promises to expedite the production of simulation results, discovery, and publications, as well as increase the collaboration’s data analysis reach, thus moving scientists closer to new particle physics.

    One challenge the group has encountered so far is that, unlike other computers on the Grid, Theta cannot reach out to the job server at CERN to receive computing tasks. To solve this, the ATLAS software team developed Harvester, a Python edge service that can retrieve jobs from the server and submit them to Theta. In addition, Childers developed Yoda, an MPI-enabled wrapper that launches these jobs on each compute node.

    Harvester and Yoda are now being integrated into the ATLAS production system. The team has just started testing this new workflow on Theta, where it has already simulated over 12 million collision events. Simulation is the only step that is “production-ready,” meaning it can accept jobs from the CERN job server.

    The team also has a running end-to-end workflow—which includes event generation and reconstruction—for ALCF resources. For now, the local ATLAS group is using it to run simulations investigating if machine learning techniques can be used to improve the way they identify particles in the detector. If it works, machine learning could provide a more efficient, less resource-intensive method for handling this vital part of the LHC scientific process.

    “Our traditional methods have taken years to develop and have been highly optimized for ATLAS, so it will be hard to compete with them,” says Childers. “But as new tools and technologies continue to emerge, it’s important that we explore novel approaches to see if they can help us advance science.”
    Upgrade computing, upgrade science

    As CERN’s quest for new science gets more and more intense, as it will with the HL-LHC upgrade in 2026, the computational requirements to handle the influx of data become more and more demanding.

    “With the scientific questions that we have right now, you need that much more data,” says LeCompte. “Take the Higgs boson, for example. To really understand its properties and whether it’s the only one of its kind out there takes not just a little bit more data but takes a lot more data.”

    This makes the ALCF’s resources—especially its next-generation exascale system, Aurora—more important than ever for advancing science.

    Depiction of ANL ALCF Cray Shasta Aurora exascale supercomputer

    Aurora, scheduled to come online in 2021, will be capable of one billion billion calculations per second—that’s 100 times more computing power than Mira. It is just starting to be integrated into the ATLAS efforts through a new project selected for the Aurora Early Science Program (ESP) led by Jimmy Proudfoot, an Argonne Distinguished Fellow in the High Energy Physics division. Proudfoot says that the effective utilization of Aurora will be key to ensuring that ATLAS continues delivering discoveries on a reasonable timescale. Since increasing compute resources increases the analyses that are able to be done, systems like Aurora may even enable new analyses not yet envisioned.

    The ESP project, which builds on the progress made by Childers and his team, has three components that will help prepare Aurora for effective use in the search for new physics: enable ATLAS workflows for efficient end-to-end production on Aurora, optimize ATLAS software for parallel environments, and update algorithms for exascale machines.

    “The algorithms apply complex statistical techniques which are increasingly CPU-intensive and which become more tractable—and perhaps only possible—with the computing resources provided by exascale machines,” explains Proudfoot.

    In the years leading up to Aurora’s run, Proudfoot and his team, which includes collaborators from the ALCF and Lawrence Berkeley National Laboratory, aim to develop the workflow to run event generation, simulation, and reconstruction. Once Aurora becomes available in 2021, the group will bring their end-to-end workflow online.

    The stated goals of the ATLAS experiment—from searching for new particles to studying the Higgs boson—only scratch the surface of what this collaboration can do. Along the way to groundbreaking science advancements, the collaboration has developed technology for use in fields beyond particle physics, like medical imaging and clinical anesthesia.

    These contributions and the LHC’s quickly growing needs reinforce the importance of the work that LeCompte, Childers, Proudfoot, and their colleagues are doing with ALCF computing resources.

    “I believe DOE’s leadership computing facilities are going to play a major role in the processing and simulation of the future rounds of data that will come from the ATLAS experiment,” says LeCompte.

    This research is supported by the DOE Office of Science. ALCF computing time and resources were allocated through the ASCR Leadership Computing Challenge, the ALCF Data Science Program, and the Early Science Program for Aurora.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About ALCF

    The Argonne Leadership Computing Facility’s (ALCF) mission is to accelerate major scientific discoveries and engineering breakthroughs for humanity by designing and providing world-leading computing facilities in partnership with the computational science community.

    We help researchers solve some of the world’s largest and most complex problems with our unique combination of supercomputing resources and expertise.

    ALCF projects cover many scientific disciplines, ranging from chemistry and biology to physics and materials science. Examples include modeling and simulation efforts to:

    Discover new materials for batteries
    Predict the impacts of global climate change
    Unravel the origins of the universe
    Develop renewable energy technologies

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 2:32 pm on September 25, 2018 Permalink | Reply
    Tags: Accelerator Science, , , , , , , ,   

    From ALICE at CERN: “What the LHC upgrade brings to CERN” 

    CERN
    CERN New Masthead

    From From ALICE at CERN

    25 September 2018
    Rashmi Raniwala
    Sudhir Raniwala

    Six years after discovery, Higgs boson validates a prediction. Soon, an upgrade to Large Hadron Collider will allow CERN scientists to produce more of these particles for testing Standard Model of physics.

    FNAL magnets such as this one, which is mounted on a test stand at Fermilab, for the High-Luminosity LHC Photo Reidar Hahn

    Six years after the Higgs boson was discovered at the CERN Large Hadron Collider (LHC), particle physicists announced last week that they have observed how the elusive particle decays.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    The finding, presented by ATLAS and CMS collaborations, observed the Higgs boson decaying to fundamental particles known as bottom quarks.

    In 2012, the Nobel-winning discovery of the Higgs boson validated the Standard Model of physics, which also predicts that about 60% of the time a Higgs boson will decay to a pair of bottom quarks.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    According to CERN, “testing this prediction is crucial because the result will either lend support to the Standard Model — which is built upon the idea that the Higgs field endows quarks and other fundamental particles with mass — or rock its foundations and point to new physics”.

    The Higgs boson was detected by studying collisions of particles at different energies. But they last only for one zeptosecond, which is 0.000000000000000000001 seconds, so detecting and studying their properties requires an incredible amount of energy and advanced detectors. CERN announced earlier this year that it is getting a massive upgrade, which will be completed by 2026.

    Why study particles?

    Particle physics probes nature at extreme scales, to understand the fundamental constituents of matter. Just like grammar and vocabulary guide (and constrain) our communication, particles communicate with each other in accordance with certain rules which are embedded in what are known as the ‘four fundamental interactions’. The particles and three of these interactions are successfully described by a unified approach known as the Standard Model. The SM is a framework that required the existence of a particle called the Higgs boson, and one of the major aims of the LHC was to search for the Higgs boson.

    How are such tiny particles studied?

    Protons are collected in bunches, accelerated to nearly the speed of light and made to collide. Many particles emerge from such a collision, termed as an event. The emergent particles exhibit an apparently random pattern but follow underlying laws that govern part of their behaviour. Studying the patterns in the emission of these particles help us understand the properties and structure of particles.

    Initially, the LHC provided collisions at unprecedented energies allowing us to focus on studying new territories. But, it is now time to increase the discovery potential of the LHC by recording a larger number of events.

    3
    No image credit or caption

    So, what will an upgrade mean?

    After discovering the Higgs boson, it is imperative to study the properties of the newly discovered particle and its effect on all other particles. This requires a large number of Higgs bosons. The SM has its shortcomings, and there are alternative models that fill these gaps. The validity of these and other models that provide an alternative to SM can be tested by experimenting to check their predictions. Some of these predictions, including signals for “dark matter”, “supersymmetric particles” and other deep mysteries of nature are very rare, and hence difficult to observe, further necessitating the need of a High Luminosity LHC (HL-LHC).

    Imagine trying to find a rare variety of diamond amongst a very large number of apparently similar looking pieces. The time taken to find the coveted diamond will depend on the number of pieces provided per unit time for inspection, and the time taken in inspection. To complete this task faster, we need to increase the number of pieces provided and inspect faster. In the process, some new pieces of diamond, hitherto unobserved and unknown, may be discovered, changing our perspective about rare varieties of diamonds.

    Once upgraded, the rate of collisions will increase and so will the probability of most rare events. In addition, discerning the properties of the Higgs boson will require their copious supply. After the upgrade, the total number of Higgs bosons produced in one year may be about 5 times the number produced currently; and in the same duration, the total data recorded may be more than 20 times.

    With the proposed luminosity (a measure of the number of protons crossing per unit area per unit time) of the HL-LHC, the experiments will be able to record about 25 times more data in the same period as for LHC running. The beam in the LHC has about 2,800 bunches, each of which contains about 115 billion protons. The HL- LHC will have about 170 billion protons in each bunch, contributing to an increase in luminosity by a factor of 1.5.

    How will it be upgraded?

    The protons are kept together in the bunch using strong magnetic fields of special kinds, formed using quadrupole magnets. Focusing the bunch into a smaller size requires stronger fields, and therefore greater currents, necessitating the use of superconducting cables. Newer technologies and new material (Niobium-tin) will be used to produce the required strong magnetic fields that are 1.5 times the present fields (8-12 tesla).

    The creation of long coils for such fields is being tested. New equipment will be installed over 1.2 km of the 27-km LHC ring close to the two major experiments (ATLAS and CMS), for focusing and squeezing the bunches just before they cross.

    CERN crab cavities that will be used in the HL-LHC


    FNAL Crab cavities for the HL-LHC

    Hundred-metre cables of superconducting material (superconducting links) with the capacity to carry up to 100,000 amperes will be used to connect the power converters to the accelerator. The LHC gets the protons from an accelerator chain, which will also need to be upgraded to meet the requirements of the high luminosity.

    Since the length of each bunch is a few cm, to increase the number of collisions a slight tilt is being produced in the bunches just before the collisions to increase the effective area of overlap. This is being done using ‘crab cavities’.

    The experimental particle physics community in India has actively participated in the experiments ALICE and CMS. The HL-LHC will require an upgrade of these too. Both the design and the fabrication of the new detectors, and the ensuing data analysis will have a significant contribution from the Indian scientists.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:


    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS
    ATLAS
    CERN/ATLAS detector

    ALICE
    CERN ALICE New

    CMS
    CERN/CMS Detector

    LHCb

    CERN/LHCb

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles


    Quantum Diaries

     
  • richardmitnick 2:02 pm on September 25, 2018 Permalink | Reply
    Tags: Accelerator Science, , , , , , , LSND, , , ,   

    From Symmetry: “How not to be fooled in physics” 

    Symmetry Mag
    From Symmetry

    09/25/18
    Laura Dattaro

    1
    Illustration by Sandbox Studio, Chicago with Ana Kova

    Particle physicists and astrophysicists employ a variety of tools to avoid erroneous results.

    In the 1990s, an experiment conducted in Los Alamos, about 35 miles northwest of the capital of New Mexico, appeared to find something odd.

    Scientists designed the Liquid Scintillator Neutrino Detector experiment at the US Department of Energy’s Los Alamos National Laboratory to count neutrinos, ghostly particles that come in three types and rarely interact with other matter.

    LSND experiment at Los Alamos National Laboratory and Virginia Tech

    LSND was looking for evidence of neutrino oscillation, or neutrinos changing from one type to another.

    Several previous experiments had seen indications of such oscillations, which show that neutrinos have small masses not incorporated into the Standard Model, the ruling theory of particle physics. LSND scientists wanted to double-check these earlier measurements.

    By studying a nearly pure source of one type of neutrinos—muon neutrinos—LSND did find evidence of oscillation to a different type of neutrinos, electron neutrinos. However, they found many more electron neutrinos in their detector than predicted, creating a new puzzle.

    This excess could have been a sign that neutrinos oscillate between not three but four different types, suggesting the existence of a possible new type of neutrino, called a sterile neutrino, which theorists had suggested as a possible way to incorporate tiny neutrino masses into the Standard Model.

    Or there could be another explanation. The question is: What? And how can scientists guard against being fooled in physics?

    Brand new thing

    Many physicists are looking for results that go beyond the Standard Model. They come up with experiments to test its predictions; if what they find doesn’t match up, they have potentially discovered something new.

    “Do we see what we expected from the calculations if all we have there is the Standard Model?” says Paris Sphicas, a researcher at CERN.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    “If the answer is yes, then it means we have nothing new. If the answer is no, then you have the next question, which is, ‘Is this within the uncertainties of our estimates? Could this be a result of a mistake in our estimates?’ And so on and so on.”

    A long list of possible factors can trick scientists into thinking they’ve made a discovery. A big part of scientific research is identifying them and finding ways to test what’s really going on.

    “The community standard for discovery is a high bar, and it ought to be,” says Yale University neutrino physicist Bonnie Fleming. “It takes time to really convince ourselves we’ve really found something.”

    In the case of the LSND anomaly, scientists wonder whether unaccounted-for background events tipped the scales or if some sort of mechanical problem caused an error in the measurement.

    Scientists have designed follow-up experiments to see if they can reproduce the result. An experiment called MiniBooNE, hosted by Fermi National Accelerator Laboratory, recently reported seeing signs of a similar excess. Other experiments, such as the MINOS experiment, also at Fermilab, have not seen it, complicating the search.

    FNAL/MiniBooNE

    FNAL Minos map


    FNAL/MINOS


    FNAL MINOS Far Detector in the Soudan Mine in northern Minnesota

    “[LSND and MiniBooNE] are clearly measuring an excess of events over what they expect,” says MINOS co-spokesperson Jenny Thomas, a physicist at University College London. “Are those important signal events, or are they a background they haven’t estimated properly? That’s what they are up against.”

    Managing expectations

    Much of the work in understanding a signal involves preparatory work before one is even seen.

    In designing an experiment, researchers need to understand what physics processes can produce or mimic the signal being sought, events that are often referred to as “background.”

    Physicists can predict backgrounds through simulations of experiments. Some types of detector backgrounds can be identified through “null tests,” such as pointing a telescope at a blank wall. Other backgrounds can be identified through tests with the data itself, such as so-called “jack-knife tests,” which involve splitting data into subsets—say, data from Monday and data from Tuesday—which by design must produce the same results. Any inconsistencies would warn scientists about a signal that appears in just one subset.

    Researchers looking at a specific signal work to develop a deep understanding of what other physics processes could produce the same signature in their detector. MiniBooNE, for example, studies a beam primarily made of muon neutrinos to measure how often those neutrinos oscillate to other flavors. But it will occasionally pick up stray electron neutrinos, which look like muon neutrinos that have transformed. Beyond that, other physics processes can mimic the signal of an electron neutrino event.

    “We know we’re going to be faked by those, so we have to do the best job to estimate how many of them there are,” Fleming says. “Whatever excess we find has to be in addition to those.”

    Even more variable than a particle beam: human beings. While science strives to be an objective measurement of facts, the process itself is conducted by a collection of people whose actions can be colored by biases, personal stories and emotion. A preconceived notion that an experiment will (or won’t) produce a certain result, for example, could influence a researcher’s work in subtle ways.

    “I think there’s a stereotype that scientists are somehow dispassionate, cold, calculating observers of reality,” says Brian Keating, an astrophysicist at University of California San Diego and author of the book Losing the Nobel Prize, which chronicles how the desire to make a prize-winning discovery can steer a scientist away from best practices. “In reality, the truth is we actually participate in it, and there are sociological elements at work that influence a human being. Scientists, despite the stereotypes, are very much human beings.”

    Staying cognizant of this fact and incorporating methods for removing bias are especially important if a particular claim upends long-standing knowledge—such as, for example, our understanding of neutrinos. In these cases, scientists know to adhere to the adage: Extraordinary claims require extraordinary evidence.

    “If you’re walking outside your house and you see a car, you probably think, ‘That’s a car,’” says Jonah Kanner, a research scientist at Caltech. “But if you see a dragon, you might think, ‘Is that really a dragon? Am I sure that’s a dragon?’ You’d want a higher level of evidence.”


    Dragon or discovery?

    Physicists have been burned by dragons before. In 1969, for example, a scientist named Joe Weber announced that he had detected gravitational waves: ripples in the fabric of space-time first predicted by Albert Einstein in 1916. Such a detection, which many had thought was impossible to make, would have proved a key tenet of relativity. Weber rocketed to momentary fame, until other physicists found they could not replicate his results.

    The false discovery rocked the gravitational wave community, which, over the decades, became increasingly cautious about making such announcements.

    So in 2009, as the Laser Interferometer Gravitational Wave Observatory, or LIGO, came online for its next science run, the scientific collaboration came up with a way to make sure collaboration members stayed skeptical of their results. They developed a method of adding a false or simulated signal into the detector data stream without alerting the majority of the 800 or so researchers on the team. They called it a blind injection. The rest of the members knew an injection was possible, but not guaranteed.

    “We’d been not detecting signals for 30 years,” Kanner, a member of the LIGO collaboration, says. “How clear or obvious would the signature have to be for everyone to believe it?… It forced us to push our algorithms and our statistics and our procedures, but also to test the sociology and see if we could get a group of people to agree on this.”

    In late 2010, the team got the alert they had been waiting for: The computers detected a signal. For six months, hundreds of scientists analyzed the results, eventually concluding that the signal looked like gravitational waves. They wrote a paper detailing the evidence, and more than 400 team members voted on its approval. Then a senior member told them it had all been faked.

    Picking out and spending so much time examining such an artificial signal may seem like a waste of time, but the test worked just as intended. The exercise forced the scientists to work through all of the ways they would need to scrutinize a real result before one ever came through. It forced the collaboration to develop new tests and approaches to demonstrating the consistency of a possible signal in advance of a real event.

    “It was designed to keep us honest in a sense,” Kanner says. “Everyone to some extent goes in with some guess or expectation about what’s going to come out of that experiment. Part of the idea of the blind injection was to try and tip the scales on that bias, where our beliefs about whether we thought nature should produce an event would be less important.”

    All of the hard work paid off: In September 2015, when an authentic signal hit the LIGO detectors, scientists knew what to do. In 2016, the collaboration announced the first confirmed direct detection of gravitational waves. One year later, the discovery won the Nobel Prize.


    VIRGO Gravitational Wave interferometer, near Pisa, Italy

    Caltech/MIT Advanced aLigo Hanford, WA, USA installation


    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project

    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger

    ESA/eLISA the future of gravitational wave research

    1
    Skymap showing how adding Virgo to LIGO helps in reducing the size of the source-likely region in the sky. (Credit: Giuseppe Greco (Virgo Urbino group)

    No easy answers

    While blind injections worked for the gravitational waves community, each area of physics presents its own unique challenges.

    Neutrino physicists have an extremely small sample size with which to work, because their particles interact so rarely. That’s why experiments such as the NOvA experiment and the upcoming Deep Underground Neutrino experiment use such enormous detectors.

    FNAL/NOvA experiment map


    FNAL NOvA detector in northern Minnesota


    FNAL NOvA Near Detector

    Astronomers have even fewer samples: They have just one universe to study, and no way to conduct controlled experiments. That’s why they conduct decades-long surveys, to collect as much data as possible.

    Researchers working at the Large Hadron Collider have no shortage of interactions to study—an estimated 600 million events are detected every second.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    But due to the enormous size, cost and complexity of the technology, scientists have built only one LHC. That’s why inside the collider sit multiple different detectors, which can check one another’s work by measuring the same things in a variety of ways with detectors of different designs.

    CERN ATLAS


    CERN/CMS Detector



    CERN ALICE detector


    CERN LHCb chamber, LHC

    While there are many central tenets to checking a result—knowing your experiment and background well, running simulations and checking that they agree with your data, testing alternative explanations of a suspected result—there’s no comprehensive checklist that every physicist performs. Strategies vary from experiment to experiment, among fields and over time.

    Scientists must do everything they can to test a result, because in the end, it will need to stand up to the scrutiny of their peers. Fellow physicists will question the new result, subject it to their own analyses, try out alternative interpretations, and, ultimately, try to repeat the measurement in a different way. Especially if they’re dealing with dragons.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 12:46 pm on September 23, 2018 Permalink | Reply
    Tags: Accelerator Science, At CERN-The hunt for leptoquarks is on, , , ,   

    From CERN: “The hunt for leptoquarks is on” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN

    19 Sep 2018
    Achintya Rao

    1
    A collision event recorded by CMS at the start of the data-taking run of 2018. CMS sifts through such collisions up to 40 million times per second looking for signs of hypothetical particles like leptoquarks (Image: Thomas McCauley/Tai Sakuma/CMS/CERN)

    Matter is made of elementary particles, and the Standard Model of particle physics states that these particles occur in two families: leptons (such as electrons and neutrinos) and quarks (which make up protons and neutrons).

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    Under the Standard Model, these two families are totally distinct, with different electric charges and quantum numbers, but have the same number of generations (see image below).

    However, some theories that go beyond the Standard Model, including certain “grand unified theories”, predict that leptons and quarks merge at high energies to become leptoquarks. These leptoquarks are proposed in theories attempting to unify the strong, weak and electromagnetic forces.

    Such “unifications” are not unusual in physics. Electricity and magnetism were famously unified in the 19th century into a single force known as electromagnetism, via Maxwell’s elegant mathematical formulae. In the case of leptoquarks, these hybrid particles are thought to have the properties of both leptons and quarks, as well as the same number of generations. This would not only allow them to “split” into the two types of particles but would also allow leptons to change into quarks and vice versa. Indeed, anomalies detected by the LHCb experiment as well as by Belle and Babar in measurements of the properties of B mesons could be also explained by the existence of these hypothesised particles.

    Standard Model Image by Daniel Dominguez-CERN

    KEK Belle 2 detector, in Tsukuba, Ibaraki Prefecture, Japan


    SLAC/Babar

    If leptoquarks exist, they would be very heavy and quickly transform, or “decay”, into more stable leptons or quarks. Previous experiments at the SPS and LEP at CERN, HERA at DESY and the Tevatron at Fermilab have looked at decays to first- and second-generation particles. Searches for third-generation leptoquarks (LQ3) were first performed at the Tevatron, and are now being explored at the Large Hadron Collider (LHC).

    The Super Proton Synchrotron (SPS), CERN’s second-largest accelerator.


    CERN LEP Collider

    DESY HERA , 1992 to 2007

    FNAL/Tevatron map


    FNAL/Tevatron

    Since leptoquarks would transform into a lepton and a quark, LHC searchers look for telltale signatures in the distributions of these “decay products”. In the case of third-generation leptoquarks, the lepton could be a tau or a tau neutrino while the quark could be a top or bottom.

    In a recent paper [The European Physical Journal C], using data collected in 2016 at a collision energy of 13 TeV, the Compact Muon Solenoid (CMS) collaboration at the LHC presented the results of searches for third-generation leptoquarks, where every LQ3 produced in the collisions initially transformed into a tau-top pair.

    CERN/CMS Detector

    Because colliders produce particles and antiparticles at the same time, CMS specifically searched for the presence of leptoquark-antileptoquark pairs in collision events containing the remnants of a top quark, an antitop quark, a tau lepton and an antitau lepton. Further, because leptoquarks have never been seen before and their properties remain a mystery, physicists rely on sophisticated calculations based on known parameters to look for them. These parameters include the energy of the collisions and expected background levels, constrained by the possible values for the mass and spin of the hypothetical particle. Through these calculations, the scientists can estimate how many leptoquarks might have been produced in a particular data set of proton-proton collisions and how many might have been transformed into the end products their detectors can look for.

    “Leptoquarks have became one of the most tantalising ideas for extending our calculations, as they make it possible to explain several observed anomalies. At the LHC we are making every effort to either prove or exclude their existence,” says Roman Kogler, a physicist on CMS who worked on this search.

    After sifting through collision events looking for specific characteristics, CMS saw no excess in the data that might point to the existence of third-generation leptoquarks. The scientists were therefore able to conclude that any LQ3 that transform exclusively to a top-tau pair would need to be at least 900 GeV in mass, or around five times heavier than the top quark, the heaviest particle we have observed.

    The limits placed by CMS on the mass of third-generation leptoquarks are the tightest so far. CMS has also searched for third-generation leptoquarks that transform into a tau lepton and a bottom quark, concluding that such leptoquarks would need to be at least 740 GeV in mass. However, it is important to note that this result comes from the examination of only a fraction of LHC data at 13 TeV, from 2016. Further searches from CMS and ATLAS that take into account data from 2017 as well as the forthcoming run of 2018 will ensure that the LHC can continue to test theories about the fundamental nature of our universe.

    CERN/ATLAS detector

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

    OTHER PROJECTS AT CERN

    CERN AEGIS

    CERN ALPHA

    CERN ALPHA

    CERN AMS

    CERN ACACUSA

    CERN ASACUSA

    CERN ATRAP

    CERN ATRAP

    CERN AWAKE

    CERN AWAKE

    CERN CAST

    CERN CAST Axion Solar Telescope

    CERN CLOUD

    CERN CLOUD

    CERN COMPASS

    CERN COMPASS

    CERN DIRAC

    CERN DIRAC

    CERN ISOLDE

    CERN ISOLDE

    CERN LHCf

    CERN LHCf

    CERN NA62

    CERN NA62

    CERN NTOF

    CERN TOTEM

    CERN UA9

    CERN Proto Dune

    CERN Proto Dune

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel