Tagged: HEP Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:25 am on October 23, 2017 Permalink | Reply
    Tags: , , , HEP, ,   

    From FNAL: “Three Fermilab scientists awarded $17.5 million in SciDAC funding” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    October 23, 2017
    Troy Rummler

    Three Fermilab-led collaborations have been awarded a combined $17.5 million over three years by the U.S. Department of Energy’s Scientific Discovery through Advanced Computing (SciDAC) program. Researchers James Amundson, Giuseppe Cerati and James Kowalkowski will use the funds to support collaborations with external partners in computer science and applied mathematics to address problems in high-energy physics with advanced computing solutions.

    The awards, two of which can be extended to five years, mark the fourth consecutive cycle of successful bids from Fermilab scientists, who this year also bring home the majority of high-energy physics SciDAC funding disbursed. The series of computational collaborations has enabled Fermilab to propose progressively more sophisticated projects. One, an accelerator simulation project, builds directly on previous SciDAC-funded projects, while the other two projects are new: one to speed up event reconstruction and one to design new data analysis workflows.

    “Not only have we had successful projects for the last decade,” said Panagiotis Spentzouris, head of Fermilab’s Scientific Computing Division, “but we acquired enough expertise that we’re now daring to do things that we wouldn’t have dared before.”

    1
    James Amundson

    SciDAC is enabling James Amundson and his team to enhance both the depth and accuracy of simulation software to meet the challenges of emerging accelerator technology.

    His project ComPASS4 will do this by first developing integrated simulations of whole accelerator complexes, ensuring the success of PIP-II upgrades, for example, by simulating the effects of unwanted emitted radiation. PIP-II is the lab’s plan for providing powerful, high-intensity proton beams for the international Long-Baseline Neutrino Facility and Deep Underground Neutrino Experiment. The work also supports long-term goals for accelerators now in various stages of development.

    “We will be able to study plasma acceleration in much greater detail than currently possible, then combine those simulations with simulations of the produced beam in order to create a virtual prototype next-generation accelerator,” Amundson said. “None of these simulations would have been tractable with current software and high-performance computing hardware.”

    2
    Giuseppe Cerati

    The next generation of high-energy physics experiments, including the Deep Underground Neutrino Experiment, will produce an unprecedented amount of data, which needs to be reconstructed into useful information, including a particle’s energy and trajectory. Reconstruction takes an enormous amount of computing time and resources.

    “Processing this data in real time, and even offline, will become unsustainable with the current computing model,” Giuseppe Cerati said. He, therefore, has proposed to lead an exploration into modern computing architectures to speed up reconstruction.

    “Without a fundamental transition to faster processing, we would face significant reductions in efficiency and accuracy, which would have a big impact on an experiment’s discovery potential,” he added.

    3
    James Kowalkowski

    James Kowalkowski’s group will aim to redefine data analysis, enhancing optimization procedures to use computing resources in ways that have been unavailable in the past. This means fundamental changes in computational techniques and software infrastructure.

    In this new way of working, rather than treating data sets as collections of files, used to transfer chunks of information from one processing or analysis stage to the next, researchers can view data as immediately available and moveable around a unified, large-scale distributed application. This will permit scientists within collaborations to process large portions of collected experimental data in short order — nearly on demand.

    “Without the special funding from SciDAC to pull people from diverse backgrounds together, it would be nearly impossible to carry out this work,” Kowalkowski said.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    FNAL Icon
    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

    Advertisements
     
  • richardmitnick 1:16 pm on October 20, 2017 Permalink | Reply
    Tags: , , , HEP, , , Scientists make rare achievement in study of antimatter,   

    From Symmetry: “Scientists make rare achievement in study of antimatter” 


    Symmetry

    10/19/17
    Kimber Price

    1
    Maximilien Brice, Julien Marius Ordan, CERN

    Through hard work, ingenuity and a little cooperation from nature, scientists on the BASE experiment vastly improved their measurement of a property of protons and antiprotons.

    2
    BASE: Baryon Antibaryon Symmetry Experiment. Maximilien Brice

    Scientists at CERN are celebrating a recent, rare achievement in precision physics: Collaborators on the BASE experiment measured a property of antimatter 350 times as precisely as it had ever been measured before.

    The BASE experiment looks for undiscovered differences between protons and their antimatter counterparts, antiprotons. The result, published in the journal Nature, uncovered no such difference, but BASE scientists say they are hopeful the leap in the effectiveness of their measurement has potentially brought them closer to a discovery.

    “According to our understanding of the Standard Model [of particle physics], the Big Bang should have created exactly the same amount of matter and antimatter, but [for the most part] only matter remains,” says BASE Spokesperson Stefan Ulmer.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    This is strange because when matter and antimatter meet, they annihilate one another. Scientists want to know how matter came to dominate our universe.

    “One strategy to try to get hints to understand the mechanisms behind this matter-antimatter symmetry is to compare the fundamental properties of matter and antimatter particles with ultra-high precision,” Ulmer says.

    Scientists on the BASE experiment study a property called the magnetic moment. The magnetic moment is an intrinsic value of particles such as protons and antiprotons that determines how they will orient in a magnetic field, like a compass. Protons and antiprotons should behave exactly the same, other than their charge and direction of orientation; any differences in how they respond to the laws of physics could help explain why our universe is made mostly of matter.

    This is a challenging measurement to make with a proton. Measuring the magnetic moment of an antiproton is an even bigger task. To prevent antiprotons from coming into contact with matter and annihilating, scientists need to house them in special electromagnetic traps.

    While antiprotons generally last less than a second, the ones used in this study were placed in a unique reservoir trap in 2015 and used one by one, as needed, for experiments. The trapped antimatter survived for more than 400 days.

    During the last year, Ulmer and his team worked to improve the precision of the most sophisticated technqiues developed for this measurement in the last decade.

    They did this by improving thier cooling methods. Antiprotons at temperatures close to absolute zero move less than room-temperature ones, making them easier to measure.

    Previously, BASE scientists had cooled each individual antiproton before measuring it and moving on to the next. With the improved trap, the antiprotons stayed cool long enough for the scientists to swap an antiproton for a new one as soon as it became too hot.

    “Developing an instrument stable enough to keep the antiproton close to absolute zero for 4-5 days was the major goal,” says Christian Smorra, the first author of the study.

    This allowed them to collect data more rapidly than ever before. Combining this instrument with a new technique that measures two particles simultaneously allowed them to break their own record from last year’s measurement by a longshot.

    “This is very rare in precision physics, where experimental efforts report on factors of greater than 100 magnitude in improvement,” Ulmer says.

    The results confirm that the two particles behave exactly the same, as the laws of physics would predict. So the mystery of the imbalance between matter and antimatter remains.

    Ulmer says that the group will continue to improve the precision of their work. He says that, in five to 10 years, they should be able to make a measurement at least twice as precise as this latest one. It could be within this range that they will be able to detect subtle differences between protons and antiprotons.

    “Antimatter is a very unique probe,” Ulmer says. “It kind of watches the universe through very different glasses than any matter experiments. With antimatter research, we may be the only ones to uncover physics treasures that would help explain why we don’t have antimatter anymore.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 2:17 pm on October 17, 2017 Permalink | Reply
    Tags: , , , HBOOK and PAW and ROOT and GEANT, HEP, , , , René Brun   

    From ALICE at CERN: “40 Years of Large Scale Data Analysis in HEP: Interview with René Brun” 

    CERN
    CERN New Masthead

    16 October 2017
    Virginia Greco

    1
    Over 40 years of career at CERN, René Brun developed a number of software packages that became largely used in High Energy Physics. For these fundamental contributions he was recently awarded a special prize of the EPS High Energy Particle Physics Division. We have talked with him about the key events of this (hi)story.

    1
    René Brun giving a seminar at CERN (on October 4, 2017) about “40 Years of Large Scale Data Analysis in HEP – the HBOOK, Paw and Root Story”. [Credit: Virginia Greco]

    It is hard to imagine that one same person can be behind many of the most important and largely used software packages developed at CERN and in high-energy physics: HBOOK, PAW, ROOT and GEANT. This passionate and visionary person is René Brun, now honorary member of CERN, who was recently awarded a special prize of the EPS High Energy Particle Physics Division “for his outstanding and original contributions to the software tools for data management, detector simulation, and analysis that have shaped particle and high energy physics experiments for many decades”. Over 40 years of career at CERN, he worked with various brilliant scientists and we cannot forget that the realization of such endeavors is always the product of a collaborative effort. Nevertheless, René has had the undoubtable merit of conceiving new ideas, proposing projects and working hard and enthusiastically to transform them in reality.

    One of his creations, ROOT, is a data analysis tool widely used in high energy and nuclear physics experiments, at CERN and in other laboratories. It has already passed beyond the limits of physics and is now being applied in other scientific fields and even in finance. GEANT is an extremely successful software package developed by René Brun, which allows simulating physics experiments and particle interactions in detectors. Its latest version, GEANT4, is currently the first choice of particle physicists dealing with detector simulations.

    But previous to ROOT and GEANT4, which are very well known among the youngest as well, many other projects had been proposed and software tools had been developed. It is a fascinating story, which René was invited to tell in a recent colloquium, organized at CERN by the EP department.

    As he recounts, all started in 1973, when he was hired in the Data Handling (DD) division at CERN to work with Carlo Rubbia in the R602 experiment at the ISR. His duty was to help developing a special hardware processor for the online reconstruction of the collision patterns. But since this development was moving slowly and was not occupying much of his work time, René was asked to write some software for the event reconstruction in multiwire proportional chambers. “At that time, I hated software,” René confesses smiling, “I had written software during my PhD thesis, while studying in Clermont-Ferrand and working at CERN during the weekends, and I hadn’t really enjoyed it. I had joined Rubbia’s group with the ‘promise’ that I would work on hardware, but very quickly I became a software guy again…”

    In short time, René implemented in software (programming in Fortran4) what they could not realize via hardware and, in addition, he developed a histogram package called HBOOK. This allowed realizing a very basic analysis of the data, creating histograms, filling them and sending the output to a line printer. He also wrote a program called HPLOT which was specialized in drawing histograms generated by HBOOK.

    At that time, there were no graphic devices, so the only way to visualize histograms was printing them using a line printer, and programs were written in the form of punched cards.

    René remembers with affection the time spent punching cards, not for the procedure itself, which was slow and quite tedious, but for the long chats he used to have in the room where the card punchers and printers of the DD department were sitting, as well as in the cafeteria nearby. In those long hours, he could discuss ideas and comment on new technologies with colleagues.

    A huge progress was made possible by the introduction of the teletype, which replaced card punchers. Users could generate programs on a disk file and communicate with a central machine, called FOCUS, while – at the same time – seeing on a roll of paper what they were doing as in a normal type machine. “The way it worked can make people smile today,” René recounts, “To log in the FOCUS, one had to type a command which caused a red light to flash in the computer centre. Seeing the light, the operator would mount into the memory of the machine the tape of the connected person, who could thus run a session on the disk. When the user logged out, the session was again dumped on tape. You can imagine the traffic! But this was still much faster than punching cards.”

    Some time later, the teletype was in turn replaced by a Tektronix 4010 terminal, which brought in a big revolution, since it gave the possibility to display results in graphic form. This new, very expensive device allowed René to speed up the development of his software: HBOOK first, then another package called ZBOOK and the first version of GEANT. Created in 1974 with his colleagues in the Electronic Experiments (EE) group, GEANT1 was a tool for performing simple detector simulations. Gradually, they added features to this software and were able to generate collision simulations: GEANT2 was born.

    In 1975 René joined the NA4 experiment, a deep inelastic muon scattering experiment in the North Area, led by Carlo Rubbia. There he collaborated on the development of new graphic tools that allowed printing histograms using a device called CalComp plotter. This machine, which worked with a 10-meter-long roll of paper, granted a much better resolution compared with line printers, but was very expensive. In 1979 a microfilm system was introduced: histograms saved on the film could be inspected before sending them to the plotter, so that only the interesting ones were printed. This reduced the expenses due to the use of the CalComp.

    René was then supposed to follow Rubbia in the UA1 experiment, for which he had been doing many simulations – “Without knowing that I was simulating for UA1,” René highlights. But instead, at the end of 1980, he joined the OPAL experiment, where he performed all the simulations and created GEANT3.

    While working on the HBOOK system, in 1974 René had developed a memory management and I/O system called ZBOOK. This tool was an alternative to the HYDRA system, which was being developed in the bubble chambers group by the late Julius Zoll (also author of another management system called Patchy).

    Thinking that it was meaningless to have two competing systems, in 1981, the late Emilio Pagiola proposed the development of a new software package called GEM. While three people were working hard on the GEM project, René and Julius together started to run benchmarks to compare their systems, ZBOOK and HYDRA, with GEM. Through these tests, they came to the conclusion that the new system was by far slower than theirs.

    In 1983 Ian Butterworth, the then Director for Computing, decided that only the ZBOOK system would be supported at CERN and that GEM had to be stopped, and HYDRA was frozen. “My group leader, Hans Grote, came to my office, shook my hand and told me: ‘Congratulations René, you won.’ But I immediately thought that this decision was not fair, because actually both systems had good features and Julius Zoll was a great software developer.”

    In consequence of this decision, René and Julius started a collaboration and joined forces to develop a package integrating the best features of both ZBOOK and HYDRA. The new project was called ZEBRA, from the combination of the names of the two original systems. “When Julius and I announced that we were collaborating, Ian Butterworth immediately called both of us to his office and told us that, if in 6 months the ZEBRA system was not functioning, we would be fired from CERN. But indeed, less than two months later we were already able to show a running primary version of the ZEBRA system.”

    At the same time, histogram and visualization tools were under development. René put together an interactive version of HBOOK and HPLOT, called HTV, which run on Tektronix machines. But in 1982 the advent of personal workstations marked a revolution. The first personal workstation introduced in Europe, the Apollo, represented a leap in terms of characteristics and performance: it was faster, had more memory and better user interface than any other previous device. “I was invited by the Apollo company to go to Boston and visit them,” René recounts. “When I first saw the Apollo workstation, I was shocked. I immediately realized that it could speed up our development by a factor of 10. I put myself at work and I think that in just three days I adapted some 20000 lines of code for it.”

    The work of René in adapting HTV for the Apollo workstation attracted the interest of the late Rudy Böck, Luc Pape and Jean-Pierre Revol from the UA1 collaboration, who also suggested some improvements. Therefore, in 1984 the three of them elaborated a proposal for a new package, which would be based on HBOOK and ZEBRA, that they called PAW, from Physics Analysis Workstation.

    2
    The PAW team: (from the left) René Brun, Pietro Zanarini, Olivier Couet (standing) and Carlo Vandoni.

    After a first period of uncertainties, the PAW project developed quickly and many new features were introduced, thanks also to the increasing memory space of the workstations. “At a certain point, the PAW software was growing so fast that we started to receive complaints from users who could not keep up with the development,” says René smiling. “Maybe we were a bit naïve, but certainly full of enthusiasm.”

    The programming language generally used for scientific computing was FORTRAN. In particular, at that time FORTRAN 77 (introduced in 1977) was widespread in the high-energy physics community and the main reason for its success was the fact that it was well structured and quite easy to learn. Besides, very efficient implementations of it were available on all the machines used at the time. As a consequence, when the new FORTRAN 90 appeared, it seemed obvious that it would replace FORTRAN 77 and that it would be as successful as the previous version. “I remember well the leader of the computing division, Paolo Zanella, saying: ‘I don’t know what the next programming language will do but I know its name: FORTRAN.’”

    In 1990 and 91 René, together with Mike Metcalf, who was a great expert of FORTRAN, worked hard to adapt the ZEBRA package to FORTRAN 90. But this effort did not lead to a satisfactory result and discussions raised about the opportunity to keep working with FORTRAN or moving to another language. It was the period when object-oriented programming was taking its first steps and also when Tim Berners Lee joined René’s group.

    Berners-Lee was supposed to develop a documentation system, called XFIND, to replace the previous FIND that could run only on IBM machines, which had to be usable on other devices. He believed, though, that the procedure he was supposed to implement was a bit clumsy and certainly not the best approach to the problem. So, he proposed a different solution with a more decentralized and adaptable approach, which required first of all a work of standardization. In this context, Berners-Lee developed the by-now-very-famous idea of the World Wide Web servers and clients, developed using an object-oriented language (Object C).

    It was a very hot period, because the phase of design and simulation of the experiments for the new accelerator LHC had been launched. It was important to take a decision about the programming language and the software tools to use in these new projects.

    At the workshop of ERICE, organized by INFN in November 1990, and then at the Computing in High Energy Physics (CHEP) conference in Annecy (France), in September 1992, the high-energy physics “software gurus” of the world gathered to discuss about programming languages and possible orientations for software in HEP. Among the many languages proposed, there were also Eiffel, Prolog, Modula2 and others.

    In 1994 two Research and Development (RD) projects were launched: RD44, with the objective of implementing in C++ a new version of GEANT (which will become GEANT4), and RD45, aiming to investigate object-oriented database solutions for the LEP experiments.

    According to René, his division was split in three opinion groups: those who wanted to stay with FORTRAN 90, those who bet on C++ and those who were interested in using commercial products. “I presented a proposal to develop a package that would take PAW to the OO word. But the project, which I called ZOO, was rejected and I was even invited to take a sabbatical leave” René admits.

    This blow, though, proved later to be indeed a strike of luck for René. He was suggested by his division leader, David Williams, to join the NA49 experiment in the North Area, which needed somebody to help developing the software. At first, he refused. He had been leading for years both the GEANT and the PAW projects and making simulation or developing software for different groups and applications, thus accepting to go back working in a specific experiment appeared to him as a big limitation.

    But he gave it second thoughts and realized that it was an opportunity to take some time to develop new software, with total freedom. He went to visit the NA49 building in the Prevessin site and, seeing from the windows pine trees and squirrels, he felt that it was indeed the kind of quiet environment he needed for his new project. Therefore, he moved his workstation from his office to the Prevessin site (“I did it during a weekend, without even telling David Williams”) and, while working for NA49, he taught himself C++ by converting in this new OO language a large part of his HBOOK software.

    At the beginning of 1995, René was joined in NA49 by Fons Rademakers, with whom he had already collaborated. The two of them worked very hard for several months and produced the first version of what became the famous ROOT system. The name comes simply from the combination of the starting letter of the email addresses of the two founders (René and Rdm, for Rademakers), the double O of Object Oriented and the word Technology. But the meaning or the word ‘root’ also fitted well with its being a basic framework for more software to be developed and with the use of tree structures in its architecture.

    In November of the same year, René gave a seminar to present the ROOT system. “The Computing Division auditorium was unexpectedly crowded!” René recalls, “I think it was because people thought that Fons and I had disappeared from the software arena, while all of a sudden we were back again!” And actually the ROOT system generated considerable interest.

    But while René and Fons were completely absorbed by the work on their new software package, the RD45 project, which had the mandate to decide what new software had to be adopted by the new LHC experiments, had proposed to use the commercial product “Objectivity” and a lot of work was ongoing to develop applications to meet the HEP needs. According to René, there was a clear intention to obstruct the development and diffusion of ROOT. In spring 1996 the CERN director for computing, Lorenzo Foa, declared that the ROOT project was considered as a private initiative of NA49 which was not supported by the CERN management and that the official line of development was the one around Objectivity.

    “I think that the LHC Computing Board didn’t have the right insight into the architecture of these software tools to be able to judge which solution was the best. Thus, they had to trust what they were told,” René comments. “It is always a problem when there is such a divide between the experts – and users – working on something and the people who are to take important decisions.”

    Nevertheless, René and Fons continued developing ROOT and implementing new features, taking advantage of the lessons learnt with the previous software packages (in particular the requests and criticisms of the users). In addition, they followed closely the development of the official line with Objectivity, in order to know what people using it were looking for and what the problems or difficulties were. “The more we looked into Objectivity, the more we realized it could not meet the needs of our community,” René adds, “we knew that the system would fail and that eventually people would realize it. This gave us even more energy and motivation to work hard and improve our product.”

    They had continuous support from the NA49 and ALICE collaborations, as well as from many people in ATLAS and CMS, who saw good potentiality in their software package. At the time, René was collaborating with many people in both experiments, including Fabiola Gianotti and Daniel Froidevaux, in particular for detector simulations. Besides, many users trusted them for the relationship created along many years through the user support of PAW and GEANT.

    Things started to change when interest for ROOT raised outside CERN. In 1998, the two experiments of Fermilab, CDF and D0, decided to discuss about the future of their software approach, in view of the soon-coming Run II of the Tevatron. Hence, they opened two calls for proposals of software solutions, one for data storage and one for data analysis and visualization. René submitted ROOT to both calls. During the CHEP conference in Chicago the proposals were discussed and the last day it was publicly announced that CDF and D0 would adopt ROOT. “I was not expecting it,” says René, “I remember that when the communication was given, everybody turned their face and looked at me.” Soon later, the experiments of RHIC at the Brookhaven National Laboratory took the same decision. The BaBar experiment at SLAC, after years spent attempting to use Objectivity, had realized that it was not as good a system as expected, so moved to ROOT as well.

    Gradually, it was clear that the HEP community was ‘naturally’ going towards ROOT, so the CERN management had to accept this situation and, eventually, support it. But this happened only in 2002. With more manpower allocated to the project, ROOT continued developing fast and the number of users increased dramatically. It also started to spread to other branches of science and into the financial world. “In 2010, we had on average 12000 downloads per month of the software package and the ROOT website had more visitors than the CERN one”.

    3
    The logo of the ROOT software package.

    René retired in 2012, but his two most important brainchildren, ROOT and GEANT, keep growing thanks to the work of many young scientists. “I think that it is essential to have a continuous stimulus that pushes you to improve your products and come out with new solutions. For this, the contribution of young people is very important,” comments René. But, as he admits, what really made him and his colleagues work hard for so many years is the fact that the software packages they were developing had always some competitors and, in many cases, they were challenged and even obstructed. “When you are contrasted, but you know you are right, you are condemned to succeed.”

    The great attention to the users’ needs has also been very important, because it helped to shape the software and build a trust relationship with people. “I have always said that you have to put the user support at the highest priority,” René explains. “If you reply to a request in 10 minutes you get 10 points, in one hour you get 2 points, and in one day you go already to -10 points. Answering questions and comments is fundamental, because if the users are satisfied with the support you give them, they are willing to trust what you propose next.”

    Now that he is retired, René still follows the software development at CERN, but only as an external observer. This does not mean that he has left apart his scientific interests, on the contrary he is now dedicating most of his energies to a more theoretical project, since he is developing a physics model. In his spare time, he likes gardening. He loves flowers, but he cannot avoid looking at them with a scientific eye: “A colleague of mine, who is mathematician, and I developed a mathematical model about the way flowers are structured and grow.”

    Brilliant minds are always at work.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:


    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS
    ATLAS
    CERN/ATLAS detector

    ALICE
    CERN ALICE New

    CMS
    CERN/CMS Detector

    LHCb

    CERN/LHCb

    LHC

    CERN/LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles


    Quantum Diaries

     
  • richardmitnick 8:09 pm on October 13, 2017 Permalink | Reply
    Tags: , Baby MIND, , , HEP, , ,   

    From CERN: “Baby MIND born at CERN now ready to move to Japan” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    13 Oct 2017
    Stefania Pandolfi

    1
    Baby MIND under test on the T9 beamline at the Proton Synchrotron experimental hall in the East Area, summer 2017 (Image: Alain Blondel/University of Geneva)

    A member of the CERN Neutrino Platform family of neutrino detectors, Baby MIND, is now ready to be shipped from CERN to Japan in 4 containers to start the experimental endeavour it has been designed and built for. The containers are being loaded on 17 and 18 October and scheduled to arrive by mid-December.

    Baby MIND is a 75-tonne neutrino detector prototype for a Magnetised Iron Neutrino Detector (MIND). Its goal is to precisely identify and track positively or negatively charged muons – the product of muon neutrinos from the (Tokai to Kamioka) beam line, interacting with matter in the WAGASCI neutrino detector, in Japan.

    T2K map, T2K Experiment, Tokai to Kamioka, Japan

    The more detailed the identification of the muon that crosses the Baby MIND detector, the more we can learn about the original neutrino, in view of contributing to a more precise understanding of the neutrino oscillations phenomenon*.

    The journey of these muon neutrinos starts from the Japan Proton Accelerator Research Complex (J-PARC) in Tokai. They travel all the way to the Super-Kamiokande Detector in Kamioka, some 295 km away.

    Super-Kamiokande Detector, located under Mount Ikeno near the city of Hida, Gifu Prefecture, Japan

    On their journey, the neutrinos pass through the near detector complex building, located 280 m downstream from Tokai, where the WAGASCI + Baby MIND suite of detectors are. Baby MIND aims to measure the velocity and charge of muons produced by the neutrino interactions with matter in the WAGASCI detector. Muons precise tracking will help testing our ability to reconstruct important characteristics of their parent neutrinos. This, in turn, is important because in studying muon neutrino oscillations on their journey from Tokai to Kamioka, it is crucial to know how strongly and how often they interact with matter.

    Born from prototyping activities launched within the AIDA project, since its approval in December 2015 by the CERN Research Board, the Baby MIND collaboration – comprising CERN, University of Geneva, the Institute of Nuclear research in Moscow, the Universities of Glasgow, Kyoto, Sofia, Tokyo, Uppsala and Valencia – has been busy designing, prototyping, constructing and testing this detector. The magnet construction phase, which lasted 6 months, was completed in mid-February 2017, two weeks ahead of schedule.

    The fully assembled Baby MIND detector was tested on a beam line (link sends e-mail) at the experimental zone of the Proton Synchrotron in the East Hall during Summer 2017. These tests showed that the detector is working as expected and, therefore, ready to go.

    2
    Baby MIND under test on the T9 beamline at the Proton Synchrotron experimental hall in the East Area, summer 2017 (Image: Alain Blondel/University of Geneva)

    *Neutrino oscillations

    Neutrinos are everywhere. Each second, several billion of these particles coming from the Sun, the Earth and our galaxy, pass through our bodies. And yet, they fly past unnoticed. Indeed, despite their cosmic abundance and ubiquity, neutrinos are extremely difficult to study because they hardly interact with matter. For this reason, they are among the least understood particles in the Standard Model (SM) of particle physics.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    What we know is that they come in three types or ‘flavours’ – electron neutrino, muon neutrino and tau neutrino. Since their first detection in 1956, and until the late 1990s neutrinos were thought to be massless, in line with the SM predictions. However, a few years later, the Super-Kamiokande experiment in Japan and then the Sudbury Neutrino Observatory in Canada independently demonstrated that neutrinos can change (oscillate) from one flavour to another spontaneously.

    Sudbury Neutrino Observatory, , no longer operating

    This is only possible if neutrinos have masses, however small, and the probability of changing flavour is proportional to their difference in mass and the distance they travel. This ground-breaking discovery was awarded with the 2015 Physics Nobel Prize.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 4:16 pm on October 13, 2017 Permalink | Reply
    Tags: , , CLEAR, HEP, ,   

    From CERN Courier: “CLEAR prospects for accelerator research” 


    CERN Courier

    Oct 13, 2017

    1
    CLEAR’s plasma-lens experiment will test ways to drive strong currents through a plasma for particle-beam transverse focusing.
    Image credit: M Volpi.

    A new user facility for accelerator R&D, the CERN Linear Electron Accelerator for Research (CLEAR), started operation in August and is ready to provide beam for experiments. CLEAR evolved from the former CTF3 test facility for the Compact Linear Collider (CLIC), which ended a successful programme in December 2016. Following approval of the CLEAR proposal, the necessary hardware modifications started in January and the facility is now able to host and test a broad range of ideas in the accelerator field.

    CLEAR’s primary goal is to enhance and complement the existing accelerator R&D programme at CERN, as well as offering a training infrastructure for future accelerator physicists and engineers. The focus is on general accelerator R&D and component studies for existing and possible future accelerator applications. This includes studies of high-gradient acceleration methods, such as CLIC X-band and plasma technologies, as well as prototyping and validation of accelerator components for the high-luminosity LHC upgrade.

    The scientific programme for 2017 includes: a combined test of critical CLIC technologies, continuing previous tests performed at CTF3; measurements of radiation effects on electronic components to be installed on space missions in a Jovian environment and for dosimetry tests aimed at medical applications; beam instrumentation R&D; and the use of plasma for beam focusing. Further experiments, such as those exploring THz radiation for accelerator applications and direct impedance measurements of equipment to be installed in CERN accelerators, are also planned.

    The experimental programme for 2018 and beyond is still open to new and challenging proposals. An international scientific committee is currently being formed to prioritise proposals, and a user request form is available at the CLEAR website: http://clear.web.cern.ch/.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 9:53 pm on October 12, 2017 Permalink | Reply
    Tags: , , , HEP, , , , Xenon is a heavy noble gas that exists in trace quantities in the air, Xenon takes a turn in the LHC   

    From Symmetry: “Xenon takes a turn in the LHC” 

    Symmetry Mag
    Symmetry

    10/12/17
    Sarah Charley

    1
    For the first time, the Large Hadron Collider is accelerating xenon nuclei for experiments.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    Most of the year, the Large Hadron Collider at CERN collides protons. LHC scientists have also accelerated lead nuclei stripped of their electrons. Today, for just about eight hours, they are experimenting with a different kind of nucleus: xenon.

    Xenon is a heavy noble gas that exists in trace quantities in the air. Xenon nuclei are about 40 percent lighter than lead nuclei, so xenon-xenon collisions have a different geometry and energy distribution than lead-lead collisions.

    “When two high-energy nuclei collide, they can momentarily form a droplet of quark gluon plasma, the primordial matter that filled our universe just after the big bang,” says Peter Steinberg, a physicist at the US Department of Energy’s Brookhaven National Laboratory and a heavy-ion coordinator for the ATLAS experiment at CERN. “The shape of the colliding nuclei influences the initial shape of this droplet, which in turn influences how the plasma flows and finally shows up in the angles of the particles we measure. We’re hoping that these smaller droplets from xenon-xenon collisions give us deeper insight into how this still-mysterious process works at truly subatomic length scales.”

    Not all particles that travel through CERN’s long chain of interconnected accelerators wind up in the LHC. Earlier this year, scientists were loading xenon ions into the accelerator and firing them at a fixed-target experiment instead.

    “We can have particles from two different sources feeding into CERN’s accelerator complex,” says Michaela Schaumann, a physicist in LHC operation working on the heavy-ion program. “The LHC’s injectors are so flexible that, once everything is set up properly, they can alternate between accelerating protons and accelerating ions a few times a minute.”

    Having the xenon beam already available provided an opportunity to send xenon into the LHC for first (and potentially only) time. It took some serious additional work to bring the beam quality up to collider levels, Schaumann says, but today it was ready to go.

    “We are keeping the intensities very low in order to fulfil machine protection requirements and be able to use the same accelerator configuration we apply during the proton-proton runs with xenon beams,” Schaumann says. “We needed to adjust the frequency of the accelerator cavities [because more massive xenon ions circulate more slowly than protons], but many of the other machine settings stayed roughly the same.”

    This novel run tests scientists’ knowledge of beam physics and shows the flexibility of the LHC. Scientists say they are hopeful it could reveal something new.

    “We can learn a lot about the properties of the hot, dense matter from smaller collision systems,” Steinberg says. “They are a valuable bridge to connect what we observe in lead-lead collisions to strikingly similar observations in proton-proton interactions.”

    3
    The LHC screen during the xenon-ion run. (Image: CERN)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 7:02 am on October 12, 2017 Permalink | Reply
    Tags: , , , , HEP, , , , The Math That’s Too Difficult for Physics   

    From Quanta: “The Math That’s Too Difficult for Physics” 

    Quanta Magazine
    Quanta Magazine

    November 18, 2016 [Wow!!]
    Kevin Hartnett

    1
    Christian Gwiozda

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event


    CERN CMS Higgs Event

    Higgs Always the last place your look.

    How do physicists reconstruct what really happened in a particle collision? Through calculations that are so challenging that, in some cases, they simply can’t be done. Yet.

    It’s one thing to smash protons together. It’s another to make scientific sense of the debris that’s left behind.

    This is the situation at CERN, the laboratory that houses the Large Hadron Collider, the largest and most powerful particle accelerator in the world.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    In order to understand all the data produced by the collisions there, experimental physicists and theoretical physicists engage in a continual back and forth. Experimentalists come up with increasingly intricate experimental goals, such as measuring the precise properties of the Higgs boson. Ambitious goals tend to require elaborate theoretical calculations, which the theorists are responsible for. The experimental physicists’ “wish list is always too full of many complicated processes,” said Pierpaolo Mastrolia, a theoretical physicist at the University of Padua in Italy. “Therefore we identify some processes that can be computed in a reasonable amount of time.”

    By “processes,” Mastrolia is referring to the chain of events that unfolds after particles collide. For example, a pair of gluons might combine through a series of intermediate steps — particles morphing into other particles — to form a Higgs boson, which then decays into still more particles. In general, physicists prefer to study processes involving larger numbers of particles, since the added complexity assists in searches for physical effects that aren’t described by today’s best theories. But each additional particle requires more math.

    To do this math, physicists use a tool called a Feynman diagram, which is essentially an accounting device that has the look of a stick-figure drawing: Particles are represented by lines that collide at vertices to produce new particles.

    3
    Feynman Diagrams Depicting Possible Formations of the Higgs Boson. Image Credit: scienceblogs.com. astrobites

    Physicists then take the integral of every possible path an experiment could follow from beginning to end and add those integrals together. As the number of possible paths goes up, the number of integrals that theorists must compute — and the difficulty of calculating each individual integral — rises precipitously.

    When deciding on the kinds of collisions they want to study, physicists have two main choices to make. First, they decide on the number of particles they want to consider in the initial state (coming in) and the final state (going out). In most experiments, it’s two incoming particles and anywhere from one to a dozen outgoing particles (referred to as “legs” of the Feynman diagram). Then they decide on the number of “loops” they’ll take into account. Loops represent all the intermediate collisions that could take place between the initial and final states. Adding more loops increases the precision of the measurement. They also significantly add to the burden of calculating Feynman diagrams. Generally speaking, there’s a trade-off between loops and legs: If you want to take into account more loops, you need to consider fewer legs. If you want to consider more legs, you’re limited to just a few loops.

    “If you go to two loops, the largest number [of legs] going out is two. People are pushing toward three particles going out at two loops — that’s the boundary that’s really beyond the state of the art,” said Gavin Salam, a theoretical physicist at CERN.

    Physicists already have the tools to calculate probabilities for tree-level (zero loop) and one-loop diagrams featuring any number of particles going in and out. But accounting for more loops than that is still a major challenge and could ultimately be a limiting factor in the discoveries that can be achieved at the LHC.

    “Once we discover a particle and want to determine its properties, its spin, mass, angular momentum or couplings with other particles, then higher-order calculations” with loops become necessary, said Mastrolia.

    And that’s why many are excited about the emerging connections between Feynman diagrams and number theory that I describe in the recent article “Strange Numbers Found in Particle Collisions.” If mathematicians and physicists can identify patterns in the values generated from diagrams of two or more loops, their calculations would become much simpler — and experimentalists would have the mathematics they need to study the kinds of collisions they’re most interested in.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 8:18 am on October 4, 2017 Permalink | Reply
    Tags: , , HEP, , , ,   

    From ALICE: Women in STEM – “Focus on Ester Casula” 

    CERN
    CERN New Masthead

    18 September 2017 [Just found in social media.]

    1
    Ester Casula

    Ester Anna Rita Casula is a postdoctoral researcher at the Italian National Institute of Nuclear Physics (INFN) of Cagliari – her hometown.

    1

    NAZIONALI del GRAN SASSO, located in L’Aquila, Italy

    She has been ALICE Run Manager for two weeks between June and August of this year.

    During her second week of shift, I meet Ester at point 2, where she spends most of her time monitoring the data taking and making sure everything runs smoothly.

    Sitting with me in the kitchen next to the control room, she talks smiling and laughing. I can see that she has a very extroverted personality. Besides telling me about her work, she unveils an uncommon passion of hers…

    What’s you background and your career path up to now?

    I have studied Physics at the University of Cagliari, in Italy, and I have been a member of the ALICE collaboration since when I was working on my Bachelor’s Degree thesis. At that time, we didn’t have data yet, so I used Monte Carlo simulations. Then, for my Master’s Degree thesis and during my PhD I focused on the analysis of low masses in the di-muon channel – thus, mainly the F – in pp, Pb-Pb and p-Pb collisions at all of the energies we have taken data with. I started with the data from pp collisions at 7 TeV – for my Master’s thesis – and then continued with the other energies and with p-Pb and Pb-Pb data (in detail: pp at 2.76 and 5 TeV, p-Pb at 5 TeV, Pb-Pb at 2.76 and 5 TeV).

    After completing my PhD in 2014, I started a first postdoc with the University of Cagliari and now I am concluding a second postdoc with the INFN in the same town.

    I am based in Cagliari, but in the last months I have spent most of my time at CERN and, in particular, in the control room, since I have also followed some runs as a shift leader.

    How do you like being the run manager?

    It is an interesting experience: every day you might have to face a different problem. For example, during my shift once we were called by the LHC control room to be informed that ALICE was causing the dump of the beam. Of course, we had to solve the issue very quickly. It happened in the dead of the night and I was at home. As soon as I received the call by the shift leader I got up and went to the control room. Luckily I am staying nearby, in Saint-Genis.

    In situations like this you have to react quickly, try to understand the issue as fast as you can and take decisions. In this specific case, the problem was caused by the threshold of the Beam Control Monitors (BCM), which are basically protection devices. We called the expert on call for the BCM, who checked the situation and fixed this issue. Even though the problem seemed to be solved, I kept staying in the control room until 5 am, because I was worried that something else could happen.

    What do you like the most of this role?

    Certainly this, the fact that you need to keep under control and solve different kinds of issues. In addition, you have to give instructions and take decisions: this is quite challenging, if you are not used to it. Actually, you start training in taking responsibilities already when you are the shift leader. When you become run manager, you go a little step forward. I spend a lot of time in the control room and, when I am at home, I check continuously the electronic log to know how the run is proceeding. When I wake up in the morning, the first thing I do – even before standing up – is checking online the status of the accelerator, to know if it is working, and of the experiment.

    It sounds a bit stressing…

    Well, it can be stressing sometimes, indeed. In particular because you have to be ready and react quickly; but, actually, I am finding it easier this week, since it is my second time as run manager.

    You can count on the run coordinator anyway, right?

    Sure. But we call her only if something very important happens. For normal issues, such as a shift leader having some doubts about the operations to perform, the run manager takes on the responsibility. Certainly, it is important to know what the most common issues are. That is why, before starting my first shift, I overlapped with the previous run manager for some days.

    What’s your main field of interest?

    I work on the analysis of the F in Pb-Pb collisions. An article on this topic based on data at 2.75 TeV is in preparation and now we are analyzing data from collisions at 5 TeV. I am quite specialized on this topic.

    Would you like to change topic to do something different?

    Yes, why not?

    Actually, when I was doing my first steps in the analysis, I made some study on the U, but it was based on simulations only, so it was more of an exercise than a real analysis.

    Anyway, I will see. I will have to evaluate the opportunities.

    What are your plans for the future?

    My postdoctoral contract at INFN will get to an end soon, so I will have to look for another job. I would prefer to keep staying in Cagliari, but I am also taking into consideration the possibility to make an experience in another country.

    Where? Or where absolutely not?

    Well, preferably in Europe, but not necessarily. Certainly I would avoid cold places… [She laughs].

    Would you like to teach?

    I don’t know. I have been a tutor for two courses at the University, which means that I helped the professor with the laboratory lessons. It was an interesting experience, but I am not particularly attracted to teaching, mainly because it takes a lot of time to prepare classes and find the right way to explain complex topics.

    Thus, I guess you would prefer to work for a Laboratory, as you are doing at INFN?

    Ideally yes, I would prefer to focus only on research.

    Nevertheless, I don’t exclude the academic career either. I think that I can enjoy part of the process of training students, even though I think it can be hard and tiring.

    What are your interests outside work?

    Well, my main hobby is breeding dogs. I raise them and make them compete in dog shows, which are dog beauty contests. [She laughs.]

    How many dogs do you have?

    I have three at my place, in Cagliari. Three more are looked after by some friends of mine but I make them participate in competitions as well.

    I get a litter of puppies once every three years and I keep some of them. They are all Italian Greyhounds with pedigree. I own the mother and select a father when I decide to have new puppies. [She laughs again.]

    What moves you to do this?

    I love them. I have even created the world online database of the Italian hounds, which didn’t exist before. I started it by myself, then I got some help from other three breeders in US and France. We have registered about 60,000 dogs. Unfortunately, we could go backward only till the end of the 19th century. Lately, the national dog clubs are putting information online, but in order to collect old data I had to rely on the original documentation. So, I went personally to the headquarters of the Italian National Dog Institution (ENCI) in Milan and photocopied all the certificates they have, from 1912 up to now.

    This is cool, but why did you do it?

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:


    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS
    ATLAS
    CERN/ATLAS detector

    ALICE
    CERN ALICE New

    CMS
    CERN/CMS Detector

    LHCb

    CERN/LHCb

    LHC

    CERN/LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles


    Quantum Diaries

     
  • richardmitnick 8:05 pm on September 25, 2017 Permalink | Reply
    Tags: , , , , , , , , , DM axions, HEP, , , The origin of solar flares,   

    From CERN Courier: “Study links solar activity to exotic dark matter” 


    CERN Courier

    1
    Solar-flare distributions

    The origin of solar flares, powerful bursts of radiation appearing as sudden flashes of light, has puzzled astrophysicists for more than a century. The temperature of the Sun’s corona, measuring several hundred times hotter than its surface, is also a long-standing enigma.

    A new study suggests that the solution to these solar mysteries is linked to a local action of dark matter (DM). If true, it would challenge the traditional picture of DM as being made of weakly interacting massive particles (WIMPs) or axions, and suggest that DM is not uniformly distributed in space, as is traditionally thought.

    The study is not based on new experimental data. Rather, lead author Sergio Bertolucci, a former CERN research director, and collaborators base their conclusions on freely available data recorded over a period of decades by geosynchronous satellites. The paper presents a statistical analysis of the occurrences of around 6500 solar flares in the period 1976–2015 and of the continuous solar emission in the extreme ultraviolet (EUV) in the period 1999–2015. The temporal distribution of these phenomena, finds the team, is correlated with the positions of the Earth and two of its neighbouring planets: Mercury and Venus. Statistically significant (above 5σ) excesses of the number of flares with respect to randomly distributed occurrences are observed when one or more of the three planets find themselves in a slice of the ecliptic plane with heliocentric longitudes of 230°–300°. Similar excesses are observed in the same range of longitudes when the solar irradiance in the EUV region is plotted as a function of the positions of the planets.

    These results suggest that active-Sun phenomena are not randomly distributed, but instead are modulated by the positions of the Earth, Venus and Mercury. One possible explanation, says the team, is the existence of a stream of massive DM particles with a preferred direction, coplanar to the ecliptic plane, that is gravitationally focused by the planets towards the Sun when one or more of the planets enter the stream. Such particles would need to have a wide velocity spectrum centred around 300 km s–1 and interact with ordinary matter much more strongly than typical DM candidates such as WIMPs. The non-relativistic velocities of such DM candidates make planetary gravitational lensing more efficient and can enhance the flux of the particles by up to a factor of 106, according to the team.

    Co-author Konstantin Zioutas, spokesperson for the CAST experiment at CERN, accepts that this interpretation of the solar and planetary data is speculative – particularly regarding the mechanism by which a temporarily increased influx of DM actually triggers solar activity.

    CERN CAST Axion Solar Telescope

    However, he says, the long persisting failure to detect the ubiquitous DM might be due to the widely assumed small cross-section of its constituents with ordinary matter, or to erroneous DM modelling. “Hence, the so-far-adopted direct-detection concepts can lead us towards a dead end, and we might find that we have overlooked a continuous communication between the dark and the visible sector.”

    Models of massive DM streaming particles that interact strongly with normal matter are few and far between, although the authors suggest that “antiquark nuggets” are best suited to explain their results. “In a few words, there is a large ‘hidden’ energy in the form of the nuggets,” says Ariel Zhitnitsky, who first proposed the quark-nugget dark-matter model in 2003. “In my model, this energy can be precisely released in the form of the EUV radiation when the anti-nuggets enter the solar corona and get easily annihilated by the light elements present in such a highly ionised environment.”

    The study calls for further investigation, says researchers. “It seems that the statistical analysis of the paper is accurate and the obtained results are rather intriguing,” says Rita Bernabei, spokesperson of the DAMA experiment, which for the first time in 1998 claimed to have detected dark matter in the form of WIMPs on the basis of an observed seasonal modulation of a signal in their scintillation detector.

    DAMA-LIBRA at Gran Sasso

    “However, the paper appears to be mostly hypothetical in terms of this new type of dark matter.”

    The team now plans to produce a full simulation of planetary lensing taking into account the simultaneous effect of all the planets in the solar system, and to extend the analysis to include sunspots, nano-flares and other solar observables. CAST, the axion solar telescope at CERN, will also dedicate a special data-taking period to the search for streaming DM axions.

    “If true, our findings will provide a totally different view about dark matter, with far-reaching implications in particle and astroparticle physics,” says Zioutas. “Perhaps the demystification of the Sun could lead to a dark-matter solution also.”

    Further reading

    S Bertolucci et al. 2017 Phys. Dark Universe 17 13. Elsevier

    http://www.elsevier.com/locate/dark

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 2:46 pm on September 22, 2017 Permalink | Reply
    Tags: , , , , , HEP, Other CERN Linacs, ,   

    From CERN Courier: “Injecting new life into the LHC” 

    CERN Courier

    Sep 22, 2017

    Malika Meddahi
    Giovanni Rumolo

    1
    Beam transfer magnets. No image credit

    The Large Hadron Collider (LHC) is the most famous and powerful of all CERN’s machines, colliding intense beams of protons at an energy of 13 TeV.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    But its success relies on a series of smaller machines in CERN’s accelerator complex that serve it. The LHC’s proton injectors have already been providing beams with characteristics exceeding the LHC’s design specifications. This decisively contributed to the excellent performance of the 2010–2013 LHC physics operation and, since 2015, has allowed CERN to push the machine beyond its nominal beam performance.

    Built between 1959 and 1976, the CERN injector complex accelerates proton beams to a kinetic energy of 450 GeV. It does this via a succession of accelerators: a linear accelerator called Linac 2 followed by three synchrotrons – the Proton Synchrotron Booster (PSB), the Proton Synchrotron (PS) and the Super Proton Synchrotron (SPS).

    2
    CERN Linac 2. No image credit

    3
    CERN The Proton Synchrotron Booster

    CERN Proton Synchrotron

    CERN Super Proton Synchrotron

    The complex also provides the LHC with ion beams, which are first accelerated through a linear accelerator called Linac 3 [nand the Low Energy Ion Ring (LEIR) synchrotron before being injected into the PS and the SPS.

    6
    CERN Linac 3

    5
    CERN Low Energy Ion Ring (LEIR) synchrotron

    The CERN injectors, besides providing beams to the LHC, also serve a large number of fixed-target experiments at CERN – including the ISOLDE radioactive-beam facility and many others.

    CERN ISOLDE

    Part of the LHC’s success lies in the flexibility of the injectors to produce various beam parameters, such as the intensity, the spacing between proton bunches and the total number of bunches in a bunch train. This was clearly illustrated in 2016 when the LHC reached peak luminosity values 40% higher than the design value of 1034 cm–2 s–1, although the number of bunches in the LHC was still about 27% below the maximum achievable. This gain was due to the production of a brighter beam with roughly the same intensity per bunch but in a beam envelope of just half the size.

    Despite the excellent performance of today’s injectors, the beams produced are not sufficient to meet the very demanding proton beam parameters specified by the high-luminosity upgrade of the LHC (HL-LHC).

    Indeed, as of 2025, the HL-LHC aims to accumulate an integrated luminosity of around 250 fb–1 per year, to be compared with the 40 fb–1 achieved in 2016. For heavy-ion operations, the goals are just as challenging: with lead ions the objective is to obtain an integrated luminosity of 10 nb–1 during four runs starting from 2021 (compared to the 2015 achievement of less than 1 nb–1). This has demanded a significant upgrade programme that is now being implemented.

    Immense challenges

    To prepare the CERN accelerator complex for the immense challenges of the HL-LHC, the LHC Injectors Upgrade project (LIU) was launched in 2010. In addition to enabling the necessary proton and ion injector chains to deliver beams of ions and protons required for the HL-LHC, the LIU project must ensure the reliable operation and lifetime of the injectors throughout the HL-LHC era, which is expected to last until around 2035. Hence, the LIU project is also tasked with replacing ageing equipment (such as power supplies, magnets and radio-frequency cavities) and improving radioprotection measures such as shielding and ventilation. [See https://sciencesprings.wordpress.com/2017/09/21/from-cern-next-stop-the-superconducting-magnets-of-the-future/%5D

    One of the first challenges faced by the LIU team members was to define the beam-performance limitations of all the accelerators in the injector chain and identify the actions needed to overcome them by the required amount. Significant machine and simulation studies were carried out over a period of years, while functional and engineering specifications were prepared to provide clear guidelines to the equipment groups. This was followed by the production of the first hardware prototype devices and their installation in the machines for testing and, where possible, early exploitation.

    Significant progress has already been made concerning the production of ion beams. Thanks to the modifications in Linac 3 and LEIR implemented after 2015 and the intensive machine studies conducted within the LIU programme over the last three years, the excellent performance of the ion injector chain could be further improved in 2016 (figure 1). This enabled the recorded luminosity for the 2016 proton–lead run to exceed the target value by a factor of almost eight. The main remaining challenges for the ion beams will be to more than double the number of bunches in the LHC through complex RF manipulations in the SPS known as “momentum slip stacking”, as well as to guarantee continued and stable performance of the ion injector chain without constant expert monitoring.

    Along the proton injector chain, the higher-intensity beams within a comparatively small beam envelope required by the HL-LHC can only be demonstrated after the installation of all the LIU equipment during Long Shutdown 2 (LS2) in 2019–2020. The main installations feature: a new injection region, a new main power supply and RF system in the PSB; a new injection region and RF system to stabilise the future beams in the PS; an upgraded main RF system; and the shielding of vacuum flanges together with partial coating of the beam chambers in order to stabilise future beams against parasitic electromagnetic interaction and electron clouds in the SPS. Beam instrumentation, protection devices and beam dumps also need to be upgraded in all the machines to match the new beam parameters. The baseline goals of the LIU project to meet the challenging HL-LHC requirements are summarised in the panel (final page of feature).

    Execution phase

    Having defined, designed and endorsed all of the baseline items during the last seven years, the LIU project is presently in its execution phase. New hardware is being produced, installed and tested in the different machines. Civil-engineering work is proceeding for the buildings that will host the new PSB main power supply and the upgraded SPS RF equipment, and to prepare the area in which the new SPS internal beam dump will be located.

    The 86 m-long Linac 4, which will eventually replace Linac 2, is an essential component of the HL-LHC upgrade .

    7
    CERN Linac 4

    The machine, based on newly developed technology, became operational at the end of 2016 following the successful completion of acceleration tests at its nominal energy of 160 MeV. It is presently undergoing an important reliability run that will be instrumental to reach beams with characteristics matching the requirements of the LIU project and to achieve an operational availability higher than 95%, which is an essential level for the first link in the proton injector chain. On 26 October 2016, the first 160 MeV negative hydrogen-ion beam was successfully sent to the injection test stand, which operated until the beginning of April 2017 and demonstrated the correct functioning of this new and critical CERN injection system as well as of the related diagnostics and controls.

    The PSB upgrade has mostly completed the equipment needed for the injection of negative hydrogen ions from Linac 4 into the PSB and is progressing with the 2 GeV energy upgrade of the PSB rings and extraction, with a planned installation date of 2019–2020 during LS2. On the beam-physics side, studies have mainly focused on the deployment of the new wideband RF system, commissioning of beam diagnostics and investigation of space-charge effects. During the 2016–2017 technical stop, the principal LIU-related activities were the removal of a large volume of obsolete cables and the installation of new beam instrumentation (e.g. a prototype transverse size measurement device and turn-by-turn orbit measurement systems). The unused cables, which had been individually identified and labelled beforehand, could be safely removed from the machine to allow cables for the new LIU equipment to be pulled.

    The procurement, construction, installation and testing of upgrade items for the PS is also progressing. Some hardware, such as new corrector magnets and power supplies, a newly developed beam gas-ionisation monitor and new injection vacuum chambers to remove aperture limitations, was already installed during past technical stops. Mitigating anticipated longitudinal beam instabilities in the PS is essential for achieving the LIU baseline beam parameters. This requires that the parasitic electromagnetic interaction of the beam with the multiple RF systems has to be reduced and a new feedback system has to be deployed to keep the beam stable. Beam-dynamics studies will determine the present intensity reach of the PS and identify any remaining needs to comfortably achieve the value required for the HL-LHC. Improved schemes of bunch rotation are also under investigation to better match the beam extracted from the PS to the SPS RF system and thus limit the beam losses at injection energy in the SPS.

    In the SPS, the LIU deployment in the tunnel has begun in earnest, with the re-arrangement and improvement of the extraction kicker system, the start of civil engineering for the new beam-dump system in LSS5 and the shielding of vacuum flanges in 10 half-cells together with the amorphous carbon coating of the adjacent beam chambers (to mitigate against electron-cloud effects). In a notable first, eight dipole and 10 focusing quadrupole magnet chambers were amorphous carbon coated in-situ during the 2016–2017 technical stop, proving the industrialisation of this process (figure 2). The new overground RF building needed to accommodate the power amplifiers of the upgraded main RF system has been completed, while procurement and testing of the solid-state amplifiers has also commenced. The prototyping and engineering for the LIU beam-dump is in progress with the construction and installation of a new SPS beam-dump block, which will be able to cope with the higher beam intensities of the HL-LHC and minimise radiation issues.

    Regarding diagnostics, the development of beam-size measurement devices based on flying wire, gas ionisation and synchrotron radiation, all of which are part of the LIU programme, is already providing meaningful results (figure 3) addressing the challenges of measuring the operating high-intensity and high-brightness beams with high precision. From the machine performance and beam dynamics side, measurements in 2015–2016 made with the very high intensities available from the PS meant that new regimes were probed in terms of electron-cloud instabilities, RF power and losses at injection. More studies are planned in 2017–2018 to clearly identify a path for the mitigation of the injection losses when operating with higher beam currents.

    Looking forward to LS2

    The success of LIU in delivering beams with the desired parameters is the key to achieving the HL-LHC luminosity target. Without the LIU beams, all of the other necessary HL-LHC developments – including high-field triplet magnets [see above], crab cavities and new collimators – would only allow a fraction of the desired luminosity to be delivered to experiments.

    Whenever possible, LIU installation work is taking place during CERN’s regular year-end technical stops. But the great majority of the upgrade requires an extended machine stop and therefore will have to wait until LS2 for implementation. The duration of access to the different accelerators during LS2 is being defined and careful preparation is ongoing to manage the work on site, ensure safety and level the available resources among the different machines in the CERN accelerator complex. After all of the LIU upgrades are in place, beams will be commissioned with the newly installed systems. The LIU goals in terms of beam characteristics are, by definition, uncharted territory. Reaching them will require not only a high level of expertise, but also careful optimisation and extensive beam-physics and machine-development studies in all of CERN’s accelerators.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: