Tagged: Particle Accelerators Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:16 pm on October 20, 2017 Permalink | Reply
    Tags: , , , , Particle Accelerators, , Scientists make rare achievement in study of antimatter,   

    From Symmetry: “Scientists make rare achievement in study of antimatter” 


    Symmetry

    10/19/17
    Kimber Price

    1
    Maximilien Brice, Julien Marius Ordan, CERN

    Through hard work, ingenuity and a little cooperation from nature, scientists on the BASE experiment vastly improved their measurement of a property of protons and antiprotons.

    2
    BASE: Baryon Antibaryon Symmetry Experiment. Maximilien Brice

    Scientists at CERN are celebrating a recent, rare achievement in precision physics: Collaborators on the BASE experiment measured a property of antimatter 350 times as precisely as it had ever been measured before.

    The BASE experiment looks for undiscovered differences between protons and their antimatter counterparts, antiprotons. The result, published in the journal Nature, uncovered no such difference, but BASE scientists say they are hopeful the leap in the effectiveness of their measurement has potentially brought them closer to a discovery.

    “According to our understanding of the Standard Model [of particle physics], the Big Bang should have created exactly the same amount of matter and antimatter, but [for the most part] only matter remains,” says BASE Spokesperson Stefan Ulmer.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    This is strange because when matter and antimatter meet, they annihilate one another. Scientists want to know how matter came to dominate our universe.

    “One strategy to try to get hints to understand the mechanisms behind this matter-antimatter symmetry is to compare the fundamental properties of matter and antimatter particles with ultra-high precision,” Ulmer says.

    Scientists on the BASE experiment study a property called the magnetic moment. The magnetic moment is an intrinsic value of particles such as protons and antiprotons that determines how they will orient in a magnetic field, like a compass. Protons and antiprotons should behave exactly the same, other than their charge and direction of orientation; any differences in how they respond to the laws of physics could help explain why our universe is made mostly of matter.

    This is a challenging measurement to make with a proton. Measuring the magnetic moment of an antiproton is an even bigger task. To prevent antiprotons from coming into contact with matter and annihilating, scientists need to house them in special electromagnetic traps.

    While antiprotons generally last less than a second, the ones used in this study were placed in a unique reservoir trap in 2015 and used one by one, as needed, for experiments. The trapped antimatter survived for more than 400 days.

    During the last year, Ulmer and his team worked to improve the precision of the most sophisticated technqiues developed for this measurement in the last decade.

    They did this by improving thier cooling methods. Antiprotons at temperatures close to absolute zero move less than room-temperature ones, making them easier to measure.

    Previously, BASE scientists had cooled each individual antiproton before measuring it and moving on to the next. With the improved trap, the antiprotons stayed cool long enough for the scientists to swap an antiproton for a new one as soon as it became too hot.

    “Developing an instrument stable enough to keep the antiproton close to absolute zero for 4-5 days was the major goal,” says Christian Smorra, the first author of the study.

    This allowed them to collect data more rapidly than ever before. Combining this instrument with a new technique that measures two particles simultaneously allowed them to break their own record from last year’s measurement by a longshot.

    “This is very rare in precision physics, where experimental efforts report on factors of greater than 100 magnitude in improvement,” Ulmer says.

    The results confirm that the two particles behave exactly the same, as the laws of physics would predict. So the mystery of the imbalance between matter and antimatter remains.

    Ulmer says that the group will continue to improve the precision of their work. He says that, in five to 10 years, they should be able to make a measurement at least twice as precise as this latest one. It could be within this range that they will be able to detect subtle differences between protons and antiprotons.

    “Antimatter is a very unique probe,” Ulmer says. “It kind of watches the universe through very different glasses than any matter experiments. With antimatter research, we may be the only ones to uncover physics treasures that would help explain why we don’t have antimatter anymore.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


    Advertisements
     
  • richardmitnick 2:17 pm on October 17, 2017 Permalink | Reply
    Tags: , , , HBOOK and PAW and ROOT and GEANT, , Particle Accelerators, , , René Brun   

    From ALICE at CERN: “40 Years of Large Scale Data Analysis in HEP: Interview with René Brun” 

    CERN
    CERN New Masthead

    16 October 2017
    Virginia Greco

    1
    Over 40 years of career at CERN, René Brun developed a number of software packages that became largely used in High Energy Physics. For these fundamental contributions he was recently awarded a special prize of the EPS High Energy Particle Physics Division. We have talked with him about the key events of this (hi)story.

    1
    René Brun giving a seminar at CERN (on October 4, 2017) about “40 Years of Large Scale Data Analysis in HEP – the HBOOK, Paw and Root Story”. [Credit: Virginia Greco]

    It is hard to imagine that one same person can be behind many of the most important and largely used software packages developed at CERN and in high-energy physics: HBOOK, PAW, ROOT and GEANT. This passionate and visionary person is René Brun, now honorary member of CERN, who was recently awarded a special prize of the EPS High Energy Particle Physics Division “for his outstanding and original contributions to the software tools for data management, detector simulation, and analysis that have shaped particle and high energy physics experiments for many decades”. Over 40 years of career at CERN, he worked with various brilliant scientists and we cannot forget that the realization of such endeavors is always the product of a collaborative effort. Nevertheless, René has had the undoubtable merit of conceiving new ideas, proposing projects and working hard and enthusiastically to transform them in reality.

    One of his creations, ROOT, is a data analysis tool widely used in high energy and nuclear physics experiments, at CERN and in other laboratories. It has already passed beyond the limits of physics and is now being applied in other scientific fields and even in finance. GEANT is an extremely successful software package developed by René Brun, which allows simulating physics experiments and particle interactions in detectors. Its latest version, GEANT4, is currently the first choice of particle physicists dealing with detector simulations.

    But previous to ROOT and GEANT4, which are very well known among the youngest as well, many other projects had been proposed and software tools had been developed. It is a fascinating story, which René was invited to tell in a recent colloquium, organized at CERN by the EP department.

    As he recounts, all started in 1973, when he was hired in the Data Handling (DD) division at CERN to work with Carlo Rubbia in the R602 experiment at the ISR. His duty was to help developing a special hardware processor for the online reconstruction of the collision patterns. But since this development was moving slowly and was not occupying much of his work time, René was asked to write some software for the event reconstruction in multiwire proportional chambers. “At that time, I hated software,” René confesses smiling, “I had written software during my PhD thesis, while studying in Clermont-Ferrand and working at CERN during the weekends, and I hadn’t really enjoyed it. I had joined Rubbia’s group with the ‘promise’ that I would work on hardware, but very quickly I became a software guy again…”

    In short time, René implemented in software (programming in Fortran4) what they could not realize via hardware and, in addition, he developed a histogram package called HBOOK. This allowed realizing a very basic analysis of the data, creating histograms, filling them and sending the output to a line printer. He also wrote a program called HPLOT which was specialized in drawing histograms generated by HBOOK.

    At that time, there were no graphic devices, so the only way to visualize histograms was printing them using a line printer, and programs were written in the form of punched cards.

    René remembers with affection the time spent punching cards, not for the procedure itself, which was slow and quite tedious, but for the long chats he used to have in the room where the card punchers and printers of the DD department were sitting, as well as in the cafeteria nearby. In those long hours, he could discuss ideas and comment on new technologies with colleagues.

    A huge progress was made possible by the introduction of the teletype, which replaced card punchers. Users could generate programs on a disk file and communicate with a central machine, called FOCUS, while – at the same time – seeing on a roll of paper what they were doing as in a normal type machine. “The way it worked can make people smile today,” René recounts, “To log in the FOCUS, one had to type a command which caused a red light to flash in the computer centre. Seeing the light, the operator would mount into the memory of the machine the tape of the connected person, who could thus run a session on the disk. When the user logged out, the session was again dumped on tape. You can imagine the traffic! But this was still much faster than punching cards.”

    Some time later, the teletype was in turn replaced by a Tektronix 4010 terminal, which brought in a big revolution, since it gave the possibility to display results in graphic form. This new, very expensive device allowed René to speed up the development of his software: HBOOK first, then another package called ZBOOK and the first version of GEANT. Created in 1974 with his colleagues in the Electronic Experiments (EE) group, GEANT1 was a tool for performing simple detector simulations. Gradually, they added features to this software and were able to generate collision simulations: GEANT2 was born.

    In 1975 René joined the NA4 experiment, a deep inelastic muon scattering experiment in the North Area, led by Carlo Rubbia. There he collaborated on the development of new graphic tools that allowed printing histograms using a device called CalComp plotter. This machine, which worked with a 10-meter-long roll of paper, granted a much better resolution compared with line printers, but was very expensive. In 1979 a microfilm system was introduced: histograms saved on the film could be inspected before sending them to the plotter, so that only the interesting ones were printed. This reduced the expenses due to the use of the CalComp.

    René was then supposed to follow Rubbia in the UA1 experiment, for which he had been doing many simulations – “Without knowing that I was simulating for UA1,” René highlights. But instead, at the end of 1980, he joined the OPAL experiment, where he performed all the simulations and created GEANT3.

    While working on the HBOOK system, in 1974 René had developed a memory management and I/O system called ZBOOK. This tool was an alternative to the HYDRA system, which was being developed in the bubble chambers group by the late Julius Zoll (also author of another management system called Patchy).

    Thinking that it was meaningless to have two competing systems, in 1981, the late Emilio Pagiola proposed the development of a new software package called GEM. While three people were working hard on the GEM project, René and Julius together started to run benchmarks to compare their systems, ZBOOK and HYDRA, with GEM. Through these tests, they came to the conclusion that the new system was by far slower than theirs.

    In 1983 Ian Butterworth, the then Director for Computing, decided that only the ZBOOK system would be supported at CERN and that GEM had to be stopped, and HYDRA was frozen. “My group leader, Hans Grote, came to my office, shook my hand and told me: ‘Congratulations René, you won.’ But I immediately thought that this decision was not fair, because actually both systems had good features and Julius Zoll was a great software developer.”

    In consequence of this decision, René and Julius started a collaboration and joined forces to develop a package integrating the best features of both ZBOOK and HYDRA. The new project was called ZEBRA, from the combination of the names of the two original systems. “When Julius and I announced that we were collaborating, Ian Butterworth immediately called both of us to his office and told us that, if in 6 months the ZEBRA system was not functioning, we would be fired from CERN. But indeed, less than two months later we were already able to show a running primary version of the ZEBRA system.”

    At the same time, histogram and visualization tools were under development. René put together an interactive version of HBOOK and HPLOT, called HTV, which run on Tektronix machines. But in 1982 the advent of personal workstations marked a revolution. The first personal workstation introduced in Europe, the Apollo, represented a leap in terms of characteristics and performance: it was faster, had more memory and better user interface than any other previous device. “I was invited by the Apollo company to go to Boston and visit them,” René recounts. “When I first saw the Apollo workstation, I was shocked. I immediately realized that it could speed up our development by a factor of 10. I put myself at work and I think that in just three days I adapted some 20000 lines of code for it.”

    The work of René in adapting HTV for the Apollo workstation attracted the interest of the late Rudy Böck, Luc Pape and Jean-Pierre Revol from the UA1 collaboration, who also suggested some improvements. Therefore, in 1984 the three of them elaborated a proposal for a new package, which would be based on HBOOK and ZEBRA, that they called PAW, from Physics Analysis Workstation.

    2
    The PAW team: (from the left) René Brun, Pietro Zanarini, Olivier Couet (standing) and Carlo Vandoni.

    After a first period of uncertainties, the PAW project developed quickly and many new features were introduced, thanks also to the increasing memory space of the workstations. “At a certain point, the PAW software was growing so fast that we started to receive complaints from users who could not keep up with the development,” says René smiling. “Maybe we were a bit naïve, but certainly full of enthusiasm.”

    The programming language generally used for scientific computing was FORTRAN. In particular, at that time FORTRAN 77 (introduced in 1977) was widespread in the high-energy physics community and the main reason for its success was the fact that it was well structured and quite easy to learn. Besides, very efficient implementations of it were available on all the machines used at the time. As a consequence, when the new FORTRAN 90 appeared, it seemed obvious that it would replace FORTRAN 77 and that it would be as successful as the previous version. “I remember well the leader of the computing division, Paolo Zanella, saying: ‘I don’t know what the next programming language will do but I know its name: FORTRAN.’”

    In 1990 and 91 René, together with Mike Metcalf, who was a great expert of FORTRAN, worked hard to adapt the ZEBRA package to FORTRAN 90. But this effort did not lead to a satisfactory result and discussions raised about the opportunity to keep working with FORTRAN or moving to another language. It was the period when object-oriented programming was taking its first steps and also when Tim Berners Lee joined René’s group.

    Berners-Lee was supposed to develop a documentation system, called XFIND, to replace the previous FIND that could run only on IBM machines, which had to be usable on other devices. He believed, though, that the procedure he was supposed to implement was a bit clumsy and certainly not the best approach to the problem. So, he proposed a different solution with a more decentralized and adaptable approach, which required first of all a work of standardization. In this context, Berners-Lee developed the by-now-very-famous idea of the World Wide Web servers and clients, developed using an object-oriented language (Object C).

    It was a very hot period, because the phase of design and simulation of the experiments for the new accelerator LHC had been launched. It was important to take a decision about the programming language and the software tools to use in these new projects.

    At the workshop of ERICE, organized by INFN in November 1990, and then at the Computing in High Energy Physics (CHEP) conference in Annecy (France), in September 1992, the high-energy physics “software gurus” of the world gathered to discuss about programming languages and possible orientations for software in HEP. Among the many languages proposed, there were also Eiffel, Prolog, Modula2 and others.

    In 1994 two Research and Development (RD) projects were launched: RD44, with the objective of implementing in C++ a new version of GEANT (which will become GEANT4), and RD45, aiming to investigate object-oriented database solutions for the LEP experiments.

    According to René, his division was split in three opinion groups: those who wanted to stay with FORTRAN 90, those who bet on C++ and those who were interested in using commercial products. “I presented a proposal to develop a package that would take PAW to the OO word. But the project, which I called ZOO, was rejected and I was even invited to take a sabbatical leave” René admits.

    This blow, though, proved later to be indeed a strike of luck for René. He was suggested by his division leader, David Williams, to join the NA49 experiment in the North Area, which needed somebody to help developing the software. At first, he refused. He had been leading for years both the GEANT and the PAW projects and making simulation or developing software for different groups and applications, thus accepting to go back working in a specific experiment appeared to him as a big limitation.

    But he gave it second thoughts and realized that it was an opportunity to take some time to develop new software, with total freedom. He went to visit the NA49 building in the Prevessin site and, seeing from the windows pine trees and squirrels, he felt that it was indeed the kind of quiet environment he needed for his new project. Therefore, he moved his workstation from his office to the Prevessin site (“I did it during a weekend, without even telling David Williams”) and, while working for NA49, he taught himself C++ by converting in this new OO language a large part of his HBOOK software.

    At the beginning of 1995, René was joined in NA49 by Fons Rademakers, with whom he had already collaborated. The two of them worked very hard for several months and produced the first version of what became the famous ROOT system. The name comes simply from the combination of the starting letter of the email addresses of the two founders (René and Rdm, for Rademakers), the double O of Object Oriented and the word Technology. But the meaning or the word ‘root’ also fitted well with its being a basic framework for more software to be developed and with the use of tree structures in its architecture.

    In November of the same year, René gave a seminar to present the ROOT system. “The Computing Division auditorium was unexpectedly crowded!” René recalls, “I think it was because people thought that Fons and I had disappeared from the software arena, while all of a sudden we were back again!” And actually the ROOT system generated considerable interest.

    But while René and Fons were completely absorbed by the work on their new software package, the RD45 project, which had the mandate to decide what new software had to be adopted by the new LHC experiments, had proposed to use the commercial product “Objectivity” and a lot of work was ongoing to develop applications to meet the HEP needs. According to René, there was a clear intention to obstruct the development and diffusion of ROOT. In spring 1996 the CERN director for computing, Lorenzo Foa, declared that the ROOT project was considered as a private initiative of NA49 which was not supported by the CERN management and that the official line of development was the one around Objectivity.

    “I think that the LHC Computing Board didn’t have the right insight into the architecture of these software tools to be able to judge which solution was the best. Thus, they had to trust what they were told,” René comments. “It is always a problem when there is such a divide between the experts – and users – working on something and the people who are to take important decisions.”

    Nevertheless, René and Fons continued developing ROOT and implementing new features, taking advantage of the lessons learnt with the previous software packages (in particular the requests and criticisms of the users). In addition, they followed closely the development of the official line with Objectivity, in order to know what people using it were looking for and what the problems or difficulties were. “The more we looked into Objectivity, the more we realized it could not meet the needs of our community,” René adds, “we knew that the system would fail and that eventually people would realize it. This gave us even more energy and motivation to work hard and improve our product.”

    They had continuous support from the NA49 and ALICE collaborations, as well as from many people in ATLAS and CMS, who saw good potentiality in their software package. At the time, René was collaborating with many people in both experiments, including Fabiola Gianotti and Daniel Froidevaux, in particular for detector simulations. Besides, many users trusted them for the relationship created along many years through the user support of PAW and GEANT.

    Things started to change when interest for ROOT raised outside CERN. In 1998, the two experiments of Fermilab, CDF and D0, decided to discuss about the future of their software approach, in view of the soon-coming Run II of the Tevatron. Hence, they opened two calls for proposals of software solutions, one for data storage and one for data analysis and visualization. René submitted ROOT to both calls. During the CHEP conference in Chicago the proposals were discussed and the last day it was publicly announced that CDF and D0 would adopt ROOT. “I was not expecting it,” says René, “I remember that when the communication was given, everybody turned their face and looked at me.” Soon later, the experiments of RHIC at the Brookhaven National Laboratory took the same decision. The BaBar experiment at SLAC, after years spent attempting to use Objectivity, had realized that it was not as good a system as expected, so moved to ROOT as well.

    Gradually, it was clear that the HEP community was ‘naturally’ going towards ROOT, so the CERN management had to accept this situation and, eventually, support it. But this happened only in 2002. With more manpower allocated to the project, ROOT continued developing fast and the number of users increased dramatically. It also started to spread to other branches of science and into the financial world. “In 2010, we had on average 12000 downloads per month of the software package and the ROOT website had more visitors than the CERN one”.

    3
    The logo of the ROOT software package.

    René retired in 2012, but his two most important brainchildren, ROOT and GEANT, keep growing thanks to the work of many young scientists. “I think that it is essential to have a continuous stimulus that pushes you to improve your products and come out with new solutions. For this, the contribution of young people is very important,” comments René. But, as he admits, what really made him and his colleagues work hard for so many years is the fact that the software packages they were developing had always some competitors and, in many cases, they were challenged and even obstructed. “When you are contrasted, but you know you are right, you are condemned to succeed.”

    The great attention to the users’ needs has also been very important, because it helped to shape the software and build a trust relationship with people. “I have always said that you have to put the user support at the highest priority,” René explains. “If you reply to a request in 10 minutes you get 10 points, in one hour you get 2 points, and in one day you go already to -10 points. Answering questions and comments is fundamental, because if the users are satisfied with the support you give them, they are willing to trust what you propose next.”

    Now that he is retired, René still follows the software development at CERN, but only as an external observer. This does not mean that he has left apart his scientific interests, on the contrary he is now dedicating most of his energies to a more theoretical project, since he is developing a physics model. In his spare time, he likes gardening. He loves flowers, but he cannot avoid looking at them with a scientific eye: “A colleague of mine, who is mathematician, and I developed a mathematical model about the way flowers are structured and grow.”

    Brilliant minds are always at work.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:


    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS
    ATLAS
    CERN/ATLAS detector

    ALICE
    CERN ALICE New

    CMS
    CERN/CMS Detector

    LHCb

    CERN/LHCb

    LHC

    CERN/LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles


    Quantum Diaries

     
  • richardmitnick 1:59 pm on October 17, 2017 Permalink | Reply
    Tags: , , , Eliane Epple, Particle Accelerators, , ,   

    From ALICE at CERN: Women in STEM – “Focus on Eliane Epple” 

    CERN
    CERN New Masthead

    16 October 2017
    Virginia Greco

    1
    Eliane Epple
    A postdoctoral researcher at Yale University, Eliane is working on an analysis involving hard scattering events that produce direct photons and has recently done her first shift as Run Manager for ALICE.

    When she started studying physics in Stuttgart, her hometown, Eliane Epple was already passionate about particle physics. But since it was not possible to specialize in this field at her university, after two years she moved to Munich and attended the Technical University Munich (TUM). Here, she followed courses for two more years before joining a research project led by Prof. Laura Fabbietti, who had just received a big grant and was starting her research group. The subject of Eliane’s Diploma thesis was the study of the interactions of kaons – and other particles containing strange quarks – with nuclear matter (protons and neutrons). More in detail, for her Diploma she analyzed the decay products of a resonance called Λ(1405), which is by some theories treated as a molecular bound state of an anti-kaon and a nucleon. Its is in this sense a pre-stage of a kaonic nuclear cluster that she later studied during her PhD, still working with Prof. Fabbietti.

    In particular, Eliane and colleagues were investigating the possible existence of anti-kaonic bound-states formed by, for example, two nucleons and one anti-kaon.­ Besides Fabbietti’s team, other groups all over the world were working on this topic, since a number of theoretical physicists had hypothesized that the attraction between nucleons and anti-kaons should be strong enough to give rise to this bound state, at least for a short time. “I analyzed data from the High Acceptance Di-Electron Spectrometer (HADES) at GSI.

    GSI HADES

    In particular, I looked for particles produced in p+p collisions that could originate bfrom the decay of this anti-kaon-nucleon bound state,” explains Eliane. “It was a very controversial topic at the time, because there were groups that, analyzing a certain set of data, could see a signal compatible with the detection of such bound state, while others couldn’t. I didn’t find any signal proving this hypothesis, but at the same time my results set un upper limit for the existence of this bound state at the beam energy of 3.5 GeV.”

    “In order to set a limit,” Eliane continues, “you compare the result of your data analysis with the outcome of a simulation, performed assuming the hypothesis that the signal you are looking for but didn’t see exists. In other words, you develop a model for this case and study how much signal you can introduce and still keep consistency with your data. You proceed to add more and more signal strength to your model in little steps, until you reach a threshold: if you overcome it, the model doesn’t fit anymore with the data. This threshold is an upper limit.”

    She also combined her results with data from other experiments and showed that it was very unlikely that the signal seen by some other groups could be due to an anti-kaon-nucleon bound state. “Actually, I think that this signal exists because there are many compelling reasons from our theory colleagues, but it is very challenging to see, first of all because the production cross section of this state is probably very small, which means that it occurs rarely, so we need to take a lot of data. In addition, it might be a very broad state, so we are not going to find a narrow peak. As a consequence, understanding the background well is essential.”

    When she completed her PhD in 2014, she decided to change field. “In that situation, you have two possible choices,” explains Eliane, “either you stay on the same topic and become an expert in a very specific field, or you change and broaden your horizon. In this second case, you do not become a specialist of one topic but rather increase your ‘portfolio’. I preferred to go for this second option and do something completely new. This way is much harder because you basically start from the beginning but I think it benefits a researcher in the long term to look at a field, in this case QCD, from many perspectives. I thus also encourage some young researchers to give low energy QCD research a chance and see what people do beyond the TeV scale.” Therefore, she joined the research group led by John Harris and Helen Caines at Yale University, in New Haven (US), where she has been working for two and a half years now, and entered the ALICE collaboration.

    Her present research activities focus on hard probes in high-energy collisions. “The proton is a very fascinating object, there is a lot going on in it,” Eliane comments. “When you scatter two protons at low energy (an energy range where I have previously been working on), you see how the ‘entire’ proton behaves, you are not able to distinguish its internal structure. On the contrary, at the high energies of LHC, when you collide two protons you start seeing what happens inside, you can observe how partons collide with each other.”

    In these hard scattering events, particles with a high transverse momentum are present in the final state. Eliane is analyzing Pb-Pb events in which a parton and a photon (a gamma) are produced. Photons do not interact with strongly-interacting matter, hence, when the Quark Gluon Plasma (QGP) is created in ALICE by smashing lead nuclei, a photon produced in the collision can traverse this medium and get out unaffected. In the opposite direction, a parton moves away from the collision vertex and fragments into a particle shower. The sum of the momenta of the particles in this shower have to balance the momentum of the photon (combining these fragments with the gamma on the other side is called gamma-hadron correlation), and altogether they carry the total momentum of the mother parton.

    The objective of this research is measuring the fragmentation function, which describes the correlation between the momentum of the mother and those of each particle in the shower. Normally, most of the daughter particles carry a small fraction (less than 20%*) of the momentum of the mother, whereas very few of them have a high fraction of this momentum. “By studying the behaviour of the particle shower in Pb-Pb collisions, in comparison with pp and p-Pb collisions, we can understand how the QGP medium modifies it,” explains Eliane. “We may have, for example, fewer of these very high momentum fragments and therefore more of the low momentum ones, or the shower might be broader. This study gives information about the properties of the medium that is created.”

    There are measurements of gamma-hadron correlations performed in PHENIX,

    BNL RHIC PHENIX

    at 200 GeV which show that in gold-gold collisions the fragmentation function changes, giving fewer particles with high momentum fractions and many more particles having a low momentum fraction. ALICE is investigating what happens at higher energies.

    Eliane is now working in collaboration with a graduate student at her institute and other colleagues in Berkeley. “We are performing a very complex analysis. In our events, we have to identify gammas on one side and the hadron showers on the other. But gammas can also be decay products of other particles, such as pions and other mesons. Thus, it is important to avoid this background signal and take into consideration only events in which the gamma is produced in the primary vertex. This is not easy and requires a number of following steps.”

    Eliane will continue working at Yale for some time. Then, she will either look for another post-doctoral position in ALICE or will directly apply for some grants, most likely in Europe. “There are various opportunities in Germany to get research funding to start your own research group.”

    Even though she likes her present topic of analysis, in the future she might change for something more basic: the substructure and dynamic of the proton. “The proton is a very complex and fascinating object in its own right, we still do not know much about its internal dynamics,” she highlights. In any case, the most important thing for her is to settle on a research topic that will give her deeper insight into QCD properties — something she is very intrigued by.

    In addition to doing data analysis, Eliane coordinates the activities of the EMCal calibration group and EMCal photon object group and, lately, has been Run Manager for the data taking. With so much work and a four-year-old daughter, there is not much time left. Nevertheless, when she can, she attends classes of modern dance to de-stress and relax.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:


    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS
    ATLAS
    CERN/ATLAS detector

    ALICE
    CERN ALICE New

    CMS
    CERN/CMS Detector

    LHCb

    CERN/LHCb

    LHC

    CERN/LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles


    Quantum Diaries

     
  • richardmitnick 4:16 pm on October 13, 2017 Permalink | Reply
    Tags: , , CLEAR, , Particle Accelerators,   

    From CERN Courier: “CLEAR prospects for accelerator research” 


    CERN Courier

    Oct 13, 2017

    1
    CLEAR’s plasma-lens experiment will test ways to drive strong currents through a plasma for particle-beam transverse focusing.
    Image credit: M Volpi.

    A new user facility for accelerator R&D, the CERN Linear Electron Accelerator for Research (CLEAR), started operation in August and is ready to provide beam for experiments. CLEAR evolved from the former CTF3 test facility for the Compact Linear Collider (CLIC), which ended a successful programme in December 2016. Following approval of the CLEAR proposal, the necessary hardware modifications started in January and the facility is now able to host and test a broad range of ideas in the accelerator field.

    CLEAR’s primary goal is to enhance and complement the existing accelerator R&D programme at CERN, as well as offering a training infrastructure for future accelerator physicists and engineers. The focus is on general accelerator R&D and component studies for existing and possible future accelerator applications. This includes studies of high-gradient acceleration methods, such as CLIC X-band and plasma technologies, as well as prototyping and validation of accelerator components for the high-luminosity LHC upgrade.

    The scientific programme for 2017 includes: a combined test of critical CLIC technologies, continuing previous tests performed at CTF3; measurements of radiation effects on electronic components to be installed on space missions in a Jovian environment and for dosimetry tests aimed at medical applications; beam instrumentation R&D; and the use of plasma for beam focusing. Further experiments, such as those exploring THz radiation for accelerator applications and direct impedance measurements of equipment to be installed in CERN accelerators, are also planned.

    The experimental programme for 2018 and beyond is still open to new and challenging proposals. An international scientific committee is currently being formed to prioritise proposals, and a user request form is available at the CLEAR website: http://clear.web.cern.ch/.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 9:53 pm on October 12, 2017 Permalink | Reply
    Tags: , , , , Particle Accelerators, , , Xenon is a heavy noble gas that exists in trace quantities in the air, Xenon takes a turn in the LHC   

    From Symmetry: “Xenon takes a turn in the LHC” 

    Symmetry Mag
    Symmetry

    10/12/17
    Sarah Charley

    1
    For the first time, the Large Hadron Collider is accelerating xenon nuclei for experiments.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    Most of the year, the Large Hadron Collider at CERN collides protons. LHC scientists have also accelerated lead nuclei stripped of their electrons. Today, for just about eight hours, they are experimenting with a different kind of nucleus: xenon.

    Xenon is a heavy noble gas that exists in trace quantities in the air. Xenon nuclei are about 40 percent lighter than lead nuclei, so xenon-xenon collisions have a different geometry and energy distribution than lead-lead collisions.

    “When two high-energy nuclei collide, they can momentarily form a droplet of quark gluon plasma, the primordial matter that filled our universe just after the big bang,” says Peter Steinberg, a physicist at the US Department of Energy’s Brookhaven National Laboratory and a heavy-ion coordinator for the ATLAS experiment at CERN. “The shape of the colliding nuclei influences the initial shape of this droplet, which in turn influences how the plasma flows and finally shows up in the angles of the particles we measure. We’re hoping that these smaller droplets from xenon-xenon collisions give us deeper insight into how this still-mysterious process works at truly subatomic length scales.”

    Not all particles that travel through CERN’s long chain of interconnected accelerators wind up in the LHC. Earlier this year, scientists were loading xenon ions into the accelerator and firing them at a fixed-target experiment instead.

    “We can have particles from two different sources feeding into CERN’s accelerator complex,” says Michaela Schaumann, a physicist in LHC operation working on the heavy-ion program. “The LHC’s injectors are so flexible that, once everything is set up properly, they can alternate between accelerating protons and accelerating ions a few times a minute.”

    Having the xenon beam already available provided an opportunity to send xenon into the LHC for first (and potentially only) time. It took some serious additional work to bring the beam quality up to collider levels, Schaumann says, but today it was ready to go.

    “We are keeping the intensities very low in order to fulfil machine protection requirements and be able to use the same accelerator configuration we apply during the proton-proton runs with xenon beams,” Schaumann says. “We needed to adjust the frequency of the accelerator cavities [because more massive xenon ions circulate more slowly than protons], but many of the other machine settings stayed roughly the same.”

    This novel run tests scientists’ knowledge of beam physics and shows the flexibility of the LHC. Scientists say they are hopeful it could reveal something new.

    “We can learn a lot about the properties of the hot, dense matter from smaller collision systems,” Steinberg says. “They are a valuable bridge to connect what we observe in lead-lead collisions to strikingly similar observations in proton-proton interactions.”

    3
    The LHC screen during the xenon-ion run. (Image: CERN)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 7:02 am on October 12, 2017 Permalink | Reply
    Tags: , , , , , Particle Accelerators, , , The Math That’s Too Difficult for Physics   

    From Quanta: “The Math That’s Too Difficult for Physics” 

    Quanta Magazine
    Quanta Magazine

    November 18, 2016 [Wow!!]
    Kevin Hartnett

    1
    Christian Gwiozda

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event


    CERN CMS Higgs Event

    Higgs Always the last place your look.

    How do physicists reconstruct what really happened in a particle collision? Through calculations that are so challenging that, in some cases, they simply can’t be done. Yet.

    It’s one thing to smash protons together. It’s another to make scientific sense of the debris that’s left behind.

    This is the situation at CERN, the laboratory that houses the Large Hadron Collider, the largest and most powerful particle accelerator in the world.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    In order to understand all the data produced by the collisions there, experimental physicists and theoretical physicists engage in a continual back and forth. Experimentalists come up with increasingly intricate experimental goals, such as measuring the precise properties of the Higgs boson. Ambitious goals tend to require elaborate theoretical calculations, which the theorists are responsible for. The experimental physicists’ “wish list is always too full of many complicated processes,” said Pierpaolo Mastrolia, a theoretical physicist at the University of Padua in Italy. “Therefore we identify some processes that can be computed in a reasonable amount of time.”

    By “processes,” Mastrolia is referring to the chain of events that unfolds after particles collide. For example, a pair of gluons might combine through a series of intermediate steps — particles morphing into other particles — to form a Higgs boson, which then decays into still more particles. In general, physicists prefer to study processes involving larger numbers of particles, since the added complexity assists in searches for physical effects that aren’t described by today’s best theories. But each additional particle requires more math.

    To do this math, physicists use a tool called a Feynman diagram, which is essentially an accounting device that has the look of a stick-figure drawing: Particles are represented by lines that collide at vertices to produce new particles.

    3
    Feynman Diagrams Depicting Possible Formations of the Higgs Boson. Image Credit: scienceblogs.com. astrobites

    Physicists then take the integral of every possible path an experiment could follow from beginning to end and add those integrals together. As the number of possible paths goes up, the number of integrals that theorists must compute — and the difficulty of calculating each individual integral — rises precipitously.

    When deciding on the kinds of collisions they want to study, physicists have two main choices to make. First, they decide on the number of particles they want to consider in the initial state (coming in) and the final state (going out). In most experiments, it’s two incoming particles and anywhere from one to a dozen outgoing particles (referred to as “legs” of the Feynman diagram). Then they decide on the number of “loops” they’ll take into account. Loops represent all the intermediate collisions that could take place between the initial and final states. Adding more loops increases the precision of the measurement. They also significantly add to the burden of calculating Feynman diagrams. Generally speaking, there’s a trade-off between loops and legs: If you want to take into account more loops, you need to consider fewer legs. If you want to consider more legs, you’re limited to just a few loops.

    “If you go to two loops, the largest number [of legs] going out is two. People are pushing toward three particles going out at two loops — that’s the boundary that’s really beyond the state of the art,” said Gavin Salam, a theoretical physicist at CERN.

    Physicists already have the tools to calculate probabilities for tree-level (zero loop) and one-loop diagrams featuring any number of particles going in and out. But accounting for more loops than that is still a major challenge and could ultimately be a limiting factor in the discoveries that can be achieved at the LHC.

    “Once we discover a particle and want to determine its properties, its spin, mass, angular momentum or couplings with other particles, then higher-order calculations” with loops become necessary, said Mastrolia.

    And that’s why many are excited about the emerging connections between Feynman diagrams and number theory that I describe in the recent article “Strange Numbers Found in Particle Collisions.” If mathematicians and physicists can identify patterns in the values generated from diagrams of two or more loops, their calculations would become much simpler — and experimentalists would have the mathematics they need to study the kinds of collisions they’re most interested in.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 8:18 am on October 4, 2017 Permalink | Reply
    Tags: , , , , Particle Accelerators, ,   

    From ALICE: Women in STEM – “Focus on Ester Casula” 

    CERN
    CERN New Masthead

    18 September 2017 [Just found in social media.]

    1
    Ester Casula

    Ester Anna Rita Casula is a postdoctoral researcher at the Italian National Institute of Nuclear Physics (INFN) of Cagliari – her hometown.

    1

    NAZIONALI del GRAN SASSO, located in L’Aquila, Italy

    She has been ALICE Run Manager for two weeks between June and August of this year.

    During her second week of shift, I meet Ester at point 2, where she spends most of her time monitoring the data taking and making sure everything runs smoothly.

    Sitting with me in the kitchen next to the control room, she talks smiling and laughing. I can see that she has a very extroverted personality. Besides telling me about her work, she unveils an uncommon passion of hers…

    What’s you background and your career path up to now?

    I have studied Physics at the University of Cagliari, in Italy, and I have been a member of the ALICE collaboration since when I was working on my Bachelor’s Degree thesis. At that time, we didn’t have data yet, so I used Monte Carlo simulations. Then, for my Master’s Degree thesis and during my PhD I focused on the analysis of low masses in the di-muon channel – thus, mainly the F – in pp, Pb-Pb and p-Pb collisions at all of the energies we have taken data with. I started with the data from pp collisions at 7 TeV – for my Master’s thesis – and then continued with the other energies and with p-Pb and Pb-Pb data (in detail: pp at 2.76 and 5 TeV, p-Pb at 5 TeV, Pb-Pb at 2.76 and 5 TeV).

    After completing my PhD in 2014, I started a first postdoc with the University of Cagliari and now I am concluding a second postdoc with the INFN in the same town.

    I am based in Cagliari, but in the last months I have spent most of my time at CERN and, in particular, in the control room, since I have also followed some runs as a shift leader.

    How do you like being the run manager?

    It is an interesting experience: every day you might have to face a different problem. For example, during my shift once we were called by the LHC control room to be informed that ALICE was causing the dump of the beam. Of course, we had to solve the issue very quickly. It happened in the dead of the night and I was at home. As soon as I received the call by the shift leader I got up and went to the control room. Luckily I am staying nearby, in Saint-Genis.

    In situations like this you have to react quickly, try to understand the issue as fast as you can and take decisions. In this specific case, the problem was caused by the threshold of the Beam Control Monitors (BCM), which are basically protection devices. We called the expert on call for the BCM, who checked the situation and fixed this issue. Even though the problem seemed to be solved, I kept staying in the control room until 5 am, because I was worried that something else could happen.

    What do you like the most of this role?

    Certainly this, the fact that you need to keep under control and solve different kinds of issues. In addition, you have to give instructions and take decisions: this is quite challenging, if you are not used to it. Actually, you start training in taking responsibilities already when you are the shift leader. When you become run manager, you go a little step forward. I spend a lot of time in the control room and, when I am at home, I check continuously the electronic log to know how the run is proceeding. When I wake up in the morning, the first thing I do – even before standing up – is checking online the status of the accelerator, to know if it is working, and of the experiment.

    It sounds a bit stressing…

    Well, it can be stressing sometimes, indeed. In particular because you have to be ready and react quickly; but, actually, I am finding it easier this week, since it is my second time as run manager.

    You can count on the run coordinator anyway, right?

    Sure. But we call her only if something very important happens. For normal issues, such as a shift leader having some doubts about the operations to perform, the run manager takes on the responsibility. Certainly, it is important to know what the most common issues are. That is why, before starting my first shift, I overlapped with the previous run manager for some days.

    What’s your main field of interest?

    I work on the analysis of the F in Pb-Pb collisions. An article on this topic based on data at 2.75 TeV is in preparation and now we are analyzing data from collisions at 5 TeV. I am quite specialized on this topic.

    Would you like to change topic to do something different?

    Yes, why not?

    Actually, when I was doing my first steps in the analysis, I made some study on the U, but it was based on simulations only, so it was more of an exercise than a real analysis.

    Anyway, I will see. I will have to evaluate the opportunities.

    What are your plans for the future?

    My postdoctoral contract at INFN will get to an end soon, so I will have to look for another job. I would prefer to keep staying in Cagliari, but I am also taking into consideration the possibility to make an experience in another country.

    Where? Or where absolutely not?

    Well, preferably in Europe, but not necessarily. Certainly I would avoid cold places… [She laughs].

    Would you like to teach?

    I don’t know. I have been a tutor for two courses at the University, which means that I helped the professor with the laboratory lessons. It was an interesting experience, but I am not particularly attracted to teaching, mainly because it takes a lot of time to prepare classes and find the right way to explain complex topics.

    Thus, I guess you would prefer to work for a Laboratory, as you are doing at INFN?

    Ideally yes, I would prefer to focus only on research.

    Nevertheless, I don’t exclude the academic career either. I think that I can enjoy part of the process of training students, even though I think it can be hard and tiring.

    What are your interests outside work?

    Well, my main hobby is breeding dogs. I raise them and make them compete in dog shows, which are dog beauty contests. [She laughs.]

    How many dogs do you have?

    I have three at my place, in Cagliari. Three more are looked after by some friends of mine but I make them participate in competitions as well.

    I get a litter of puppies once every three years and I keep some of them. They are all Italian Greyhounds with pedigree. I own the mother and select a father when I decide to have new puppies. [She laughs again.]

    What moves you to do this?

    I love them. I have even created the world online database of the Italian hounds, which didn’t exist before. I started it by myself, then I got some help from other three breeders in US and France. We have registered about 60,000 dogs. Unfortunately, we could go backward only till the end of the 19th century. Lately, the national dog clubs are putting information online, but in order to collect old data I had to rely on the original documentation. So, I went personally to the headquarters of the Italian National Dog Institution (ENCI) in Milan and photocopied all the certificates they have, from 1912 up to now.

    This is cool, but why did you do it?

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:


    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS
    ATLAS
    CERN/ATLAS detector

    ALICE
    CERN ALICE New

    CMS
    CERN/CMS Detector

    LHCb

    CERN/LHCb

    LHC

    CERN/LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles


    Quantum Diaries

     
  • richardmitnick 7:12 am on October 3, 2017 Permalink | Reply
    Tags: , , , Particle Accelerators, , , SESAME - also known as the International Centre for Synchrotron-Light for Experimental Science and Applications,   

    From Symmetry: “Shining with possibility” 

    Symmetry Mag

    Symmetry

    09/26/17
    Signe Brewster

    As Jordan-based SESAME nears its first experiments, members are connecting in new ways.

    1
    Artwork by Ana Kova


    SESAME Particle Accelerator, Jordan campus, an independent laboratory located in Allan in the Balqa governorate of Jordan

    Early in the morning, physicist Roy Beck Barkai boards a bus in Tel Aviv bound for Jordan. By 10:30 a.m., he is on site at SESAME, a new scientific facility where scientists plan to use light to study everything from biology to archaeology. He is back home by 7 p.m., in time to have dinner with his children.

    Before SESAME opened, the closest facility like it was in Italy. Beck Barkai often traveled for two days by airplane, train and taxi for a day or two of work—an inefficient and expensive process that limited his ability to work with specialized equipment from his home lab and required him to spend days away from his family.

    “For me, having the ability to kiss them goodbye in the morning and just before they went to sleep at night is a miracle,” Beck Barkai says. “It felt like a dream come true. Having SESAME at our doorstep is a big plus.”

    SESAME, also known as the International Centre for Synchrotron-Light for Experimental Science and Applications in the Middle East, opened its doors in May and is expected to host its first beams of synchrotron light this year. Scientists from around the world will be able to apply for time to use the facility’s powerful light source for their experiments. It’s the first synchrotron in the region.

    Beck Barkai says SESAME provides a welcome dose of convenience, as scientists in the region can now drive to a research center instead of flying with sensitive equipment to another country. It’s also more cost-effective.

    Located in Jordan to the northwest of the city of Amman, SESAME was built by a collaboration made up of Cyprus, Egypt, Iran, Israel, Jordan, Pakistan, Turkey and the Palestinian Authority—a partnership members hope will improve relations among the eight neighbors.

    “SESAME is a very important step in the region,” says SESAME Scientific Advisory Committee Chair Zehra Sayers. “The language of science is objective. It’s based on curiosity. It doesn’t need to be affected by the differences in cultural and social backgrounds. I hope it is something that we will leave the next generations as a positive step toward stability.”

    2
    Artwork by Ana Kova

    Protein researcher and a University of Jordan professor Areej Abuhammad says she hopes SESAME will provide an environment that encourages collaboration.

    “I think through having the chance to interact, the scientists from around this region will learn to trust and respect each other,” she says. “I don’t think that this will result in solving all the problems in the region from one day to the next, but it will be a big step forward.”

    The $100 million center is a state-of-the-art research facility that should provide some relief to scientists seeking time at other, overbooked facilities. SESAME plans to eventually host 100 to 200 users at a time.

    SESAME’s first two beamlines will open later this year. About twice per year, SESAME will announce calls for research proposals, the next of which is expected for this fall. Sayers says proposals will be evaluated for originality, preparedness and scientific quality.

    Groups of researchers hoping to join the first round of experiments submitted more than 50 applications. Once the lab is at full operation, Sayers says, the selection committee expects to receive four to five times more than that.

    Opening up a synchrotron in the Middle East means that more people will learn about these facilities and have a chance to use them. Because some scientists in the region are new to using synchrotrons or writing the style of applications SESAME requires, Sayers asked the selection committee to provide feedback with any rejections.

    Abuhammad is excited for the learning opportunity SESAME presents for her students—and for the possibility that experiences at SESAME will spark future careers in science.

    She plans to apply for beam time at SESAME to conduct protein crystallography, a field that involves peering inside proteins to learn about their function and aid in pharmaceutical drug discovery.

    Another scientist vying for a spot at SESAME is Iranian chemist Maedeh Darzi, who studies the materials of ancient manuscripts and how they degrade. Synchrotrons are of great value to archaeologists because they minimize the damage to irreplaceable artifacts. Instead of cutting them apart, scientists can take a less damaging approach by probing them with particles.

    Darzi sees SESAME as a chance to collaborate with scientists from the Middle East and to promote science, peace and friendship. For her and others, SESAME could be a place where particles put things back together.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 8:05 pm on September 25, 2017 Permalink | Reply
    Tags: , , , , , , , , , DM axions, , Particle Accelerators, , The origin of solar flares,   

    From CERN Courier: “Study links solar activity to exotic dark matter” 


    CERN Courier

    1
    Solar-flare distributions

    The origin of solar flares, powerful bursts of radiation appearing as sudden flashes of light, has puzzled astrophysicists for more than a century. The temperature of the Sun’s corona, measuring several hundred times hotter than its surface, is also a long-standing enigma.

    A new study suggests that the solution to these solar mysteries is linked to a local action of dark matter (DM). If true, it would challenge the traditional picture of DM as being made of weakly interacting massive particles (WIMPs) or axions, and suggest that DM is not uniformly distributed in space, as is traditionally thought.

    The study is not based on new experimental data. Rather, lead author Sergio Bertolucci, a former CERN research director, and collaborators base their conclusions on freely available data recorded over a period of decades by geosynchronous satellites. The paper presents a statistical analysis of the occurrences of around 6500 solar flares in the period 1976–2015 and of the continuous solar emission in the extreme ultraviolet (EUV) in the period 1999–2015. The temporal distribution of these phenomena, finds the team, is correlated with the positions of the Earth and two of its neighbouring planets: Mercury and Venus. Statistically significant (above 5σ) excesses of the number of flares with respect to randomly distributed occurrences are observed when one or more of the three planets find themselves in a slice of the ecliptic plane with heliocentric longitudes of 230°–300°. Similar excesses are observed in the same range of longitudes when the solar irradiance in the EUV region is plotted as a function of the positions of the planets.

    These results suggest that active-Sun phenomena are not randomly distributed, but instead are modulated by the positions of the Earth, Venus and Mercury. One possible explanation, says the team, is the existence of a stream of massive DM particles with a preferred direction, coplanar to the ecliptic plane, that is gravitationally focused by the planets towards the Sun when one or more of the planets enter the stream. Such particles would need to have a wide velocity spectrum centred around 300 km s–1 and interact with ordinary matter much more strongly than typical DM candidates such as WIMPs. The non-relativistic velocities of such DM candidates make planetary gravitational lensing more efficient and can enhance the flux of the particles by up to a factor of 106, according to the team.

    Co-author Konstantin Zioutas, spokesperson for the CAST experiment at CERN, accepts that this interpretation of the solar and planetary data is speculative – particularly regarding the mechanism by which a temporarily increased influx of DM actually triggers solar activity.

    CERN CAST Axion Solar Telescope

    However, he says, the long persisting failure to detect the ubiquitous DM might be due to the widely assumed small cross-section of its constituents with ordinary matter, or to erroneous DM modelling. “Hence, the so-far-adopted direct-detection concepts can lead us towards a dead end, and we might find that we have overlooked a continuous communication between the dark and the visible sector.”

    Models of massive DM streaming particles that interact strongly with normal matter are few and far between, although the authors suggest that “antiquark nuggets” are best suited to explain their results. “In a few words, there is a large ‘hidden’ energy in the form of the nuggets,” says Ariel Zhitnitsky, who first proposed the quark-nugget dark-matter model in 2003. “In my model, this energy can be precisely released in the form of the EUV radiation when the anti-nuggets enter the solar corona and get easily annihilated by the light elements present in such a highly ionised environment.”

    The study calls for further investigation, says researchers. “It seems that the statistical analysis of the paper is accurate and the obtained results are rather intriguing,” says Rita Bernabei, spokesperson of the DAMA experiment, which for the first time in 1998 claimed to have detected dark matter in the form of WIMPs on the basis of an observed seasonal modulation of a signal in their scintillation detector.

    DAMA-LIBRA at Gran Sasso

    “However, the paper appears to be mostly hypothetical in terms of this new type of dark matter.”

    The team now plans to produce a full simulation of planetary lensing taking into account the simultaneous effect of all the planets in the solar system, and to extend the analysis to include sunspots, nano-flares and other solar observables. CAST, the axion solar telescope at CERN, will also dedicate a special data-taking period to the search for streaming DM axions.

    “If true, our findings will provide a totally different view about dark matter, with far-reaching implications in particle and astroparticle physics,” says Zioutas. “Perhaps the demystification of the Sun could lead to a dark-matter solution also.”

    Further reading

    S Bertolucci et al. 2017 Phys. Dark Universe 17 13. Elsevier

    http://www.elsevier.com/locate/dark

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 3:36 pm on September 22, 2017 Permalink | Reply
    Tags: , ATLAS hunts for new physics with dibosons, , , , Particle Accelerators,   

    From CERN Courier: “ATLAS hunts for new physics with dibosons” 


    CERN Courier

    Sep 22, 2017

    1
    WZ data

    Beyond the Standard Model of particle physics (SM), crucial open questions remain such as the nature of dark matter, the overabundance of matter compared to antimatter in the universe, and also the mass scale of the scalar sector (what makes the Higgs boson so light?). Theorists have extended the SM with new symmetries or forces that address these questions, and many such extensions predict new resonances that can decay into a pair of bosons (diboson), for example: VV, Vh, Vγ and γγ, where V stands for a weak boson (W and Z), h for the Higgs boson, and γ is a photon.

    The ATLAS collaboration has a broad search programme for diboson resonances, and the most recent results using 36 fb–1 of proton–proton collision data at the LHC taken at a centre-of-mass energy of 13 TeV in 2015 and 2016 have now been released. Six different final states characterised by different boson decay modes were considered in searches for a VV resonance: 4ℓ, ℓℓνν, ℓℓqq, ℓνqq, ννqq and qqqq, where ℓ, ν and q stand for charged leptons (electrons and muons), neutrinos and quarks, respectively. For the Vh resonance search, the dominant Higgs boson decay into a pair of b-quarks (branching fraction of 58%) was exploited together with four different V decays leading to ℓℓbb, ℓνbb, ννbb and qqbb final states. A Zγ resonance was sought in final states with two leptons and a photon.

    A new resonance would appear as an excess (bump) over the smoothly distributed SM background in the invariant mass distribution reconstructed from the final-state particles. The left figure shows the observed WZ mass distribution in the qqqq channel together with simulations of some example signals. An important key to probe very high-mass signals is to identify high-momentum hadronically decaying V and h bosons. ATLAS developed a new technique to reconstruct the invariant mass of such bosons combining information from the calorimeters and the central tracking detectors. The resulting improved mass resolution for reconstructed V and h bosons increased the sensitivity to very heavy signals.

    No evidence for a new resonance was observed in these searches, allowing ATLAS to set stringent exclusion limits. For example, a graviton signal predicted in a model with extra spatial dimensions was excluded up to masses of 4 TeV, while heavy weak-boson-like resonances (as predicted in composite Higgs boson models) decaying to WZ bosons are excluded for masses up to 3.3 TeV. Heavier Higgs partners can be excluded up to masses of about 350 GeV, assuming specific model parameters.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: